My Take On Assistive Tech For The Hackaday Prize

We’re in the last few weeks for entries in the 2016 Hackaday Prize — specifically the challenge is to show off your take on assisstive technology. This is a hugely broad category and I’ve been thinking about it for a while. I’m sure there’s a ton of low-hanging fruit that’s not obvious to everyone. This would be a great time to hit up the comments below and leave your “hey, I always thought someone should make…” ideas. I’m looking forward to reading them and it might just inspire someone to spend the next couple weeks hammering out a prototype to enter.

For me, it’s medication. I knew this can be a challenging problem having gone through a few cycles of prescription medicines in my life. But recently I helped out a family member who was suddenly on many medications taken on eight different times a day — including once, twice, three, and six times per day. This was further compounded by sleep deprivation (having to set alarms at night to take the medicine) and  drowsy/woozy effects from the medicine. I can tell you first hand that this is really tough for anyone to deal with and it’s incredibly easy to make a mistake or not be able to remember if you took a dose.

Pill Organizers Do No More or Less

We’ve seen a number of pill organizers before and that’s what I reached for in this case. However, that organizer only had four slots for each day. I didn’t hack it (other than writing on the doors with a Sharpie for when to take each) but even if there were added buttons or LEDs I’m not convinced this would be a marked improvement.

What you see above is my proposal for the medicine problem. Smartphones have become ubiquitous and the processing power and cameras of even budget phones are mind blowing. I think it is entirely possible to write an app that uses computer vision to recognize pills and sync them with the schedule. This may mean whipping the phone out of your pocket, or designing a pill box that has a phone stand next to it (saying that makes me think of using RPi and a Pi camera). Grab your pills and validate them under the camera.

Useful Augmented Reality

The screen of the phone would use augmented reality to overlay information about the pills it sees — you know, like Pokemon Go but in a way that enriches your life. ‘pills, catch ’em all!’ — new pills can be learned of the fly, delivering the user to a screen to identify the pill and the dosing schedule. Taking the validation picture will record when the medicine was taken, and the natural extension of this systems is a pharmacy’s ability to push your dose schedule to your account when you pick up the prescription. A stretch goal would be keeping an eye out for interactions.

This is all very much like how hospitals do it — they’re scanning bar codes on the packaging and the patient bracelet and recording it. This would be an easier user experience and quite frankly I think companies already in this space (like Snapchat and Niantic) could whip this up in a single-day hackathon no problem.

Is it the perfect system? Maybe not. But there is no perfect system or we’d be using it by now. We need you, the world’s talent pool, to step up and make life a little better. Do it in prototype form by October 3rd and you’ll be eligible for one of twenty $1000 cash prizes and a chance at winning the Hackaday Prize. But even if you don’t build a single thing, one idea could be the spark that lets others change the world for the better. So let’s hear it!

Hackaday Prize Entry: LipSync, Smartphone Access For Quadriplegic People

For most of us, our touch-screen smartphones have become an indispensable accessory. Without thinking we tap and swipe our way through our digital existence, the promise of ubiquitous truly portable computing has finally been delivered.

Smartphones present a problem though to some people with physical impairments. A touchscreen requires manual dexterity on a scale we able-bodied people take for granted, but remains a useless glass slab to someone unable to use their arms.

LipSync is a project that aims to address the problem of smartphone usage for one such group, quadriplegic people. It’s a mouth-operated joystick for the phone’s on-screen cursor, with sip-and-puff vacuum control for simulating actions such as screen taps and the back button.

To the smartphone itself, the device appears as a standard Bluetooth pointing device, while at its business end the joystick and pressure sensor both interface to a Bluetooth module through an Arduino Micro. The EAGLE board and schematic files are available on the project’s hackaday.io page linked above, and there is a GitHub repository for the code.

Technology is such a part of our lives these days, and it’s great to see projects like this bridge the usability gaps for everyone.  Needless to say, it’s a perfect candidate for the Assistive Technology round of the Hackaday Prize.

 

 

Hackaday Prize Entry: The Internet Of Meat

We’ve only just begun to see the proliferation of smart kitchen gadgets. Dumb crock pots with the intelligence of a bimetallic strip, are being replaced by smart sous vide controllers. The next obvious step is barbecue. For his Hackaday Prize entry, [armin] is building a smart, eight-channel BBQ controller for real barbecue, with smoke and fans, vents and metal boxes.

This BBQ controller has been in the works for years now, starting with a thread in a German barbeque forum. The original build featured an original Raspberry Pi, and could relay temperatures from inside a slab of meat to anywhere with range of a WiFi network.

For his Hackaday Prize entry, [armin] is working on a vastly improved version. The new version supports eight temperature probes, temperature logging and plotting, a webcam, setting alarms, a web interface, 433MHz radio, and PWM and fan control. Yes, if you’re very, very clever you can use this project to build a barbeque that will cycle a fan, and open and close a damper while monitoring the temperature of a brisket and email you when it’s done. It’s the Internet of Meat, and it’s the most glorious thing we’ve seen yet.

Hackaday Prize Entry: Mouse Controlled Microscope

You might imagine that all one should need to operate a microscope would be a good set of eyes. Unfortunately if you are an amputee that may not be the case. Veterinary lab work for example requires control of focus, as well as the ability to move the sample in both X and Y directions, and these are not tasks that can easily be performed simultaneously with only a single hand.

[ksk]’s solution to this problem is to use geared stepper motors and an Arduino Mega to allow the manual functions of the microscope to be controlled from a computer mouse or trackball. The motors are mounted on the microscope controls with a custom 3D-printed housing. A rotary selector on the control box containing the Arduino allows the user to select a slow or fast mode for fine or coarse adjustment.

It’s fair to say that this project is still a work in progress, we’re featuring it in our series of posts looking at Hackaday Prize entries. However judging by the progress reported so far it’s clear that this is a project with significant potential, and we can see the finished product could be of use to anyone operating the microscope.

We’ve featured one or two mouse controlled projects over the years, though not controlling microscopes. Here’s one mouse controlled robot arm, and we’ve covered another arm with a 3D mouse.

Hackaday Prize Entry: Text To Speech The Hard Way

Studies have shown reading to children leads to improved academic performance later in life, a trait that will make them more competitive in the workforce, and ultimately happier human beings. It follows, then, that having a robot read to children will also lead to happier and more productive adults, while normalizing the cyborg uprising takeover of the AI apocalypse of 2037.

It’s a good thing the above paragraph is a complete non-sequitur and has nothing to do with this Hackaday Prize entry. The TextEye, [Markus]’ entry for the Assistive Technology portion of the Hackaday Prize, is a handheld device that translates the written word into speech, useful for anyone who either can’t see well or can’t read gooder. Yes, it will also read to children, but so did Teddy Ruxpin.

If you’re keeping track, this isn’t the first time [Markus] has entered this project in a Hackaday Prize contest. The first time was six months ago in the Hackaday / Adafruit Raspberry Pi Zero contest. [Markus] was inspired by a group of blind computer science students using specialized hardware that allowed them to study the same thing as everyone else.

Since the first few project logs, a lot has changed in this project. You can buy a Pi Zero easily, and the updated Pi Zero 1.3 now comes with a camera connector. [Markus] is swapping out his Pi Model A and USB webcam for the Pi Zero and Pi camera. The software remains the same — GraphicsMagick, Tesseract OCR, Festival and Wiring Pi handle reading text and turning those words into speech — with a slight refactoring of the code. It’s a great use for the Pi Zero, and an excellent example of an Assistive Technology, and we’re happy to see it again in the Hackaday Prize.

Hackaday Prize Entry: AutoFan Saves Tired Drivers With Face Recognition

Long distance driving can be tedious at times. The glare of the sun and the greenhouse effect of all your car’s windows make it hot and dry. You turn on the fan, or air conditioning if you have it, and that brings relief. Soon enough you’ve got another problem, the cold dry air is uncomfortable on your eyes. Eventually as you become more tired, you find yourself needing the air on your face more and more as you stay alert. You thus spend most of the journey fiddling with your vents or adjusting the climate controls. Wouldn’t it be great if the car could do all that for you?

AutoFan is a project from [hanno] that aims to automate this process intelligently. It has a fan with steerable louvres, driven by a Raspberry Pi 2 with attached webcam. The Pi computes the position of the driver’s face, and ensures the air from the fan is directed to one side of it. If it sees the driver’s blink rate increasing it directs the air to their face, having detected that they are becoming tired.

The build logs go into detail on the mathematics of calculating servo angles and correcting for camera lens distortion in OpenCV. They also discuss the Python code used to take advantage of the multicore architecture, and to control the servos. The prototype fan housing can be seen in the video below the break, complete with an unimpressed-looking cat. For those of you interested in the code, he has made it available in a GitHub repository.

Continue reading “Hackaday Prize Entry: AutoFan Saves Tired Drivers With Face Recognition”

Hackaday Prize Entry: High End Preamps

While compact disks are seeing an uptick in popularity thanks to a convenient format that offers a lossless high-quality 44.1 KHz sample rate with 16-bit depth, some people are still riding the vinyl bandwagon of 2010. With that comes a need for the best hardware, and that means expensive cartridges and preamps designed by someone who knows what they’re doing.

For this year’s Hackaday Prize, [skrodahl] is building a really, really good preamplifier for moving coil turntable cartridges. It’s already built, it’s already tested, and the results are good: it produces between 36 and 46dB of gain, -110dB of dynamic range, and a signal to noise ratio of 79.46 relative to a 5mV input. That puts this preamplifier into the same territory as preamps sold with serial numbers, crystal lattices, and other audiophile nonsense.

The quality of this preamp comes from the design, and like any good open hardware project, [skrodahl] has made the schematic, PCB, and layout of this preamp completely open. It’s a great preamp, and a great entry for the Hackaday Prize.