The World Health Organization estimates that around 90% of the 285 million or so visually impaired people worldwide live in low-income situations with little or no access to assistive technology. For his Hackaday Prize entry, [Tiendo] has created a simple and easily reproducible way-finding device for people with reduced vision: a bracelet that detects nearby objects and alerts the wearer to them.
It does its job using an ultrasonic distance sensor and an Arduino Pro Mini. The bracelet has two feedback modes: audio and haptic. In audio mode, the bracelet will begin to beep when an object is within 2.5 meters. And it behaves the way you’d expect—get closer to the object and the beeping increases; back away and it decreases. Haptic mode involves two tiny vibrating disk motors attached to small PVC cuffs that fit on the thumb and pinky. These motors will buzz differently based on the person’s proximity to a given object. If an object is 1 to 2.5 meters away, the pinky motor will vibrate. Closer than that, and it switches over to the thumb motor.
To add to the thriftiness of this project, [Tiendo] re-used other objects where he could. The base of the bracelet is a cuff made from PVC. The nylon chin strap and plastic buckle from a broken bike helmet make it adjustable to fit any wrist. To keep the PVC cuff from chafing, he slipped small pieces from an old pair of socks on to the sides.
It’s easy to see why this project is a finalist in our Best Product contest. It’s a simple, low-cost assistive device made from readily available and recycled materials, and it can be built by anyone who knows a little bit about electronics. Add in the fact that it’s lightweight and frees up both hands, and you have a great product that can help a lot of people. Watch it beep and buzz after the break. Continue reading “Hackaday Prize Entry: A Bracelet for the Blind”
Chorded keysets can be found all over. For instance, Braille writers and court stenographers both use them. These chorded keyboards create each letter by pressing a combination of keys rather just one, making for much smaller keyboards [Christine] got the idea to create wearable rig that uses an accelerometer and vibe motor attached to each finger to serve as a one-handed, no-look, silent keyer. Forget small keyboards, this project does away with it altogether, relying on the accelerometers to keep track of your fingers.
[Christine]’s prototype consists of a Bluno BLE controller, a GSM module, and a few accelerometers and motors. The vibration motors not only provide haptic feedback so you know you tapped something, but also replays the chords so you can double-check what you’re writing.
Typically one-handed keyboards rely on button presses, with no-look use dependent on memorizing the layout—think of a 10-key pad. [Christine]’s project lets you type on any surface or none at all, making it handy for typing while you work with the other hand. It also has great potential for vision impaired users.
Is it possible to effectively communicate tactile pedagogical messages in a heuristic tele-haptic proto-sculpting environment? Let’s try rephrasing that. What if you could use a robot to help teach someone a creative skill? Imagine guiding someone’s hand with a paintbrush. Now imagine guiding a bunch of peoples’ hands with paintbrushes, using a series of linked robots.
From [Morgan Rauscher] comes Art-Bot 2.0 — a creative learning tool that provides an entirely new way to teach painting, sculpting, or pretty much anything requiring dexterity or a tool. We covered Art-Bot 1.0 a few years ago, but in case you’ve forgotten, it was an eight-foot tall chainsaw wielding robot inside of an enclosure. Even children, using the remote, could play with chainsaws.
Constructed with the help of the Hexagram Institute, Art-Bot 2.0 is made up of three rugged servo driven robot arms. One is for the teacher, to guide movements, one performs those same movements on a work-piece, and a third robot arm allows a student to feel what is happening.
Continue reading “Tele-Haptic Proto-Sculpting: Using Robots To Teach”
The “absorbed device user” meme, like someone following Google Maps on a smart phone so closely that they walk out into traffic, is becoming all too common. Not only can an interface that requires face time be a hazard to your health in traffic, it’s also not particularly useful to the visually impaired. Haptic interfaces can help the sighted and the visually impaired alike, but a smart phone really only has one haptic trick – vibration. But a Yale engineer has developed a 3D printed shape-shifting navigation tool that could be a haptics game changer.
Dubbed the Animotus by inventor [Ad Spiers], the device is a hand-held cube split into two layers. The upper layer can swivel left or right and extend or retract, giving the user both tactile and visual clues as to which direction to walk and how far to the goal. For a field test of the device, [Ad] teamed up with a London theater group in an interactive production of the play “Flatland”, the bulk of which was staged in an old church in total darkness. As you can see in the night-vision video after the break, audience members wearing tracking devices were each given an Animotus to allow them to navigate through the interactive sets. The tracking data indicated users quickly adapted to navigation in the dark while using the Animotus, and some became so attached to their device that they were upset by the ending of the play, which involved its mock confiscation and destruction.
Performing art applications aside, there’s plenty of potential for haptics with more than one degree of freedom. Imagine a Bluetooth interface to the aforementioned Google Maps, or an electronic seeing-eye dog that guides a user around obstacles using an Animotus and a camera. There’s still plenty of utility in traditional haptics, though, as this Hackaday Prize semi-finalist shows.
Continue reading “Experimental Theater Helps Field test Haptic Navigation Device”
There are 3.6 Million deafblind people in the world, and by far their greatest problem is one of communication. For his entry for the Hackaday Prize, our own miracle worker on hackaday.io is creating a system that enables haptic communication for a variety of devices. It’s called Tact-Tiles, and instead of creating a single device, [Anderson] is building an entire system that enables a multitude of communication devices for deafblind people.
The basic unit of the Tact Tile system is a small, touch sensitive vibrating pad. These tiny PCBs can be fitted to just about anything, including a wired glove, or whatever haptic interface anyone can dream up. The core of the device is a small PCB that can control 32 of these vibrating pads, and communicates with a smartphone or computer over a Bluetooth connection.
With a little bit of software, the Tact Tiles can be configured an any way imaginable, with mapping individual tiles to letters of the alphabet, mapping gestures to letters, or any combination in between. [Anderson] has a great video demoing the possibility of his device, you can check that out below.
Continue reading “Hackaday Prize Semifinalist: Tact Tiles”
For his project entered in the Hackaday Prize, [Neil] is working on a navigation aid for the blind. He’s calling his device Pathfinder, and it’s designed to allow greater freedom of motion for the disabled.
Pathfinder is a relatively simple device, with a cheap, off the shelf ultrasonic distance sensor, an ATMega, and a few passives. On its own, the ultrasonic distance sensor is only accurate to about 5%. By incorporating a temperature sensor, [Neil] was able to nail down the accuracy of his sensor to about 1%. Impressive!
For the machine to human interface, [Neil] chose haptic feedback, or small vibration motors tucked away inside a wristband. It’s by far the easiest way to add the output needed, and with a haptic motor driver, it’s easy to add specialized drive patterns to the vibration motor
You can check out [Neil]’s quarterfinal entry video for the Pathfinder below.
Continue reading “Hackaday Prize Semifinalist: Haptic Navigation”
[Roy Shilkrot] and his fellow researchers at the MIT Media Lab have developed the FingerReader, a wearable device that aids in reading text. Worn on the index finger, it receives input from print or digital text and outputs spoken words – and it does this on-the-go. The FingerReader consists of a camera and sensors that detect the text. A series of algorithms the researchers created are used along with character recognition software to create the resulting audio feedback.
There is a lot of haptic feedback built into the FingerReader. It was designed with the visually impaired as the primary user for times when Braille is not practical or simply unavailable. The FingerReader requires the wearer to make physical contact with the tip of their index finger on the print or digital screen, tracing the line. As the user does so, the FingerReader is busy calculating where lines of text begin and end, taking pictures of the words being traced, and converting it to text and then to spoken word. As the user reaches the end of a line of text or begins a new line, it vibrates to let them know. If a user’s finger begins to stray, the FingerReader can vibrate from different areas using two motors along with an audible tone to alert them and help them find their place.
The current prototype needs to be connected to a laptop, but the researchers are hoping to create a version that only needs a smartphone or tablet. The videos below show a demo of the FingerReader. For a proof-of-concept, we are very impressed. The FingerReader reads text of various fonts and sizes without a problem. While the project was designed primarily for the blind or visually impaired, the researchers acknowledge that it could be a great help to people with reading disabilities or as a learning aid for English. It could make a great on-the-go translator, too. We hope that [Roy] and his team continue working on the FingerReader. Along with the Lorm Glove, it has the potential to make a difference in many people’s lives. Considering our own lousy eyesight and family’s medical history, we’ll probably need wearable tech like this in thirty years!
Continue reading “Trace Your Book or Kindle with the FingerReader”