A Virtual Cane for the Visually Impaired

cane

[Roman] has created an electronic cane for the visually impaired. Blind and visually impaired people have used canes and walking sticks for centuries. However, it wasn’t until the 1920’s and 1930’s that the white cane came to be synonymous with the blind. [Roman] is attempting to improve on the white cane design by bringing modern electronics to the table. With a mixture of hardware and clever software running on an Android smartphone, [Roman] has created a device that could help a blind person navigate.

The white cane has been replaced with a virtual cane, consisting of a 3D printed black cylinder. The cane is controlled by an ATmega328 running the Arduino bootloader and [Roman's] code. Peeking out from the end of the handle is a Maxbotix ultrasonic distance sensor. Distance information is reported to the user via a piezo buzzer and a vibration motor. An induction coil allows for charging without fumbling for tiny connectors. A Bluetooth module connects the virtual cane to the other half of the system, an Android phone.

[Roman's] Android app runs solely on voice prompts and speech syntheses. Navigation commands such as “Take me to <address>” use the phone’s GPS and Google Maps API to retrieve route information. [Roman's] app then speaks the directions for the user to follow. Help can be summoned by simply stating “Send <contact name> my current location.” In the event that the user drops their virtual cane, “Find my device” will send a Bluetooth command to the cane. Once the command is received, the cane will reveal its position by beeping and vibrating.

We’ve said it before, and we’ll say it again. Using technology to help disabled people is one of the best hacks we can think of. Hackaday alum [Caleb Kraft] has been doing just that with his work at The Controller Project. [Roman] is still actively improving his cane. He’s already won a gold medal at the Niagara Regional Science and Engineering Fair. He’s entered his project in several more science events, including the Canada Wide Science Fair and the Google Science Fair. Good luck [Roman]!

Audiobook player used only NFC tags for control

no-button-nfc-audiobook-reader

[Martynas Mickevičius] has a Grandmother who is visually impaired. She enjoys listening to audiobooks and has been doing so using a DVD player for quite some time. The problem is that there is no way for her to save her position in between listening session. He set out to help by building a dedicated audiobook reader that doesn’t have any buttons.

The project was inspired by a one-button reader we featured back in November. Like that project, [Martynas] chose to use the inexpensive, yet powerful Raspberry Pi. The main difference comes in the control method. He’s using an NFC tag reader, which is mounted in the top portion of the RPi case. The image above shows the rig during prototyping, but his final version is all bundled up in the pink enclosure and only needs the power and audio cables connected to it. See for yourself in the demo after the jump.

Each book has its own NFC tag. When she’s done reading she can simply cut the power and it will resume in the same place the next time it is plugged in. The tag setup is a vast improvement since it allows an entire library to be stored on the SD card and chosen using a different tag. With this hardware in place it should be trivial to code extensions to the system, like a script that uses text-to-speech to announce which book is being played before playback starts.

[Read more...]

One-button audiobook player made from a Raspberry Pi

[Michael Clemens] was looking for gifts for his Grandmother’s 90th Birthday. She is visually impaired and loves to be able to listen to audiobooks. The problem is that she doesn’t really get the hang of using electronics. He made things easy by building her a one-button audiobook player.

The Raspberry Pi board is a perfect solution for this project. It’s cheap, it has an audio port, it has storage for the books on the system SD card, and it runs Linux. The last part is key as it made things very simple when [Michael] started pulling together the various components.

When the RPi is powered up it drops immediately into a Python script which loads the audio track and places the music player daemon in pause. The yellow button seen above works as a play/pause button when clicked. If the listener misses something she can hold the button for more than four seconds to go back one track. Loading new books is easy too. [Michael] copies the files onto a thumb drive with a special volume label. When plugged into the RPi USB port the script automatically copies the book and starts playing when the drive is removed. He included a video demo on his project page linked above.

Rubik’s Cube for the blind

Check out this Rubik’s Cube for the blind. The idea didn’t start off as an accessibility hack, but instead as a way for [Brian Doom] to figure out where the face of each cube goes when manipulating the puzzle. It gave him tactile feedback and his ability to use it in dim lighting was when it dawned on him that this could be useful to others.

Now when we first thought of a puzzle for the blind the term ‘Braille’ immediately jumped to mind. But this doesn’t use it. That’s great, because not all visually impaired people can understand Braille. Instead, this uses dimension and texture to identify each of the puzzle faces. There are mushroom-shaped knobs, Phillips screws, adhesive rubber bumpers, raised text label maker labels, and a few other items that go along with each color. This doesn’t prevent those with sight from playing either. It’s something of one Rubik’s cube for all. Well, all except for the robots made to solve a stock cube.

[via Dvice]

Vanishing point robot guidance

Students at the National University of Computer and Emerging Sciences in Pakistan have been working on a robot to assist the visually impaired. It looks pretty simple, just a mobile base that carries a laptop and a webcam. The bot doesn’t have a map of its environment, but instead uses vanishing point guidance. As you can see in the image above, each captured frame is analyzed for indicators of perspective, which can be extrapolated all the way to the vanishing point where the green lines above intersect. Here it’s using stripes on the floor, as well as the corners where the walls meet the ceiling to establish these lines. From the video after the break you can see that this method works, and perhaps with a little bit of averaging they could get the bot to drive straight with less zig-zagging.

Similar work on vanishing point navigation is being done at the University of Minnesota. [Pratap R. Tokekar's] robot can also be seen after the break, zipping along the corridor and even making turns when it runs out of hallway.

[Read more...]

Data plotting for the visually impaired

This setup helps to represent data in a meaningful way to for visually impaired people. It uses a combination of physical objects to represent data clusters, and audio feedback when manipulating those objects. In the video after the break you’ll see that the cubes can orient themselves to represent data clusters. The table top acts as a graphing field, with a textured border as a reference for the user. A camera mounted below the clear surface allows image processing software to calculate the locations for the cubes. Each cube is motorized and contains an Arduino and ZigBee module, listening for positioning information from the computer that is doing the video processing. Once in position, the user can move the cubes, with modulated noise as a measure of how near they are to the heart of each data cluster.

The team plans to conduct further study on the usefulness of this interactive data object. We certainly see potential for hacking as this uses off-the-shelf components that are both inexpensive, and easy to find. It certainly reminds us of a multitouch display with added physical tokens.

[Read more...]

Follow

Get every new post delivered to your Inbox.

Join 97,838 other followers