[Jakob Kilian] is working on a glove that he hopes will let the blind “see” their surroundings.
One of the most fascinating examples of the human brain’s plasticity is in its ability to map one sense to another. Some people, for example, report being able to see sound, giving them a supernatural ability to distinguish tones. This effect has also been observed in the visually impaired. There are experiments where grids of electrodes were placed on the tongue or mechanical actuators were placed on the lower back. The signals from a camera were fed into these grids and translated in to shocks or movement. The interesting effect is that the users quickly learned to distinguish objects from this low resolution input. As they continued to use these devices they actually reported seeing the objects as their visual centers took over interpreting this input.
Most of these projects are quite bulky and the usual mess you’d expect from a university laboratory. [Jakob]’s project appears to trend to a much more user-friendly product. A grid of haptics are placed on the back of the user’s hand along with a depth camera. Not only is it somewhat unobtrusive, the back of the hand is very sensitive to touch and the camera is in a prime position to be positioned for a look around the world.
[Jakob] admits that, as an interaction designer, his hardware hacking skills are still growing. To us, the polish and thought that went into this is already quite impressive, so it’s no wonder he’s one of the Hackaday Prize Finalists.
This is quite clever and well within the reach of a hacker, I’d say. You can buy laser range sensors from Ebay, or ultrasonic, or even have vision based systems for object detection.
I wonder whether it wouldn’t be better to use modified glasses instead of using the hand. Having the vision apparatus on your head is how humans see already, and it would be less obtrusive?