A team from the University of Chicago brings us a new spin on sensory substitution, the “Seeing with the Hands” project, turning external environment input into sensations. Here specifically, the focus is on substituting vision into hand sensations, aimed at blind and vision disabled. The prototype is quite inspiration-worthy!
On the input side, we have a wrist-mounted camera, sprinkled with a healthy amount of image processing, of course. As for the output, no vibromotors or actuators are in use – instead, tactile receptors are stimulated by passing small amounts of current through your skin, triggering your touch receptors electrically. An 5×6 array of such “tactile” pixels is placed on the back of the hand and fingers. The examples provided show it to be a decent substitution.
This technique depends on the type of image processing being used, as well as the “resolution” of the pixels, but it’s a fun concept nevertheless, and the study preprint has some great stories to tell. This one’s far from the first sensory substitution devices we’ve covered, though, as quite a few of them were mechanical in nature – the less moving parts, the better, we reckon!
I built a dev kit to do this and (what’s apparently called) sensory weaving a few years back as an experiment. It was nothing more than an nrf52 breakout with a few dozen pwm outputs to sma connectors (and associated daughterboards to stick the vibromotors to). Pretty cool to see two projects like this in one day, maybe I should reboot mine.
(kinda OT because aimed @ hearing impaired)
TLDR: Idea – visualizing gaming 3d/surround sound via circular 360° audio FFT / spectrogram
For ~a year now I’ve been playing around with this idea.
Take a project/software like
https://github.com/ensingerphilipp/CanetisRadar-Improved
but display it as a ring/border around the whole game (many games can be shrunk with SpecialK even without window-borders)
and instead of just displaying volume/loudness do a 360° spectrogram/FFT
while subtracting sounds coming from everywhere.
Depending on how/if this would work it could be implemented with AR glasses as well.
Kinda like https://hackaday.com/2017/09/22/hackaday-prize-entry-haptivision-creates-a-net-of-vibration-motors/ but visual.
Hmm, I was thinking about doing something like this for tactile sense in prosthetics. Taking some representation of strain gauges on the surface and representing them on the user’s skin.