On November 6th, Northwestern University introduced a groundbreaking leap in haptic technology, and it’s worth every bit of attention now, even two weeks later. Full details are in their original article. This innovation brings tactile feedback into the future with a hexagonal matrix of 19 mini actuators embedded in a flexible silicone mesh. It’s the stuff of dreams for hackers and tinkerers looking for the next big thing in wearables.
What makes this patch truly cutting-edge? First, it offers multi-dimensional feedback: pressure, vibration, and twisting sensations—imagine a wearable that can nudge or twist your skin instead of just buzzing. Unlike the simple, one-note “buzzers” of old devices, this setup adds depth and realism to interactions. For those in the VR community or anyone keen on building sensory experiences, this is a game changer.
But the real kicker is its energy management. The patch incorporates a ‘bistable’ mechanism, meaning it stays in two stable positions without continuous power, saving energy by recycling elastic energy stored in the skin. Think of it like a rubber band that snaps back and releases stored energy during operation. The result? Longer battery life and efficient power usage—perfect for tinkering with extended use cases.
And it’s not all fun and games (though VR fans should rejoice). This patch turns sensory substitution into practical tech for the visually impaired, using LiDAR data and Bluetooth to transmit surroundings into tactile feedback. It’s like a white cane but integrated with data-rich, spatial awareness feedback—a boost for accessibility.
Fancy more stories like this? Earlier this year, we wrote about these lightweight haptic gloves—for those who notice, featuring a similar hexagonal array of 19 sensors—a pattern for success? You can read the original article on TechXplore here.
I don’t know why hexagonal things attached to your skin always freak me out a little. Maybe it’s because on a passing glance, you might think you have had bees installed on you.
Trypophobia, try googling it.
don’t google it
Or hives.
If your implant is running Windows CE then it contains a couple of hives within its registry 🍯
Trypophobia much?
Love a good bit of haptics, especially for sensory substitution (or expansion, addition).
Being able to harvest power will be great for all day wear on such types of tech in the future, as sensory throughput can be demanding energy wise, and being able to utilize smaller patches of skin to convey many more bits of information makes the future of what this field will reach into so much more bright and interesting.
For information throughput, you have to think of it in terms of dimension of feedback – how much information can go through at a single moment of time
More means of discernable feedback, means more compressibility/intelligibility of signal in the time dimension, because it can compress it over all the other dimensions tactilely (stretch, pinch, twist, vibration, frequency, spatial positioning etc).
There are some caveats, such as different types of signals being more discernable when in combination with other effects (frequency discrimination is worse than intensity, which is worse than spatial), but overall having many more means to route and articulate information is so useful.
Personally been daily using a custom DIY wristband rig for exploring such things that gives me a cone of thermal feedback up to 20′ away, OBDII data from my cars, a cone of lidar feedback (among other things) for the past couple years, but only via wideband vibrotactile arrays vs all the cool new things this tech offers.
Sensory substitution/addition and such are great for enhancement of situational awareness in real life, not just replacing senses, as now I can feel where and how far away people are behind me, outside the biological field of view. Great for noisy environments, or in cases where wearing noise cancelling headphones would reduce that awareness, but also going above and beyond.
Situational awareness augmenting is just one aspect of what this sort of tech, field can offer, outside the realm of VR, even just pulling from existing scientific literature as far back as the 60s. (But there’s a lot more potential than even that highlights)
Might be normal in the future to walk past people wearing such types of technology either visible or not, feeding them custom information streams, as it enables more close interlinking and feedback loops with various technologies we use nowadays even, not just for sensory modifications.
I wonder if you could use this for brail? Literally a book at your fingertips, in a finger-cot or something. You could simulate a finger going across a page while your hand sits in your lap.
You could, although seeing as only about 10% of legally blind individuals can read braille, it may be more worthwhile to design an entirely new set of patterns for feedback specific for usage with these devices, vs being constrained to the dimensions of articulation braille feedback offers, for backwards compatibility.
Something similar to braille, absolutely though – in terms of both usage and how the feedback occurs.
For example, neosensory has a wristband that can articulate auditory, speech information over an array of 4 vibrotactile motors – could have an audiobook ingested over one of these type devices, although doing direct auditory ingest will conflict mentally with other auditory information at the same time, so perhaps doing it via a more custom conceptual language that isn’t a direct 1:1 for anything in reality may be better. (like neosensory has a wristband specific to accentuate phonemes vs specific audio frequencies to skin)
So it could simulate lots of information throughput, via something that could potentially enable more people to access information than what braille as designed currently does, because it has so many more means of information conveyance.
Does make me wonder how well feedback for braille could be from a display more aligned for braille style feedback, like one of these:
https://hackaday.com/2024/05/23/lightweight-haptic-gloves-with-electro-osmotic-pump-arrays/
Maybe not on the fingers as an all day wearable that wouldn’t work well, but perhaps on the skin of the inner wrist could be as discriminatory for information ingest.
Consider vision instead, TLDR; Sensory substitution systems that downsample video to electro tactile arrays.
Here’s an old, but digestible reference:
Successful tactile based visual sensory substitution use functions independently of visual pathway integrity https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2014.00291/full
I remember some years ago a guy with a back pack full of actuators that gave him real time feedback about stock exchange so he could react on the go.
That would be David Eagleman – who owns and runs neosensory, a company with about 70 things in the works related to sensory substitution, expansion and addition.
here is his talk, about that vest:
https://www.youtube.com/watch?v=4c1lqFXHvqI
Publicly, only wristbands that allow access to auditory information as an assistive device, although with some rudimentary bluetooth API that enables a bit more expansion, exploration.
here is a write up of his on the topic/field from last year:
https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2022.1055546/full
I believe he also has a hand in the “BrainPort Vision Device” as well, which was being piloted with much less sophisticated hardware a decade ago. If I remember right that work was based on a much earlier academic breakthrough using tongues as a vision input. I remember a reporter giving his account.
If they’re going to go for something that looks gross they should go all the way and paint them various shades of pink and red, some with a little greenish white dot in the middle.