A few weeks ago [Jacob Merz] sent me an email about his sensory expansion project, which allows the wearer to “hear” infrared light by mapping it to specific tones. Although a rough prototype, [Jacob’s] device reflects a larger realm of technological possibilities: the development of a type of “peripheral” for the human body. EDIT: Updated gallery to include new photos and added link to Jacob’s new site.
You’re going to want to listen to [David Eagleman’s] TEDx Alamo talk particularly around 10 minutes in, where he talks about the sonic glasses. [Eagleman] claims that the human brain, if given a consistent input that corresponds to the real world, can decipher the signal into usable information. The sonic glasses, which provide a type of sonar to the blind wearer, eventually just…work. Your brain can “learn” its own drivers for input devices.
If you think the sonic glasses sound more familiar than a 1970’s invention, you’re probably thinking about [Neil Harbisson], who built a similar device to allow him to “hear” colors. Strictly speaking, though [Harbisson] claims to be the first UK cyborg, he’d certainly encounter resistance from [Donna Haraway’s] A Cyborg Manifesto, which argues the concept of the “cyborg” is not new to our era; humans have always used tools to expand their abilities, and even the simplest ones should count toward the classification of cyborg.
This week’s Hacking and Philosophy is much more opened ended. I invite you to speculate on these technologies and how they are integrated into the human body: from prosthesis that seek to replace missing limbs to eager engineering students or tech enthusiasts implanting neodymium magnets into their fingers.
1. Where is this all headed? Are humans going to have essentially plug-in-play devices?
2. What other examples of implanting tech are out there?
3. [Eagleman] is working on some other examples of “sensory substitution.” Could you hack together something useful like this? Or have you already?
[Eagleman’s] TED Talk:
[Harbisson’s] TED Talk:
[Jacob Merz’s] device:
If you wear glasses that flip your vision 180 degrees, your brain will adapt to this input and “flip” it back. Take the glasses off and your view is upside down.
It takes about a week for most people but yeah, it is pretty awesome to do that.
It doesn’t really “flip” – you simply learn to cope with it and feel normal about looking at everything upside down.
actually the amazing thing is taking glasses off doesnt flip it again – you now know BOTH ways.
“I listen to color”
*ahem* … like synesthesia?
You’re absolutely right, and that’s the intention: but it’s an artificial implementation that becomes second nature. I’ll have to check for sources, but I vaguely remember reading studies that demonstrate people with synesthesia that assigns colors to numbers have higher performance in math; if we can harness/install our own synesthetic experiences, perhaps we can leverage that into a learning advantage?
When one sense is diminished, other senses become more acute. This is a well established concept. it would be so cool to sculpt that specifically to a specific sense just by adding a more acute sense to our repertoire.
Nah. That’s a myth. Blind people don’t hear better than sighted people. They’re just listening more.
Shouldn’t they be taking the Meds?
Ah, sonic glasses. So old school. Hearing colours in the 70’s? Nah, trivial.
See, way back in 1971, a guy called Professor Kay, a researcher and university lecturer in Christchurch, New Zealand (Yep, that place where they had those big earthquakes just a couple years back that near-leveled the place) developed the sonic spectacles. You can see some details, and the spectacles, at http://zabonne.co.nz/?action=product&id=10685&category=10062
They used ultrasonic sounders to audibly map the surroundings, feeding back in stereo to the wearer. The user could get a sound image of his surroundings, apparently with a fair level of resolution. Endless sets of students did associated projects in their final year of electrical engineering degree studies all aiding the overall development in a small way.
These specs were subsequently funded, developed and manufactured in small numbers by a company in the city headed, I think, by one of his fellow researchers RP Smith, and they were then used by some in the blind community there. An article was published in the media mid or late 70s about a near-blind sheep farmer on a farm bike (a tricycle thing powered by a motorbike engine in wide use) who was using one of these on the farm. Can’t find the article now….Lost in the annals of time…
I also think there was a subsequent “sonic cane” but can’t find any reference to it now either.
The same company went on to make other aids for the lesser-abled. It was called, at one stage, Sensory Aids Ltd. No laughter, please.
So, same old, same old. I don’t suggest anyone tries to grab the patents for these now, folks.
I should have scrolled down to comments to see yours. Kudos for remembering history and warning everyone that the patent trolls who love to take old tech, throw in bizspeak and sell it to idiots with more twitter followers than sense (cough apple cough). Keep on keepin on, chief :)
I used my first dragonsoft nat speaking to tell me which neighbors pulled up by recording the .wavs for their cars and having it listen. Lots of garbage printed out, but then there was the entry for F-150 and Toyota Camry where it heard my neighbors pull up lol. Got an A on that one. No one cared that I was modeling it on the Navy’s stuff (no russian diesels in the area lol). Those same folks are probably mad about the NSA. Oh well. It is always amazing what you can learn in school if ya pay attention :)
It’s already used quite heavily in assistive technology. It’s also used massively in average technology. I don’t think this is so much a new thing as that people are starting to put on paper what people already knew – there just wasn’t really a word and a terminology to it. Using a tool, complex or simple, start functioning as an extension of you as opposed to something you have just consciously learned how to operate. Learning something new (physically speaking) is usually the path from “knowing how to operate” and “not really knowing so much as it just does itself on a lower level in the brain”. There are things we can do to speed this up (“don’t think so much about what you’re doing, just let it happen!” as we’ve all heard teachers proclaim since forever) and it’s neat that it’s being studied more rigourously.
But the “now” tech is already doing this. I don’t sit and think about where the keys are on my keyboard, I just think words and BAM, they appear in this text box. When I drive, I don’t sit and go “Hmm, I shall turn left soon, I’d better put on my turn signal, depress my right foot on that pedal and prepare to turn this round thing counter-clockwise in a bit” I just sort of.. decide that’s what I want and lower-level stuff takes care of the “Aha! So then we need to set this whole chain of events in motion then!”. It’s as natural as walking, speaking, catching a ball.. immensely complicated matters that, in the conscious mind is handled by simply deciding to and issuing a lower-level (driver, if you will by the posts terms) command to some other brain fragment that’s in charge of that.
I take it no one on HaD is handicapable. These kind of synthaesia morphs have been around for well over 10 years for the hearing and seeing impaired. Ugh. the photo makes me not want to read past the blurb.
Oh for fucking sake HackADay, don’t turn this site into fucking engadget, we’ve had this shit for years, now I wanna see the shit we won’t have for years, and I’m not just talking “OMG FLEXIBAL SCREENS! SOO FUTURISTIC!”
I’m afraid I wasn’t quite clear in my point with this post, considering the numerous comments that seem to think it’s backward-reflecting and not forward-thinking.
Did you watch the Eagleman video? He’s suggesting adding new senses entirely, even to the point of having real time stock quotes just..accessible in your mind.
My hope with this post was to emphasize how the brain can sort of “snap” into understanding a signal input after a few weeks time, which seems to indicate a huge potential to add specific … enhancements?
As I said, old fucking news.
I have finger implants, I can feel electrical feilds.
If I could be fucked getting a scalpel I could rig the s.m.a.r.t data from the drives in my server up to a nerve, and with a little training feel when they fail, and where they are in my rack without even opening my eyes.
Old.
Fucking.
News.
As you are a Hackaday author I believe it is your duty to watch this video, then tell me that your article showed us anything even remotely “New”:
http://www.youtube.com/watch?v=p_JpPMIriAI
I suppose I don’t see the reason for such a hostile tone; that’s not to say I’m ridiculing you, I just don’t see what provoked it. This is a column–Hacking and Philosophy–that promotes discussion by choosing a subject and engaging with that topic. That biohacking isn’t a novel concept to you isn’t the point. In fact, none of the posts in this column’s history have ever claimed to address something “brand new,” and this week’s is no different. There are specific questions to guide the discussion, as their have been every installment.
In the case of biohacking, the field is certainly an emergent one, a point supported by each video’s treatment of the technologies and their implementations. I’ve seen Lepht’s talk before among many others, and I certainly welcome anyone to share similar links.
I’d much rather have an open conversation about your finger implants in the context of Eagleman’s talk: how the sensation(s) became integrated into your daily experience and if something just “clicked” after a few weeks like he mentions.
“I suppose I don’t see the reason for such a hostile tone; that’s not to say I’m ridiculing you, I just don’t see what provoked it.”
BECAUSE BAD DAY AT THE OFFICE ITERNET BVGHA RWJGSA FMA DON’T TAKE IT PERSONALY
“This is a column–Hacking and Philosophy–that promotes discussion by choosing a subject and engaging with that topic. That biohacking isn’t a novel concept to you isn’t the point. In fact, none of the posts in this column’s history have ever claimed to address something “brand new,” and this week’s is no different. There are specific questions to guide the discussion, as their have been every installment.”
You might as well have said “Did you know that you can wire stuff other then your ~10 sense up to nerves or your other existing senses and gain a new sense? Wow! Such amazement!”
“I’d much rather have an open conversation about your finger implants in the context of Eagleman’s talk: how the sensation(s) became integrated into your daily experience and if something just “clicked” after a few weeks like he mentions.”
It’s essentially what he said.
If you want a less permanent idea of what it’s like, where a pair of glasses that invert your vision, or superglue some neodymium magnets to your fingers.
I’ve experimented with the southpaw idea, and putting speakers from earphones seemed to give me a response, but they are too bulky for practical use.
I’d put some magnets in my arm but there are less nerves there so I might not even feel it. So I’d probably need a implant for each direction, rather then 4 electromagnets with 1 implant.
What kind of experiences have changed (for better or worse) from the magnet implants, in your experience?
I’ve toyed with the idea of building in some kind of compass feedback so you would innately just “know” which direction you were facing, but I’m not sure to what degree of accuracy that could be nailed down and what kind of feedback would be best suited (sound for all its convenience is a bit of a hassle, I feel, especially if it’s unacceptable to have headphones on depending on your day-to-day situation).
“What kind of experiences have changed (for better or worse) from the magnet implants, in your experience? ”
Mainly being able to get around in the dark better.
“I’ve toyed with the idea of building in some kind of compass feedback so you would innately just “know” which direction you were facing, but I’m not sure to what degree of accuracy that could be nailed down and what kind of feedback would be best suited (sound for all its convenience is a bit of a hassle, I feel, especially if it’s unacceptable to have headphones on depending on your day-to-day situation).”
Sonic is always easiest, but I don’t like the idea of it, not because of the bulk, but because some music, sounds, practically anything could throw you off, and if you had your balance rigged up, could literally knock you over. We’re gonna see some funny shit if this goes commercial.
For things like IR, UV, distance, sure. It would work fine. But directionally you’d want something implanted. Probably not magnets, since they are, again, pretty bulky.
Maybe an inductive electrode, but body-proofing it might be a task. I’ll email Amal Graafstra, maybe he’ll have a few good ideas…
I’d say stranded electrode implant, but you’d want to keep the thing very clean, and held onto your arm securely, since the whole “WTF Did you do that for Eww Eww Eww” thing.
Yo can have stuff going out of your skin, but you have to be careful.
For the record, I found this article (and discussion) interesting as it is an area that I considering getting in to. I lost vision in one eye 2 years ago and the lack of depth perception and narrowed field of view have had some surprising effects.
I figure if I’m going to try and recover some of my sensing through technology I might as well include something that is ‘better than human’ too.
Ryan raised some important points about using sound that I hadn’t considered. Thanks.