Many of us will have seen robotics or prosthetics operated by the electrical impulses detected from a person’s nerves, or their brain. In one form or another they are a staple of both mass-market technology news coverage and science fiction.
The point the TV journalists and the sci-fi authors fail to address though is this: how does it work? On a simple level they might say that the signal from an individual nerve is picked up just as though it were a wire in a loom, and sent to the prosthetic. But that’s a for-the-children explanation which is rather evidently not possible with a few electrodes on the skin. How do they really do it?
A project from [Bruce Land]’s Cornell University students [Michael Haidar], [Jason Hwang], and [Srikrishnaa Vadivel] seeks to answer that question. They’ve built an interface that allows them to control a robotic hand using signals gathered from electrodes placed on their forearms. And their write-up is a fascinating read, for within that project lie a multitude of challenges, of which the hand itself is only a minor one that they solved with an off-the-shelf kit.
The interface itself had to solve the problem of picking up the extremely weak nerve impulses while simultaneously avoiding interference from mains hum and fluorescent lights. They go into detail about their filter design, and their use of isolated power supplies to reduce this noise as much as possible.
Even with the perfect interface though they still have to train their software to identify different finger movements. Plotting the readings from their two electrodes as axes of a graph, they were able to map graph regions corresponding to individual muscles. Finally, the answer that displaces the for-the-children explanation.
There are several videos linked from their write-up, but the one we’re leaving you with below is a test performed in a low-noise environment. They found their lab had so much noise that they couldn’t reliably demonstrate all fingers moving, and we think it would be unfair to show you anything but their most successful demo. But it’s also worth remembering how hard it was to get there.
Continue reading “Move A Robotic Hand With Your Nerve Impulses”
When you own an $80,000 car, a normal person might be inclined to never take it out of the garage. But normal often isn’t what we do around here, so seeing a Tesla S driven by mind control is only slightly shocking.
[Casey_S] appears to be the owner of the Tesla S in question, but if he’s not he’ll have some ‘splaining to do. He took the
gigantic battery and computer in a car-shaped case luxury car to a hackathon in Berkley last week and promptly fitted it with the gear needed to drive the car remotely. Yes, the Model S has steering motors built in, but Tesla hasn’t been forthcoming with an API to access such functions. So [Casey_S] and his team had to cobble together a steering servo from a windshield wiper motor and a potentiometer mounted to a frame made of 2x4s. Linear actuators attach to the brake and accelerator pedals, and everything talks to an Arduino.
The really interesting part is that the whole thing is controlled by an electroencephalography helmet and a machine learning algorithm that detects when the driver thinks “forward” or “turn right.” It translates those thoughts to variables that drive the actuators. Unfortunately, space constraints kept [Casey_S] from really putting the rig through its paces, but the video after the break shows that the system worked well enough to move the car forward and steer a little.
There haven’t been too many thought-controlled cars featured here before, but we have covered a wheelchair with an EEG interface.
Continue reading “Think Your Way to Work in a Mind-Controlled Tesla”
Losing a limb often means getting fitted for a prosthetic. Although there have been some scientific and engineering advances (compare a pirate’s peg leg to “blade runner” Oscar Pistorius’ legs), they still are just inert attachments to your body. Researchers at Johns Hopkins hope to change all that. In the Journal of Neural Engineering, they announced a proof of concept design that allowed a person to control prosthetic fingers using mind control.
Continue reading “Mind-Controlled Prosthetic Arm”
[Chip Audette] owns (at least) two gadgets: one of those remote control helium-filled flying shark (an Air Swimmer), and an OpenBCI EEG system that can read brain waves and feed the data to a PC. Given that information, it can hardly surprise you that [Chip] decided to control his flying fish with his brain.
Before you get too excited, you have to (like [Chip]) alter your expectations. While an EEG has a lot of information, your direct thoughts are (probably) not readable. However, certain actions create easily identifiable patterns in the EEG data. In particular, closing your eyes creates a strong 10Hz signal across the back of the head.
Continue reading “School of Friends Use Thought Control on a Shark”
Producing micro robotics is not yet easy or cost-effective, but why do we need to when we can just control the minds of cockroaches? A team or researchers from North Carolina State University is calling this augmented Madagascar Hissing cockroach an Insect Biobot in their latest research paper (PDF). It’s not the first time the subject has come up. There have already been proofs in research and even more amateur endeavors. But the accuracy and control seen in the video after the break is beyond compare.
The roach is being controlled to perfectly follow a line on the floor. One of the things that makes this iteration work so well is that the microcontroller includes a new type of ADC-based feedback loop for the stimulation of the insect brain. This helps to ensure that the roach will not grow accustom to the stimulation and stop responding to it. Since this variety of insect can live for about two years, this breakthrough makes it into a reusable tool. We’re not sure what that tool will be used for, but perhaps the next plague of insects will be controlled by man, and not mother nature.
Continue reading “Mind-controlling cockroaches”
[Rafael Mizrahi and Anat Sambol] decided that Angry Birds was missing one crucial element – mind control. They grabbed a copy of the game for their netbook and [Rafael] strapped on an Emotiv EPOC headset to see if he could play it without using a mouse or keyboard. While he was able to move the cursor around with his thoughts, he found that Emotiv’s EmoKey software lacked any sort of mouse button support. Undaunted, they turned to the Internet for help and found that he could map the Emotiv’s output to his mouse via another application, GlovePie.
As you can see in the video below their efforts were successful, though we doubt [Rafael] will be completely giving up his mouse just yet. With some more refinement, we imagine [Rafael] will be blasting pigs to kingdom come in no time.
If you are interested in trying this yourself, be aware that only the SDK version of the EPOC headset can be paired with 3rd party applications, the standard consumer version is locked into using solely authorized software.
Continue reading if you would like to see a video of their Angry Birds neural interface in action.
Continue reading “Flinging birds and slaying pigs with your thoughts”
At one point or another, who hasn’t had a dream in which you could fly, simply by thinking about it? [Yehuda Duenyas, aka XXXY] is currently working on a project at Rensselaer Polytechnic Institute which can allow you to do just that.
As part of a thesis project dubbed the “Infinity Simulator“, he has constructed a system that allows people to fly about using the elaborate rigging system at RPI’s Experimental Media and Performing Arts Center. His project allows users to glide through the air, walk up walls, and otherwise live out their flying fantasies, with mere thoughts.
An EEG headset is placed on the user, along with other wearable sensors which enhance the audio and visual experience of the person in flight. With enough concentration, the rigging system sweeps people off their feet, sending them soaring anywhere their mind desires. It sounds a bit like pretending to be Superman while using The Force to us, however the installation is described on the EMPAC web site as a “live-action stunt show crossed with a video game.” Either way, sign us up!
Hopefully we will see some video of the completed project in the near future, but in the meantime keep reading to see a behind-the-scenes preview of the flying rig in action.
Continue reading “Fly like Superman using The Force”