Project Perceives Pondering, Prints Poetry

If poetry is your thing, this hack might convince you that your brain is more advanced than the rest of us poor sots. [Roni Brandini] designed a system that prints lines of poetry when you concentrate. The Mind Poetry project uses an EEG headset from Mattel’s Mindflex toy and pipes your brain’s signals to an Arduino Mega 2560. The system then looks for patterns of brain waves that indicate concentration. As you maintain your concentration, the system continues to print lines of poetry to a small display.

Tapping into the mindflex

[Roni] follows the standard Mindflex hack process by tapping into the data transmission pin on the Mindflex board. Optoisolation is provided by a PC817 to make sure wall power can’t accidentally bleed over into your own wetware. You could get away with just using batteries, but isolation is still a best practice.

The Arduino Brain Library is used to decipher the signal. The Mindflex picks up brain waves from roughly 1 Hz to 50 Hz, which is enough bandwidth to approximately determine mental state. For example, Theta waves are in the 4 Hz to 7 Hz range and can indicate a relaxed, meditative state. Low Beta waves range from 13 Hz to 17 Hz and indicate an alert, focused mental state. The Mindflex system is also generous in that it provides derived meditation and attention scores, ranging from 0 to 100.

It’s difficult to get a high level of precision with this sensor and sampling system, so the code uses [Roni]’s custom recipe of meditation score, attention score, and Low Beta value. He finds it most effective to trigger actions based on a relationship of these scores instead of focusing on the readings themselves. For example, an uptick in both Low Beta waves and the attention score indicate concentration.

Mindflex Brainwave Chart

If the wearer is concentrating, the system prints lines of poetry to the display and charts the three values. As an added gamification, it’ll tell you how many times you broke concentration before you completed the poem. One can imagine a game that tries to break concentration by printing other phrases or even activating an array of mechanical distractions.

If poetry isn’t your thing, you’re in luck. The “Mind Poetry” project also makes some headway (pun intended) with processing the EEG headset’s signals and triggering actions This means you don’t have to be into the poetry scene to reap the benefits. You now have the bones of a hack that lets you control things with your brain muscles and without your muscle muscles.

For inspiration, check out some other Mindflex hacks that let you order drinks with your mind (recommended), shock the heck out of people (not recommended), or even move around your skirt (uh… you do you?).

Continue reading “Project Perceives Pondering, Prints Poetry”

Move A Robotic Hand With Your Nerve Impulses

Many of us will have seen robotics or prosthetics operated by the electrical impulses detected from a person’s nerves, or their brain. In one form or another they are a staple of both mass-market technology news coverage and science fiction.

The point the TV journalists and the sci-fi authors fail to address though is this: how does it work? On a simple level they might say that the signal from an individual nerve is picked up just as though it were a wire in a loom, and sent to the prosthetic. But that’s a for-the-children explanation which is rather evidently not possible with a few electrodes on the skin. How do they really do it?

A project from [Bruce Land]’s Cornell University students [Michael Haidar], [Jason Hwang], and [Srikrishnaa Vadivel] seeks to answer that question. They’ve built an interface that allows them to control a robotic hand using signals gathered from electrodes placed on their forearms. And their write-up is a fascinating read, for within that project lie a multitude of challenges, of which the hand itself is only a minor one that they solved with an off-the-shelf kit.

The interface itself had to solve the problem of picking up the extremely weak nerve impulses while simultaneously avoiding interference from mains hum and fluorescent lights. They go into detail about their filter design, and their use of isolated power supplies to reduce this noise as much as possible.

Even with the perfect interface though they still have to train their software to identify different finger movements. Plotting the readings from their two electrodes as axes of a graph, they were able to map graph regions corresponding to individual muscles. Finally, the answer that displaces the for-the-children explanation.

There are several videos linked from their write-up, but the one we’re leaving you with below is a test performed in a low-noise environment. They found their lab had so much noise that they couldn’t reliably demonstrate all fingers moving, and we think it would be unfair to show you anything but their most successful demo. But it’s also worth remembering how hard it was to get there.

Continue reading “Move A Robotic Hand With Your Nerve Impulses”

Think Your Way To Work In A Mind-Controlled Tesla

When you own an $80,000 car, a normal person might be inclined to never take it out of the garage. But normal often isn’t what we do around here, so seeing a Tesla S driven by mind control is only slightly shocking.

[Casey_S] appears to be the owner of the Tesla S in question, but if he’s not he’ll have some ‘splaining to do. He took the gigantic battery and computer in a car-shaped case luxury car to a hackathon in Berkley last week and promptly fitted it with the gear needed to drive the car remotely. Yes, the Model S has steering motors built in, but Tesla hasn’t been forthcoming with an API to access such functions. So [Casey_S] and his team had to cobble together a steering servo from a windshield wiper motor and a potentiometer mounted to a frame made of 2x4s. Linear actuators attach to the brake and accelerator pedals, and everything talks to an Arduino.

The really interesting part is that the whole thing is controlled by an electroencephalography helmet and a machine learning algorithm that detects when the driver thinks “forward” or “turn right.” It translates those thoughts to variables that drive the actuators. Unfortunately, space constraints kept [Casey_S] from really putting the rig through its paces, but the video after the break shows that the system worked well enough to move the car forward and steer a little.

There haven’t been too many thought-controlled cars featured here before, but we have covered a wheelchair with an EEG interface.

Continue reading “Think Your Way To Work In A Mind-Controlled Tesla”

Mind-Controlled Prosthetic Arm

Losing a limb often means getting fitted for a prosthetic. Although there have been some scientific and engineering advances (compare a pirate’s peg leg to “blade runner” Oscar Pistorius’ legs), they still are just inert attachments to your body. Researchers at Johns Hopkins hope to change all that. In the Journal of Neural Engineering, they announced a proof of concept design that allowed a person to control prosthetic fingers using mind control.

Continue reading “Mind-Controlled Prosthetic Arm”

School Of Friends Use Thought Control On A Shark

[Chip Audette] owns (at least) two gadgets: one of those remote control helium-filled flying shark (an Air Swimmer), and an OpenBCI EEG system that can read brain waves and feed the data to a PC. Given that information, it can hardly surprise you that [Chip] decided to control his flying fish with his brain.

Before you get too excited, you have to (like [Chip]) alter your expectations. While an EEG has a lot of information, your direct thoughts are (probably) not readable. However, certain actions create easily identifiable patterns in the EEG data. In particular, closing your eyes creates a strong 10Hz signal across the back of the head.

Continue reading “School Of Friends Use Thought Control On A Shark”

Mind-controlling Cockroaches

Producing micro robotics is not yet easy or cost-effective, but why do we need to when we can just control the minds of cockroaches? A team or researchers from North Carolina State University is calling this augmented Madagascar Hissing cockroach an Insect Biobot in their latest research paper (PDF). It’s not the first time the subject has come up. There have already been proofs in research and even more amateur endeavors. But the accuracy and control seen in the video after the break is beyond compare.

The roach is being controlled to perfectly follow a line on the floor. One of the things that makes this iteration work so well is that the microcontroller includes a new type of ADC-based feedback loop for the stimulation of the insect brain. This helps to ensure that the roach will not grow accustom to the stimulation and stop responding to it. Since this variety of insect can live for about two years, this breakthrough makes it into a reusable tool. We’re not sure what that tool will be used for, but perhaps the next plague of insects will be controlled by man, and not mother nature.

Continue reading “Mind-controlling Cockroaches”

Flinging Birds And Slaying Pigs With Your Thoughts

angry_birds_mind_control

[Rafael Mizrahi and Anat Sambol] decided that Angry Birds was missing one crucial element – mind control. They grabbed a copy of the game for their netbook and [Rafael] strapped on an Emotiv EPOC headset to see if he could play it without using a mouse or keyboard. While he was able to move the cursor around with his thoughts, he found that Emotiv’s EmoKey software lacked any sort of mouse button support. Undaunted, they turned to the Internet for help and found that he could map the Emotiv’s output to his mouse via another application, GlovePie.

As you can see in the video below their efforts were successful, though we doubt [Rafael] will be completely giving up his mouse just yet. With some more refinement, we imagine [Rafael] will be blasting pigs to kingdom come in no time.

If you are interested in trying this yourself, be aware that only the SDK version of the EPOC headset can be paired with 3rd party applications, the standard consumer version is locked into using solely authorized software.

Continue reading if you would like to see a video of their Angry Birds neural interface in action.

Continue reading “Flinging Birds And Slaying Pigs With Your Thoughts”