Mind Control… No, Not Like That

[Vintage Geek] found an interesting device from 1996 called “MindDrive” which claims you can control your computer with your brain. Oddly, though, it doesn’t connect to your head. Instead, it has a little finger sensor that looks like a pulse-ox sensor. Did it work? The video below will show you what it can and can’t do.

The company claims the device is the result of seven years of research. We suspect it is little more than a galvanometer, like a kid’s toy lie detector. There is a gold sensor and a Velcro strap. It is hard to imagine that it was feasible that “thinking left” would cause a change in your finger that the device can interpret.

Continue reading “Mind Control… No, Not Like That”

Mind-Controlled Flamethrower

Mind control might seem like something out of a sci-fi show, but like the tablet computer, universal translator, or virtual reality device, is actually a technology that has made it into the real world. While these devices often requires on advanced and expensive equipment to interpret brain waves properly, with the right machine learning system it’s possible to do things like this mind-controlled flame thrower on a much smaller budget. (Video, embedded below.)

[Nathaniel F] was already experimenting with using brain-computer interfaces and machine learning, and wanted to see if he could build something practical combining these two technologies. Instead of turning to an EEG machine to read brain patterns, he picked up a much less expensive Mindflex and paired it with a machine learning system running TensorFlow to make up for some of its shortcomings. The processing is done by a Raspberry Pi 4, which sends commands to an Arduino to fire the flamethrower when it detects the proper thought patterns. Don’t forget the flamethrower part of this build either: it was designed and built entirely by [Nathanial F] as well using gas and an arc lighter.

While the build took many hours of training to gather the proper amount of data to build the neural network and works as the proof of concept he was hoping for, [Nathaniel F] notes that it could be improved by replacing the outdated Mindflex with a better EEG. For now though, we appreciate seeing sci-fi in the real world in projects like this, or in other mind-controlled projects like this one which converts a prosthetic arm into a mind-controlled music synthesizer.

Continue reading “Mind-Controlled Flamethrower”

Mind-Controlled Beer Pong Gets Easier As You Drink

Wouldn’t it be nice if beer pong could somehow get easier the more you drink? You know, so you can drink more? [Ty Palowski] has made it so with automated, mind-controlled beer pong.

[Ty] started by making a beer pong table that moves the cups back and forth at both ends. An Arduino Nano controls a stepper that controls a slider, and the cups move with the slider through the magic of magnets. The mind control part came cheaper than you might think. Back in 2009, Mattel released a game called Mind Flex that involves an EEG headset and using brain waves to guide a foam ball on a stream of air through a little obstacle course. These headsets are available for about $12 on ebay, or at least they were before this post went up.

[Ty] cracked open the headset added an HC-06 Bluetooth module to talk to the Arduino. It’s using a program called Brainwave OSC to get the raw data from the headset and break it into levels of concentration and relaxation. The Arduino program monitors the attention levels, and when a certain threshold of focus is reached, it moves the cups back and forth at a predetermined speed ranging from 1 to an impossible-looking 10. Check out the two videos after the break. The first one covers the making of the the automatic beer pong part, and the second is where [Ty] adds mind control.

We’ve seen a different headset — the hacker-friendly NeuroSky Mindwave — pop up a few times. Here’s one that’s been hacked to induce lucid dreaming.

Continue reading “Mind-Controlled Beer Pong Gets Easier As You Drink”

Mind Poetry Mindflex Hack

Project Perceives Pondering, Prints Poetry

If poetry is your thing, this hack might convince you that your brain is more advanced than the rest of us poor sots. [Roni Brandini] designed a system that prints lines of poetry when you concentrate. The Mind Poetry project uses an EEG headset from Mattel’s Mindflex toy and pipes your brain’s signals to an Arduino Mega 2560. The system then looks for patterns of brain waves that indicate concentration. As you maintain your concentration, the system continues to print lines of poetry to a small display.

Tapping into the mindflex

[Roni] follows the standard Mindflex hack process by tapping into the data transmission pin on the Mindflex board. Optoisolation is provided by a PC817 to make sure wall power can’t accidentally bleed over into your own wetware. You could get away with just using batteries, but isolation is still a best practice.

The Arduino Brain Library is used to decipher the signal. The Mindflex picks up brain waves from roughly 1 Hz to 50 Hz, which is enough bandwidth to approximately determine mental state. For example, Theta waves are in the 4 Hz to 7 Hz range and can indicate a relaxed, meditative state. Low Beta waves range from 13 Hz to 17 Hz and indicate an alert, focused mental state. The Mindflex system is also generous in that it provides derived meditation and attention scores, ranging from 0 to 100.

It’s difficult to get a high level of precision with this sensor and sampling system, so the code uses [Roni]’s custom recipe of meditation score, attention score, and Low Beta value. He finds it most effective to trigger actions based on a relationship of these scores instead of focusing on the readings themselves. For example, an uptick in both Low Beta waves and the attention score indicate concentration.

Mindflex Brainwave Chart

If the wearer is concentrating, the system prints lines of poetry to the display and charts the three values. As an added gamification, it’ll tell you how many times you broke concentration before you completed the poem. One can imagine a game that tries to break concentration by printing other phrases or even activating an array of mechanical distractions.

If poetry isn’t your thing, you’re in luck. The “Mind Poetry” project also makes some headway (pun intended) with processing the EEG headset’s signals and triggering actions This means you don’t have to be into the poetry scene to reap the benefits. You now have the bones of a hack that lets you control things with your brain muscles and without your muscle muscles.

For inspiration, check out some other Mindflex hacks that let you order drinks with your mind (recommended), shock the heck out of people (not recommended), or even move around your skirt (uh… you do you?).

Continue reading “Project Perceives Pondering, Prints Poetry”

Move A Robotic Hand With Your Nerve Impulses

Many of us will have seen robotics or prosthetics operated by the electrical impulses detected from a person’s nerves, or their brain. In one form or another they are a staple of both mass-market technology news coverage and science fiction.

The point the TV journalists and the sci-fi authors fail to address though is this: how does it work? On a simple level they might say that the signal from an individual nerve is picked up just as though it were a wire in a loom, and sent to the prosthetic. But that’s a for-the-children explanation which is rather evidently not possible with a few electrodes on the skin. How do they really do it?

A project from [Bruce Land]’s Cornell University students [Michael Haidar], [Jason Hwang], and [Srikrishnaa Vadivel] seeks to answer that question. They’ve built an interface that allows them to control a robotic hand using signals gathered from electrodes placed on their forearms. And their write-up is a fascinating read, for within that project lie a multitude of challenges, of which the hand itself is only a minor one that they solved with an off-the-shelf kit.

The interface itself had to solve the problem of picking up the extremely weak nerve impulses while simultaneously avoiding interference from mains hum and fluorescent lights. They go into detail about their filter design, and their use of isolated power supplies to reduce this noise as much as possible.

Even with the perfect interface though they still have to train their software to identify different finger movements. Plotting the readings from their two electrodes as axes of a graph, they were able to map graph regions corresponding to individual muscles. Finally, the answer that displaces the for-the-children explanation.

There are several videos linked from their write-up, but the one we’re leaving you with below is a test performed in a low-noise environment. They found their lab had so much noise that they couldn’t reliably demonstrate all fingers moving, and we think it would be unfair to show you anything but their most successful demo. But it’s also worth remembering how hard it was to get there.

Continue reading “Move A Robotic Hand With Your Nerve Impulses”

Think Your Way To Work In A Mind-Controlled Tesla

When you own an $80,000 car, a normal person might be inclined to never take it out of the garage. But normal often isn’t what we do around here, so seeing a Tesla S driven by mind control is only slightly shocking.

[Casey_S] appears to be the owner of the Tesla S in question, but if he’s not he’ll have some ‘splaining to do. He took the gigantic battery and computer in a car-shaped case luxury car to a hackathon in Berkley last week and promptly fitted it with the gear needed to drive the car remotely. Yes, the Model S has steering motors built in, but Tesla hasn’t been forthcoming with an API to access such functions. So [Casey_S] and his team had to cobble together a steering servo from a windshield wiper motor and a potentiometer mounted to a frame made of 2x4s. Linear actuators attach to the brake and accelerator pedals, and everything talks to an Arduino.

The really interesting part is that the whole thing is controlled by an electroencephalography helmet and a machine learning algorithm that detects when the driver thinks “forward” or “turn right.” It translates those thoughts to variables that drive the actuators. Unfortunately, space constraints kept [Casey_S] from really putting the rig through its paces, but the video after the break shows that the system worked well enough to move the car forward and steer a little.

There haven’t been too many thought-controlled cars featured here before, but we have covered a wheelchair with an EEG interface.

Continue reading “Think Your Way To Work In A Mind-Controlled Tesla”

Mind-Controlled Prosthetic Arm

Losing a limb often means getting fitted for a prosthetic. Although there have been some scientific and engineering advances (compare a pirate’s peg leg to “blade runner” Oscar Pistorius’ legs), they still are just inert attachments to your body. Researchers at Johns Hopkins hope to change all that. In the Journal of Neural Engineering, they announced a proof of concept design that allowed a person to control prosthetic fingers using mind control.

Continue reading “Mind-Controlled Prosthetic Arm”