TOBE: Tangible Out-of-Body Experience With Biosignals

TOBE is a toolkit that enables the user to create Tangible Out-of-Body Experiences, created by [Renaud Gervais] and others and presented at the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. The goal is to expose the inner states of users using physiological signals such as heart rate or brain activity. The toolkit is a proposal that covers the creation of a 3D printed avatar where visual representations of physiological sensors (ECG, EDA, EEG, EOG and breathing monitor) are displayed, the creation and use of these sensors based on open hardware platforms such as Bitalino or OpenBCI, and signal processing software using OpenViBE.

In their research paper, the team identified the signals and mental states which they have organized in three different types:

  • States perceived by self and others, e.g. eye blinks. Even if those signals may sometimes appear redundant as one may directly look at the person in order to see them, they are crucial in associating a feedback to a user.
  • States perceived only by self, e.g. heart rate or breathing. Mirroring these signals provides presence towards the feedback.
  • States hidden to both self and others, e.g. mental states such as cognitive workload. This type of metrics holds the most
    promising applications since they are mostly unexplored.

By visualising their own inner states and with the ability to share them, users can develop a better understating of their own selves as well others. Analysing their avatar in different contexts allows a user to see how they react in different scenarios such as stress, working or playing. When you join several users they can see how each other responds the same stimuli, for example. Continue reading “TOBE: Tangible Out-of-Body Experience With Biosignals”

Think Your Way To Work In A Mind-Controlled Tesla

When you own an $80,000 car, a normal person might be inclined to never take it out of the garage. But normal often isn’t what we do around here, so seeing a Tesla S driven by mind control is only slightly shocking.

[Casey_S] appears to be the owner of the Tesla S in question, but if he’s not he’ll have some ‘splaining to do. He took the gigantic battery and computer in a car-shaped case luxury car to a hackathon in Berkley last week and promptly fitted it with the gear needed to drive the car remotely. Yes, the Model S has steering motors built in, but Tesla hasn’t been forthcoming with an API to access such functions. So [Casey_S] and his team had to cobble together a steering servo from a windshield wiper motor and a potentiometer mounted to a frame made of 2x4s. Linear actuators attach to the brake and accelerator pedals, and everything talks to an Arduino.

The really interesting part is that the whole thing is controlled by an electroencephalography helmet and a machine learning algorithm that detects when the driver thinks “forward” or “turn right.” It translates those thoughts to variables that drive the actuators. Unfortunately, space constraints kept [Casey_S] from really putting the rig through its paces, but the video after the break shows that the system worked well enough to move the car forward and steer a little.

There haven’t been too many thought-controlled cars featured here before, but we have covered a wheelchair with an EEG interface.

Continue reading “Think Your Way To Work In A Mind-Controlled Tesla”

Hackaday Prize Entry: Lucid Dreaming Research

Lucid dreaming is one of the rare psychological phenomenon terrible sci-fi frequently gets right. Yes, lucid dreaming does exist, and one of the best ways to turn a normal dream into a lucid dream is to fixate on a particular object, sound, or smell. For their Hackaday Prize entry, [Jae] is building a device to turn the electronic enthusiast community on to lucid dreaming. It’s a research platform that allows anyone to study their own dreams and access a world where you can do anything.

The core of this project is an 8-channel EEG used to measure the electrical activity in the brain during sleep. These EEG electrodes are fed into a 24-bit ADC which is sampled 250 times per second by an ARM Cortex M4F microcontroller. The captured data is recorded or sent to a PC or smartphone over a Bluetooth connection where a familiar sound can be played (think of the briefcase in Inception), or some other signal that will tell the dreamer they’re dreaming.

We’ve seen a few similar builds in the past, most famously a NeuroSky MindWave headset turned into a comfortable single-channel EEG-type device. The NeuroSky hardware is limited, though, and a setup with proper amplifiers and ADCs will be significantly more helpful in debugging the meatspace between [Jae]’s ears.

Hacklet 105 – More Mind And Brain Hacks

A mind is a terrible thing to waste – but an awesome thing to hack. We last visited brain hacks back in July of 2015. Things happen fast on Hackaday.io. Miss a couple of days, and you’ll miss a bunch of great new projects, including some awesome new biotech hacks. This week, we’re checking out some of the best new mind and brain hacks on Hackaday.io

We start with [Daniel Felipe Valencia V] and Brainmotic. Brainmotic is [Daniel’s] entry in the 2016 Hackaday Prize. Smart homes and the Internet of Things are huge buzzwords these days. [Daniel’s] project aims to meld this technology with electroencephalogram (EEG). Your mind will be able to control your home. This would be great for anyone, but it’s especially important for the handicapped. Brainmotic’s interface is using the open hardware OpenBCI as the brain interface. [Daniel’s] software and hardware will create a bridge between this interface and the user’s home.

 

biofeed1Next we have [Angeliki Beyko] with Serial / Wireless Brainwave Biofeedback. EEG used to be very expensive to implement. Things have gotten cheap enough that we now have brain controlled toys on the market. [Angeliki] is hacking these toys into useful biofeedback tools. These tools can be used to visualize, and even control the user’s state of mind. [Angeliki’s] weapon of choice is the MindFlex series of toys. With the help of a PunchThrouch LightBlue Bean she was able to get the EEG headsets talking on Bluetooth. A bit of fancy software on the PC side allows the brainwave signals relieved by the MindFlex to be interpreted as simple graphs. [Angeliki] even went on to create a Mind-Controlled Robotic Xylophone based on this project.

brainhelmetNext is [Stuart Longland] who hopes to protect brains with Improved Helmets. Traumatic Brain Injury (TBI) is in the spotlight of medical technology these days. As bad as it may be, TBI is just one of several types of head and neck injuries one may sustain when in a bicycle or motorcycle accident. Technology exists to reduce injury, and is included with some new helmets. Many of these technologies, such as MIPS, are patented. [Stuart] is working to create a more accurate model of the head within the helmet, and the brain within the skull. From this data he intends to create a license free protection system which can be used with new helmets as well as retrofitted to existing hardware.

mindwaveFinally we have [Tom Meehan], whose entry in the 2016 Hackaday Prize is Train Your Brain with Neurofeedback. [Tom] is hoping to improve quality of life for people suffering from Epilepsy, Autism, ADHD, and other conditions with the use of neurofeedback. Like [Angeliki ] up above, [Tom] is hacking hardware from NeuroSky. In this case it’s the MindWave headset. [Tom’s] current goal is to pull data from the TAGM1 board inside the MindWave. Once he obtains EEG data, a Java application running on the PC side will allow him to display users EEG information. This is a brand new project with updates coming quickly – so it’s definitely one to watch!

If you want more mind hacking goodness, check out our freshly updated brain hacking project list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Brain Waves Can Answer Spock’s (and VR’s) Toughest Question

In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.

[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.

Continue reading “Brain Waves Can Answer Spock’s (and VR’s) Toughest Question”

Kay Igwe Explains Brain Gaming Through SSVEP


We had some incredible speakers at the Hackaday SuperConference. One of the final talks was given by [Kay Igwe], a graduate electrical engineering student at Columbia University. [Kay] has worked in nanotechnology as well as semiconductor manufacturing for Intel. These days, she’s spending her time playing games – but not with her hands.

Many of us love gaming, and probably spend way too much time on our computers, consoles, or phones playing games. But what about people who don’t have the use of their hands, such as ALS patients? Bringing gaming to the disabled is what prompted  [Kay] to work on Control iT, a brain interface for controlling games. Brain-computer interfaces invoke images of Electroencephalography (EEG) machines. Usually that means tons of electrodes, gel in your hair, and data which is buried in the noise.

[Kay Igwe] is exploring a very interesting phenomenon that uses flashing lights to elicit very specific, and easy to detect brain waves. This type of interface is very promising and is the topic of the talk she gave at this year’s Hackaday SuperConference. Check out the video of her presentation, then join us after the break as we dive into the details of her work.

Continue reading “Kay Igwe Explains Brain Gaming Through SSVEP”

School Of Friends Use Thought Control On A Shark

[Chip Audette] owns (at least) two gadgets: one of those remote control helium-filled flying shark (an Air Swimmer), and an OpenBCI EEG system that can read brain waves and feed the data to a PC. Given that information, it can hardly surprise you that [Chip] decided to control his flying fish with his brain.

Before you get too excited, you have to (like [Chip]) alter your expectations. While an EEG has a lot of information, your direct thoughts are (probably) not readable. However, certain actions create easily identifiable patterns in the EEG data. In particular, closing your eyes creates a strong 10Hz signal across the back of the head.

Continue reading “School Of Friends Use Thought Control On A Shark”