TOBE: Tangible Out-of-Body Experience With Biosignals

TOBE is a toolkit that enables the user to create Tangible Out-of-Body Experiences, created by [Renaud Gervais] and others and presented at the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. The goal is to expose the inner states of users using physiological signals such as heart rate or brain activity. The toolkit is a proposal that covers the creation of a 3D printed avatar where visual representations of physiological sensors (ECG, EDA, EEG, EOG and breathing monitor) are displayed, the creation and use of these sensors based on open hardware platforms such as Bitalino or OpenBCI, and signal processing software using OpenViBE.

In their research paper, the team identified the signals and mental states which they have organized in three different types:

  • States perceived by self and others, e.g. eye blinks. Even if those signals may sometimes appear redundant as one may directly look at the person in order to see them, they are crucial in associating a feedback to a user.
  • States perceived only by self, e.g. heart rate or breathing. Mirroring these signals provides presence towards the feedback.
  • States hidden to both self and others, e.g. mental states such as cognitive workload. This type of metrics holds the most
    promising applications since they are mostly unexplored.

By visualising their own inner states and with the ability to share them, users can develop a better understating of their own selves as well others. Analysing their avatar in different contexts allows a user to see how they react in different scenarios such as stress, working or playing. When you join several users they can see how each other responds the same stimuli, for example. Continue reading “TOBE: Tangible Out-of-Body Experience With Biosignals”

School Of Friends Use Thought Control On A Shark

[Chip Audette] owns (at least) two gadgets: one of those remote control helium-filled flying shark (an Air Swimmer), and an OpenBCI EEG system that can read brain waves and feed the data to a PC. Given that information, it can hardly surprise you that [Chip] decided to control his flying fish with his brain.

Before you get too excited, you have to (like [Chip]) alter your expectations. While an EEG has a lot of information, your direct thoughts are (probably) not readable. However, certain actions create easily identifiable patterns in the EEG data. In particular, closing your eyes creates a strong 10Hz signal across the back of the head.

Continue reading “School Of Friends Use Thought Control On A Shark”

Brains Controlling Labyrinths Without Hands

[Daniel], [Gal] and [Maxim] attended a hackathon last weekend – Brainihack 2015 – that focused on neuroscience-themed builds in a day and a half long build off. The trio are communications systems engineering and computer science students with no background in neuroscience whatsoever. You can’t build an FMRI in a day and a half, so they ended up winning the best project in the open source category with a brain-controlled labyrinth game.

The labyrinth itself is entirely 3D printed and much, much simpler than the usual, ‘wooden maze with holes’ that’s generally associated with labyrinth puzzles. It’s really just a plastic spiral for a ball to follow. There’s a reason for this simplicity. The team is using EEG to detect brain waves and move the labyrinth on the X and Y axes.

The team is using OpenBCI for the interface between their brains and a pair of servos. This is actually an interesting piece of tech; unlike a few toys like the NeuroSky MindWave and the Star Wars Force Trainer, the OpenBCI gives you eight input channels that attach to anywhere on the scalp. The team used these inputs to measure Alpha waves and Steady State Visually Evoked Potential to control the pair of servos on the labyrinth frame.

It’s a great build, a wonderful demonstration of a device that outputs real EEG signals, and the team on a prize. What’s not to like?