Brain Waves Can Answer Spock’s (and VR’s) Toughest Question

In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.

[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.

Continue reading “Brain Waves Can Answer Spock’s (and VR’s) Toughest Question”

Kay Igwe Explains Brain Gaming Through SSVEP


We had some incredible speakers at the Hackaday SuperConference. One of the final talks was given by [Kay Igwe], a graduate electrical engineering student at Columbia University. [Kay] has worked in nanotechnology as well as semiconductor manufacturing for Intel. These days, she’s spending her time playing games – but not with her hands.

Many of us love gaming, and probably spend way too much time on our computers, consoles, or phones playing games. But what about people who don’t have the use of their hands, such as ALS patients? Bringing gaming to the disabled is what prompted  [Kay] to work on Control iT, a brain interface for controlling games. Brain-computer interfaces invoke images of Electroencephalography (EEG) machines. Usually that means tons of electrodes, gel in your hair, and data which is buried in the noise.

[Kay Igwe] is exploring a very interesting phenomenon that uses flashing lights to elicit very specific, and easy to detect brain waves. This type of interface is very promising and is the topic of the talk she gave at this year’s Hackaday SuperConference. Check out the video of her presentation, then join us after the break as we dive into the details of her work.

Continue reading “Kay Igwe Explains Brain Gaming Through SSVEP”

Hacklet 56 – Brain Hacks

The brain is the most powerful – and least understood computer known to man. For these very reasons, working with the mind has long been an attraction for hackers, makers, and engineers. Everything from EEG to magnetic stimulus to actual implants have found their way into projects. This week’s Hacklet is about some of the best brain hacks on Hackaday.io!

teensy-bio[Paul Stoffregen], father of the Teensy, is hard at work on Biopotential Signal Library, his entry in the 2015 Hackaday Prize. [Paul] isn’t just hacking his own mind, he’s creating a library and reference design using the Teensy 3.1. This library will allow anyone to read electroencephalogram (EEG) signals without having to worry about line noise filtering, signal processing, and all the other details that make recording EEG signals hard. [Paul] is making this happen by having the Teensy’s cortex M4 processor perform interrupt driven acquisition and filtering in the background. This leaves the user’s Arduino sketch free to actually work with the data, rather than acquiring it. The initial hardware design will collect data from TI ADS129x chips, which are 24 bit ADCs with 4 or 8 simultaneous channels. [Paul] plans to add more chips to the library in the future.

 

bioxNext up is [Jae Choi] with Lucid Dream Communication Link. [Jae] hopes to create a link between the dream world and the real world. To do this, they are utilizing BioEXG, a device [Jae] designed to collect several types of biological signals. Data enters the system through several active probes. These probes use common pogo pins to make contact with the wearer’s skin. [Jae] says the active probes were able to read EEG signals even through their thick hair! Communication between dreams and the real world will be accomplished with eye movements. We haven’t heard from [Jae] in awhile – so we hope they aren’t caught in limbo!

bioloop[Qquuiinn] is working from a different angle to build bioloop, their entry in the 2015 Hackaday Prize. Rather than using EEG signals, [Qquuiinn] is going with Galvanic Skin Response (GSR). GSR is easy to measure compared to EEG signals. [Qquuiinn] is using an Arduino Pro Mini to perform all their signal acquisition and processing. This biofeedback signal has been used for decades by devices like polygraph “lie detector” machines. GSR values change as the sweat glands become active. It provides a window into a person’s psychological or physiological stress levels. [Qquuiinn] hopes bioloop will be useful both to individuals and to mental health professionals.

biomonitorFinally we have [Marcin Byczuk] with Biomonitor. Biomonitor can read both EEG and electrocardiogram (EKG) signals. Unlike the other projects on today’s Hacklet, Biomonitor is wireless. It uses a Bluetooth radio to transmit data to a nearby PC or smartphone. The main processor in Biomonitor is an 8 bit ATmega8L. Since the 8L isn’t up to a lot of signal processing, [Marcin] does much of his filtering the old fashioned way – in hardware. Carefully designed op-amp based active filters provide more than enough performance when measuring these types of signals. Biomonitor has already found it’s way into academia, being used in both the PalCom project, and brain-computer interface research.

If you want more brain hacking goodness, check out our brain hacking project list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Human-Machine Interface Projects At TEI 2016

For many of us, interacting with computers may be as glorious as punching keys and smearing touch screens with sweaty fingers and really bad posture. While functional, it’s worth reimagining a world where our conversation with technology is far more intuitive, ergonomic, and engaging. Enter TEI, an annual conference devoted to human-computer interaction and a landmark for novel projects that reinvent the conventional ways we engage our computers. TEI isn’t just another sit-down conference to soak in a wealth of paper talks. It’s an interactive weekend that combines these talks with a host of workshops provided by the speakers themselves.

Last year’s TEI brought us projects like SPATA, digital calipers that sped up our CAD modeling by eliminating the need for a third hand, and TorqueScreen, a force-feedback mechanism for tablets and other handhelds.

Next February’s conference is no exception for new ways to interact with novel technology. To get a sense of what’s to come, here’s a quick peek into the past from last year’s projects:

Continue reading “Human-Machine Interface Projects At TEI 2016”

Brains Controlling Labyrinths Without Hands

[Daniel], [Gal] and [Maxim] attended a hackathon last weekend – Brainihack 2015 – that focused on neuroscience-themed builds in a day and a half long build off. The trio are communications systems engineering and computer science students with no background in neuroscience whatsoever. You can’t build an FMRI in a day and a half, so they ended up winning the best project in the open source category with a brain-controlled labyrinth game.

The labyrinth itself is entirely 3D printed and much, much simpler than the usual, ‘wooden maze with holes’ that’s generally associated with labyrinth puzzles. It’s really just a plastic spiral for a ball to follow. There’s a reason for this simplicity. The team is using EEG to detect brain waves and move the labyrinth on the X and Y axes.

The team is using OpenBCI for the interface between their brains and a pair of servos. This is actually an interesting piece of tech; unlike a few toys like the NeuroSky MindWave and the Star Wars Force Trainer, the OpenBCI gives you eight input channels that attach to anywhere on the scalp. The team used these inputs to measure Alpha waves and Steady State Visually Evoked Potential to control the pair of servos on the labyrinth frame.

It’s a great build, a wonderful demonstration of a device that outputs real EEG signals, and the team on a prize. What’s not to like?

My First Brainf*ck

fuck

There was a time – not too long ago – that a ‘my first computer’ required the use of machine code and an understanding of binary. While an introduction to computers is now just how to put a Raspberry Pi image on an SD card, a few people are keeping the dream of memorizing opcodes alive. One such person is [Johan von Konow], creator of My First Brainfuck, an ultra small, low-cost programmable computer.

My First Brainfuck is an Arduino shield designed to have all the features of a normal computer, but without all those messy mnemonics that make assembly programming so easy. This computer is programmed in Brainfuck, a purposely obtuse programming language that, while being incredibly esoteric and difficult to program in, can be very, very rewarding.

[Johan] has a short tutorial showing how his computer works and how the Brainfuck language operates. There are only eight commands in Brainfuck, perfect for such a minimal user interface, but with enough patience, nearly anything can be written in this difficult language.

Right now there are a few examples showing how to play a scale on the on-board buzzer, displaying a Larson scanner on the LEDs, and a few more programs will be published in the future.

Putting The Brains Of A Reverse Geocache On The Outside

ioio

A reverse geocache – a box that only opens in a specific geographical area – is a perennial favorite here at Hackaday. We see a ton of different implementations, but most of the time, the builds are reasonably similar. Of course dedicating a GPS receiver solely to a reverse geocache isn’t an inexpensive prospect, so [Eric] came up with a better solution. He’s using a smart phone as the brains of his geocache, allowing him to keep the GPS and display outside the locked box.

The build began by finding an old box and modifying it so it can be locked with a servo. The only other bits of electronics inside the box are an IOIO board, a battery pack, and an I2C EEPROM for storing a few settings. On the phone side of things, [Eric] wrote an Android app to serve as both the programming interface, UI, and GPS hardware for his reverse geocache. It’s exactly like all the other reverse geocaches we’ve seen, only this time the controls are wireless.

[Eric] put up a video demoing his reverse geocache. You can check that out after the break.

Continue reading “Putting The Brains Of A Reverse Geocache On The Outside”