We had some incredible speakers at the Hackaday SuperConference. One of the final talks was given by [Kay Igwe], a graduate electrical engineering student at Columbia University. [Kay] has worked in nanotechnology as well as semiconductor manufacturing for Intel. These days, she’s spending her time playing games – but not with her hands.
Many of us love gaming, and probably spend way too much time on our computers, consoles, or phones playing games. But what about people who don’t have the use of their hands, such as ALS patients? Bringing gaming to the disabled is what prompted [Kay] to work on Control iT, a brain interface for controlling games. Brain-computer interfaces invoke images of Electroencephalography (EEG) machines. Usually that means tons of electrodes, gel in your hair, and data which is buried in the noise.
[Kay Igwe] is exploring a very interesting phenomenon that uses flashing lights to elicit very specific, and easy to detect brain waves. This type of interface is very promising and is the topic of the talk she gave at this year’s Hackaday SuperConference. Check out the video of her presentation, then join us after the break as we dive into the details of her work.
[Kay] is taking a slightly different approach from EEG based systems. She’s using Steady State Visually Evoked Potential (SSVEP). SSVEP is a long name for a simple concept. Visual data is processed in the occipital lobe, located at the back of the brain. It turns out that if a person looks at a flashing light at say, 50 Hz, their occipital lobe will have a strong electrical signal at 50 Hz, or a multiple thereof. Signals as high as 75 Hz, faster than is consciously recognizable as flashing, still generate electrical “flashes” in the brain. The signal is generated by neurons firing in response to the visual stimulus. The great thing about SSVEP is that the signals are much easier to detect than standard EEG signals. Dry contacts work fine here – no gel required!
[Kay’s] circuit is a classic setup for amplifying low power signals generated by the human body. She uses an AD620 instrumentation amplifier to bring the signals up to a reasonable level. After that, a couple of active filter stages clean things up. Finally, the brainwave signals are sent into the ADC of an Arduino.
The Arduino digitizes the data and sends it on to a computer. [Kay] used Processing to analyze the signal and display output. In this case, she’s performing a Fast Fourier Transform (FFT), then analyzing the frequencies of the brain signal. Finally, the output is displayed in the form of a game.
The video game [Kay] designed allows the user to move a character around the screen. This is done by looking at one of two blinking lights. One light causes the player to run to the right, while the other causes to the player to move upwards.
[Kay] has a lot planned for Control iT, everything from controlling wheelchairs to drones. We hope she has time to get it all done between her graduate classes at Columbia!