In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.
[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.
One challenge is that inexpensive consumer-grade EEG headbands (like the Muse headband the researchers used) don’t monitor a complete EEG like a medical-grade device does. Also, the headbands have limits on their data quality since they don’t use a gel interface. They also get noise from sweat, hair, and even facial muscle movements. You can read more about how the Muse works on its website or in the video below.
Despite these problems, the researchers were able to distinguish between users watching a Kahn Academy video (an instructional mental state) and those watching a YouTube cat video (recreational mental state) at least some of the time. We won’t make a value judgement about how recreational cat videos really are–that’s the subject of another scholarly paper.
Last year, we saw a teardown of the Muse. If you want something more open and capable, there’s always OpenHardwareExG.
Whole domain is wide open for amature experimentation.
EEG headsets are getting inexpensive and this is very cool. I’d like to see how far a DIY one (like from http://homes.cs.washington.edu/~vlee2/06271861.pdf) could go.
Maybe using neurofeedback to teach yourself how to provide a good signal to the machine learning system would be an option to overcome classification misses.
To get good conductivity requires not only gel but something like a scotch pad to pierce the dead flaky skin layer – it’s rather an unpleasant experience having the sensors fitted but once it’s in place it’s alright.
You sure aren’t kidding. The last EEG I had (every one of them, actually) it felt like the woman performing the test was scrubbing down to bare skull.
How do you feel? With the nerve endings in my skin.
Wrong, the answer we were looking for was “Squishy”. Thank you for playing.
I always thought he’d simply provide an answer not related to emotional state. “In normal physical and mental condition.”
I have the EPOC+. I’ve been looking for a good project for it, but I can’t come up with one. Any ideas?
https://emotiv.com/epoc.php
These things have been around for ages, yet nothing has come of them. The potential of EEGs to be used for BMIs is severely limited and vastly overestimated. Especcially due to the muscle artifacts. The emotional state difference is easily visible in the facial muscle movements, that wouldn’t even need EEG. And for drowsy drivers, there is already a chinese gadgets around that react to head tilt.
Soon coming to a TSA check point near you …
How do you feel?
How do you feel?
How do you feel?
After being asked that same question a few times, I would have grabbed an axe, chopped the annoying computer up, and responded “Much better, thank you”
Just wanted to make a correction regarding the affiliation info – note
that the second and third authors are at IBM T.J. Watson Research Center
(not U of M), while the first author is from U of Memphis (see affiliations in our
paper on Mental Recognition via Wearable EEG).