Brain Waves can Answer Spock’s (and VR’s) Toughest Question

In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.

[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.

One challenge is that inexpensive consumer-grade EEG headbands (like the Muse headband the researchers used) don’t monitor a complete EEG like a medical-grade device does. Also, the headbands have limits on their data quality since they don’t use a gel interface. They also get noise from sweat, hair, and even facial muscle movements. You can read more about how the Muse works on its website or in the video below.

Despite these problems, the researchers were able to distinguish between users watching a Kahn Academy video (an instructional mental state) and those watching a YouTube cat video (recreational mental state) at least some of the time. We won’t make a value judgement about how recreational cat videos really are–that’s the subject of another scholarly paper.

Last year, we saw a teardown of the Muse. If you want something more open and capable, there’s always OpenHardwareExG.

12 thoughts on “Brain Waves can Answer Spock’s (and VR’s) Toughest Question

  1. To get good conductivity requires not only gel but something like a scotch pad to pierce the dead flaky skin layer – it’s rather an unpleasant experience having the sensors fitted but once it’s in place it’s alright.

  2. These things have been around for ages, yet nothing has come of them. The potential of EEGs to be used for BMIs is severely limited and vastly overestimated. Especcially due to the muscle artifacts. The emotional state difference is easily visible in the facial muscle movements, that wouldn’t even need EEG. And for drowsy drivers, there is already a chinese gadgets around that react to head tilt.

  3. How do you feel?

    How do you feel?

    How do you feel?

    After being asked that same question a few times, I would have grabbed an axe, chopped the annoying computer up, and responded “Much better, thank you”

  4. Just wanted to make a correction regarding the affiliation info – note
    that the second and third authors are at IBM T.J. Watson Research Center
    (not U of M), while the first author is from U of Memphis (see affiliations in our
    paper on Mental Recognition via Wearable EEG).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s