In Star Trek IV: The Voyage Home, the usually unflappable Spock found himself stumped by one question: How do you feel? If researchers at the University of Memphis and IBM are correct, computers by Spock’s era might not have to ask. They’d know.
[Pouya Bashivan] and his colleagues used a relatively inexpensive EEG headset and machine learning techniques to determine if, with limited hardware, the computer could derive a subject’s mental state. This has several potential applications including adapting virtual reality avatars to match the user’s mood. A more practical application might be an alarm that alerts a drowsy driver.
One challenge is that inexpensive consumer-grade EEG headbands (like the Muse headband the researchers used) don’t monitor a complete EEG like a medical-grade device does. Also, the headbands have limits on their data quality since they don’t use a gel interface. They also get noise from sweat, hair, and even facial muscle movements. You can read more about how the Muse works on its website or in the video below.
Despite these problems, the researchers were able to distinguish between users watching a Kahn Academy video (an instructional mental state) and those watching a YouTube cat video (recreational mental state) at least some of the time. We won’t make a value judgement about how recreational cat videos really are–that’s the subject of another scholarly paper.