MindFlex Watches As You Judge Others By Sight Alone

mindflex_headset

[Paul] really wanted to know what his brain was thinking.

No, really. He is aware of all the thoughts that come and go, but he wanted to know what was going on in his brain below his conscious thought stream. Armed with a MindFlex headset and a Teensy, he set out to decode what really was going on inside his head.

He spent a month crawling 35 million Google profiles, downloading each user’s pictures into a MySQL database. The Teensy was attached to the MindFlex sensor board, and collects all of the headset’s output over a serial connection.

His experiment consisted of flashing each of the profile pictures on his monitor for one second, recording 2 of the 11 available brainwave channels from the MindFlex. These values were then plotted out so that he could visualize the “Attention” and “Meditation” values captured by the headset. At the end of the day he discovered, interestingly enough, that looking at dogs relaxed him the most!

We would love to see what correlations could be drawn from his collected data, and what sorts of “hidden” thoughts are coursing through others’ brains. It could certainly end up being a double-edged sword, uncovering subconscious biases and other such things, but it’s an incredibly intriguing experiment to say the least.

Be sure to check out the video below of the experiment in progress.

[youtube=http://www.youtube.com/watch?v=xBxb5eR7nbs&w=470]

11 thoughts on “MindFlex Watches As You Judge Others By Sight Alone

    1. The first thing I thought of when I read this post was to crawl the google profiles, but use a kinect and a small sceen under and eyepatch, and use these things instigate a facial recognition system where info about the person you’re looking at is displayed on the eyepatch screen. It’s just like Iron Man!
      All powered by a small laptop in a backpack with a 1-2TB hard drive installed.
      Anyway, I though what he actually did was quite interesting too :)

      1. The Google Profile database without images is about 60GB, and about 7GB compressed. I’m thinking about releasing the database on BitTorrent, but am unsure about the legalities of doing so. The images add another 800gb to the database. I do want to feed these images into a facial recognizer program’s trainer, and then see what Googler’s I can pick out of a crowd with it.

  1. Very cool! My other half is a Neurologist, and I am working on becoming a programmer; this is very much something we have brainstormed about before!

    Glad to see someone finding applications for this technology; keep up the great work.

  2. Has anyone figured out what the ‘attention’ and ‘meditation’ values from the Mindflex actually measure? I’m guessing it’s probably based on alpha band power along with something else. Also, are these two measures simply opposite poles of the same measure, or are they different values?

    Also, has anyone compared the recordings from the Mindflex to proper EEG recordings (e.g. from side by side electrodes from a professional system and Mindflex)?

    Without a bit more information on what the Mindflex is recording, and how well, I’d be skeptical about whether results like the above are driven by changes in usefully interpretable brain activity vs psychophysical processes (e.g. differences in average luminance between image types), or even differences in contamination of the data with muscular artefacts between image types.

    (I think I should qualify the above by saying that I love the idea of home experimentation/citizen science as in this example, but it would be useful to establish the limitations of the equipment/encourage experimental designs that control for extraneous factors – then genuinely interesting results can be produced).

    1. I don’t know what the cutoffs are for the toy, but the usual “Attention” / “Relaxed” cut off has to do the with the frequency of brain activity. Low frequency waves (e.g., delta — 0-4 Hz) are associated with more calm and meditative states where higher frequency (e.g., gamma, 30-100 Hz) are associated with more active and creative mental states. The requisite wiki link: http://en.wikipedia.org/wiki/Electroencephalography#Wave_patterns

      As a data point it is not the most informative, but looks a bit of fun from a toy.

      1. Cheers for the reply! I actually work with EEG, so I’m familiar with this type of band limited division of EEG data. Thing is, there are a number of different aspects of state-related brain activity that could be used as a metric of ‘attention’ and ‘meditation’.

        Last I checked, the exact method by which these values are calculated was unknown. I wonder if the ‘big brother’ systems of the Mindflex – the Mindwave etc. give any details if they output the same data?

        If no-one has compared Mindflex to real EEG data I might be tempted to try myself, although not being able to get to raw wave data will make the comparison limited.

  3. This sort of thing is done all the time by cogsci types, but it’s good that it’s getting some exposure here — and it’s good that it’s gotten feasible for hobbyists to fuck with their brains using electronics and measure the results quantitatively. I was planning to get an NIA for a similar experiment with priming.

Leave a Reply to ruphosCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.