Anouk Wipprecht: Robotic Dresses and Human Interfaces

Anouk Wipprecht‘s hackerly interests are hard to summarize, so bear with us. She works primarily on technological dresses, making fashion with themes inspired by nature, but making it interactive. If that sounds a little bit vague, consider that she’s made over 40 pieces of clothing, from a spider dress that attacks when someone enters your personal space too quickly to a suit with plasma balls that lets her get hit by Arc Attack’s giant musical Tesla coils in style. She gave an inspiring talk at the 2017 Hackaday Superconference, embedded below, that you should really go watch.

Anouk has some neat insights about how the world of fashion and technology interact. Technology, and her series of spider dresses in particular, tends to evolve over related versions, while fashion tends to seek the brand-new and the now. Managing these two impulses can’t be easy.

For instance, her first spider was made with servos and laser-cut acrylic, in a construction that probably seems familiar to most Hackaday readers. But hard edges, brittle plastic, and screws that work themselves slowly loose are no match for human-borne designs. Her most recent version is stunningly beautiful, made of 3D printed nylon for flexibility, and really nails the “bones of a human-spider hybrid” aesthetic that she’s going for.

The multiple iterations of her drink-dispensing “cocktail dress” (get it?!) show the same progression. We appreciate the simple, press-button-get-drink version that she designed for a fancy restaurant in Ibiza, but we really love the idea of being a human ice-breaker at parties that another version brings to the mix: to get a drink, you have to play “truth or dare” with questions randomly chosen and displayed on a screen on the wearer’s arm.

Playfulness runs through nearly everything that Anouk creates. She starts out with a “what if?” and runs with it. But she’s not just playing around. She’s also a very dedicated documenter of her projects, because she believes in paying the inspiration forward to the next generation. And her latest project does something really brilliant: merging fashion, technology, and medical diagnostics.

It’s a stripped-down EEG that kids with ADHD can wear around in their daily lives that triggers a camera when their brains get stimulated in particular ways. Instead of a full EEG that requires a child to have 30 gel electrodes installed, and which can only be run in a medical lab, stripping down the system allows the child to go about their normal life. This approach may collect limited data in comparison to the full setup, but since it’s collected under less intimidating circumstances, the little data that it does collect may be more “real”. This project is currently in progress, so we’ll just have to wait and see what comes out. We’re excited.

There’s so much more going on in Anouk’s presentation, but don’t take our word for it. Go watch Anouk’s talk right now and you’ll find she inspires you to adds a little bit more of the human element into your projects. Be playful, awkward, or experimental. But above all, be awesome!

Continue reading “Anouk Wipprecht: Robotic Dresses and Human Interfaces”

Hackaday Prize Entry: Seizure Detection by EEG

For those that suffer them, seizures are a dangerous thing. Outside the neurological effects, there is always the possibility of injury from the surrounding environment as well – consider the dangers of having a seizure near a busy road, or even simply a glass table. Some detection methods exist for seizure sufferers, but they are primarily based on detecting the jerking motion of the patient. [akhil2001us] thinks it’s possible to do better – by measuring brainwaves to detect the onset of seizures.

The build is centered around the Neurosky Mindwave headset. This is an off-the-shelf product designed specifically for capturing EEG data. It outputs raw brainwave data which is key for doing proper analysis. The project then uses an Arduino Mega to tie everything together, along with some Sparkfun Bluetooth modules to talk to a cell phone to send an SMS for help in the event of a seizure.

The real difficulty in a project like this comes from developing an algorithm that can reliably detect seizures, as well as a unit robust enough to work in the real world. It’s no use if your headset is detecting a seizure in progress, but the help message is never sent because a wire fell out of your breadboard. It’s considerations like this, combined with the threat of litigation, behind why medical devices are so rigorously engineered and certified. For a proof of concept, however, such concerns are not as important.

We’ve seen Mindwave builds before – brainwave research is an exciting field!

How To Telepathically Tell A Robot It Screwed Up

Training machines to effectively complete tasks is an ongoing area of research. This can be done in a variety of ways, from complex programming interfaces, to systems that understand commands in natural langauge. A team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) wanted to see if it was possible for humans to communicate more directly when training a robot. Their system allows a user to correct a robot’s actions using only their brain.

The concept is simple – using an EEG cap to detect brainwaves, the system measures a special type of brain signals called “error-related potentials”. Simply noticing the robot making a mistake allows the robot to correct itself, and for a nice extra touch – blush in embarassment.

This interface allows for a very intuitive way of working with a robot – upon noticing a mistake, the robot is able to automatically stop or correct its behaviour. Currently the system is only capable of being used for very simple tasks – the video shows the robot sorting objects of two types into corresponding bins. The robot knows that if the human has detected an error, it must simply place the object in the other bin. Further research seeks to expand the possibilities of using this automatic brainwave feedback to train robots for more complex tasks. You can read the research paper here.

MIT’s CSAIL work on lots of exciting projects – their video microphone technology is truly astounding.

[Thanks to Adam Connor-Simmons for the tip!]

TOBE: Tangible Out-of-Body Experience with Biosignals

TOBE is a toolkit that enables the user to create Tangible Out-of-Body Experiences, created by [Renaud Gervais] and others and presented at the TEI ’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction. The goal is to expose the inner states of users using physiological signals such as heart rate or brain activity. The toolkit is a proposal that covers the creation of a 3D printed avatar where visual representations of physiological sensors (ECG, EDA, EEG, EOG and breathing monitor) are displayed, the creation and use of these sensors based on open hardware platforms such as Bitalino or OpenBCI, and signal processing software using OpenViBE.

In their research paper, the team identified the signals and mental states which they have organized in three different types:

  • States perceived by self and others, e.g. eye blinks. Even if those signals may sometimes appear redundant as one may directly look at the person in order to see them, they are crucial in associating a feedback to a user.
  • States perceived only by self, e.g. heart rate or breathing. Mirroring these signals provides presence towards the feedback.
  • States hidden to both self and others, e.g. mental states such as cognitive workload. This type of metrics holds the most
    promising applications since they are mostly unexplored.

By visualising their own inner states and with the ability to share them, users can develop a better understating of their own selves as well others. Analysing their avatar in different contexts allows a user to see how they react in different scenarios such as stress, working or playing. When you join several users they can see how each other responds the same stimuli, for example. Continue reading “TOBE: Tangible Out-of-Body Experience with Biosignals”

Think Your Way to Work in a Mind-Controlled Tesla

When you own an $80,000 car, a normal person might be inclined to never take it out of the garage. But normal often isn’t what we do around here, so seeing a Tesla S driven by mind control is only slightly shocking.

[Casey_S] appears to be the owner of the Tesla S in question, but if he’s not he’ll have some ‘splaining to do. He took the gigantic battery and computer in a car-shaped case luxury car to a hackathon in Berkley last week and promptly fitted it with the gear needed to drive the car remotely. Yes, the Model S has steering motors built in, but Tesla hasn’t been forthcoming with an API to access such functions. So [Casey_S] and his team had to cobble together a steering servo from a windshield wiper motor and a potentiometer mounted to a frame made of 2x4s. Linear actuators attach to the brake and accelerator pedals, and everything talks to an Arduino.

The really interesting part is that the whole thing is controlled by an electroencephalography helmet and a machine learning algorithm that detects when the driver thinks “forward” or “turn right.” It translates those thoughts to variables that drive the actuators. Unfortunately, space constraints kept [Casey_S] from really putting the rig through its paces, but the video after the break shows that the system worked well enough to move the car forward and steer a little.

There haven’t been too many thought-controlled cars featured here before, but we have covered a wheelchair with an EEG interface.

Continue reading “Think Your Way to Work in a Mind-Controlled Tesla”

Hackaday Prize Entry: Lucid Dreaming Research

Lucid dreaming is one of the rare psychological phenomenon terrible sci-fi frequently gets right. Yes, lucid dreaming does exist, and one of the best ways to turn a normal dream into a lucid dream is to fixate on a particular object, sound, or smell. For their Hackaday Prize entry, [Jae] is building a device to turn the electronic enthusiast community on to lucid dreaming. It’s a research platform that allows anyone to study their own dreams and access a world where you can do anything.

The core of this project is an 8-channel EEG used to measure the electrical activity in the brain during sleep. These EEG electrodes are fed into a 24-bit ADC which is sampled 250 times per second by an ARM Cortex M4F microcontroller. The captured data is recorded or sent to a PC or smartphone over a Bluetooth connection where a familiar sound can be played (think of the briefcase in Inception), or some other signal that will tell the dreamer they’re dreaming.

We’ve seen a few similar builds in the past, most famously a NeuroSky MindWave headset turned into a comfortable single-channel EEG-type device. The NeuroSky hardware is limited, though, and a setup with proper amplifiers and ADCs will be significantly more helpful in debugging the meatspace between [Jae]’s ears.

Hacklet 105 – More Mind and Brain Hacks

A mind is a terrible thing to waste – but an awesome thing to hack. We last visited brain hacks back in July of 2015. Things happen fast on Hackaday.io. Miss a couple of days, and you’ll miss a bunch of great new projects, including some awesome new biotech hacks. This week, we’re checking out some of the best new mind and brain hacks on Hackaday.io

We start with [Daniel Felipe Valencia V] and Brainmotic. Brainmotic is [Daniel’s] entry in the 2016 Hackaday Prize. Smart homes and the Internet of Things are huge buzzwords these days. [Daniel’s] project aims to meld this technology with electroencephalogram (EEG). Your mind will be able to control your home. This would be great for anyone, but it’s especially important for the handicapped. Brainmotic’s interface is using the open hardware OpenBCI as the brain interface. [Daniel’s] software and hardware will create a bridge between this interface and the user’s home.

 

biofeed1Next we have [Angeliki Beyko] with Serial / Wireless Brainwave Biofeedback. EEG used to be very expensive to implement. Things have gotten cheap enough that we now have brain controlled toys on the market. [Angeliki] is hacking these toys into useful biofeedback tools. These tools can be used to visualize, and even control the user’s state of mind. [Angeliki’s] weapon of choice is the MindFlex series of toys. With the help of a PunchThrouch LightBlue Bean she was able to get the EEG headsets talking on Bluetooth. A bit of fancy software on the PC side allows the brainwave signals relieved by the MindFlex to be interpreted as simple graphs. [Angeliki] even went on to create a Mind-Controlled Robotic Xylophone based on this project.

brainhelmetNext is [Stuart Longland] who hopes to protect brains with Improved Helmets. Traumatic Brain Injury (TBI) is in the spotlight of medical technology these days. As bad as it may be, TBI is just one of several types of head and neck injuries one may sustain when in a bicycle or motorcycle accident. Technology exists to reduce injury, and is included with some new helmets. Many of these technologies, such as MIPS, are patented. [Stuart] is working to create a more accurate model of the head within the helmet, and the brain within the skull. From this data he intends to create a license free protection system which can be used with new helmets as well as retrofitted to existing hardware.

mindwaveFinally we have [Tom Meehan], whose entry in the 2016 Hackaday Prize is Train Your Brain with Neurofeedback. [Tom] is hoping to improve quality of life for people suffering from Epilepsy, Autism, ADHD, and other conditions with the use of neurofeedback. Like [Angeliki ] up above, [Tom] is hacking hardware from NeuroSky. In this case it’s the MindWave headset. [Tom’s] current goal is to pull data from the TAGM1 board inside the MindWave. Once he obtains EEG data, a Java application running on the PC side will allow him to display users EEG information. This is a brand new project with updates coming quickly – so it’s definitely one to watch!

If you want more mind hacking goodness, check out our freshly updated brain hacking project list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. That’s it for this week’s Hacklet, As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!