This must be an example of when worlds collide. Who would have thought the geekery of Mindflex and Arduino could make its way into high fashion? But sure enough, this dress transforms based on the mental concentration of the model (must resist urge to crack joke here).
Details are a bit sparse, but you can get a look at the prototype in the video after the break. There’s no nudity; a larger skirt covers a more plain version. That over-skirt is connected to some type of motor system which is driven by an Arduino. When the EEG sensor in the hat detects a certain level of brain wave activity, the outer skirt is lifted and pulled to the back of the outfit, exposing the tighter version beneath.
[Lorenzo] wrote in to share the link to this garment hack. He mentions that a Lilypad and Mindflex are at work here. Looking more into the artist’s website we find this isn’t the only tech-wear produced. There’s a maternity outfit which can sense the baby’s beating heart, and harvest other data about both mother and baby, as well as a few others.
We can’t think this has much future as an everyday outfit, but more utilitarian versions are out there so we think the sky’s the limit on wearable tech.
Continue reading “Fashion leads to mind-controlled skirt-lifting contraption”
Because switching apps to change a song is such a taxing ordeal, [Oscar Celma] and [Ching-Wei Chen] decided to use their collective brainpower to change Last.FM playlists with their minds. They call their project Buddhafy, and it works by taking off-the-shelf EEG hardware and tying it into music streaming APIs.
For the build, the guys used a NeuroSky MindWave to read alpha waves inside [Oscar]’s head. The data from the MindWave was passed into a Python script that sends requests to the Last.FM and Spotify APIs. High alpha waves in brain wave patterns correspond with concentration or a deep meditative state. If [Oscar] concentrates very hard, he’ll be rewarded with calm and relaxing tunes. If [Oscar] loses focus, the music changes to the best song ever written.
The guys put up the slides from the presentation they gave at MusicHackDay in San Fransisco this last week. There’s also a video of their build in action; you can check that out after the break.
Continue reading “Control a playlist with your mind”
Whether you believe in it or not, the science behind brainwave entrainment is incredibly intriguing. [Rich Decibels] became interested in the subject, and after doing some research, decided to build an entrainment device of his own.
If you are not familiar with the concept, brainwave entrainment theory suggests that low-frequency light and sound can be used to alter brain states, based on the assumption that the human brain will change its frequency to correspond to dominant external stimulus. [Rich’s] device is very similar to [Mitch Altman’s] “Brain Machine”, and uses both of these methods in an attempt to place the user in an altered state of mind.
[Rich] installed a trio of LEDs into a set of goggles, wiring them along with a set of headphones to his laser-cut enclosure. Inside, the Brainwave Disruptor contains an Arduino, which is tasked with both generating light patterns as well as bit-banged audio streams.
Well, how does it work? [Rich] reports that it performs quite nicely, causing both visual and auditory hallucinations along with the complete loss of a sense of time. Sounds interesting enough to give it a try!
[Sam Fok], an engineering student at the Washington University School of Engineering wrote in to share a project he and his classmates [Raphael Schwartz, Mark Wronkiewicz, Charles Holmes, Jessica Zhang, Nathan Brodell, and Thane Somers] have been working on as their entry in the 2011 RESNA Student Design Competition. Their project, IpsiHand, is designed to help rehabilitate those who have suffered a stroke or other Traumatic Brain Injury (TBI).
Most motor functions in the body are controlled by the opposite hemisphere of the brain, a process called contralateral motor control. When a patient suffers from TBI, they often lose control over some portion of the body opposite the injury. Recent studies have shown however, that while most motor control is contralateral, hand movements also create ipsilateral brain activity. This means that the uninjured side of the brain can effectively control both hands, with a bit of mechanical assistance.
Their process uses an Emotiv Epoch EEG headset, which we have discussed before, to monitor the patients’ brain for activity. The data is sent wirelessly to a computer which processes the data, singling out ipsilateral brain waves. The computer then actuates a modified hand orthosis to control grasping in real time.
We think their work is fantastic, and the team’s creation has a wide array of applications in the field of therapy and assisted living. We wish them luck in their competition, and hope to see this technology put to good use in the future.
Amyotrophic lateral sclerosis (ALS) is a debilitating disease that eventually causes the afflicted individual to lose all control of their motor functions, while leaving their mental faculties intact. Those suffering from the illness typically live for only a handful of years before succumbing to the disease. On some occasions however, patients can live for long periods after their original diagnosis, and in those cases assistive technology becomes a key component in their lives.
[Alon Bukai and Ofir Benyamin], students at Ort Hermalin Collage in Israel, have been working hard on creating an EEG-controlled smart house for ALS patients under the guidance of their advisor [Amnon Demri]. The core of their project focuses around controlling everyday household items using brainwaves. They use an Emotiv EPOC EEG headset which monitors the user’s brainwaves when focusing on several large buttons displayed on a computer screen. These buttons are mapped to different functions, ranging from turning lights on and off to changing channels on a cable box. When the user focuses on a particular task, the computer analyzes the headset’s output and relays the command to the proper device.
As of right now, the EEG-controlled home is only a project for their degree program, but we hope that their efforts help spur on further advancements in this field of research.
Continue reading to see a pair of videos demonstrating their EEG-controlled smart house in action.
Continue reading “Brainwave-based assistive technology in the home”
The AutoNOMOS labs project has found a new way to maneuver its vehicles, your brain. We have looked at a previous version that uses a mostly computerized van under remote control from an iPhone. This one however, named “Brain Driver”, places the operator in the driver’s seat with an EEG strapped to their head.
Going for a more sporty look, the current vehicle is a drive-by-wire Volkswagen Passat wagon filled to the brim with fun toys like LIDAR/ RADAR sensor technology, cameras, and a specialized GPS. The EEG interface is a commercially available Emotiv model, and after a few rounds of training on safe ground, the driver is placed in control of the car.
In one demonstration the car approaches a 4 way intersection, the driver only has to think left or right and the car (intelligently) navigates the turn after coming to a proper stop, and checking for obstacles. In the second demo car and driver are let loose on an unused airport to test responsiveness.
If you like brains, cars, robots, and spinning lasers join us after the break for a video.
Continue reading “Brain Car Interface”
The use of brainwaves as control parameters for electronic systems is becoming quite widespread. The types of signals that we have access to are still quite primitive compared to what we might aspire to in our cyberpunk fantasies, but they’re a step in the right direction.
A very tempting aspect of accessing brain signals is that it can be used to circumvent physical limitations. [Jerkey] demonstrates this with his DIY brain-controlled electric wheelchair that can move people who wouldn’t otherwise have the capacity to operate joystick controls. The approach is direct, using a laptop to marshall EEG data which is passed to an arduino that simulates joystick operations for the control board of the wheelchair. From experience we know that it can be difficult to control EEGs off-the-bat, and [Jerky]’s warnings at the beginning of the instructable about having a spotter with their finger on the “off” switch should well be followed. Maybe some automated collision avoidance would be useful to include.
We’ve covered voice-operated wheelchairs before, and we’d like to know how the two types of control would stack up against one another. EEGs are more immediate than speech, but we imagine that they’re harder to control.
It would be interesting albeit somewhat trivial to see an extension of [Jerkey]’s technique as a way to control an ROV like Oberon, although depending on the faculties of the operator the speech control could be difficult (would that make it more convincing as an alien robot diplomat?).