Have you ever wanted to turn on or off your TV just by thinking about it? We love this hack mainly because it uses an old Star Wars Force Trainer game. You can still buy them for about $40-$80 USD online. This cool little toy was introduced in 2009 and uses a headset with electrodes, and an electroencephalography (EEG) chip. It transmits the EEG data to control a fan that blows air into a tube to “levitate” a ball, all the while being coached on by the voice of Yoda. (Geesh! Kids these days have the best toys!)
[Tinkernut] started by cracking open the headset, where he found the EEG chip made by a company called NeuroSky (talk about a frightening sounding company name). The PCB designer was kind enough to label the Tx/Rx pins on the board, so hooking it up to an Arduino was a snap. After scavenging an IR LED and receiver from an old VCR, the hardware was just about done. After a bit of coding, you can now control your TV by using the force! (Ok, by ‘force’ I mean brainwaves.) Video after the break.
Note: [Tinkernut’s] blog page should have more information available soon. In the meantime if you can find his Arduino Brain Library on github.
This isn’t the first EEG to TV interface we’ve featured. Way back in 2010 we featured a project that used an Emotiv EPOC EEG headset to turn on and off a TV. But at $400 for the headset, it was a little too expensive for the average Jedi.
Continue reading “Use the Force, Luke…to Turn Off Your TV”
If there’s one downside to digital storage, it’s the short lifespan. Despite technology’s best efforts, digital storage beyond 50 years is extremely difficult. [Robert Grass, et al.], researchers from the Swiss Federal Institute of Technology in Zurich, decided to address the issue with DNA. The same stuff that makes you “You” can also be used to store your entire library, and then some.
As the existence of cancer shows, DNA is not always replicated perfectly. A single mismatch, addition, or omission of a base pair can wreak havoc on an organism. [Grass, et al.] realized that for long-term storage capability, error-correction was necessary. They decided to use Reed-Solomon codes, which have been utilized in error-correction for many storage formats from CDs to QR codes to satellite communication. Starting with uncompressed digital text files of the Swiss Federal Charter from 1291 and the English translation of the Archimedes Palimpsest, they mapped every two bytes to three elements in a Galois field. Each element was then encoded to a specific codon, a triplet of nucleotides. In addition, two levels of redundancy were employed, creating outer- and inner- codes for error recovery. Since long DNA is very difficult to synthesize (and pricier), the final product was 4991 DNA segments of 158 nucleotides each (39 codons plus primers).
Continue reading “Store Digital Files for Eons in Silica-Encased DNA”
Bryan is a computer neophyte (he needs help turning his computer on), but he has a basketball story. His team was playing in a crucial basketball playoff game at the club. They were down by two late in the game and he just couldn’t get one of his players to play defense. This player was a great shooter and that is about it — burying a three that put the team up for the first time. After sinking it he just stood there admiring his masterpiece while Bryan screamed at him to get back on defense (he rarely played D and he didn’t that game either). Instead, he flat lined and went down on his face– heart attack!
Of course that player was me and that was an awful day. But I’m still around to tell the story… as a hardware designer years before I didn’t know that I’d bet everything on one particular project.
Continue reading “Developer Saved Years Later by His Own Hardware”
[Michael Balzer] shows us that you are your own best advocate when it comes to medical care – having the ability to print models of your own tumors is a bonus. [Michael’s] wife, Pamela, had been recovering from a thyroidectomy when she started getting headaches. She sought a second opinion after the first radiologist dismissed the MRI scans of her head – and learned she had a 3 cm tumor, a meningioma, behind her left eye. [Michael], host of All Things 3D, asked for the DICOM files (standard medical image format) from her MRI. When Pamela went for a follow-up, it looked like the tumor had grown aggressively; this was a false alarm. When [Michael] compared the two sets of DICOM images in Photoshop, the second MRI did not truly show the tumor had grown. It had only looked that way because the radiologist had taken the scan at a different angle! Needless to say, the couple was not pleased with this misdiagnosis.
However, the meningioma was still causing serious problems for Pamela. She was at risk of losing her sight, so she started researching the surgery required to remove the tumor. The most common surgery is a craniotomy: the skull is sawed open and the brain physically lifted in order to access the tumor below it. Not surprisingly, this carries a high risk of permanent damage to any nerves leading to loss of smell, taste, or sight if the brain is moved the wrong way. Pamela decided to look for an alternative surgery that was less invasive. [Michael] created a 3D print of her skull and meningioma from her MRIs. He used InVesalius, free software designed to convert the 2D DICOM files into 3D images. He then uploaded the 3D rendered skull to Sketchfab, sharing it with potential neurologists. Once a neurologist was found that was willing to consider an alternative surgery, [Michael] printed the skull and sent it to the doctor. The print was integral in planning out the novel procedure, in which a micro drill was inserted through the left eyelid to access the tumor. In the end, 95% of the tumor was removed with minimal scarring, and her eyesight was spared.
If you want to print your own MRI or CT scans, whether for medical use or to make a cool mug with your own mug, there are quite a few programs out there that can help. Besides the aforementioned InVesalius, there is DeVIDE, Seg3D, ImageVis3D, and MeshLab or MeshMixer.
Have you ever taken a First Aid & CPR training course? Don’t you just love the realism of the dummy mannequins you get to practice on? [Park, Qurashi, & Chen], who are students of Cornell University, thought the dummies could use an intelligent upgrade.
It’s the final project for their electrical and computer engineering course ECE 4760. And what they’ve done is successfully created a budget friendly CPR not-so-dumb dummy using the venerable ATmega1284 microcontroller.
The dummy can sense when chest compressions are given, if the nose is plugged properly when breaths are given, if the head is tilted back properly to open the airway, and it even makes use of a microphone to detect if breaths are given properly! While it does this, it uses LED eyes and an LCD screen to provide training feedback to the student. Once the students are sufficiently practiced, it also has a “real” mode that doesn’t give you any feedback to make sure the students truly learned the technique. Continue reading “Smart CPR Dummy Makes Sure you Do it Right”
Ever heard of the summer camp called Superhero Cyborgs? It’s where [Coby Unger] met nine-year-old [Aidan Robinson] and helped him design his very own custom prosthetic arm.
The camp is put on by KIDmob for kids who have various limb disabilities, and helps give them the tools and guidance to be able to make their very own prosthetics. Some of the designs the children come up with are cool, useful, pretty and sometimes not overly functional — but [Aidan’s] designs really intrigued [Coby] who is a designer and part of the staff at Pier 9, a world-class fabrication facility (and makerspace) run by Autodesk.
There’s a lot of problems with prosthetics for children. They’re very expensive, kids don’t stay the same size, and even though they might cost a lot, they don’t necessarily work that well. [Aidan] had a few commercial options but didn’t like any of them, so much so that he preferred not wear them period. But when he attended the camp he realized he had the ability to design a prosthetic that he’d actually want to wear.
Continue reading “Kid Designs His Own Prosthetic Arm at a Summer Camp”
The Thalmic Myo is an electronic arm band with an IMU and myoelectric sensors, able to measure the orientation and muscle movements of an arm. This device has uses ranging from prosthetics to Minority Report-style user interfaces. Thalmic is also a Y Combinator company, with $15 million in funding and tech press gushing over the possible uses of this futuristic device. Truly, a remarkable story for the future of user interfaces and pseudo-medical devices that can get around most FDA regulations.
A few months ago, Thalmic released a firmware update to the Myo that blocks raw access to the myoelectric sensors. Anyone wanting to develop for the Myo now needs to submit an application and pay Thalmic and their investors a pound of flesh – up to $5000 for academic institutions. The current version of the firmware only provides access to IMU data and ‘gestures’ – not the raw muscle data that would be invaluable when researching RSI detection, amputee prosthetics, or a hundred other ideas floating around the Thalmic forums.
Thalmic started their company with the idea that an open SDK would be best for the community, with access to the raw sensor data available in all but the latest version of the firmware. A few firmware revisions ago, Thalmic removed access to this raw data, breaking a number of open source projects that would be used for researchers or anyone experimenting with the Thalmic Myo. Luckily, someone smart enough to look at version numbers has come up with an open library to read the raw sensor data. It works well, and the official position of Thalmic is that raw sensor data will be unavailable in the future. If you want to develop something with the Myo, this library just saved your butt.
Thalmic will have an official statement on access to raw sensor data soon.
Quick aside, but if you want to see how nearly every form of media is crooked, try submitting this to Hacker News and look at the Thalmic investors. Edit: don’t bother, we’re blacklisted or something.
Update: Thalmic has updated their policy, and will be releasing a firmware version that gives access to the raw EMG sensor data later on. The reasons for getting rid of the raw sensor data is twofold:
- Battery life. Streaming raw data out of the armband takes a lot of power. Apparently figuring out ‘gestures’ on the uC and sending those saves power.
- User experience. EMG data differs from person to person and is hard to interpret.