[Reza Naima] has just released the designs for his Berkeley Tricorder for the public to use. He’s been designing it since 2007 as his thesis work for his PhD, and since he’s done now (Congrats!), he decided to let it grow by making it open source!
We covered it almost 7 years ago now when it was in its first prototype form, and it has come a long way since then. The latest version features an electromyogram (EMG), an electrocardiograph (ECG), a bioimpedance spectrometer, a pulse oximeter, an accelerometer, and all the data is recorded to a micro SD card or sent via bluetooth to a tablet or smart phone for data visualization.
He’s released it in hopes that other researchers can utilize the hardware in their own research, hopefully springing up a community of people interested in non-invasive health monitoring. With any luck, the development of the Berkeley Tricorder will continue, and maybe some day, can truly live up to its name!
Unfortunately there’s no new video showing off the latest iteration, but we’ve attached the original video after the break, which gives a good narrative on the device by [Reza] himself.
Continue reading “The Berkeley Tricorder is now Open Source!”
The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.
We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.
Continue reading “PhotoTransistor Based Eye-Tracking”
Ideally, technology is supposed to enhance our lives. [Shane and Eileen], two seniors at Cornell have found a great way to enhance the lives of visually impaired individuals with their acoustic wayfinding device. In brainstorming for their final project, [Shane and Eileen] were inspired by this Hackaday post about robots as viable replacements for guide dogs. They sought to provide wearable, hands-free guidance and detection of (primarily) indoor obstacles—namely chairs, benches, and other inanimate objects below eye level.
The wayfinder comprises two systems working in tandem: a head-mounted navigation unit and a tactile sensor worn on the user’s finger. Both systems use Maxbotix LV-MaxSonar-EZ0 ultrasonic rangefinder modules to detect obstacles and vibrating mini-disc motors to provide haptic feedback at speeds proportionate to the user’s distance from an obstacle.
The head unit uses two rangefinders and two vibrating motors. Together, the rangefinders have a field of view of about 120 degrees and are capable of detecting obstacles up to 6.45 meters away. The tactile sensor comprises one rangefinder and motor and is used in a manner similar to a Hoover cane. The user sweeps their hand to detect objects that would likely be out of the range of the head unit. Both parts are ergonomic and size-adjustable.
At power up, [Shane and Eileen]‘s software performs a calibration of the tactile sensor to determine the distance threshold in conjunction with the user’s height. They’ve used an ATMega 1284 to drive the system, and handled real-time task scheduling between the two subsystems with the TinyRealTime kernel. A full demonstration video is embedded after the break.
Continue reading “Acoustic Wayfinder for the Visually Impaired”
When we think of machine learning it’s usually in the context of robotics—giving an algorithm a large set of input data in order to train it for a certain task like navigation or understanding your handwriting. But it turns out you can also train a nasty virus to go to sleep and never wake up again. That’s exactly what the Immunity Project has been doing. They believe that they have a viable HIV vaccine and are trying to raise about $25 million to begin human testing.
The vaccine hacks the Human Immunodeficiency Virus itself, forcing it to mutate into a dormant form that will not attack its human carrier. It sounds so simple, but a lot of existing knowledge and procedures, as well as new technology, went into getting this far. Last week we spoke with [Reid Rubsamen, M.D.] about the process, which began by collecting blood samples from a wide range of “Controllers“. Controllers are people who carry HIV but manage to suppress the virus’s progression to AIDS. How do you find these people? That’s another story which Scientific American covered (PDF); the short answer is that thanks to the work of [Bruce D. Walker, M.D.] there was already a database of Controllers available.
The information accumulated by [Walker] then underwent a data crunching exercise. The data set was so enormous that a novel approach was adopted. For the laymen this is described as a spam filter: using computers to look at large sets of email to develop a complex process for sifting real messages out of the noise. The task at hand is to look at the genotype of a Controller and compare it with the epitope— a short chain of proteins—in the virus they carry. The power of machine learning managed to whittle down all the data to a list of the first six epitopes that have the desired dormant-mutation property. The vaccine consists of a cocktail of these epitopes. It does, however, require some clever delivery tactics to reach the parts of the world where it’s most needed. The vaccine must not require refrigeration nor any special skills to administer.
The vaccine’s production uses existing methods to synthesize the amino acid peptides, which are the epitopes themselves. The packaging, however, is a new concept. [Dr. Rubsamen's] company, Flow Parma, Inc., is using microspheres to encapsulate the vaccine, which render it shelf-stable and allow it to be administered through a nasal spray. Learn more about the technology behind the production of microspheres from this white paper (PDF).
If the vaccine (which will be produced without profit) passes clinical trials, it could see mass distribution as early as 2017.
The $25M we mentioned earlier is a tall hill to climb, but think of the reward if the vaccine is successful. You can donate directly to help reach this goal. If you’re planning on giving gift cards this year, you can purchase them for many different retailers through Gyft, who is donating 100% of December proceeds to the project.
[Rahul] works at a startup that produces cutting edge diagnostic test cards. These simple cards can test for enzymes, antibodies, and diseases quickly and easily. For one test, this greatly speeds up the process of testing and diagnosis, but since these tests can now be administered en masse, health services the world over now have the problem of reading, categorizing, and logging thousands of these diagnostic test cards.
The normal solution to this problem is a dedicated card scanner, but these cost tens of thousands of dollars. At a 24-hour hackathon, [Rahul] decided to bring down the cost of the card scanners by whipping up his own, built from a CD drive and an Arduino.
The card [Rahul] used, an A1c card that tests for glucose bound to hemoglobin, has a few lines on the card that fluoresce with different intensify depending on the test results. This can be easily read with a photodiode connected to an Arduino. The mechanical part of the build consisted of an old CD drive with a 3D printed test strip adapter. Operation is very simple – just put the test strip in the test strip holder, press a button, and the results of the test are transmitted over Bluetooth.
Not only is [Rahul]‘s build extremely simple, it’s also extremely useful and was enough to net him the ‘Most Innovative Project’ prize at the hackathon in his native Singapore.
Let’s face it, most of the time we’re hacking for no other reason than sheer enjoyment. So we love to see hacks come about that can really make a difference in people’s lives. This time around it’s a video game designed to exercise your eyes. [James Blaha] has an eye condition called Strabismus which is commonly known as crossed-eye. The issue is that the muscles for each eye don’t coordinate with each other in the way they need to in order to produce three-dimensional vision.
Recent research (linked in the reference section of [James'] post) suggests that special exercises may be able to train the eyes to work correctly. He’s been working on developing a video game to promote this type of training. As you can see above, the user (patient?) wears an Oculus Rift headset which makes it possible to show each eye slightly different images, while using a Leap Motion controller for VR interaction. If designed correctly, and paired with the addictive qualities of games, this my be just what the doctor ordered. You know what they say, practice makes perfect!
Continue reading “Video Gaming to Fix Eye Ailments”
You’re probably wondering why [Eddy], pictured above, decided to clamp two CPU cooling blocks to his torso. We were a bit concerned ourselves. As it turns out, [Eddy] has managed to construct his own Cryolipolysis device, capable of delivering targeted sub-zero temperatures to different parts of the body using a technique more popularly known as “Coolsculpting.”
Cryolipolysis is a non-surgical method of controlled cooling that exposes fat cells to cold temperatures while also creating a vacuum to limit blood flow to the treated area. [Eddy's] challenge was to discover exactly how cold to make the treatment surfaces—a secret close-guarded by the original inventors. After digging through the original patent and deciding on a range between -3C and 0C, [Eddy] began cobbling together this medical masterpiece and designing a system capable of controlling it.
His finished build consists of a simple three-button interface and accompanying LCD screen, both wired to an Arduino, allowing the user to adjust temperatures and keep tabs on a session’s time. Unfortunately, results can take several months to appear, so [Eddy] has no idea whether his creation works (despite having suffered a brush with frostbite and some skin discolorations, yikes!) You can pick through a gigantic collection of photos and detailed information over at [Eddy's] project blog, then stick around for a video from an Australian news program that explains the Coolsculpting process. Need some additional encouragement to experiment on yourself? You can always strap some electrodes to your head and run current through them. You know, for science.
Continue reading “DIY Coolsculptor Freezes Fat with Cryolipolysis”