Husband Uses MRI images to 3D print Wife’s Skull and Tumor

[Michael Balzer] shows us that you are your own best advocate when it comes to medical care – having the ability to print models of your own tumors is a bonus. [Michael’s] wife, Pamela, had been recovering from a thyroidectomy when she started getting headaches. She sought a second opinion after the first radiologist dismissed the MRI scans of her head – and learned she had a 3 cm tumor, a meningioma, behind her left eye.  [Michael], host of All Things 3D, asked for the DICOM files (standard medical image format) from her MRI.  When Pamela went for a follow-up, it looked like the tumor had grown aggressively; this was a false alarm. When [Michael] compared the two sets of DICOM images in Photoshop, the second MRI did not truly show the tumor had grown. It had only looked that way because the radiologist had taken the scan at a different angle! Needless to say, the couple was not pleased with this misdiagnosis.

However, the meningioma was still causing serious problems for Pamela. She was at risk of losing her sight, so she started researching the surgery required to remove the tumor. The most common surgery is a craniotomy: the skull is sawed open and the brain physically lifted in order to access the tumor below it. Not surprisingly, this carries a high risk of permanent damage to any nerves leading to loss of smell, taste, or sight if the brain is moved the wrong way. Pamela decided to look for an alternative surgery that was less invasive. [Michael] created a 3D print of her skull and meningioma from her MRIs. He used InVesalius, free software designed to convert the 2D DICOM files into 3D images. He then uploaded the 3D rendered skull to Sketchfab, sharing it with potential neurologists. Once a neurologist was found that was willing to consider an alternative surgery, [Michael] printed the skull and sent it to the doctor. The print was integral in planning out the novel procedure, in which a micro drill was inserted through the left eyelid to access the tumor. In the end, 95% of the tumor was removed with minimal scarring, and her eyesight was spared.

If you want to print your own MRI or CT scans, whether for medical use or to make a cool mug with your own mug, there are quite a few programs out there that can help. Besides the aforementioned InVesalius, there is DeVIDE, Seg3D, ImageVis3D, and MeshLab or MeshMixer.

[via Make]

Smart CPR Dummy Makes Sure you Do it Right

Have you ever taken a First Aid & CPR training course? Don’t you just love the realism of the dummy mannequins you get to practice on? [Park, Qurashi, & Chen], who are students of Cornell University, thought the dummies could use an intelligent upgrade.

It’s the final project for their electrical and computer engineering course ECE 4760. And what they’ve done is successfully created a budget friendly CPR not-so-dumb dummy using the venerable ATmega1284 microcontroller.

The dummy can sense when chest compressions are given, if the nose is plugged properly when breaths are given, if the head is tilted back properly to open the airway, and it even makes use of a microphone to detect if breaths are given properly! While it does this, it uses LED eyes and an LCD screen to provide training feedback to the student. Once the students are sufficiently practiced, it also has a “real” mode that doesn’t give you any feedback to make sure the students truly learned the technique. Continue reading “Smart CPR Dummy Makes Sure you Do it Right”

Kid Designs His Own Prosthetic Arm at a Summer Camp

Ever heard of the summer camp called Superhero Cyborgs? It’s where [Coby Unger] met nine-year-old [Aidan Robinson] and helped him design his very own custom prosthetic arm.

The camp is put on by KIDmob for kids who have various limb disabilities, and helps give them the tools and guidance to be able to make their very own prosthetics. Some of the designs the children come up with are cool, useful, pretty and sometimes not overly functional — but [Aidan’s] designs really intrigued [Coby] who is a designer and part of the staff at Pier 9, a world-class fabrication facility (and makerspace) run by Autodesk.

There’s a lot of problems with prosthetics for children. They’re very expensive, kids don’t stay the same size, and even though they might cost a lot, they don’t necessarily work that well. [Aidan] had a few commercial options but didn’t like any of them, so much so that he preferred not wear them period. But when he attended the camp he realized he had the ability to design a prosthetic that he’d actually want to wear.

Continue reading “Kid Designs His Own Prosthetic Arm at a Summer Camp”

Thalmic Labs Shuts Down Free Developer Access Update: It’s Back Again

The Thalmic Myo is an electronic arm band with an IMU and myoelectric sensors, able to measure the orientation and muscle movements of an arm. This device has uses ranging from prosthetics to Minority Report-style user interfaces. Thalmic is also a Y Combinator company, with $15 million in funding and tech press gushing over the possible uses of this futuristic device. Truly, a remarkable story for the future of user interfaces and pseudo-medical devices that can get around most FDA regulations.

A few months ago, Thalmic released a firmware update to the Myo that blocks raw access to the myoelectric sensors. Anyone wanting to develop for the Myo now needs to submit an application and pay Thalmic and their investors a pound of flesh – up to $5000 for academic institutions. The current version of the firmware only provides access to IMU data and ‘gestures’ – not the raw muscle data that would be invaluable when researching RSI detection, amputee prosthetics, or a hundred other ideas floating around the Thalmic forums.

Thalmic started their company with the idea that an open SDK would be best for the community, with access to the raw sensor data available in all but the latest version of the firmware. A few firmware revisions ago, Thalmic removed access to this raw data, breaking a number of open source projects that would be used for researchers or anyone experimenting with the Thalmic Myo.  Luckily, someone smart enough to look at version numbers has come up with an open library to read the raw sensor data. It works well, and the official position of Thalmic is that raw sensor data will be unavailable in the future. If you want to develop something with the Myo, this library just saved your butt.

Thalmic will have an official statement on access to raw sensor data soon.

Quick aside, but if you want to see how nearly every form of media is crooked, try submitting this to Hacker News and look at the Thalmic investors. Edit: don’t bother, we’re blacklisted or something.

Update: Thalmic has updated their policy, and will be releasing a firmware version that gives access to the raw EMG sensor data later on. The reasons for getting rid of the raw sensor data is twofold:

  • Battery life. Streaming raw data out of the armband takes a lot of power. Apparently figuring out ‘gestures’ on the uC and sending those saves power.
  • User experience. EMG data differs from person to person and is hard to interpret.

 

Nanobots Swim like Scallops in Non-Newtonian Fluids

The idea of using nanobots to treat diseases has been around for years, though it has yet to be realized in any significant manner. Inspired by Purcell’s Scallop theorem, scientists from the Max Planck Institute for Intelligent Systems have created their own version . They designed a “micro-scallop” that could propel itself through non-Newtonian fluids, which is what most biological fluids happen to be.

The scientists decided on constructing a relatively simple robot, one with two rigid “shells” and a flexible connecting hinge. They 3D-printed a negative mold of the structure and filled it with a polydimethylsiloxane (PDMS) solution mixed with fluorescent powder to enable detection. Once cured, the nanobot measured 800 microns wide by 300 microns thick. It’s worth noting that it did not have a motor. Once the mold was complete, two neodymium magnets were glued onto the outside of each shell. When a gradient-free external magnetic field was applied, the magnets make the nanobot’s shells open and close. These reciprocal movements resulted in its net propulsion through non-Newtonian media. The scientists also tested it in glycerol, an example of a Newtonian fluid. Confirming Purcell’s Scallop theorem, the nanobot did not move through the glycerol. They took videos of the nanobot in motion using a stereoscope, a digital camera with a colored-glass filter, and an ultraviolet LED to make the fluorescent nanobot detectable.

The scientists did not indicate any further studies regarding this design. Instead, they hope it will aid future researchers in designing nanobots that can swim through blood vessels and body fluids.  We don’t know how many years it will be before this becomes mainstream medical science, but we know this much: we will never look at scallops the same way again!

Continue reading “Nanobots Swim like Scallops in Non-Newtonian Fluids”

Medical Tricorder Mark I

A handheld tricorder is as good a reason as any to start a project. The science-fiction-derived form factor provides an opportunity to work on a lot of different areas of hardware development like portable power, charging, communications between sensor and microcontroller. And of course you need a user interface so that the values being returned will have some meaning for the user.

[Marcus B] has done a great job with all of this in his first version of a medical tricorder. The current design hosts two sensors, one measures skin temperature using infrared, the other is a pulse sensor.

For us it’s not the number of sensors that makes something a “tricorder” but the ability of the device to use those sensors to make a diagnosis (or to give the user enough hints to come to their own conclusion). [Marcus] shares similar views and with that in mind has designed in a real-time clock and an SD card slot. These can be used to log sensor data over time which may then be able to suggest ailments based on a known set of common diagnosis parameters.

Looking at the image above you may be wondering which chip is the microcontroller. This build is actually a shield for an Arduino hiding underneath.

There’s a demonstration video after the break. And if you find this impressive you won’t want to miss the Open Source Science Tricorder which is one of the finalists for the 2014 Hackaday Prize.

Continue reading “Medical Tricorder Mark I”

UC Davis Researchers Use Light to Erase Memories in Genetically Altered Mice

Much like using UV light to erase data from an EPROM, researchers from UC Davis have used light to erase specific memories in mice. [Kazumasa Tanaka, Brian Wiltgen and colleagues] used optogenetic techniques to test current ideas about memory retrieval. Optogenetics has been featured on Hackaday before. It is the use of light to control specific neurons (nerve cells) that have been genetically sensitized to light.  By doing so, the effects can be seen in real-time.

For their research, [Kazumasa Tanaka, Brian Wiltgen and colleagues] created genetically altered mice whose activated neurons expressed GFP, a protein that fluoresces green. This allowed neurons to be easily located and track which ones responded to learning and memory stimuli. The neurons produced an additional protein that made it possible to “switch them off” in response to light.  This enabled the researchers to determine which specific neurons are involved in the learning and memory pathways as well as study the behavior of the mouse when certain neurons were active or not.

Animal lovers may want to refrain from the following paragraph. The mice were subjected to mild electric shocks after being placed in a cage. They were trained so that when they were put in the cage again, they remembered the previous shock and would freeze in fear. However, when specific neurons in the hippocampus (a structure in the brain) were exposed to light transmitted through fiber optics (likely through a hole in each mouse’s skull), the mice happily scampered around the cage, no memory of the earlier shock to terrify them. The neurons that stored the memory of the shock had been “turned off” after the light exposure.

Continue reading “UC Davis Researchers Use Light to Erase Memories in Genetically Altered Mice”