One of the modern marvels in our medical toolkit is ultrasound imaging. One of its drawbacks, however, is that it displays 2D images. How expensive do you think it would be to retrofit an ultrasound machine to produce 3D images? Try a $10 chip and pennies worth of plastic.
While — of all things — playing the Wii with his son, [Joshua Broder, M.D], an emergency physician and associate professor of surgery at [Duke Health], realized he could port the Wii’s gyroscopic sensor to ultrasound technology. He did just that with the help of [Matt Morgan, Carl Herickhoff and Jeremy Dahl] from [Duke’s Pratt School of Engineering] and [Stanford University]. The team mounted the sensor onto the side of the probe with a 3D printed collar. This relays the orientation data to the computer running software that sutures the images together into a complete 3D image in near real-time, turning a $50,000 ultrasound machine into its $250,000 equivalent.
[Dr. Broder] is eager to point out that it compares to MRI and CT imaging in quality, but with fewer issues: it reduces error in interpreting the images, and makes advanced imaging available in rural or developing areas. This is also useful when MRIs and CTs are risky due to medical history or for newborn children, and in critical situations where prep for an MRI or CT would take too much time.
It is entirely possible to hack together your own ultrasound machine, and even add some augmented reality components sure to cause a double-take.
[Thanks for the tip, Kevin Qes Huang!]
Where is there a link to the $10.00 chip?
Looks like a standard COTS(Common Off The Shelf) IMU(Inertial Measurement Unit) so some type of gyroscope and accelerometer module, something like a “MPU-6050 Accelerometer + Gyro”
The photo above is an STEVAL-MKI121V1 (which has a 6/9 axis module mounted in the middle) but I suspect that the $10 reference is just to the use of IMU units in general rather than any specific kit of parts.
nice, though i imagine the $250k machine updates in 3D in real time, not just a 2D slice..
I don’t think the GE and Toshiba machines normally found in pre-natal departments are quite the 250k mark.
They ‘do’ 3D, though not real-time.
So unless this ‘does’ real-time pseudo 3D, I’d guess it just bumps the features of systems that the ‘3D’ feature is an expensive add-on or not available.
In compensation to all those “not a hack” seen here:
THIS IS A REAL FRIGGIN’ HACK.
Sorry for shouting .-)
I could have done it with a 555,
well, maybe several thousand…
B^)
I can name that tune in 2 Arduino libraries –
Shout approved! +1
I think the real interesting/secret sauce stuff would be the software which takes the images and gyroscopic feedback to create a 3D representation in near real-time.
Agree, is there any mention of software availability?
Now add that to a signos hand held ultrasound.
Clever marketing piece
^ this.
$10!!! makes great headlines, but the cost is never in the components – it’s in the validation of the hardware and software, testing and certification. Oh, and liability insurance.
ISO 13485 is your friend.
Having worked on software in two organizations under ISO 13485, I respectfully disagree that it is my friend.
This is just an add-on to an existing kit which already has testing, certification, and liability insurance. No modification is made to the active components of the machine which is where any worry about danger to the patient would be involved. That’s what makes it interesting and why it could be transformative. You don’t have to be so cynical about an actually clever hack.
I’ve thought of the exact same thing, using cheap bedside ultrasound (which does a 2D slice) to construct 3D images. Glad someone has done it, and I hope it becomes standard everywhere.
No reason 3D ultrasound imaging couldn’t become as standard and cheap as a stethoscope. It really ought to be, and would remove a bunch of guesswork from normal, everyday procedures.
It’d be REALLY cool if you could pair this with augmented reality (either using a display on the portable ultrasound or perhaps on the doctor’s glasses) allowing the doctor or nurse to easily see into the patient in real time without having to stare at a separate display. The technology should be seamless and transparent, not require a separate technician.
I’m not being cynical. It’s simply true. See the ISO comments and others.
And this isn’t a simple add-on when it comes to certification. I haven’t looked too closely at the output 3D, but what if the error on the additional sensors is too high and the algorithm gets things wrong, or otherwise drifts and a tumour is rendered smaller because of it? Whose fault is that?
Also, I’ve seen this hack with laser depth scanners (and thermal cameras) and the reconstruction wasn’t something I’d bet a person’s life on.
Incorrect. Would still require FDA / National Body assessment and approval where used as a medical diagnostic tool. Read the scope of the EU’s medical device directive. Read the scopes of your national version of the general standard IEC60601-1 and the particular standard IEC60601-2-37.
At least I am smart enough to know that I am stupid. And there are those too stupid to know that they are stupid.
This. The product affects the diagnostic images, therefore is a medical device and would need to be verified and validated under 60601 or equivalent and designed and manufactured under 13485, or equivalent and the software would need to be developed and verified under 62304, or equivalent.
It’s a fantastic hack and could be really useful in the developing world where budgets are limited and regulations take a back seat to on the ground pragmatism. Don’t expect to see one in a western hospital any time soon though!
No disagreement there, but if you can now sell a $250,000 piece of equipment for $200,000 it’s still saving a lot.
two thoughts
1) how easy is the 3D printed collar and electronics to sterilize or cover with a suitable protective sleeve between patients
2) how easy is it to explain to the coroner (and ambulance chasing personal injury lawyers, if in the US), “Sorry the diagnosis of the abdominal aortic aneurism with an already poor prognosis was delayed, there was a cold solder joint in our lead free add on board that has no scheduled maintenance regimen or diagnostic mode.”
Question 1 is silly. Create a collar from injected plastic and it will survive an autoclave. Polypropylene (PP) and polypropylene copolymer (PPCO) can be autoclaved without any issues. You can create one hundred collars, and keep them around.
I think you didn’t watched the video before asking question 2… It’s an add-on, so if there is a cold solder joint your enhanced ultrasound machine into a regular one. Nobody will die. Add-on board dead? Take it off and proceed like it never existed at all.
I believe they slip a plastic bag-type thing over the handheld before anything touches skin.
Only $10?
Yeah, only the HW but what about price of writing the software?
The software is left as an exercise to the reader.
I suspect that existing point-cloud processing solutions would be able to work with this data.
The awesome project strangled by FDA rules:
https://www.slicer.org/
A plugin would be trivial given this API is designed for 3D reconstruction.
point-cloud would be one of the worst ways to represent and process this kind of 2D source image data.
This is a thoroughly solved problem, and has been for well over a decade. A decent review is here: http://www.sciencedirect.com/science/article/pii/S0301562907001081
Ever notice how pay-walled articles don’t get cited as often….
I am not saying ambitious people are less intelligent, but the data seems to correlate with this negative trend.
Is there no way to access the raw data coming out of the probe? The ultrasound software has to be doing some processing to turn the ultrasound information into an image, and that’s going to destroy some information. There has to be a better way to get from an ultrasound measurement+orientation to a 3D model than going through an intermediate 2D stage, and point clouds have worked well in other areas of computer vision.
Oh blast. My apologies: I didn’t notice it was paywalled to most people. Damned Elsevier. They’re the worst of the lot.
dahud says: “Is there no way to access the raw data coming out of the probe?”
The raw data coming out of modern probes is baseband RF, typically around 40 MHz, 128 channels, 16 bits. It’s possible to catch it, but not easy, and almost impossible without the explicit cooperation of the manufacturer to gain access to the internal data busses. You can imagine the NDA involved.
There also an enormous amount of very esoteric and proprietary processing done to modern ultrasound data before it gets to the screen. It’s extremely difficult to come even close to the OEM image quality of you try to process the data yourself.
The ultrasound image data is intrinsically 2D (in most probes), so it’s far simpler to grab the image data (literally, with a frame grabber) and build the 3D volume out of the 2D slices. These are finite-thickness slabs of intensity data, not surfaces, so pointcloud representation is not appropriate, and inefficient to boot.
Ah, gotcha. I was thinking that the ultrasound probe gave a linear slice of response times along various vectors, similar to how ultrasonic sensors are often used in robotics.
It’s ironic that the segment around 0:30-:35, he is scanning a phantom (a dummy), with a dummy probe, using a teaching instrument. *That* probe is non-functional, and contains a 6-DOF position and orientation sensor that tells the display where the probe is, relative to the body. The display then presents a synthetic image in real time, drawn from a 3-D volume image.
That 3-D real-time technology is from around 1990. Back then we needed an Ascension Technologies Flock of Birds sensor and a dedicated $20k workstation to do though.
I vote this best hack of the year. I’ve wanted a 3D home medical scanner for the past few years to do self-diagnosis and to show my kids scans of themselves. That’d be amazing! We also make experimental lightfield interfaces in the startup I’m part of, and it’d be incredible to get near-real-time volumetric scans into those displays. But its always been such a pain to get the data. I can’t wait to try this out!
One can use modern phone in his pocket with acelerometer and DOF sensor. It may be bulky but will remove custom hardware.
The output of the sensors on most phones is atrocious. Trying to get precise data from a phone is an exercise in futility
So the software to do the conversion is free in this $10 scenario, as is the filament and use of the 3D printer.
Not sure why the exaggeration is necessary.
The software is indeed free, or near enough. The problem is simple trigonometry, with a bit of noise filtering thrown in. Throw in some point-cloud processing techniques from projects like CloudCompare and you’re golden.
I suppose we could add $0.35 to the end of the price tag for the filament, if that’d make you feel better.
Pretty freakin sweet. Well done, Doc!
Can we just stop for a moment to appreciate the elegance of just hijacking the VGA stream to get the images, too? So many people would be invested in trying to gain access to the software, or do a screen grab on the host PC, and these guys are all “Nah, man, just snarf the VGA.” Also, his eyebrows DGAF.
Its not really “hacking” the VGA stream, they are using an off the shelf video capture device. I think it is this one: https://www.epiphan.com/products/dvi2usb-3-0/
It is an excellent application though; capture the video output + sensor for the position/orientation + software to assemble
Very creative hacking! But I fear that a commercial implementation will also incur a generous patent licensing fee to Duke and/or Stanford.
Damned unlikely, since similar implementations have existed in the literature since the nineties, and exactly this approach was published in 2001-2002, and some current “3D enabled” commercial probes have accelerometers already built into them.
Summary: In an alternate Universe Johnny Knoxville became a doctor and created an inexpensive technique for generating pseudo 3D data from a 2D ultrasound machine.
I will not be the least bit surprised if this idea turns up in those $3,000 portable ultrasounds that you can buy in China.
They’ve existed for some time… https://www.alibaba.com/showroom/3d-portable-ultrasound-machine.html
It’s not like it’s a new idea.
IThe idea was patented by Hopkins USPTO 8,914,245 B2 in 2014. But a neat implementation https://patentimages.storage.googleapis.com/6c/3c/90/d0bbdf723226a5/US8914245.pdf