This hack is a strange mixture of awesome and ghoulish. [Andrew Sink] created a 3D printed version of his brain. He received a CD from an MRI session that contained the data obtained by the scan. Not knowing what to do with it he created a model of his brain.
Out of a number of images, some missing various parts of his head, he selected the one that was most complete. This image he brought into OisriX, a Mac program for handling DICOM files. He worked on the image for an hour dissecting away his own eyes, skull, and skin. An STL file containing his brain was brought over to NetFabb to see how it looked. There was still more dissection needed so [Andrew] turned to Blender. More bits and pieces of his skull’s anatomy were dissected to pare it down to just the brain. But there were some lesions at the base of the brain that needed to be filled. With the help of [Cindy Raggio] these were filled in to complete the 3D image.
The usual steps sent it to the 3D printer to be produced at 0.2 mm resolution. It only took 49 hours to print at full-size. This brain was printed for fun, but we’ve seen other 3D printed brain hacks which were used to save lives. How many people do you know that have a spare brain sitting around?
Think of Virtual Reality and it’s mostly fun and games that come to mind. But there’s a lot of useful, real world applications that will soon open up exciting possibilities in areas such as medicine, for example. [Victor] from the Shackspace hacker space in Stuttgart built an Augmented Reality Ultrasound scanning application to demonstrate such possibilities.
But first off, we cannot get over how it’s possible to go dumpster diving and return with a functional ultrasound machine! That’s what member [Alf] turned up with one day. After some initial excitement at its novelty, it was relegated to a corner gathering dust. When [Victor] spotted it, he asked to borrow it for a project. Shackspace were happy to donate it to him and free up some space. Some time later, [Victor] showed off what he did with the ultrasound machine.
As soon as the ultrasound scanner registers with the VR app, possibly using the image taped to the scan sensor, the scanner data is projected virtually under the echo sensor. There isn’t much detail of how he did it, but it was done using Vuforia SDK which helps build applications for mobile devices and digital eye wear in conjunction with the Unity 5 cross-platform game engine. Check out the video to see it in action.
When [Cassidy and Chad Lexcen]’s twin daughters were born in August, smaller twin [Teegan] was clearly in trouble. Diagnostics at the Minnesota hospital confirmed that she had been born with only one lung and half a heart. [Teegan]’s parents went home and prepared for the inevitable, but after two months, she was still alive. [Cassidy and Chad] started looking for second opinions, and after a few false starts, [Teegan]’s scans ended up at Miami’s Nicklaus Children’s Hospital, where the cardiac team looked them over. They ordered a 3D print of the scans to help visualize possible surgical fixes, but the 3D printer broke.
Not giving up, they threw [Teegan]’s scans into Sketchfab, slapped an iPhone into a Google Cardboard that one of the docs had been playing with in his office, and were able to see a surgical solution to [Teegan]’s problem. Not only was Cardboard able to make up for the wonky 3D printer, it was able to surpass it – the 3D print would only have been the of the heart, while the VR images showed the heart in the context of the rest of the thoracic cavity.[Dr. Redmond Burke] and his team were able to fix [Teegan]’s heart in early December, and she should be able to go home in a few weeks to join her sister [Riley] and make a complete recovery.
We love the effect that creative use of technology can have on our lives. We’ve already seen a husband using the same Sketchfab tool to find a neurologist that remove his wife’s brain tumor. Now this is a great example of doctors doing what it takes to better leverage the data at their disposal to make important decisions.
There is a fascinating brain reaction known as the McCollough Effect which is like side-loading malicious code through your eyeballs. Although this looks and smells like an optical illusion, the science would argue otherwise. What Celeste McCollough observed in 1965 can be described as a contingent aftereffect although we refer to this as “The McCollough Effect” due to McCollough being the first to recognize this phenomena. It’s something that can’t be unseen… sometimes affecting your vision for months!
I am not suggesting that you experience the McCollough Effect yourself. We’ll look at the phenomena of the McCollough Effect, and it can be understood without subjecting yourself to it. If you must experience the McCollough Effect you do so at your own risk (here it is presented as a video). But read on to understand what is happening before you take the plunge.
Measuring the body’s electrical signals is a neat trick… if you can get your equipment dialed in enough to establish dependable measurements. The technique is called Surface ElectroMyography (SEMG) though you’ll hear many call this ECG. They’re essentially the same technology; the Electro CardioGraph instruments monitor the activity of the heart while SEMG Instruments monitor electrical signals used to control other muscles. Both types of hardware amount to an instrumentation type amplifier and some form of I/O or display.
This topic has been in my back pocket for many months now. Back in May we Hackaday’ites descended on New York City for the Disrupt NY Hackathon event. We arrived a day or so early so that we might better peruse the Korean BBQ joints and check out the other electronics that NY has to offer. On Saturday we gathered around, each shouting out the size of his or her t-shirt preference as we covered up our black Hackaday logo tees with maroon maroon ones (sporting the Hackaday logo of course) for a 24-hour craze of hardware hacking.
There were two individuals at our tables who were both hacking away on hardware to measure the electrical field produced by the body’s muscles in some form or another. The electrical signals measured from the skin are small, and need careful consideration to measure the signal despite the noise. This is a fun experiment that lets you work with both Instrumentation Amplifiers and OpAmps to achieve a usable signal from the movement of your body.
Normally, strain sensors are limited in their flexibility by the underlying substrate. This lead researchers at the University of Manitoba to an off-the-wall solution: mixing carbon nanotubes into a chewing-gum base. You can watch their demo video below the break.
The procedure, documented with good scientific rigor, is to have a graduate student chew a couple sticks of Doublemint for half an hour, and then wash the gum in ethanol and dry it out overnight. Carbon nanotubes are then added, and the gum is repeatedly stretched and folded, like you would with pizza dough, to align the ‘tubes. After that, just hook up electrodes and measure the resistance as you bend it.
The obvious advantage of a gum sensor is that it’s slightly sticky and very stretchy. The team says it works when stretched up to five times its resting length. Try that with your Power Glove.
We had some incredible speakers at the Hackaday SuperConference. One of the final talks was given by [Kay Igwe], a graduate electrical engineering student at Columbia University. [Kay] has worked in nanotechnology as well as semiconductor manufacturing for Intel. These days, she’s spending her time playing games – but not with her hands.
Many of us love gaming, and probably spend way too much time on our computers, consoles, or phones playing games. But what about people who don’t have the use of their hands, such as ALS patients? Bringing gaming to the disabled is what prompted [Kay] to work on Control iT, a brain interface for controlling games. Brain-computer interfaces invoke images of Electroencephalography (EEG) machines. Usually that means tons of electrodes, gel in your hair, and data which is buried in the noise.
[Kay Igwe] is exploring a very interesting phenomenon that uses flashing lights to elicit very specific, and easy to detect brain waves. This type of interface is very promising and is the topic of the talk she gave at this year’s Hackaday SuperConference. Check out the video of her presentation, then join us after the break as we dive into the details of her work.