How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs

Thirty years ago, [Robert “Buz” Chmielewski] suffered a surfing accident as a teenager. This left him as a quadriplegic due to a C6 spinal cord injury. After becoming a participant in a brain-computer interface study at Johns Hopkins, he was recently able to feed himself through the use of prosthetic arms. The most remarkable thing about these prosthetic arms is primarily the neural link with [Buz’s] brain, which allows him to not only control the artificial arms, but also feel what they are touching, due to a closed-loop system which transfers limb sensory input to the patient’s brain.

The prosthetic limb in question is the Modular Prosthetic Limb (MPL) from Johns Hopkins Applied Physics Laboratory (APL). The Johns Hopkins Medicine Brain-Computer Interface study began a year ago, when [Buz] had six microelectrode arrays (MEA) implanted into his brain: half in the motor cortex and half in the sensory cortex. During the following months, the study focused on using the signals from the first set of arrays to control actuators, such as the MPL. The second set of arrays was used to study how the sensory cortex had to be stimulated to allow a patient to feel the artificial limb much as one feels a biological limb.

What makes this study so interesting is not only the closed-loop approach which provides the patient with feedback on the position and pressure on the prosthetic, but also that it involves both hemispheres of the brain. As a result, after only a year of the study, [Buz] was able to use two of the MPLs simultaneously to feed himself, which is a delicate and complicated tasks.

In the video embedded after the break one can see a comparison of [Buz] at the beginning of the study and today, as he manages to handle cutlery and eat cake, without assistance.

Continue reading “How A Quadriplegic Patient Was Able To Eat By Himself Again Using Artificial Limbs”

Karting Hands-Free

Some of us have computer mice with more buttons than we have fingers, resolution tracking finer than a naked eye can discern, and forced-air vents. All these features presuppose one thing; the user has a functioning hand. [Federico Runco] knows that amyotrophic lateral sclerosis, ALS, or Lou Gehrig’s disease, will rob a person of their ability to use standard computer inputs, or the joystick on a motorized wheelchair. He is building EyesDrive for the 2020 Hackaday Prize, to restore that mobility to ALS patients. There are already some solutions, but this one focuses on a short bill of materials.

Existing systems are expensive and often track pupil location, which returns precise data, but EyesDrive only discerns, left, right, and resting. For these, we need three non-invasive electrodes, a custom circuit board with amplifiers, signal processing circuits, and a microcontroller. He includes a Bluetooth socket on the custom PCBs, which is the primary communication method. In the video below he steers a virtual kart around a knotty course to prove that his system is up to the task of an urban wheelchair.

EyesDrive by [Federico Runco] should not be confused with the HackadayPrize2015 winner, Eyedrivomatic, lead by two remarkable hackers, Steve Evans and Patrick Joyce.

Continue reading “Karting Hands-Free”