Today, prostheses and exoskeletons are controlled using electromyography. In other words, by recording the electrical activity in muscles as they contract. It’s neither intuitive nor human-like, and it really only shows the brain’s intent, not the reality of what the muscle is doing.
After embedding pairs of 3mm diameter ball magnets into the calves of turkeys, the researchers were able to detect muscle movement in three milliseconds, and to the precision of thirty-seven microns, which is about the width of a human hair. They hope to try MM on humans within the next couple of years. It would be a great solution overall if it works out, because compared with the electromyography method, MM is cheaper, less invasive, and potentially permanent. Couple MM with a new type of amputation surgery called AMI that provides a fuller range of motion, less pain overall, and finer control of prosthetics, and the future of prostheses and rehabilitation looks really exciting. Be sure to check out the video after the break.
For all the work done since the dawn of robotics, there is still no match for the human hand in terms of its dexterity and adaptability. Researchers of the IRIM Lab at Koreatech is a step closer with their ingenious BLT gripper, which can pinch with precision or grasp a larger object with evenly distributed force. (Video embedded below.)
The three fingered gripper is technically called a “belt and link actuated transformable adaptive gripper with active transition capability”. Each finger is a interesting combination of a rigid “fingertip” and actuation link, and a belt as a grasping surface. The actuation link has a small gearbox at it’s base to open and close the hand, and the hinge with the “fingertip” is spring-loaded to the open position. A flexible belt stretches between the finger tip and the base of the gripper, which can be tensioned to actuate the fingertip for pinching, or provide even force across the inside of the gripper for grasping. Two of the fingers can also rotate at the base to give various gripper configurations. This allows the gripper to be used in various ways, including smoothly shifting between pinching and grasping without dropping a object.
We love the relative simplicity of the mechanism, and can see it being used for general robotics and prosthetic hands, especially if force sensing is integrated. The mechanism should be fairly easy to replicate using 3D printed components, a piece of toothed belt, and two cheap servos, so get cracking! Continue reading “Gripper Uses Belts To Pinch And Grasp”→
Traditionally, sockets for prostheses are created by making a plaster cast of the limb being fitted, and are then sculpted in carbon fiber. It’s an expensive and time-consuming process, and what is supposed to be a customized socket often turns out to be an uncomfortable disappointment. Though prosthetists design these sockets specifically to take pressure off of the more rigid areas of tissue, this usually ends up putting more pressure on the softer areas, causing pain and discomfort.
An MIT team led by [Arthur Preton] wants to make prosthesis sockets more comfortable and better customized. They created FitSocket, a machine that assesses the rigidity of limb tissue. You can see it in motion after the break.
FitSocket is essentially a ring of 14 actuators that gently prod the limb and test how much pressure it takes to push in the tissue. By repeating this process over the entire limb, [Preton] can create a map that shows the varying degrees of stiffness or softness in the tissue.
We love to see advancements in prostheses. Here’s an electronic skin that brings feeling to artificial fingertips.
When we lose a limb, the brain is really none the wiser. It continues to send signals out, but since they no longer have a destination, the person is stuck with one-way communication and a phantom-limb feeling. The fact that the brain carries on has always been promising as far as prostheses are concerned, because it means the electrical signals could potentially be used to control new limbs and digits the natural way.
Like real skin, the e-dermis has an outer, epidermal layer and an inner, dermal layer. Both layers use conductive and piezoresistive textiles to transmit information about tangible objects back to the peripheral nerves in the limb. E-dermis does this non-invasively through the skin using transcutaneous electrical nerve stimulation, better known as TENS. Here’s a link to the full article published in Science Robotics.
First, the researchers made a neuromorphic model of all the nerves and receptors that relay signals to the nervous system. To test the e-dermis, they used 3-D printed objects designed to be grasped between thumb and forefinger, and monitored the subject’s brain activity via EEG.
For now, the e-dermis is confined to the fingertips. Ideally, it would cover the entire prosthesis and be able to detect temperature as well as curvature. Stay tuned, because it’s next on their list.
[Ondřej Vocílka] is a student at the Brno University of Technology in the Czech Republic. In addition, the 23-year-old lost his vision in his left eye. While attending a lecture on 3D printing, he wondered if he could 3D print an ophthalmic prosthesis — an artificial eye. Turns out, he could. If you don’t speak Czech, you’ll need to call on a translation service like we did.
Unlike conventional glass or plastic eyes, it is trivial to change parameters like color when 3D printing the prosthetic. This is especially important with the iris and the finished product takes about 90 minutes to print. There is additional time required to coat the product with an acrylic layer to mimic the gloss of a natural eye.
Prostheses are a great help to those who have lost limbs, or who never had them in the first place. Over the past few decades there has been a great deal of research done to make these essential devices more useful, creating prostheses that are capable of movement and more accurately recreating the functions of human body parts. At Georgia Tech, they’re working on just that, with the help of AI.
[Jason Barnes] lost his arm in a work accident, which prevented him from playing the piano the way he used to. The researchers at Georgia Tech worked with him, eventually producing a prosthetic arm that, unlike most, actually has individual finger control. This is achieved through the use of an ultrasound probe, which is used to detect muscle movements elsewhere on his body, with enough detail to allow the control of individual fingers. This is done through a TensorFlow-based neural network which analyses the ultrasound data to determine which finger the user is trying to move. The use of ultrasound was the major breakthrough which made this possible; previous projects have often relied on electromyogram sensors to read muscle impulses but these lack the resolution required.
The prosthesis is nicknamed the “Skywalker arm”, after its similarities to the prostheses seen in the Star Wars films. It’s not [Jason]’s first advanced prosthetic, either – Georgia Tech has also equipped him with an advanced drumming prosthesis. This allows him to use two sticks with a single arm, the second stick using advanced AI routines to drum along with the music in the room.
It’s great to see music being used as a driver to create high-performance prosthetics and push the state of the art forward. We’re sure [Jason] enjoys performing with the new hardware, too. But perhaps you’d like to try something similar, even though you’ve got two hands already? Try this on for size.