Building a robotic arm and hand that matches human dexterity is tougher than it looks. We can create aesthetically pleasing ones, very functional ones, but the perfect mix of both? Still a work in progress. Just ask [Sarah de Lagarde], who in 2022 literally lost an arm and a leg in a life-changing accident. In this BBC interview, she shares her experiences openly – highlighting both the promise and the limits of today’s prosthetics.
The problem is that our hands aren’t just grabby bits. They’re intricate systems of nerves, tendons, and ridiculously precise motor control. Even the best AI-powered prosthetics rely on crude muscle signals, while dexterous robots struggle with the simplest things — like tying shoelaces or flipping a pancake without launching it into orbit.
That doesn’t mean progress isn’t happening. Researchers are training robotic fingers with real-world data, moving from ‘oops’ to actual precision. Embodied AI, i.e. machines that learn by physically interacting with their environment, is bridging the gap. Soft robotics with AI-driven feedback loops mimic how our fingers instinctively adjust grip pressure. If haptics are your point of interest, we have posted about it before.
The future isn’t just robots copying our movements, it’s about them understanding touch. Instead of machine learning, we might want to shift focus to human learning. If AI cracks that, we’re one step closer.
How many of you pervs had the same question come to mind?
Can they fap with them?
rolls eyes…
and immediately clicks another tab open for the search
Can’t wait to see the lawyer commercials for this new category of “personal injury”
I’ll probably live long enough to see these AI androids be sued.
You won’t, they are not entities, legal or otherwise. You will instead see regular liability based suits against doctors because they are an easier target than the manufacturers, and some class based action if there’s a major screw up.
1.) Did not come to mind, thank you for your contribution. I need to up my game.
2.) What makes you think people don’t already do it with the older models? Heck I bet Captain Redbeard did it with his rusty old hook.. And don’t even get me started about the peg leg
This is the sort of thing AI is actually great for.
It’s like games using AI to generate extra frames because of the PCIe bottleneck leaving the GPU idle for milliseconds between frames. Since the CPU can’t communicate with the GPU fast enough, the GPU uses AI to infer what the CPU would be saying if it could.
Or using AI in media decompression, intelligently reconstructing details crushed in lossy compression that’s required by low bandwidth connections.
If we can’t match organic nerves for signal propagation and decoding, AI can fill the gaps.
To improve prosthetics they need only to be able to accurately communicate information to and from the brain. The brain will be able to work out how to control the prosthetic better from there. What they suggest is making a shortcut by using AI that doesn’t actually improve communication. To me this feels like a cop-out instead of doing the hard work (R&D) that is needed to improve man-machine communications.
(complete amateur feedback here)
Agreed. Isn’t this an ongoing process in all our lives with all our sensors and “actuators”? Constantly learning, re-learning and adjusting “what is where” (spacial orientation/position of our own extremities), “how we feel what”, “what signals need to be send to make a specific move” and so forth …
I’d assume no ones brain signals to the hand and sensory information coming back is identical to anyone else’s when doing the same task.
They probably even change during a single individuals lifetime.
-> using ML algos to “train” chips for delicate fine control in closed loop with the sensors in an prosthetic hand may be a good idea.
Even when “fitting”(?) one to/for a patient as a limited initial learning stage between human and machine.
But after that I’d think in-/output should be “static”.
AI I should at most internally to the prosthesis compensate for it’s own wear & tear.
@limroh The static aspects of future on-device ASIC based models has pros and cons, unfortunately legal aspects will get in the way of flexibility too.
Still, this is a good thing and the training you mention has been by some device makers for quite some time.
This is two things, but is not a “cop-out”. Research on prosthetics is rather poorly managed, so think of the obvious “AI” marketing angle as what it is, and be glad someone is investing in useful areas. As for the technology itself, it’s compromise. What we are really talking about is smoothing out and regularising motion and filling in gaps in whatever communication is available. This will still be a desirable function as sensory i/o improves.
It’s poorly managed because it’s a grant factory. It’s one of those things that makes people go “aww” and pull out their pocketbook. This is not to belittle the need for such research, or people suffering from certain conditions…
Cancer research is also very similar. It is rare for any new technology or technique to come out without a million half-baked articles on how it could be used for “cancer drug delivery” or similar, and nothing comes of it… But trust me, hundreds of millions of dollars disappear.
Certain highly sympathetic subjects are routinely taken advantage of, and also just as often used as alibis.
AI enables a new option for prosthetics, let the arm/hand/leg/foot prosthesis learn from the wearer and think for itself to perform and assist. It ‘connects’ to your brain by watching and feeling you.
To grab a cup, just lean/move towards the object, watching the cup, the arm should automatically reach out and grab. A smart AI arm should learn and know enough about the context, social cues, the habits and other limbs of the wearer to perform operations autonomously, but for the wearer. If it sees a handrail and you are slowing down, probably grabs it automatically, then releases and grabs again along with your steps. A simple glance away by the wearer provides context to cancel the assist. Your prosthesis will respond like a partner that knows you. Most things can be communicated without voice. But of course talking and providing detail instructions, preferences, corrections to the prosthesis is possible, that’s actually the easy part now.
An arm or hand assists with grabbing, holding and even performing multi-step tasks, whatever the prosthesis hardware is capable of. A leg/foot is a very strong balancing robot, sensing, observing and predicting what’s needed to keep you balanced and moving around.
reminds me of this video by a french humorist “Monsieur poulpe” : https://www.youtube.com/watch?v=FZ6kR_XKlwE
Cut to a prosthetic arm writing out “I have no mouth and I must scream” over and over again on the nightstand while its owner is sleeping