As difficult as it is for a human to learn ambidexterity, it’s quite easy to program into a humanoid robot. After all, a robot doesn’t need to overcome years of muscle memory. Giving a one-handed robot ambidexterity, however, takes some more creativity. [Kelvin Gonzales Amador] managed to do this with his ambidextrous robot hand, capable of signing in either left- or right-handed American Sign Language (ASL).
The essential ingredient is a separate servo motor for each joint in the hand, which allows each joint to bend equally well backward and forward. Nothing physically marks one side as the palm or the back of the hand. To change between left and right-handedness, a servo in the wrist simply turns the hand 180 degrees, the fingers flex in the other direction, and the transformation is complete. [Kelvin] demonstrates this in the video below by having the hand sign out the full ASL alphabet in both the right and left-handed configurations.
The tradeoff of a fully direct drive is that this takes 23 servo motors in the hand itself, plus a much larger servo for the wrist joint. Twenty small servo motors articulate the fingers, and three larger servos control joints within the hand. An Arduino Mega controls the hand with the aid of two PCA9685 PWM drivers. The physical hand itself is made out of 3D-printed PLA and nylon, painted gold for a more striking appearance.
This isn’t the first language-signing robot hand we’ve seen, though it does forgo the second hand. To make this perhaps one of the least efficient machine-to-machine communication protocols, you could also equip it with a sign language translation glove.
Robots Hacks2442 Articles
LeRobot Brings Autonomy To Hobby Robots
Robotic arms have a lot in common with CNC machines in that they are usually driven by a fixed script of specific positions to move to, and actions to perform. Autonomous behavior isn’t the norm, especially not for hobby-level robotics. That’s changing rapidly with LeRobot, an open-source machine learning framework from the Hugging Face community.

If a quick browse of the project page still leaves you with questions, you’re not alone. Thankfully, [Ilia] has a fantastic video that explains and demonstrates the fundamentals wonderfully. In it, he shows how LeRobot allows one to train an economical 3D-printed robotic arm by example, teaching it to perform a task autonomously. In this case, the task is picking up a ball and putting it into a cup.
[Ilia] first builds a dataset by manually operating the arm to pick up a ball and place it in a cup. Then, with a dataset consisting of only about fifty such examples, he creates a machine learning model capable of driving the arm to autonomously pick up a ball and place it in a cup, regardless of where the ball and cup actually are. It even gracefully handles things like color changes and [Ilia] moving the cup and ball around mid-task. You can skip directly to 34:16 to see this autonomous behavior in action, but we do recommend watching the whole video for a highly accessible yet deeply technical overview.
Building A Robotic Arm Without Breaking The Bank
There are probably at least as many ways to construct a robotic arm as there are uses for them. In the case of [Thomas Sanladerer] his primary requirement for the robotic arm was to support a digital camera, which apparently has to be capable of looking vaguely menacing in a completely casual manner. Meet Caroline, whose styling and color scheme is completely coincidental and does not promise yummy moist cake for anyone who is still alive after all experiments have been run.
Unlike typical robotic arms where each joint in the arm is directly driven by a stepper motor or similar, [Thomas] opted to use a linear rail that pushes or pulls the next section of the arm in a manner that’s reminiscent of the action by the opposing muscles in our mammalian appendages. This 3D printer-inspired design is pretty sturdy, but the steppers like to skip steps, so he is considering replacing them with brushless motors.
Beyond this, the rest of the robotic arm uses aluminium hollow stock, a lot of 3D printed sections and for the head a bunch of Waveshare ST3215 servos with internal magnetic encoder for angle control. One of these ~€35 ST3215s did cook itself during testing, which is somewhat worrying. Overall, total costs was a few hundred Euro, which for a nine-degree robotic arm like this isn’t too terrible.
Continue reading “Building A Robotic Arm Without Breaking The Bank”
Talking Robot Uses Typewriter Tech For Mouth
Many decades ago, IBM engineers developed the typeball. This semi-spherical hunk of metal would become the heart of the Selectric typewriter line. [James Brown] has now leveraged that very concept to create a pivoting mouth mechanism for a robot that appears to talk.
What you’re looking at is a plastic ball with lots of different mouth shapes on it. By pivoting the ball to different angles inside the head of a robot, it’s possible to display different mouth shapes on the face. By swapping mouth shapes rapidly in concert with recorded speech, it’s possible to make the robot appear to be speaking. We don’t get a great look at the mechanism that operates the ball, but Selectric typeball operation is well documented elsewhere if you seek to recreate the idea yourself.
The real benefit of this mechanism is speed. It might not look as fluid as some robots with manually-articulated flexible mouths, but the rapid mouth transitions really help sell the effect because they match the pace of speech. [James] demonstrated the finished product on Mastodon, and it looks great in action.
This isn’t the first time we’ve featured [James Brown]’s work. You may recall he got DOOM running on a tiny LEGO brick a few years back.
Thanks to [J. Peterson] for the tip!
Automated Rubbish Removal System
The hackers over at [HTX Studio] built a set of twenty trash cans which can automatically catch and remove rubbish.
In order to catch trash a bin needs to do two things: detect where trash will land; and then get there, fast. The second part is easy: three big motors with wheels under the bin. But how does a bin know where the trash will land? It uses a camera installed in the bin itself for that.
[HTX Studio] iteratively trained a model to process visual information from the camera to identify common types of trash. When it sees a trained object flying through the air it rushes to catch it where it will land. After many rounds of fine-tuning it finally started to work reliably.
Robots Are Coming For Your Berry Good Job
We don’t know if picking blackberries at scale is something people enjoy doing. But if you do, we have bad news. The University of Arkansas wants to put you out of a job in favor of your new robot overlord. It turns out that blackberries in Arkansas alone are a $24 million business. The delicate berries are typically hand-picked.
The robot hand that can do the same job has three soft fingers and tendons made from guitar strings. Each finger has a force sensor at the tip so it can squeeze the berries just right. How much force does it take to grab a blackberry? To find out, researchers placed sensors on the fingers of experienced pickers and used the data to guide their design. Researchers claim they were inspired by the motion of a tulip opening and closing each day.
Your berry picking job is safe for now, though. They don’t have the vision system to actually find the berries. Not yet, anyway. Of course in the meantime, the gripper could be used for anything that needs a delicate touch.
Oddly, everyone seems to want to develop robots to pick agricultural items. We are usually more interested in a different kind of picking.
Reachy The Robot Gets A Mini (Kit) Version
Reachy Mini is a kit for a compact, open-source robot designed explicitly for AI experimentation and human interaction. The kit is available from Hugging Face, which is itself a repository and hosting service for machine learning models. Reachy seems to be one of their efforts at branching out from pure software.

Reachy Mini is intended as a development platform, allowing people to make and share models for different behaviors, hence the Hugging Face integration to make that easier. On the inside of the full version is a Raspberry Pi, and we suspect some form of Stewart Platform is responsible for the movement of the head. There’s also a cheaper (299 USD) “lite” version intended for tethered use, and a planned simulator to allow development and testing without access to a physical Reachy at all.
Reachy has a distinctive head and face, so if you’re thinking it looks familiar that’s probably because we first covered Reachy the humanoid robot as a project from Pollen Robotics (Hugging Face acquired Pollen Robotics in April 2025.)
The idea behind the smaller Reachy Mini seems to be to provide a platform to experiment with expressive human communication via cameras and audio, rather than to be the kind of robot that moves around and manipulates objects.
It’s still early in the project, so if you want to know more you can find a bit more information about Reachy Mini at Pollen’s site and you can see Reachy Mini move in a short video, embedded just below.
Continue reading “Reachy The Robot Gets A Mini (Kit) Version”






