LeRobot Brings Autonomy To Hobby Robots

Robotic arms have a lot in common with CNC machines in that they are usually driven by a fixed script of specific positions to move to, and actions to perform. Autonomous behavior isn’t the norm, especially not for hobby-level robotics. That’s changing rapidly with LeRobot, an open-source machine learning framework from the Hugging Face community.

The SO-101 arm is an economical way to get started.

If a quick browse of the project page still leaves you with questions, you’re not alone. Thankfully, [Ilia] has a fantastic video that explains and demonstrates the fundamentals wonderfully. In it, he shows how LeRobot allows one to train an economical 3D-printed robotic arm by example, teaching it to perform a task autonomously. In this case, the task is picking up a ball and putting it into a cup.

[Ilia] first builds a dataset by manually operating the arm to pick up a ball and place it in a cup. Then, with a dataset consisting of only about fifty such examples, he creates a machine learning model capable of driving the arm to autonomously pick up a ball and place it in a cup, regardless of where the ball and cup actually are. It even gracefully handles things like color changes and [Ilia] moving the cup and ball around mid-task. You can skip directly to 34:16 to see this autonomous behavior in action, but we do recommend watching the whole video for a highly accessible yet deeply technical overview.

Continue reading “LeRobot Brings Autonomy To Hobby Robots”

Building A Robotic Arm Without Breaking The Bank

There are probably at least as many ways to construct a robotic arm as there are uses for them. In the case of [Thomas Sanladerer] his primary requirement for the robotic arm was to support a digital camera, which apparently has to be capable of looking vaguely menacing in a completely casual manner. Meet Caroline, whose styling and color scheme is completely coincidental and does not promise yummy moist cake for anyone who is still alive after all experiments have been run.

Unlike typical robotic arms where each joint in the arm is directly driven by a stepper motor or similar, [Thomas] opted to use a linear rail that pushes or pulls the next section of the arm in a manner that’s reminiscent of the action by the opposing muscles in our mammalian appendages. This 3D printer-inspired design is pretty sturdy, but the steppers like to skip steps, so he is considering replacing them with brushless motors.

Beyond this, the rest of the robotic arm uses aluminium hollow stock, a lot of 3D printed sections and for the head a bunch of Waveshare ST3215 servos with internal magnetic encoder for angle control. One of these ~€35 ST3215s did cook itself during testing, which is somewhat worrying. Overall, total costs was a few hundred Euro, which for a nine-degree robotic arm like this isn’t too terrible.

Continue reading “Building A Robotic Arm Without Breaking The Bank”

Talking Robot Uses Typewriter Tech For Mouth

Many decades ago, IBM engineers developed the typeball. This semi-spherical hunk of metal would become the heart of the Selectric typewriter line. [James Brown] has now leveraged that very concept to create a pivoting mouth mechanism for a robot that appears to talk.

What you’re looking at is a plastic ball with lots of different mouth shapes on it. By pivoting the ball to different angles inside the head of a robot, it’s possible to display different mouth shapes on the face. By swapping mouth shapes rapidly in concert with recorded speech, it’s possible to make the robot appear to be speaking. We don’t get a great look at the mechanism that operates the ball, but Selectric typeball operation is well documented elsewhere if you seek to recreate the idea yourself.

The real benefit of this mechanism is speed. It might not look as fluid as some robots with manually-articulated flexible mouths, but the rapid mouth transitions really help sell the effect because they match the pace of speech. [James] demonstrated the finished product on Mastodon, and it looks great in action.

This isn’t the first time we’ve featured [James Brown]’s work. You may recall he got DOOM running on a tiny LEGO brick a few years back.

Thanks to [J. Peterson] for the tip!

A man standing next to a host of small automatic trash cans

Automated Rubbish Removal System

The hackers over at [HTX Studio] built a set of twenty trash cans which can automatically catch and remove rubbish.

In order to catch trash a bin needs to do two things: detect where trash will land; and then get there, fast. The second part is easy: three big motors with wheels under the bin. But how does a bin know where the trash will land? It uses a camera installed in the bin itself for that.

[HTX Studio] iteratively trained a model to process visual information from the camera to identify common types of trash. When it sees a trained object flying through the air it rushes to catch it where it will land. After many rounds of fine-tuning it finally started to work reliably.

Continue reading “Automated Rubbish Removal System”

Robots Are Coming For Your Berry Good Job

We don’t know if picking blackberries at scale is something people enjoy doing. But if you do, we have bad news. The University of Arkansas wants to put you out of a job in favor of your new robot overlord. It turns out that blackberries in Arkansas alone are a $24 million business. The delicate berries are typically hand-picked.

The robot hand that can do the same job has three soft fingers and tendons made from guitar strings. Each finger has a force sensor at the tip so it can squeeze the berries just right. How much force does it take to grab a blackberry? To find out, researchers placed sensors on the fingers of experienced pickers and used the data to guide their design. Researchers claim they were inspired by the motion of a tulip opening and closing each day.

Your berry picking job is safe for now, though. They don’t have the vision system to actually find the berries. Not yet, anyway. Of course in the meantime, the gripper could be used for anything that needs a delicate touch.

Oddly, everyone seems to want to develop robots to pick agricultural items. We are usually more interested in a different kind of picking.

Reachy The Robot Gets A Mini (Kit) Version

Reachy Mini is a kit for a compact, open-source robot designed explicitly for AI experimentation and human interaction. The kit is available from Hugging Face, which is itself a repository and hosting service for machine learning models. Reachy seems to be one of their efforts at branching out from pure software.

Our guess is that some form of Stewart Platform handles the head movement.

Reachy Mini is intended as a development platform, allowing people to make and share models for different behaviors, hence the Hugging Face integration to make that easier. On the inside of the full version is a Raspberry Pi, and we suspect some form of Stewart Platform is responsible for the movement of the head. There’s also a cheaper (299 USD) “lite” version intended for tethered use, and a planned simulator to allow development and testing without access to a physical Reachy at all.

Reachy has a distinctive head and face, so if you’re thinking it looks familiar that’s probably because we first covered Reachy the humanoid robot as a project from Pollen Robotics (Hugging Face acquired Pollen Robotics in April 2025.)

The idea behind the smaller Reachy Mini seems to be to provide a platform to experiment with expressive human communication via cameras and audio, rather than to be the kind of robot that moves around and manipulates objects.

It’s still early in the project, so if you want to know more you can find a bit more information about Reachy Mini at Pollen’s site and you can see Reachy Mini move in a short video, embedded just below.

Continue reading “Reachy The Robot Gets A Mini (Kit) Version”

A photo of the circuitry along with an oscilloscope

Eight Artificial Neurons Control Fully Autonomous Toy Truck

Recently the [Global Science Network] released a video of using an artificial brain to control an RC truck.

The video shows a neural network comprised of eight artificial neurons assembled on breadboards used to control a fully autonomous toy truck. The truck is equipped with four proximity sensors, one front, one front left, one front right, and one rear. The sensor readings from the truck are transmitted to the artificial brain which determines which way to turn and whether to go forward or backward. The inputs to each neuron, the “synapses”, can be excitatory to increase the firing rate or inhibitory to decrease the firing rate. The output commands are then returned wirelessly to the truck via a hacked remote control.

This particular type of neural network is called a Spiking Neural Network (SNN) which uses discrete events, called “spikes”, instead of continuous real-valued activations. In these types of networks when a neuron fires matters as well as the strength of the signal. There are other videos on this channel which go into more depth on these topics.

The name of this experimental vehicle is the GSN SNN 4-8-24-2 Autonomous Vehicle, which is short for: Global Science Network Spiking Neural Network 4 Inputs 8 Neurons 24 Synapses 2 Degrees of Freedom Output. The circuitry on both the vehicle and the breadboards is littered with LEDs which give some insight into how it all functions.

If you’re interested in how neural networks can control behavior you might like to see a digital squid’s behavior shaped by a neural network.

Continue reading “Eight Artificial Neurons Control Fully Autonomous Toy Truck”