Building A Robotic Arm Without Breaking The Bank

There are probably at least as many ways to construct a robotic arm as there are uses for them. In the case of [Thomas Sanladerer] his primary requirement for the robotic arm was to support a digital camera, which apparently has to be capable of looking vaguely menacing in a completely casual manner. Meet Caroline, whose styling and color scheme is completely coincidental and does not promise yummy moist cake for anyone who is still alive after all experiments have been run.

Unlike typical robotic arms where each joint in the arm is directly driven by a stepper motor or similar, [Thomas] opted to use a linear rail that pushes or pulls the next section of the arm in a manner that’s reminiscent of the action by the opposing muscles in our mammalian appendages. This 3D printer-inspired design is pretty sturdy, but the steppers like to skip steps, so he is considering replacing them with brushless motors.

Beyond this, the rest of the robotic arm uses aluminium hollow stock, a lot of 3D printed sections and for the head a bunch of Waveshare ST3215 servos with internal magnetic encoder for angle control. One of these ~€35 ST3215s did cook itself during testing, which is somewhat worrying. Overall, total costs was a few hundred Euro, which for a nine-degree robotic arm like this isn’t too terrible.

Continue reading “Building A Robotic Arm Without Breaking The Bank”

Talking Robot Uses Typewriter Tech For Mouth

Many decades ago, IBM engineers developed the typeball. This semi-spherical hunk of metal would become the heart of the Selectric typewriter line. [James Brown] has now leveraged that very concept to create a pivoting mouth mechanism for a robot that appears to talk.

What you’re looking at is a plastic ball with lots of different mouth shapes on it. By pivoting the ball to different angles inside the head of a robot, it’s possible to display different mouth shapes on the face. By swapping mouth shapes rapidly in concert with recorded speech, it’s possible to make the robot appear to be speaking. We don’t get a great look at the mechanism that operates the ball, but Selectric typeball operation is well documented elsewhere if you seek to recreate the idea yourself.

The real benefit of this mechanism is speed. It might not look as fluid as some robots with manually-articulated flexible mouths, but the rapid mouth transitions really help sell the effect because they match the pace of speech. [James] demonstrated the finished product on Mastodon, and it looks great in action.

This isn’t the first time we’ve featured [James Brown]’s work. You may recall he got DOOM running on a tiny LEGO brick a few years back.

Thanks to [J. Peterson] for the tip!

A man standing next to a host of small automatic trash cans

Automated Rubbish Removal System

The hackers over at [HTX Studio] built a set of twenty trash cans which can automatically catch and remove rubbish.

In order to catch trash a bin needs to do two things: detect where trash will land; and then get there, fast. The second part is easy: three big motors with wheels under the bin. But how does a bin know where the trash will land? It uses a camera installed in the bin itself for that.

[HTX Studio] iteratively trained a model to process visual information from the camera to identify common types of trash. When it sees a trained object flying through the air it rushes to catch it where it will land. After many rounds of fine-tuning it finally started to work reliably.

Continue reading “Automated Rubbish Removal System”

Robots Are Coming For Your Berry Good Job

We don’t know if picking blackberries at scale is something people enjoy doing. But if you do, we have bad news. The University of Arkansas wants to put you out of a job in favor of your new robot overlord. It turns out that blackberries in Arkansas alone are a $24 million business. The delicate berries are typically hand-picked.

The robot hand that can do the same job has three soft fingers and tendons made from guitar strings. Each finger has a force sensor at the tip so it can squeeze the berries just right. How much force does it take to grab a blackberry? To find out, researchers placed sensors on the fingers of experienced pickers and used the data to guide their design. Researchers claim they were inspired by the motion of a tulip opening and closing each day.

Your berry picking job is safe for now, though. They don’t have the vision system to actually find the berries. Not yet, anyway. Of course in the meantime, the gripper could be used for anything that needs a delicate touch.

Oddly, everyone seems to want to develop robots to pick agricultural items. We are usually more interested in a different kind of picking.

Reachy The Robot Gets A Mini (Kit) Version

Reachy Mini is a kit for a compact, open-source robot designed explicitly for AI experimentation and human interaction. The kit is available from Hugging Face, which is itself a repository and hosting service for machine learning models. Reachy seems to be one of their efforts at branching out from pure software.

Our guess is that some form of Stewart Platform handles the head movement.

Reachy Mini is intended as a development platform, allowing people to make and share models for different behaviors, hence the Hugging Face integration to make that easier. On the inside of the full version is a Raspberry Pi, and we suspect some form of Stewart Platform is responsible for the movement of the head. There’s also a cheaper (299 USD) “lite” version intended for tethered use, and a planned simulator to allow development and testing without access to a physical Reachy at all.

Reachy has a distinctive head and face, so if you’re thinking it looks familiar that’s probably because we first covered Reachy the humanoid robot as a project from Pollen Robotics (Hugging Face acquired Pollen Robotics in April 2025.)

The idea behind the smaller Reachy Mini seems to be to provide a platform to experiment with expressive human communication via cameras and audio, rather than to be the kind of robot that moves around and manipulates objects.

It’s still early in the project, so if you want to know more you can find a bit more information about Reachy Mini at Pollen’s site and you can see Reachy Mini move in a short video, embedded just below.

Continue reading “Reachy The Robot Gets A Mini (Kit) Version”

A photo of the circuitry along with an oscilloscope

Eight Artificial Neurons Control Fully Autonomous Toy Truck

Recently the [Global Science Network] released a video of using an artificial brain to control an RC truck.

The video shows a neural network comprised of eight artificial neurons assembled on breadboards used to control a fully autonomous toy truck. The truck is equipped with four proximity sensors, one front, one front left, one front right, and one rear. The sensor readings from the truck are transmitted to the artificial brain which determines which way to turn and whether to go forward or backward. The inputs to each neuron, the “synapses”, can be excitatory to increase the firing rate or inhibitory to decrease the firing rate. The output commands are then returned wirelessly to the truck via a hacked remote control.

This particular type of neural network is called a Spiking Neural Network (SNN) which uses discrete events, called “spikes”, instead of continuous real-valued activations. In these types of networks when a neuron fires matters as well as the strength of the signal. There are other videos on this channel which go into more depth on these topics.

The name of this experimental vehicle is the GSN SNN 4-8-24-2 Autonomous Vehicle, which is short for: Global Science Network Spiking Neural Network 4 Inputs 8 Neurons 24 Synapses 2 Degrees of Freedom Output. The circuitry on both the vehicle and the breadboards is littered with LEDs which give some insight into how it all functions.

If you’re interested in how neural networks can control behavior you might like to see a digital squid’s behavior shaped by a neural network.

Continue reading “Eight Artificial Neurons Control Fully Autonomous Toy Truck”

Diffuse glow of red, green, and blue LEDs embedded in silicone

Embedded LEDs For Soft Robots Made From Silicone

Over on their YouTube channel [Science Buddies] shows us how to embed LEDs in soft robots. Soft robots can be made entirely or partially from silicone. In the video you see an example of a claw-like gripper made entirely from silicone. You can also use silicone to make “skin”. The skin can stretch, and the degree of stretch can be measured by means of an embedded sensor made from stretchy conductive fabric.

As silicone is translucent if you embed LEDs within it when illuminated they will emit diffuse light. Stranded wire is best for flexibility and the video demonstrates how to loop the wires back and forth into a spring-like shape for expansion and contraction along the axis which will stretch. Or you can wire in the LEDs without bending the wires if you run them along an axis which won’t stretch.

The video shows how to make silicone skin by layering two-part mixture into a mold. A base layer of silicone is followed by a strip of conductive fabric and the LED with its wires. Then another layer of silicone is applied to completely cover and seal the fabric and LED in place. Tape is used to hold the fabric and LED in place while the final layer of silicone is applied.

When the LEDs are embedded in silicone there will be reduced airflow to facilitate cooling so be sure to use a large series resistor to limit the current through the LED as much as possible to prevent overheating. A 1K series resistor would be a good value to try first. If you need the LED to be brighter you will need to decrease the resistance, but make sure you’re not generating too much heat when you do so.

If you’re interested in stretchy circuits you might also like to read about flexible circuits built on polyimide film.

Continue reading “Embedded LEDs For Soft Robots Made From Silicone”