A man next to a robot with animatronic eyes and a CRT display showing an audio waveform

Animatronic Alexa Gives Amazon’s Echo A Face

Today, we’re surrounded by talking computers and clever AI systems that a few decades ago only existed in science fiction. But they definitely looked different back then: instead of a disembodied voice like ChatGPT, classic sci-fi movies typically featured robots that had something resembling a human body, with an actual face you could talk to. [Thomas] over at Workshop Nation thought this human touch was missing from his Amazon Echo, and therefore set out to give Alexa a face in a project he christened Alexatron.

The basic idea was to design a device that would somehow make the Echo’s voice visible and, at the same time, provide a pair of eyes that move in a lifelike manner. For the voice, [Thomas] decided to use the CRT from a small black-and-white TV. By hooking up the Echo’s audio signal to the TV’s vertical deflection circuitry, he turned it into a rudimentary oscilloscope that shows Alexa’s waveform in real time. An acrylic enclosure shields the CRT’s high voltage while keeping everything inside clearly visible.

To complete the face, [Thomas] made a pair of animatronic eyes according to a design by [Will Cogley]. Consisting of just a handful of 3D-printed parts and six servos, it forms a pair of eyes that can move in all directions and blink just like a real person. Thanks to a “person sensor,” which is basically a smart camera that detects faces, the eyes will automatically follow anyone standing in front of the system. The eyes are closed when the system is dormant, but they will open and start looking for faces nearby when the Echo hears its wake word, just like a human or animal responds to its name.

The end result absolutely looks the part: we especially like the eye tracking feature, which gives it that human-like appearance that [Thomas] was aiming for. He isn’t the first person trying to give Alexa a face, though: there are already cute Furbys and creepy bunnies powered by Amazon’s AI, and we’ve even seen Alexa hooked up to an animatronic fish.

Continue reading “Animatronic Alexa Gives Amazon’s Echo A Face”

Pill Bugs And Chitons Get Jobs As Tiny Grippers

A research paper titled Biological Organisms as End Effectors explores the oddball approach of giving small animals jobs as grippers at the end of a robotic arm. Researchers show that pill bugs and chitons — small creatures with exoskeletons and reflexive movements — have behaviors making them useful as grippers, with no harm done to the creatures in the process. The prototypes are really just proofs of concept, but it’s a novel idea that does work in at least a simple way.

Pill bugs reflexively close, and in the process can grasp and hold lightweight objects. The release is simply a matter of time; researchers say that after about 115 seconds a held object is released naturally when the pill bug’s shell opens. While better control over release would be good, the tests show basic functionality is present.

The chiton — a small mollusk — can grip underwater.

Another test involves the chiton, a small mollusk that attaches to things with suction and can act as an underwater end effector in a similar way. Interestingly, a chiton is able to secure itself to wood and cork; materials that typical suction cups do not work on.

A chiton also demonstrates the ability to manipulate a gripped object’s orientation. Chitons seek dark areas, so by shining light researchers could control in which direction the creature attempts to “walk”, which manipulates the held object. A chiton’s grip is strong, but release was less predictable than with pill bugs. It seems chitons release an object more or less when they feel like it.

This concept may remind readers somewhat grimly of grippers made from dead spiders, but researchers emphasize that we have an imperative to not mistreat these living creatures, but to treat them carefully as we temporarily employ them in much the same manner as dog sleds or horses have been used for transportation, or carrier pigeons for messages. Short videos of both pill bug and chiton grippers are embedded below, just under the page break.

Continue reading “Pill Bugs And Chitons Get Jobs As Tiny Grippers”

DIY Robotic Actuator Built For Walking Robots

[Aaed Musa] has built a variety of robots over the years, but found off-the-shelf servos to be underwhelming for his work. Thus, he set out to build a better actuator to support his goals of building a high-performance walking bot in future.

[Aaed] decided to try and build a quasi-direct drive actuator, similar to those used in MIT’s agile mini Cheetah robot. It consists of a powerful brushless DC motor driving a 9:1 planetary gear reduction built with 3D printed parts, which provides high torque output. It’s designed to be run with an ODrive S1 motor controller with encoder feedback for precise control.

The actuator weighs in at a total of 935 grams. It’s not cheap, with the bill of materials totaling just under $250. For your money, though, you get a responsive robotic actuator with a hefty holding torque of over 16 Nm, which [Aaed] demonstrates by having the actuator shake around some dumbells on a long lever arm.

Walking robots have exploded in popularity ever since Spot hit the scene. We’ve seen everything from complex builds to super-simple single-servo designs.

Continue reading “DIY Robotic Actuator Built For Walking Robots”

A wooden robot with a large fresnel lens in a sunny garden

Gardening Robot Uses Sunlight To Incinerate Weeds

Removing weeds is a chore few gardeners enjoy, as it typically involves long sessions of kneeling in the dirt and digging around for anything you don’t remember planting. Herbicides also work, but spraying poison all over your garden comes with its own problems. Luckily, there’s now a third option: [NathanBuildsDIY] designed and built a robot to help him get rid of unwanted plants without getting his hands dirty.

Constructed mostly from scrap pieces of wood and riding on a pair of old bicycle wheels, the robot has a pretty low-tech look to it. But it is in fact a very advanced piece of engineering that uses multiple sensors and actuators while running on a sophisticated software platform. The heart of the system is a Raspberry Pi, which drives a pair of DC motors to move the whole system along [Nathan]’s garden while scanning the ground below through a camera.

Machine vision software identifying a weed in a picture of garden soilThe Pi runs the camera’s pictures through a TensorFlow Lite model that can identify weeds. [Nathan] built this model himself by taking hundreds of pictures of his garden and manually sorting them into categories like “soil”, “plant” and “weed”. Once a weed has been detected, the robot proceeds to destroy it by concentrating sunlight onto it through a large Fresnel lens. The lens is mounted in a frame that can be moved in three dimensions through a set of servos. A movable lens cover turns the incinerator beam on or off.

Sunlight is focused onto the weed through a simple but clever two-step procedure. First, the rough position of the lens relative to the sun is adjusted with the help of a sun tracker made from four light sensors arranged around a cross-shaped cardboard structure. Then, the shadow cast by the lens cover onto the ground is observed by the Pi’s camera and the lens is focused by adjusting its position in such a way that the image formed by four holes in the lens cover ends up right on top of the target.

Once the focus is correct, the lens cover is removed and the weed is burned to a crisp by the concentrated sunlight. It’s pretty neat to see how well this works, although [Nathan] recommends you keep an eye on the robot while it’s working and don’t let it near any flammable materials. He describes the build process in full detail in his video (embedded below), hopefully enabling other gardeners to make their own, improved weed burner robots. Agricultural engineers have long been working on automatic weed removal, often using similar machine vision systems with various extermination methods like lasers or flamethrowers.

Continue reading “Gardening Robot Uses Sunlight To Incinerate Weeds”

MeArm 3.0: The Pocket-Sized Robot Arm

We all might dream of having an industrial robot arm at our disposal, complete with working controller that doesn’t need constant maintenance and replacement parts, and which is able to help us with other projects with only a minimum of coding or instruction. That’s a pipe dream for most of us, as without a large space, sufficient funding, or unlimited amounts of troubleshooting time we’ll almost always have to look for something smaller and simpler. Perhaps something even as small as this pocket-sized robotic arm.

This isn’t actually the first time we’ve seen the MeArm; the small robot has been around since 2014 and has undergone a number of revisions and upgrades. Even this revision has been out for a little while now but this latest in the series is now available with a number of improvements over the older models. The assembly time required has been reduced from two hours to about 30 minutes and the hardware has even been fully open-sourced as well which allows virtually anyone with the prerequisite tools to build this tiny robot for whatever they happen to need it for, due to its very permissive licensing.

The linked Instructable goes into every detail needed for building the robot as well as documenting all of the parts needed, although you will need access to some specialty tools to make a lot of them. We also featured a Friday Hack Chat about these robots back in 2018 that has some interesting details about these robots in it, and although this is a relatively small robot in the grand scheme of things it’s always possible to upgrade to something larger in the future.

Continue reading “MeArm 3.0: The Pocket-Sized Robot Arm”

Robodog Goes Free Thanks To Unofficial SDK

What’s better than a pretty nice legged robot? One with an alternate SDK version that opens up expensive features, of course. The author didn’t like that the original SDK only came as pre-compiled binaries restricted to the most expensive models, so rolled up their sleeves and started writing a new one.

The manufacturer’s SDK limits access to programmatic functions, but that needn’t stop you.

There are a number of commercially-available robotic quadrupeds that can trace their heritage back to the MIT Mini Cheetah design, and one of them is the Unitree Go1 series which sports a distinctive X-shaped sensor cluster on its “face”. The basic models are affordable (as far as robots go, anyway) but Unitree claims only the high-priced EDU model can be controlled via the SDK. Happily, the Free Dog SDK provides a way to do exactly that.

The SDK is a work in progress, but fully usable and allows the user to send various high level and low level commands to the Go1 robots. High level examples include things like telling the robot to perform pushups, turn 90 degrees, or walk. Low level commands are things like specifying exact positions or torque levels for individual limbs. With the new SDK, doing those things programmatically is only a Python script away.

Know any other robots that might be based on the same system? This SDK might work on them, too.

The Many Robots That Ventured Into The Chernobyl NPP #4 Reactor

Before the Chernobyl Nuclear Power Plant (ChNPP, spelled ‘Chornobyl’ in Ukrainian) disaster in 1986, there had been little need for radiation-resistant robots to venture into high-risk zones.

The MF-2 Joker, also used for clearing debris at the Chernobyl NPP #4 disaster site.
The MF-2 Joker, also used for clearing debris at the Chernobyl NPP #4 disaster site.

Yet in the aftermath of the massive steam explosion at the #4 reactor that ripped the building apart — and spread radioactive material across the USSR and Europe — such robots were badly needed to explore and provide clean-up services. The robots which were developed and deployed in a rush are the subject of a recent video by [The Chornobyl Family].

While some robots were more successful than others, with the MF-2 remote mine handling robot suffering electronic breakdowns, gradually the robots became more refined. As over the years the tasks shifted from disaster management to clean-up and management of the now entombed #4 reactor, so too did the robots. TR-4 and TR-5 were two of the later robots that were developed to take samples of material within the stricken reactor, with many more generations to follow.

The video also reveals the fate of many of these robots. Some are buried in a radioactive disposal site, others are found on the Pripyat terrain, whether set up as a tourist piece, or buried in shrubbery. What’s beyond doubt is that it are these robots that provided invaluable help and saved countless lives, thanks to the engineers behind them.

Continue reading “The Many Robots That Ventured Into The Chernobyl NPP #4 Reactor”