Reverse Engineering A Robot Mower’s Fence

There are a variety of robot mower systems on the market employing different navigation methods, and [Eelco] has the story of how one of these was reverse engineered. Second hand Roomba lawnmowers kept appearing for very low prices without the electronics driving the buried-wire fence that keeps them from going astray. The story of their reverse engineering provides us with a handy insight into their operation.

The wire fence is a loop of wire in the ground, so it was modeled using a few-ohm resistor and the waveform across it from a working driver captured with an oscilloscope. The resulting 3 kHz waveform surprisingly to us at least doesn’t appear to encode any information, so it could be replicated easily enough with an ESP32 microcontroller. An LM386 audio amplifier drives the loop, and with a bit of amplitude adjustment the mower is quite happy in its fake fence.

Robot mower hacking has become quite the thing around here.

A camera-based microscope is on a stand, looking down towards a slide which is held on a plastic stage. The stage is held in place by three pairs of brass rods, which run to red plastic cranks mounted to three stepper motors. On the opposite side of each crank from the connecting rod is a semicircular array of magnets.

Designing An Open Source Micro-Manipulator

When you think about highly-precise actuators, stepper motors probably aren’t the first device that comes to mind. However, as [Diffraction Limited]’s sub-micron capable micro-manipulator shows, they can reach extremely fine precision when paired with external feedback.

The micro-manipulator is made of a mobile platform supported by three pairs of parallel linkages, each linkage actuated by a crank mounted on a stepper motor. Rather than attaching to the structure with the more common flexures, these linkages swivel on ball joints. To minimize the effects of friction, the linkage bars are very long compared to the balls, and the wide range of allowed angles lets the manipulator’s stage move 23 mm in each direction.

To have precision as well as range, the stepper motors needed closed-loop control, which a magnetic rotary encoder provides. The encoder can divide a single rotation of a magnet into 100,000 steps, but this wasn’t enough for [Diffraction Limited]; to increase its resolution, he attached an array of alternating-polarity magnets to the rotor and positioned the magnetic encoder near these. As the rotor turns, the encoder’s local magnetic field rotates rapidly, creating a kind of magnetic gear.

A Raspberry Pi Pico 2 and three motor drivers control this creation; even here, the attention to detail is impressive. The motor drivers couldn’t have internal charge pumps or clocked logic units, since these introduce tiny timing errors and motion jitter. The carrier circuit board is double-sided and uses through-hole components for ease of replication; in a nice touch, the lower silkscreen displays pin numbers.

To test the manipulator’s capabilities, [Diffraction Limited] used it to position a chip die under a microscope. To test its accuracy and repeatability, he traced the path a slicer generated for the first layer of a Benchy, vastly scaled-down, with the manipulator. When run slowly to reduce thermal drift, it could trace a Benchy within a 20-micrometer square, and had a resolution of about 50 nanometers.

He’s already used the micro-manipulator to couple an optical fiber with a laser, but [Diffraction Limited] has some other uses in mind, including maskless lithography (perhaps putting the stepper in “wafer stepper”), electrochemical 3D printing, focus stacking, and micromachining. For another promising take on small-scale manufacturing, check out the RepRapMicron.

Continue reading “Designing An Open Source Micro-Manipulator”

Robotic Canoe Puts Robot Arms To Work

Most robots get around with tracks or wheels, but [Dave] had something different in mind. Sufficiently unbothered by the prospect of mixing electronics and water, [Dave] augmented a canoe with twin, paddle-bearing robotic arms to bring to life a concept he had: the RowboBoat. The result? A canoe that can paddle itself with robotic arms, leaving the operator free to take a deep breath, sit back, and concentrate on not capsizing.

There are a couple of things we really like about this build, one of which is the tidiness of the robotic platform that non-destructively attaches to the canoe itself with custom brackets. A combination of aluminum extrusion and custom brackets, [Dave] designed it with the help of 3D scanning the canoe as a design aid. A canoe, after all, has nary a straight edge nor a right angle in sight. Being able to pull a 3D model into CAD helps immensely in such cases; we have also seen this technique used in refitting a van into an off-grid camper.

The other thing we like is the way that [Dave] drives the arms. The two PiPER robotic arms are driven with ROS, the Robot Operating System on a nearby Jetson Orin Nano SBC. The clever part is the way [Dave] observed that padding and steering a canoe has a lot in common with a differential drive, which is akin to how a tank works. And so, for propulsion, ROS simply treats the paddle-bearing arms as though they were wheels in a differential drive. The arms don’t seem to mind a little water, and the rest of the electronics are protected by a pair of firmly-crossed fingers.

The canoe steers by joystick, but being driven by ROS it could be made autonomous with a little more work. [Dave] has his configuration and code for RowboBoat up on GitHub should anyone wish to take a closer look. Watch it in action in the video, embedded below.

Continue reading “Robotic Canoe Puts Robot Arms To Work”

Animatronic Eyes Are Watching You

If you haven’t been following [Will Cogley]’s animatronic adventures on YouTube, you’re missing out. He’s got a good thing going, and the latest step is an adorable robot that tracks you with its own eyes.

Yes, the cameras are embedded inside the animatronic eyes.That was a lot easier than expected; rather than the redesign he was afraid of [Will] was able to route the camera cable through his existing animatronic mechanism, and only needed to hollow out the eyeball. The tiny camera’s aperture sits nigh-undetectable within the pupil.

On the software side, face tracking is provided by MediaPipe. It’s currently running on a laptop, but the plan is to embed a Raspberry Pi inside the robot at a later date. MediaPipe tracks any visible face and calculates the X and Y offset to direct the servos. With a dead zone at the center of the image and a little smoothing, the eye motion becomes uncannily natural. [Will] doesn’t say how he’s got it set up to handle more than one face; likely it will just stick with the first object identified.

Eyes aren’t much by themselves, so [Will] goes further by creating a little robot. The adorable head sits on a 3D-printed tapered roller bearing atop a very simple body. Another printed mechanism allows for pivot, and both axes are servo-controlled, bringing the total number of motors up to six. Tracking prefers eye motion, and the head pivots to follow to try and create a naturalistic motion. Judge for yourself how well it works in the video below. (Jump to 7:15 for the finished product.)

We’ve featured [Will]’s animatronic anatomy adventures before– everything from beating hearts, and full-motion bionic hands, to an earlier, camera-less iteration of the eyes in this project.

Don’t forget if you ever find yourself wading into the Uncanny Valley that you can tip us off to make sure everyone can share in the discomfort.

Continue reading “Animatronic Eyes Are Watching You”

A golden robotic hand is shown in the main picture performing the sign for the letter "g": pointing to the left, with all fingers except for the index finger curled. In the top left of the image, a human hand is shown imitating this position.

Ambidextrous Robot Hand Speaks In Signs

As difficult as it is for a human to learn ambidexterity, it’s quite easy to program into a humanoid robot. After all, a robot doesn’t need to overcome years of muscle memory. Giving a one-handed robot ambidexterity, however, takes some more creativity. [Kelvin Gonzales Amador] managed to do this with his ambidextrous robot hand, capable of signing in either left- or right-handed American Sign Language (ASL).

The essential ingredient is a separate servo motor for each joint in the hand, which allows each joint to bend equally well backward and forward. Nothing physically marks one side as the palm or the back of the hand. To change between left and right-handedness, a servo in the wrist simply turns the hand 180 degrees, the fingers flex in the other direction, and the transformation is complete. [Kelvin] demonstrates this in the video below by having the hand sign out the full ASL alphabet in both the right and left-handed configurations.

The tradeoff of a fully direct drive is that this takes 23 servo motors in the hand itself, plus a much larger servo for the wrist joint. Twenty small servo motors articulate the fingers, and three larger servos control joints within the hand. An Arduino Mega controls the hand with the aid of two PCA9685 PWM drivers. The physical hand itself is made out of 3D-printed PLA and nylon, painted gold for a more striking appearance.

This isn’t the first language-signing robot hand we’ve seen, though it does forgo the second hand. To make this perhaps one of the least efficient machine-to-machine communication protocols, you could also equip it with a sign language translation glove.

LeRobot Brings Autonomy To Hobby Robots

Robotic arms have a lot in common with CNC machines in that they are usually driven by a fixed script of specific positions to move to, and actions to perform. Autonomous behavior isn’t the norm, especially not for hobby-level robotics. That’s changing rapidly with LeRobot, an open-source machine learning framework from the Hugging Face community.

The SO-101 arm is an economical way to get started.

If a quick browse of the project page still leaves you with questions, you’re not alone. Thankfully, [Ilia] has a fantastic video that explains and demonstrates the fundamentals wonderfully. In it, he shows how LeRobot allows one to train an economical 3D-printed robotic arm by example, teaching it to perform a task autonomously. In this case, the task is picking up a ball and putting it into a cup.

[Ilia] first builds a dataset by manually operating the arm to pick up a ball and place it in a cup. Then, with a dataset consisting of only about fifty such examples, he creates a machine learning model capable of driving the arm to autonomously pick up a ball and place it in a cup, regardless of where the ball and cup actually are. It even gracefully handles things like color changes and [Ilia] moving the cup and ball around mid-task. You can skip directly to 34:16 to see this autonomous behavior in action, but we do recommend watching the whole video for a highly accessible yet deeply technical overview.

Continue reading “LeRobot Brings Autonomy To Hobby Robots”