Open Source Rover Gets An Update For Easier Building

Once upon a time, NASA-JPL put out a design for an open-source rocker-bogie rover. It was an impressive and capable thing, albeit a little expensive and difficult to build. Now, the open source community has dived in and refreshed the design, making it cheaper and more accessible than ever before.

Many parts of the original design have either become prohibitively expensive, gone out of stock, or been discontinued entirely. The new version, developed by the community that formed around the project, focuses on using off-the-shelf parts to bring costs down. Where the original design could cost as much as $3000 to build, the new model slashes that bill almost in half. It also eliminates any need for anything custom fabricated, with no machined or 3D printed parts required.

Other optimizations include cutting the rover’s head out from the basic model, as it’s not necessary for a great deal of applications. There is also better fluid and dust ingress protection, and improved serviceability. The entire rover model can also be loaded in OnShape for those desiring to inspect it or make their own modifications.

Parts lists are on GitHub for those desiring to build their own. Alternatively, check out the original design to learn more. Video after the break.

Continue reading “Open Source Rover Gets An Update For Easier Building”

Machine Learning Robot Runs Arduino Uno

When we think about machine learning, our minds often jump to datacenters full of sweating, overheating GPUs. However, lighter-weight hardware can also be used to these ends, as demonstrated by [Nikodem Bartnik] and his latest robot.

The robot is charged with autonomously navigating a simple racetrack delineated by cardboard barriers. The robot is based on a two-wheeled design with tank-style steering. Controlled by an Arduino Uno, the robot uses a Slamtec RPLIDAR sensor to help map out its surroundings. The microcontroller is also armed with a Bluetooth link and an SD card for storage.

The robot was first driven around the racetrack multiple times under manual control, all the while collecting LIDAR data. This data was combined with control inputs to help create a data set that could be used to train a machine learning model. Feature selection techniques were used to refine down the data points collected to those most relevant to completing the driving task. [Nikodem] explains how the model was created and then refined to drive the robot by itself in a variety of race track designs.

It’s a great primer on machine learning techniques applied to a small embedded platform.

Continue reading “Machine Learning Robot Runs Arduino Uno”

Big 3D Printed Hand Uses Big Servos, Naturally

[Ivan Miranda] isn’t afraid to dream big, and hopes to soon build a 3D printed giant robot he can ride around on. As the first step towards that goal, he’s built a giant printed hand big enough to hold a basketball.

The hand has fingers with several jointed segments, inspired by those wooden hand models sold as home decor at IKEA. The fingers are controlled via a toothed belt system, with two beefy 11 kg servos responsible for flexing each individual finger joint. A third 25 kg servo flexes the finger as a whole. [Ivan] does a good job of hiding the mechanics and wiring inside the structure of the hand itself, making an attractive robot appendage.

As with many such projects, control is where things get actually difficult. It’s one thing to make a robot hand flex its fingers in and out, and another thing to make it move in a useful, coordinated fashion. Regardless, [Ivan] is able to have the hand grip various objects, in part due to the usefulness of the hand’s opposable thumb. Future plans involve adding positional feedback to improve the finesse of the control system.

Building a good robot hand is no mean feat, and it remains one of the challenges behind building capable humanoid robots. Video after the break.

Continue reading “Big 3D Printed Hand Uses Big Servos, Naturally”

Next-Gen Autopilot Puts A Robot At The Controls

While the concept of automotive “autopilots” are still in their infancy, pretty much any aircraft larger than an ultralight will have some mechanism to at least hold a fixed course and altitude. Typically the autopilot system is built into the airplane’s controls, but this new system replaces the pilot themselves in a manner reminiscent of the movie Airplane.

The robot pilot, known as PIBOT, uses both AI and robotics technology to fly the airplane without altering the aircraft. Unlike a normal autopilot system, this one can be fed the aircraft’s manuals in natural language, understand them, and use that information to fly the airplane. That includes operating any of the aircraft’s cockpit controls, not just the control column and pedal assembly. Supposedly, the autopilot can handle everything from takeoff to landing, and operate capably during heavy turbulence.

The Korea Advanced Institute of Science and Technology (KAIST) research team that built the machine hopes that it will pave the way for more advanced autopilot systems, and although this one has only been tested in simulators so far it shows enormous promise, and even has certain capabilities that go far beyond human pilots’ abilities including the ability to remember a much wider variety of charts. The team also hopes to eventually migrate the technology to the land, especially military vehicles, although we’ve seen how challenging that can be already.

Several video clips of a robot arm manipulating objects in a kitchen environment, demonstrating some of the 12 generalized skills

RoboAgent Gets Its MT-ACT Together

Researchers at Carnegie Mellon University have shared a pre-print paper on generalized robot training within a small “practical data budget.” The team developed a system that breaks movement tasks into 12 “skills” (e.g., pick, place, slide, wipe) that can be combined to create new and complex trajectories within at least somewhat novel scenarios, called MT-ACT: Multi-Task Action Chunking Transformer. The authors write:

Trained merely on 7500 trajectories, we are demonstrating a universal RoboAgent that can exhibit a diverse set of 12 non-trivial manipulation skills (beyond picking/pushing, including articulated object manipulation and object re-orientation) across 38 tasks and can generalize them to 100s of diverse unseen scenarios (involving unseen objects, unseen tasks, and to completely unseen kitchens). RoboAgent can also evolve its capabilities with new experiences.

Continue reading “RoboAgent Gets Its MT-ACT Together”

Angry Robot Face Is Less Than Friendly

Sometimes you just need to create a creepy robot head and give it an intimidating personality. [Jens] has done just that, and ably so, with his latest eerie creation.

The robot face is introduced to us with a soundtrack befitting Stranger Things, or maybe Luke Million. The build was inspired by The Doorman, a creepy art piece with animatronic eyes. [Jens’] build started with a 3D model of a 3D mask, with the eyes and mouth modified to have rectangular cutouts for LED displays. The displays are run by a Raspberry Pi Pico, which generates a variety of eye and mouth animations. It uses a camera for face tracking, so the robot’s evil eyes seem to follow the viewer as they move around. In good form, the face has a simple switch—from good to evil, happy to angry. Or, as [Jens] designates the modes: “Fren” and “Not Fren.”

[Jens] does a great job explaining the build, and his acting at the end of the video is absolutely worth a chuckle. Given Halloween is around the corner, why not build five to eight of these, and hide them in your roommate’s bedroom?

Video after the break.
Continue reading “Angry Robot Face Is Less Than Friendly”

Hackaday Prize 2023: PAROL6 – A GPL Desktop Robotic Arm

Parol 6 is a 3D-printed six-axis robot arm created by [Petar Crnjak] as a combination of the principles from a few previous projects. Aside from a pneumatic gripper, each axis is driven by a stepper motor, with at least a few of these axes being driven through a metal planetary gearbox for extra precision and torque.

From what we can glean from the work-in-progress documentation, there are some belt drives on four of the relevant axes and a mix of NEMA17 format steppers driving either 20:1 or 10:1 reduction boxes. There appears to be a mix of inductive sensors and traditional microswitches used, but it’s not so easy to work out where these are placed. Continue reading “Hackaday Prize 2023: PAROL6 – A GPL Desktop Robotic Arm”