Several video clips of a robot arm manipulating objects in a kitchen environment, demonstrating some of the 12 generalized skills

RoboAgent Gets Its MT-ACT Together

Researchers at Carnegie Mellon University have shared a pre-print paper on generalized robot training within a small “practical data budget.” The team developed a system that breaks movement tasks into 12 “skills” (e.g., pick, place, slide, wipe) that can be combined to create new and complex trajectories within at least somewhat novel scenarios, called MT-ACT: Multi-Task Action Chunking Transformer. The authors write:

Trained merely on 7500 trajectories, we are demonstrating a universal RoboAgent that can exhibit a diverse set of 12 non-trivial manipulation skills (beyond picking/pushing, including articulated object manipulation and object re-orientation) across 38 tasks and can generalize them to 100s of diverse unseen scenarios (involving unseen objects, unseen tasks, and to completely unseen kitchens). RoboAgent can also evolve its capabilities with new experiences.

Continue reading “RoboAgent Gets Its MT-ACT Together”

Angry Robot Face Is Less Than Friendly

Sometimes you just need to create a creepy robot head and give it an intimidating personality. [Jens] has done just that, and ably so, with his latest eerie creation.

The robot face is introduced to us with a soundtrack befitting Stranger Things, or maybe Luke Million. The build was inspired by The Doorman, a creepy art piece with animatronic eyes. [Jens’] build started with a 3D model of a 3D mask, with the eyes and mouth modified to have rectangular cutouts for LED displays. The displays are run by a Raspberry Pi Pico, which generates a variety of eye and mouth animations. It uses a camera for face tracking, so the robot’s evil eyes seem to follow the viewer as they move around. In good form, the face has a simple switch—from good to evil, happy to angry. Or, as [Jens] designates the modes: “Fren” and “Not Fren.”

[Jens] does a great job explaining the build, and his acting at the end of the video is absolutely worth a chuckle. Given Halloween is around the corner, why not build five to eight of these, and hide them in your roommate’s bedroom?

Video after the break.
Continue reading “Angry Robot Face Is Less Than Friendly”

Hackaday Prize 2023: PAROL6 – A GPL Desktop Robotic Arm

Parol 6 is a 3D-printed six-axis robot arm created by [Petar Crnjak] as a combination of the principles from a few previous projects. Aside from a pneumatic gripper, each axis is driven by a stepper motor, with at least a few of these axes being driven through a metal planetary gearbox for extra precision and torque.

From what we can glean from the work-in-progress documentation, there are some belt drives on four of the relevant axes and a mix of NEMA17 format steppers driving either 20:1 or 10:1 reduction boxes. There appears to be a mix of inductive sensors and traditional microswitches used, but it’s not so easy to work out where these are placed. Continue reading “Hackaday Prize 2023: PAROL6 – A GPL Desktop Robotic Arm”

When The Sojourner Mars Rover Nearly Ran LISP

During the late 1980s NASA’s Jet Propulsion Laboratory (JPL) was busy developing the first ever wheeled robot that would roam the surface of Mars. Due to the long round-trip times of any signals between Mars and Earth, development of the firmware that would control the rover was a major point, with the two teams occupied with the task each picking different levels of autonomy for the rover. In a retrospective, [Ron Garrett] who worked at JPL on the ‘more autonomy’ team describes his recollections.

Whereas [Ron]’s team focused on creating a rover that could be provided with high-level instructions which the sophisticated LISP-based firmware would use as guidelines to navigate and operate by, the other team pursued a more limited autonomy approach whereby a human driver would use explicitly plan out the route which the rover would follow before awaiting new instructions.

Perhaps unsurprisingly, the system requirements for running LISP and the additional uncertainties and complexities with the autonomous approach, as well as testing and validating the firmware, resulted in the Sojourner Mars rover featuring the latter approach, with straightforward C-based firmware. Most of Sojourner’s autonomy was limited to a home return function if communication with the lander was lost, which limited both its range and operations during its 85-day extended mission.

As [Ron] covers with examples from later missions, one advantage of LISP is that it allows you to send instructions which can be interpreted (e.g. to debug the system) without having to program in such functionality explicitly. With later Mars rover missions much more of this autonomy that [Ron]’s team pioneered was implemented, although C remained the language of choice for these later rovers.

Heading image: Ron Garrett standing in front of the Robbie prototype. Rocky III can be see in the lower left, and above him are Rajiv Desai and Robert Ivlev, two other members of the team. (Credit: Ron Garret)

Star Wars Pit Droid Has A Jetson Brain

In the Star Wars universe, pit droids are little foldable robots that perform automated repairs on spacecraft and the like. They were introduced in 1999’s The Phantom Menace, and beyond the podracing scenes, are probably the only good thing to come out of that particular film.

[Goran Vuksic] wanted a pit droid of his own, and reasoned that if he was going to go through the trouble of sanding and painting all the 3D printed components so they look like the real bot, he might as well add some smarts to it. While this droid won’t be fixing podracers anytime soon, its onboard Jetson Orin Nano Developer Kit does pack a considerable amount of processing under that dome.

A webcam mounted in the bot’s eye socket is connected to the Jetson, which is running an image detection and identification routine based on the example code provided by NVIDIA. The single-board computer uses a relay to blink some LEDs on and off when a human is detected, and a pair of servos pan-and-tilt the bot’s head towards whoever has caught its gaze.

It’s no surprise that [Goran] picked the Jetson Orin over competing SBCs for this task — in our review of the Orin Nano Developer Kit a few months ago, we found it was able to hit nearly 200 frames per second while performing this sort of real-time image analysis. So there’s plenty of room to grow should he want to integrate more complex image recognition tasks.

For example, he could follow in the footsteps of [Kris Kersey], and put a functional data overlay on top of the video to give his bot Iron Man vision.

Continue reading “Star Wars Pit Droid Has A Jetson Brain”

Browser-Based Robot Dog Simulator In ~800 Lines Of Code

[Sergii] has been learning about robot simulation and wrote up a basic simulator for a robodog platform: the Unitree A1. It only took about 800 lines of code to do so, which probably makes it a good place to start if one is headed in a similar direction.

Right now, [Sergii]’s simulator is an interactive physics model than runs in the browser. Software-wise, once the model of the robot exists the Rapier JavaScript physics engine takes care of the physics simulation. The robot’s physical layout comes from the manufacturer’s repository, so it doesn’t need to be created from scratch.

To make the tool useful, the application has two models of the robot, side by side. The one on the left is the control model, and has interactive sliders for limb positions. All movements on the control model are transmitted to the model on the right, which is the simulation model, setting the pose. The simulation model is the one that actually models the physics and gravity of all the desired motions and positions. [Sergii]’s next step is to use the simulator to design and implement a simple walking gait controller, and we look forward to how that turns out.

If Unitree sounds familiar to you, it might be because we recently covered how an unofficial SDK was able to open up some otherwise-unavailable features on the robodogs, so check that out if you want to get a little more out of what you paid for.

Ask Hackaday: What’s The Deal With Humanoid Robots?

When the term ‘robot’ gets tossed around, our minds usually race to the image of a humanoid machine. These robots are a fixture in pop culture, and often held up as some sort of ideal form.

Yet, one might ask, why the fixation? While we are naturally obsessed with recreating robots in our own image, are these bipedal machines the perfect solution we imagine them to be?

Continue reading “Ask Hackaday: What’s The Deal With Humanoid Robots?”