Next-Gen Autopilot Puts A Robot At The Controls

While the concept of automotive “autopilots” are still in their infancy, pretty much any aircraft larger than an ultralight will have some mechanism to at least hold a fixed course and altitude. Typically the autopilot system is built into the airplane’s controls, but this new system replaces the pilot themselves in a manner reminiscent of the movie Airplane.

The robot pilot, known as PIBOT, uses both AI and robotics technology to fly the airplane without altering the aircraft. Unlike a normal autopilot system, this one can be fed the aircraft’s manuals in natural language, understand them, and use that information to fly the airplane. That includes operating any of the aircraft’s cockpit controls, not just the control column and pedal assembly. Supposedly, the autopilot can handle everything from takeoff to landing, and operate capably during heavy turbulence.

The Korea Advanced Institute of Science and Technology (KAIST) research team that built the machine hopes that it will pave the way for more advanced autopilot systems, and although this one has only been tested in simulators so far it shows enormous promise, and even has certain capabilities that go far beyond human pilots’ abilities including the ability to remember a much wider variety of charts. The team also hopes to eventually migrate the technology to the land, especially military vehicles, although we’ve seen how challenging that can be already.

Several video clips of a robot arm manipulating objects in a kitchen environment, demonstrating some of the 12 generalized skills

RoboAgent Gets Its MT-ACT Together

Researchers at Carnegie Mellon University have shared a pre-print paper on generalized robot training within a small “practical data budget.” The team developed a system that breaks movement tasks into 12 “skills” (e.g., pick, place, slide, wipe) that can be combined to create new and complex trajectories within at least somewhat novel scenarios, called MT-ACT: Multi-Task Action Chunking Transformer. The authors write:

Trained merely on 7500 trajectories, we are demonstrating a universal RoboAgent that can exhibit a diverse set of 12 non-trivial manipulation skills (beyond picking/pushing, including articulated object manipulation and object re-orientation) across 38 tasks and can generalize them to 100s of diverse unseen scenarios (involving unseen objects, unseen tasks, and to completely unseen kitchens). RoboAgent can also evolve its capabilities with new experiences.

Continue reading “RoboAgent Gets Its MT-ACT Together”

Angry Robot Face Is Less Than Friendly

Sometimes you just need to create a creepy robot head and give it an intimidating personality. [Jens] has done just that, and ably so, with his latest eerie creation.

The robot face is introduced to us with a soundtrack befitting Stranger Things, or maybe Luke Million. The build was inspired by The Doorman, a creepy art piece with animatronic eyes. [Jens’] build started with a 3D model of a 3D mask, with the eyes and mouth modified to have rectangular cutouts for LED displays. The displays are run by a Raspberry Pi Pico, which generates a variety of eye and mouth animations. It uses a camera for face tracking, so the robot’s evil eyes seem to follow the viewer as they move around. In good form, the face has a simple switch—from good to evil, happy to angry. Or, as [Jens] designates the modes: “Fren” and “Not Fren.”

[Jens] does a great job explaining the build, and his acting at the end of the video is absolutely worth a chuckle. Given Halloween is around the corner, why not build five to eight of these, and hide them in your roommate’s bedroom?

Video after the break.
Continue reading “Angry Robot Face Is Less Than Friendly”

Ask Hackaday: What’s The Deal With Humanoid Robots?

When the term ‘robot’ gets tossed around, our minds usually race to the image of a humanoid machine. These robots are a fixture in pop culture, and often held up as some sort of ideal form.

Yet, one might ask, why the fixation? While we are naturally obsessed with recreating robots in our own image, are these bipedal machines the perfect solution we imagine them to be?

Continue reading “Ask Hackaday: What’s The Deal With Humanoid Robots?”

A New Educational Robotics Platform

When looking for electronics projects to use in educational settings, there is no shortage of simple, lightweight, and easily-accessible systems to choose from. From robotic arms, drones, walking robots, and wheeled robots, there is a vast array of options. But as technology marches on, the robotics platforms need to keep up as well. This turtle-style wheeled robot called the Trundlebot uses the latest in affordable microcontrollers on a relatively simple, expandable platform for the most up-to-date educational experience.

The robot is built around a Raspberry Pi Pico, with two low-cost stepper motors to drive the wheeled platform. The chassis can be built out of any material that can be cut in a laser cutter, but for anyone without this sort of tool it is also fairly easy to cut the shapes out by hand. The robot’s functionality can be controlled through Python code, and it is compatible with the WizFi360-EVB-Pico which allows it to be remote controlled through a web application. The web interface allows easy programming of commands for the Trundlebot, including a drag-and-drop feature for controlling the robot.

With all of these features, wireless connectivity, and a modern microcontroller at the core, it is an excellent platform for educational robotics. From here it wouldn’t be too hard to develop line-follower robots, obstacle-avoiding robots, or maze-solving robots. Other components can easily be installed to facilitate these designs as well. If you’re looking for a different style robot, although not expressly for educational purposes this robotic arm can be produced for under $60.

MeArm 3.0: The Pocket-Sized Robot Arm

We all might dream of having an industrial robot arm at our disposal, complete with working controller that doesn’t need constant maintenance and replacement parts, and which is able to help us with other projects with only a minimum of coding or instruction. That’s a pipe dream for most of us, as without a large space, sufficient funding, or unlimited amounts of troubleshooting time we’ll almost always have to look for something smaller and simpler. Perhaps something even as small as this pocket-sized robotic arm.

This isn’t actually the first time we’ve seen the MeArm; the small robot has been around since 2014 and has undergone a number of revisions and upgrades. Even this revision has been out for a little while now but this latest in the series is now available with a number of improvements over the older models. The assembly time required has been reduced from two hours to about 30 minutes and the hardware has even been fully open-sourced as well which allows virtually anyone with the prerequisite tools to build this tiny robot for whatever they happen to need it for, due to its very permissive licensing.

The linked Instructable goes into every detail needed for building the robot as well as documenting all of the parts needed, although you will need access to some specialty tools to make a lot of them. We also featured a Friday Hack Chat about these robots back in 2018 that has some interesting details about these robots in it, and although this is a relatively small robot in the grand scheme of things it’s always possible to upgrade to something larger in the future.

Continue reading “MeArm 3.0: The Pocket-Sized Robot Arm”

Robodog Goes Free Thanks To Unofficial SDK

What’s better than a pretty nice legged robot? One with an alternate SDK version that opens up expensive features, of course. The author didn’t like that the original SDK only came as pre-compiled binaries restricted to the most expensive models, so rolled up their sleeves and started writing a new one.

The manufacturer’s SDK limits access to programmatic functions, but that needn’t stop you.

There are a number of commercially-available robotic quadrupeds that can trace their heritage back to the MIT Mini Cheetah design, and one of them is the Unitree Go1 series which sports a distinctive X-shaped sensor cluster on its “face”. The basic models are affordable (as far as robots go, anyway) but Unitree claims only the high-priced EDU model can be controlled via the SDK. Happily, the Free Dog SDK provides a way to do exactly that.

The SDK is a work in progress, but fully usable and allows the user to send various high level and low level commands to the Go1 robots. High level examples include things like telling the robot to perform pushups, turn 90 degrees, or walk. Low level commands are things like specifying exact positions or torque levels for individual limbs. With the new SDK, doing those things programmatically is only a Python script away.

Know any other robots that might be based on the same system? This SDK might work on them, too.