Teaching A Robot To Hallucinate

Training robots to execute tasks in the real world requires data — the more, the better. The problem is that creating these datasets takes a lot of time and effort, and methods don’t scale well. That’s where Robot Learning with Semantically Imagined Experience (ROSIE) comes in.

The basic concept is straightforward: enhance training data with hallucinated elements to change details, add variations, or introduce novel distractions. Studies show a robot additionally trained on this data performs tasks better than one without.

This robot is able to deposit an object into a metal sink it has never seen before, thanks to hallucinating a sink in place of an open drawer in its original training data.

Suppose one has a dataset consisting of a robot arm picking up a coke can and placing it into an orange lunchbox. That training data is used to teach the arm how to do the task. But in the real world, maybe there is distracting clutter on the countertop. Or, the lunchbox in the training data was empty, but the one on the counter right now already has a sandwich inside it. The further a real-world task differs from the training dataset, the less capable and accurate the robot becomes.

ROSIE aims to alleviate this problem by using image diffusion models (such as Imagen) to enhance the training data in targeted and direct ways. In one example, a robot has been trained to deposit an object into a drawer. ROSIE augments this training by inpainting the drawer in the training data, replacing it with a metal sink. A robot trained on both datasets competently performs the task of placing an object into a metal sink, despite the fact that a sink never actually appears in the original training data, nor has the robot ever seen this particular real-world sink. A robot without the benefit of ROSIE fails the task.

Here is a link to the team’s paper, and embedded below is a video demonstrating ROSIE both in concept and in action. This is also in a way a bit reminiscent of a plug-in we recently saw for Blender, which uses an AI image generator to texture entire 3D scenes with a simple text prompt.

Continue reading “Teaching A Robot To Hallucinate”

Tiny Robots That Bring Targeted Drug Delivery And Treatment A Little Bit Closer

Within the world of medical science fiction they are found everywhere: tiny robots that can zip through blood vessels and intestines, where they can deliver medication, diagnose medical conditions and even directly provide treatment. Although much of this is still firmly in the realm of science-fiction, researchers at Stanford published work last year on an origami-based type of robots, controlled using an external magnetic field. Details can be found in the Nature Communications paper. Continue reading “Tiny Robots That Bring Targeted Drug Delivery And Treatment A Little Bit Closer”

Your Next Airport Meal May Be Delivered By Robot

Robot delivery has long been touted as a game-changing technology of the future. However, it still hasn’t cracked the big time. Drones still aren’t airdropping packages into our gutters by accident, nor are our pizzas brought to us via self-driving cars.

That’s not to say that able minds aren’t working on the problem. In one case, a group of engineers are working ton a robot that will handle the crucial duty of delivering food to hungry flyers at the airport.

Continue reading “Your Next Airport Meal May Be Delivered By Robot”

The Robots Of Fukushima: Going Where No Human Has Gone Before (And Lived)

The idea of sending robots into conditions that humans would not survive is a very old concept. Robots don’t heed oxygen, food, or any other myriad of human requirements. They can also be treated as disposable, and they can also be radiation hardened, and they can physically fit into small spaces. And if you just happen to be the owner of a nuclear power plant that’s had multiple meltdowns, you need robots. A lot of them. And [Asianometry] has provided an excellent synopsis of the Robots of Fukushima in the video below the break.

Starting with robots developed for the Three Mile Island incident and then Chernobyl, [Asianometry] goes into the technology and even the politics behind getting robots on the scene, and the crossover between robots destined for space and war, and those destined for cleaning up after a meltdown.

The video goes further into the challenges of putting a robot into a high radiation environment. Also interesting is the state of readiness, or rather the lack thereof, that prompted further domestic innovation.

Obviously, cleaning up a melted down reactor requires highly specialized robots. What’s more, robots that worked on one reactor didn’t work on others, creating the need for yet more custom built machines. The video discusses each, and even touches on future robots that will be needed to fully decommission the Fukushima facility.

For another look at some of the early robots put to work, check out the post “The Fukushima Robot Diaries” which we published over a decade ago.

Continue reading “The Robots Of Fukushima: Going Where No Human Has Gone Before (And Lived)”

Robots Are Folding Laundry, But They Suck At It

Robots are used in all sorts of industries on a wide variety of tasks. Typically, it’s because they’re far faster, more accurate, and more capable than we are. Expert humans could not compete with the consistent, speedy output of a robotic welder on an automotive production line, nor could they as delicately coat the chocolate on the back of a KitKat.

However, there are some tasks in which humans still have the edge. Those include driving, witty repartee, and yes, folding laundry. That’s not to say the robots aren’t trying, though, so let’s take a look at the state of the art.

Continue reading “Robots Are Folding Laundry, But They Suck At It”

Real Robot One Is… Real

Most of the robot arms we see are cool but little more than toys. Usually, they use RC servos to do motion and that’s great for making some basic motion, but if you want something more industrial and capable, check out [Pavel’s] RR1 — Real Robot One. The beefy arm has six degrees of freedom powered by stepper motors and custom planetary gearboxes. Each joint has an encoder for precise position feedback. The first prototype is already working, as you can see in the video below. Version two is forthcoming.

When you see the thing in action, you can immediately tell it isn’t a toy. There are four NEMA23 steppers and three smaller NEMA17 motors. While there are 3D printed parts, you can see a lot of metal in the build, also. You can see a video of the arm lifting up a 1 kilogram barbell and picking up a refreshing soft drink.

Continue reading “Real Robot One Is… Real”

Dead Spider Becomes Robot Gripper: It’s Necrobotics!

Robot arms and grippers do important work every hour of every day. They’re used in production lines around the world, toiling virtually ceaselessly outside of their designated maintenance windows.

They’re typically built out of steel, and powered by brawny hydraulic systems. However, some scientists have gone for a smaller scale approach that may horrify the squeamish. They’ve figured out how to turn a dead spider into a useful robotic gripper.

The name of this new Frankensteinian field? Why, it’s necrobotics, of course!

Continue reading “Dead Spider Becomes Robot Gripper: It’s Necrobotics!”