Pill Bugs And Chitons Get Jobs As Tiny Grippers

A research paper titled Biological Organisms as End Effectors explores the oddball approach of giving small animals jobs as grippers at the end of a robotic arm. Researchers show that pill bugs and chitons — small creatures with exoskeletons and reflexive movements — have behaviors making them useful as grippers, with no harm done to the creatures in the process. The prototypes are really just proofs of concept, but it’s a novel idea that does work in at least a simple way.

Pill bugs reflexively close, and in the process can grasp and hold lightweight objects. The release is simply a matter of time; researchers say that after about 115 seconds a held object is released naturally when the pill bug’s shell opens. While better control over release would be good, the tests show basic functionality is present.

The chiton — a small mollusk — can grip underwater.

Another test involves the chiton, a small mollusk that attaches to things with suction and can act as an underwater end effector in a similar way. Interestingly, a chiton is able to secure itself to wood and cork; materials that typical suction cups do not work on.

A chiton also demonstrates the ability to manipulate a gripped object’s orientation. Chitons seek dark areas, so by shining light researchers could control in which direction the creature attempts to “walk”, which manipulates the held object. A chiton’s grip is strong, but release was less predictable than with pill bugs. It seems chitons release an object more or less when they feel like it.

This concept may remind readers somewhat grimly of grippers made from dead spiders, but researchers emphasize that we have an imperative to not mistreat these living creatures, but to treat them carefully as we temporarily employ them in much the same manner as dog sleds or horses have been used for transportation, or carrier pigeons for messages. Short videos of both pill bug and chiton grippers are embedded below, just under the page break.

Continue reading “Pill Bugs And Chitons Get Jobs As Tiny Grippers”

DingoQuadruped Is A Cheap Canine-Like Robot

Robot humanoids are cool, but also a bit hard to make work as they only have two legs to stand on. Four-legged robots can be a bit more approachable. The Dingo Quadruped aims to be just such an open-source platform for teaching and experimentation purposes.

The robot is based on the Stanford Pupper, a robot platform we’ve discussed previously. It bears a design not dissimilar from the popular Spot robot from Boston Dynamics. Where Spot costs tens of thousands of dollars, though, Dingo is far cheaper, intended for cheap production by students and researchers for less than $1,500.

The robot weighs around 3 kg, and is approximately the size of a shoebox. Control over the robot is via a wireless game controller. Each leg uses three high-torque servo motors, which are elegantly placed to reduce the inertia of the leg itself. A Raspberry Pi runs the show, with an Arduino Nano also onboard for interfacing analog sensors or additional hardware. The chassis itself has a highly modular design, with a focus on making it easy to add additional hardware.

If you want to get started experimenting with quadruped robots, the Dingo might just be the perfect platform for you. Video after the break.

Continue reading “DingoQuadruped Is A Cheap Canine-Like Robot”

Hackaday Prize 2023: Bolt Bot Micro Servo Droids

This Hackaday prize entry from [saul] is the beginning of a reconfigurable kit of 3D printed parts and servo motors for robotics learning. With just access to a printer, a few cheap-as-chips servo motors, an Arduino, and some nuts and bolts, you could be hacking together robot walkers within a few hours of starting!

Bolt Bots is very simple to understand, with all the mechanics and wiring out there in the breeze, but strictly for indoor use we reckon. If you want to add remote control to your application, then drop in one of the ubiquitous nRF24L01 boards and build yourself a copy of the remote control [saul] handily provides in this other project.

There really isn’t a great deal we can say about this, as it’s essentially a build kit with quite a few configuration options, and you just have to build with it and see what’s possible. We expect the number of parts to proliferate over time giving even more options. So far [saul] demonstrates a few flavors of ‘walkers’, a rudimentary ‘robot arm’, and even a hanging drawbot.

The bolt hardware can be found in this GitHub repo, and the remote control code in this second one.

Servo-based designs are sometimes sneered at due to their dubious accuracy and repeatability, but with a little of effort, this can be vastly improved upon. Also, multi-legged walkers need multiple servos and controllers to drive ’em. Or do they?

Continue reading “Hackaday Prize 2023: Bolt Bot Micro Servo Droids”

$60 Robot Arm Is Compact

Thanks to 3D printing and inexpensive controllers, a robot arm doesn’t need to break the bank anymore. Case in point? [Build Some Stuff] did a good-looking compact arm with servos for under $60. The arm uses an interesting control mechanism, too.

Instead of the traditional joystick, the arm has a miniature arm with potentiometers at each joint instead of motors. By moving the model arm to different positions, the main arm will mimic your motions. It is similar to old control systems using a synchro (sometimes called a selsyn), but uses potentiometers and servo motors.

Continue reading “$60 Robot Arm Is Compact”

Design Files Released For The PR2 Robot

It’s always great fun to build your own robot. Sometimes, though, if you’re doing various projects or research, it’s easier to buy an existing robot and then use it to get down to business. That was very much the role of the Willow Garage PR2, but unfortunately, it’s no longer in production. However, as covered by The Robot Report, the design files have now been released for others to use.

The PR2 was built as an advanced platform with wide-ranging capabilities. It was able to manipulate objects with its 7-degrees-of-freedom end effectors, as well as visualize the real world with a variety of complex sensor packages. Researchers put it to work on a variety of tasks, from playing pool to fetching beers and even folding laundry. The latter one is still considered an unsolved problem that challenges even the best robots.

Rights to the PR2 robot landed in the hands of Clearpath Robotics, after Willow Garage was shut down in 2014. Clearpath is now providing access to the robot’s design files on its website. This includes everything from wiring diagrams and schematics, to assembly drawings, cable specs, and other background details. You’ll have to provide some personal information to get access, but the documentation you desire is all there.

We actually got our first look at the PR2 robot many years ago, way back in 2009. If you decide to build your own from scratch, be sure to hit us up on the tipsline.

Continue reading “Design Files Released For The PR2 Robot”

Teaching A Robot To Hallucinate

Training robots to execute tasks in the real world requires data — the more, the better. The problem is that creating these datasets takes a lot of time and effort, and methods don’t scale well. That’s where Robot Learning with Semantically Imagined Experience (ROSIE) comes in.

The basic concept is straightforward: enhance training data with hallucinated elements to change details, add variations, or introduce novel distractions. Studies show a robot additionally trained on this data performs tasks better than one without.

This robot is able to deposit an object into a metal sink it has never seen before, thanks to hallucinating a sink in place of an open drawer in its original training data.

Suppose one has a dataset consisting of a robot arm picking up a coke can and placing it into an orange lunchbox. That training data is used to teach the arm how to do the task. But in the real world, maybe there is distracting clutter on the countertop. Or, the lunchbox in the training data was empty, but the one on the counter right now already has a sandwich inside it. The further a real-world task differs from the training dataset, the less capable and accurate the robot becomes.

ROSIE aims to alleviate this problem by using image diffusion models (such as Imagen) to enhance the training data in targeted and direct ways. In one example, a robot has been trained to deposit an object into a drawer. ROSIE augments this training by inpainting the drawer in the training data, replacing it with a metal sink. A robot trained on both datasets competently performs the task of placing an object into a metal sink, despite the fact that a sink never actually appears in the original training data, nor has the robot ever seen this particular real-world sink. A robot without the benefit of ROSIE fails the task.

Here is a link to the team’s paper, and embedded below is a video demonstrating ROSIE both in concept and in action. This is also in a way a bit reminiscent of a plug-in we recently saw for Blender, which uses an AI image generator to texture entire 3D scenes with a simple text prompt.

Continue reading “Teaching A Robot To Hallucinate”

Tiny Robots That Bring Targeted Drug Delivery And Treatment A Little Bit Closer

Within the world of medical science fiction they are found everywhere: tiny robots that can zip through blood vessels and intestines, where they can deliver medication, diagnose medical conditions and even directly provide treatment. Although much of this is still firmly in the realm of science-fiction, researchers at Stanford published work last year on an origami-based type of robots, controlled using an external magnetic field. Details can be found in the Nature Communications paper. Continue reading “Tiny Robots That Bring Targeted Drug Delivery And Treatment A Little Bit Closer”