Atlas robot jumps over a gap

Boston Dynamics Atlas Dynamic Duo Tackles Obstacle Course

Historically, the capabilities of real world humanoid robots have trailed far behind their TV and movie counterparts. But roboticists kept pushing state of the art forward, and Boston Dynamics just shared a progress report: their research platform Atlas can now complete a two-robot parkour routine.

Watching the minute-long routine on YouTube (embedded after the break) shows movements more demanding than their dance to the song “Do You Love Me?  And according to Boston Dynamics, this new capability is actually even more impressive than it looks. Unlike earlier demonstrations, this routine used fewer preprogrammed motions that made up earlier dance performances. Atlas now makes more use of its onboard sensors to perceive its environment, and more of its onboard computing power to decide how to best move through the world on a case-by-case basis. It also needed to string individual actions together in a continuous sequence, something it had trouble doing earlier.

Such advances are hard to tell from a robot demonstration video, which are frequently edited and curated to show highlighted success and skip all the (many, many) fails along the way. Certainly Boston Dynamics did so themselves before, but this time it is accompanied by almost six minutes worth of behind-the-scenes footage. (Also after the break.) We see the robot stumbling as it learned, and the humans working to put them back on their feet.

Humanoid robot evolution has not always gone smoothly (sometimes entertainingly so) but Atlas is leaps and bounds over its predecessors like Honda Asimo. Such research finds its way to less humanoid looking robots like the Stretch. And who knows, maybe one day real robots will be like their TV and movie counterparts that have, for so long, been played by humans inside costumes.

Continue reading “Boston Dynamics Atlas Dynamic Duo Tackles Obstacle Course”

Boston Dynamics Stretch Robot Trades Lab Coat For Work Uniform

Boston Dynamics has always built robots with agility few others could match. While great for attention-getting demos, from outside the company it hasn’t been clear how they’ll translate acrobatic skills into revenue. Now we’re getting a peek at a plan in an interview with IEEE Spectrum about their new robot Stretch.

Most Boston Dynamics robots have been research projects, too expensive and not designed for mass production. The closest we got to date was Spot, which was offered for sale and picked up a few high profile jobs like inspecting SpaceX test sites. But Spot was still pretty experimental without an explicit application. In contrast, Stretch has a laser-sharp focus made clear by its official product page: this robot will be looking for warehouse jobs. Specifically, Stretch is designed to handle boxes up to 50 lbs (23 kg). Loading and unloading them, to and from pallets, conveyer belts, trucks, or shipping containers. These jobs are repetitive and tedious back-breaking work with a high injury rate, a perfect opportunity for robots.

But warehouse logistics aren’t as tightly structured as factory automation, demanding more adaptability than typical industrial robots can offer. A niche Boston Dynamics learned it can fill after releasing an earlier demo video showing their research robot Atlas moving some boxes around: they started receiving inquiries into how much that would cost. Atlas is not a product, but wheels were set in motion leading to their Handle robot. Learning from what Handle did well (and not well) in a warehouse environment, the designed evolved to today’s Stretch. The ostrich-like Handle prototype is now relegated to further research into wheeled-legged robots and the occasional fun dance video.

The Stretch preproduction prototypes visible in these videos lacks acrobatic flair of its predecessors, but they still have the perception and planning smarts that made those robots possible. Those skills are just being applied to a narrower problem scope. Once production models are on the job, we look forward to reading some work performance reviews.

Continue reading “Boston Dynamics Stretch Robot Trades Lab Coat For Work Uniform”

Manipulators Get A 1000x FPGA-based Speed Bump

For humans, moving our arms and hands onto an object to pick it up is pretty easy; but for manipulators, it’s a different story. Once we’ve found the object we want our robot to pick up, we still need to plan a path from our robot hand to the object all the while lugging the remaining limbs along for the ride without snagging them on any incoming obstacles. The space of all possible joint configurations is called the “joint configuration space.” Planning a collision-free path through them is called path planning, and it’s a tricky one to solve quickly in the world of robotics.

These days, roboticists have nailed out a few algorithms, but executing them takes 100s of milliseconds to compute. The result? Robots spend most of their time “thinking” about moving, rather than executing the actual move.

Robots have been lurching along pretty slowly for a while until recently when researchers at Duke University [PDF] pushed much of the computation to hardware on an FPGA. The result? Path planning in hardware with a 6-degree-of-freedom arm takes under a millisecond to compute!

It’s worth asking: why is this problem so hard? How did hardware make it faster? There’s a few layers here, but it’s worth investigating the big ones. Planning a path from point A to point B usually happens probabilistically (randomly iterating to the finishing point), and if there exists a path, the algorithm will find it. The issue, however, arises when we need to lug our remaining limbs through the space to reach that object. This feature is called the swept volume, and it’s the entire shape that our ‘bot limbs envelope while getting from A to B. This is not just a collision-free path for the hand, but for the entire set of joints.

swept_volume
Image Credit: Robot Motion Planning on a Chip

Encoding a map on a computer is done by discretizing the space into a sufficient resolution of 3D voxels. If a voxel is occupied by an obstacle, it gets one state. If it’s not occupied, it gets another. To compute whether or not a path is OK, a set of voxels that represent the swept volume needs to be compared against the voxels that represent the environment. Here’s where the FPGA kicks in with the speed bump. With the hardware implementation, voxel occupation is encoded in bits, and the entire volume calculation is done in parallel. Nifty to have custom hardware for this, right?

We applaud the folks at Duke University for getting this up-and-running, and we can’t wait to see custom “robot path-planning chips” hit the market some day. For now, though, if you’d like to sink your teeth into seeing how FPGAs can parallelize conventional algorithms, check out our linear-time sorting feature from a few months back.

Continue reading “Manipulators Get A 1000x FPGA-based Speed Bump”