Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.

Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.

While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.

It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.

Robotic Cheetah Teaches A Motors Class

It seems like modern roboticists have decided to have a competition to see which group can develop the most terrifying robot ever invented. As of this writing the leading candidate seems to be the robot that can fuel itself by “eating” organic matter. We can only hope that the engineers involved will decide not to flesh that one out completely. Anyway, if we can get past the horrifying and/or uncanny valley-type situations we find ourselves in when looking at these robots, it turns out they have a lot to teach us about the theories behind a lot of complicated electric motors.

This research paper (gigantic PDF warning) focuses on the construction methods behind MIT’s cheetah robot. It has twelve degrees of freedom and uses a number of exceptionally low-cost modular actuators as motors to control its four legs. Compared to other robots of this type, this helps them jump a major hurdle of cost while still retaining an impressive amount of mobility and control. They were able to integrate a brushless motor, a smart ESC system with feedback, and a planetary gearbox all into the motor itself. That alone is worth the price of admission!

The details on how they did it are well-documented in the 102-page academic document and the source code is available on GitHub if you need a motor like this for any other sort of project, but if you’re here just for the cheetah doing backflips you can also keep up with the build progress at the project’s blog page. We also featured this build earlier in its history as well.

Use Movie Tools To Make Your Robot Move Like Movie Robots

Robots of the entertainment industry are given life by character animation, where the goal is to emotionally connect with the audience to tell a story. In comparison, real-world robot movement design focus more on managing physical limitations like sensor accuracy and power management. Tools for robot control are thus more likely to resemble engineering control consoles and not artistic character animation tools. When the goal is to build expressive physical robots, we’ll need tools like ROBiTS project to bridge the two worlds.

As an exhibitor at Maker Faire Bay Area 2019, this group showed off their first demo: a plugin to Autodesk Maya that translate joint movements into digital pulses controlling standard RC servos. Maya can import the same STL files fed to 3D printers, easily creating a digital representation of a robot. Animators skilled in Maya can then use all the tools they are familiar with, working in full context of a robot’s structure in the digital world. This will be a far more productive workflow for animation artists versus manipulating a long flat list of unintuitive slider controls or writing code by hand.

Of course, a virtual world offers some freedoms that are not available in the physical world. Real parts are not allowed to intersect, for one, and then there are other pesky physical limitations like momentum and center of gravity. Forgetting to account for them results in a robot that falls over! One of the follow-up projects on their to-do list is a bridge in the other direction: bringing physical world sensor like an IMU into digital representations in Maya.

We look forward to seeing more results on their YouTube channel. They join the ranks of other animated robots at Maker Faire and a promising addition to the toolbox for robot animation from Disney Research’s kinetic wires to Billy Whiskers who linked servos to Adobe Animate.

Continue reading “Use Movie Tools To Make Your Robot Move Like Movie Robots”

Teardown Video: What’s Inside The Self-Solving Rubik’s Cube Robot

You can find all kinds of robots at Bay Area Maker Faire, but far and away the most interesting bot this year is the Self-Solving Rubik’s Cube built by [Takashi Kaburagi]. Gently mix up the colored sides of the cube, set it down for just a moment, and it will spring to life, sorting itself out again.

I arrived at [Takashi’s] booth at just the right moment: as the battery died. You can see the video I recorded of the battery swap process embedded below. The center tile on the white face of the cube is held on magnetically. Once removed, a single captive screw (nice touch!) is loosened to lift off the top side. From there a couple of lower corners are lifted out to expose the tiny lithium cell and the wire connector that links it to the robot.

Regular readers will remember seeing this robot when we featured it in September. We had trouble learning details about the project at the time, but since then Takashi has shared much more about what went into it. Going back to 2017, the build started with a much larger 3D-printed version of a cube. With proof of concept in hand, the design was modeled in CAD to ensure everything had a carefully planned place. The result is a hand-wired robotic core that feels like science fiction but is very, very real.

I love seeing all of the amazing robots on the grounds of the San Mateo County Event Center this weekend. There is a giant mech wandering the parking lot at the Faire. There’s a whole booth of heavy-metal quadruped bots the size of dogs. And if you’re not careful where you walk you’ll step on a scaled-down Mars rover. These are all incredible, out of this world builds and I love them. But the mental leap of moving traditional cube-solvers inside the cube itself, and the craftsmanship necessary to succeed, make this the most under-appreciated engineering at this year’s Maker Faire Bay Area. I feel lucky to have caught it during a teardown phase! Let’s take a look.

Continue reading “Teardown Video: What’s Inside The Self-Solving Rubik’s Cube Robot”

Automate The Freight: Shipping Containers Sorted By Robot Stevedores

Towering behemoths are prowling the docks of Auckland, New Zealand, in a neverending shuffle of shipping containers, stacking and unstacking them like so many out-sized LEGO bricks. And they’re doing it all without human guidance.
It’s hard to overstate the impact containerized cargo has had on the modern world. The ability to load and unload ships laden with containers of standardized sizes rapidly with cranes, and then being able to plunk those boxes down onto a truck chassis or railcar carrier for land transportation has been a boon to the world’s economy, and it’s one of the main reasons we can order electronic doo-dads from China and have them show up at our doors essentially for free. At least eventually.
As with anything, solving one problem often creates other problems, and containerization is no different. The advantages of being able to load and unload one container rather than separately handling the dozen or more pallets that can fit inside it are obvious. But what then does one do with a dozen enormous containers? Or hundreds of them?
That’s where these giant self-driving cranes come in, and as we’ll see in this installment of “Automate the Freight”, these autonomous stevedores are helping ports milk as much value as possible out of containerization.

Continue reading “Automate The Freight: Shipping Containers Sorted By Robot Stevedores”

Nvidia Teaching Robots To Master IKEA Kitchens

The current wave of excitement around machine learning kicked off when graphics processors were repurposed to make training deep neural networks practical. Nvidia found themselves the engine of a new revolution and seized their opportunity to help push frontiers of research. Their research lab in Seattle will focus on one such field: making robots smart enough to work alongside humans in an IKEA kitchen.

Today’s robots are mostly industrial machines that require workspaces designed for robots. They run day and night, performing repetitive tasks, usually inside cages to keep squishy humans out of harm’s way. Robots will need to be a lot smarter about their surroundings before we could safely dismantle those cages. While there are some industrial robots making a start in this arena, they have a hard time justifying their price premium. (Example: financial difficulty of Rethink Robotics, who made the Baxter and Sawyer robots.)

So there’s a lot of room for improvement in this field, and this evolution will need a training environment offering tasks of varying difficulty levels for robots. Anywhere from the rigorous structured environment where robots work well today, to a dynamic unstructured environment where robots are hopelessly lost. Lab lead Dr. Dieter Fox explained how a kitchen is ideal. A meticulously cleaned and organized kitchen is very similar to an industrial setting. From there, we can gradually make a kitchen more challenging for a robot. For example: today’s robots can easily pick up a can with its rigid regular shape, but what about a half-full bag of flour? And from there, learn to pick up a piece of fresh fruit without bruising it. These tasks share challenges with many other tasks outside of a kitchen.

This isn’t about building a must-have home cooking robot, it’s about working through the range of challenges shared with common kitchen tasks. The lab has a lot of neat hardware, but its success will be measured by the software, and like all research, published results should be reproducible by other labs. You don’t have a high-end robotics lab in your house, but you do have a kitchen. That’s why it’s not just any kitchen, but an IKEA kitchen, to take advantage of the fact they are standardized, affordable, and available around the world for other robot researchers to benchmark against.

Most of us can experiment in a kitchen, IKEA or not. We have access to all the other tools we need: affordable AI hardware from Google, from Beaglebone, and from Nvidia. And we certainly have no shortage of robot arms and manipulators on these pages, ranging from a small laser-cut MeArm to our 2018 Hackaday Prize winner Dexter.

Robot Hummingbird Imitates Nature

Purdue’s Bio-Robotics lab has been working on a robotic hummingbird and, as you can see in the videos below, have had a lot of success. What’s more, is they’ve shared that success on GitHub. If you want to make a flapping-winged robot, this is definitely where you start.

If you’ve ever watched a hummingbird, you know their flight capability is nothing short of spectacular. The Purdue robot flies in a similar fashion (although on a tether to get both power and control information) and relies on each wing having its own motor. The motors not only propel the wings but also act as sensors. For example, they can detect if a wing is damaged, has made contact with something, or has changed performance due to atmospheric conditions.

In addition to the tethered control system, the hummingbird requires a motion capture sensor external to itself and some machine learning. Researchers note that there is sufficient payload capacity to put batteries onboard and they would also need additional sensors to accomplish totally free flight. It is amazing when you realize that a real hummingbird manages all this with a little bitty brain.

The published code is in Python and is part of three presentations later this month at a technical conference (the IEEE International Conference on Robotics and Automation).  If you don’t want to wait on the paper, there’s a post on IEEE Spectrum about the robotic beast, available now and that article contains preprint versions of the papers. The Python code does require a bit to run, so expect a significant flight computer.

The last hummingbird bot we saw was a spy. We’ve also seen robots that were like bees — sort of.

Continue reading “Robot Hummingbird Imitates Nature”