Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.

Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.

While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.

It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.

6 thoughts on “Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

  1. What, no comments at all? Seriously?

    My opinion, which you should all feel free to ridicule, is that these ‘operating systems’ are fine for research or toy robots where you want a quick and easy result, but not appropriate for serious machines that have safety issues. So much complexity creating nodes and log swapping in one single device would invite ever more frequent crashing as more nodes are added.

    So what happens when it crashes? Sure, it’s a pain in the butt to reboot, but what happens if, for some reason, it jams one of the motors ‘on’ rather than just switching everything off?

    Surely we’d want the simplest overall system possible with other systems checking them continuously for anomalies? For a start, the extremely complicated object recognition should be on it’s own system so that the machine can still function and be able to reboot a crashed status without driving randomly into a crowded mall?

    1. Indeed; they are not safe w/o redundancy and fail-safes…& lots of testing & lots of iterations. But…it seems Nvidia is pushing forward with their autonomous driving systems. It seems the most recent certification testing, they passed using dual Xavier units….if I recall correctly…(?)

      But learning on a $100 machine, a $500 machine, or a $1300 machine….it doesn’t matter…this all suits Nvidia. If people have products and projects that encourage the use of Cuda & proprietary Cuda cores, then it works to their benefit. Any way to get Nvidia products out there on a massive scale will only serve to benefit. Cause if a truly marketable driving solution is 5 years out, some Sr. in HS might have just purchased a nano and has Cuda going – 4 yrs if college with more and more Cuda, perhaps that’s the person who will write that critical component, that unique string of code, that sells a few million units.

      It’s a small investment where grandiose promises haven’t really been made – but at the same time, it facilitates specifically qualified workers for future developments…

      That aside, I’ve got my nano, my Kinect 2, some USB cameras, an SSD – I’ve been working on getting the Kinect 2 working….once it does….all that tech is getting stapled onto my vehicle, I’ll have an OBD-II interface going…. Once it’s all rigged up; it’s hands free driving on the Interstate for me!

    2. I know it’s a bit late to reply, but let me just leave my opinion here

      You don’t just plug your motors into the computer running such an OS. Nowadays, “simplest overall system possible” is just simple and cheap enough that every robot platform has one inside. You just give velocity command via LAN, Serial over USB, or any other mean the platform provides, and the platform will just move. Or not, if it detects any “anomaly”.

      State-of-the-art methods for robots are not easily computed on a microcontrollers. You could easily find “our method takes 0.x second on i7 computer thus it is safe to say it’s computed real time” OSLT on a thesis. As such, what happens on a microprocessor is usually trivial enough to be explicit about them.

      What ROS and Issac worry about is standardizing messages among the tremendous variations of sensors and actuators, so that you don’t need to write your algorithm every time you put a new sensor on your robot.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.