This Week In Security: Nvidia, Ransomware Retirement, And A TOCTOU Bug In Docker

Nvidia’s GeForce Experience (GFE) is the companion application for the Nvidia drivers, keeping said drivers up to date, as well as adding features around live streaming and media capture. The application runs as two parts, a GUI, and a system service, using an HTTP API to communicate. [David Yesland] from Rhino Security Labs decided to look into this API, searching for interesting, undocumented behavior, and shared the results on Sunday the 2nd.

The first interesting finding was that the service was written in Javascript and run using Node.js. Javascript is a scripting language, not a compiled language — the source code of the service was open for studying. This led to the revelation that API requests would be accepted from any origin, so long as the request included the proper security token. The application includes an update mechanism, which allows an authorized API call to execute an arbitrary system command. So long as the authentication token isn’t leaked to an attacker, this still isn’t a problem, right? Continue reading “This Week In Security: Nvidia, Ransomware Retirement, And A TOCTOU Bug In Docker”

Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.

Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.

While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.

It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.

Nvidia Teaching Robots To Master IKEA Kitchens

The current wave of excitement around machine learning kicked off when graphics processors were repurposed to make training deep neural networks practical. Nvidia found themselves the engine of a new revolution and seized their opportunity to help push frontiers of research. Their research lab in Seattle will focus on one such field: making robots smart enough to work alongside humans in an IKEA kitchen.

Today’s robots are mostly industrial machines that require workspaces designed for robots. They run day and night, performing repetitive tasks, usually inside cages to keep squishy humans out of harm’s way. Robots will need to be a lot smarter about their surroundings before we could safely dismantle those cages. While there are some industrial robots making a start in this arena, they have a hard time justifying their price premium. (Example: financial difficulty of Rethink Robotics, who made the Baxter and Sawyer robots.)

So there’s a lot of room for improvement in this field, and this evolution will need a training environment offering tasks of varying difficulty levels for robots. Anywhere from the rigorous structured environment where robots work well today, to a dynamic unstructured environment where robots are hopelessly lost. Lab lead Dr. Dieter Fox explained how a kitchen is ideal. A meticulously cleaned and organized kitchen is very similar to an industrial setting. From there, we can gradually make a kitchen more challenging for a robot. For example: today’s robots can easily pick up a can with its rigid regular shape, but what about a half-full bag of flour? And from there, learn to pick up a piece of fresh fruit without bruising it. These tasks share challenges with many other tasks outside of a kitchen.

This isn’t about building a must-have home cooking robot, it’s about working through the range of challenges shared with common kitchen tasks. The lab has a lot of neat hardware, but its success will be measured by the software, and like all research, published results should be reproducible by other labs. You don’t have a high-end robotics lab in your house, but you do have a kitchen. That’s why it’s not just any kitchen, but an IKEA kitchen, to take advantage of the fact they are standardized, affordable, and available around the world for other robot researchers to benchmark against.

Most of us can experiment in a kitchen, IKEA or not. We have access to all the other tools we need: affordable AI hardware from Google, from Beaglebone, and from Nvidia. And we certainly have no shortage of robot arms and manipulators on these pages, ranging from a small laser-cut MeArm to our 2018 Hackaday Prize winner Dexter.

AI At The Edge Hack Chat

Join us Wednesday at noon Pacific time for the AI at the Edge Hack Chat with John Welsh from NVIDIA!

Machine learning was once the business of big iron like IBM’s Watson or the nearly limitless computing power of the cloud. But the power in AI is moving away from data centers to the edge, where IoT devices are doing things once unheard of. Embedded systems capable of running modern AI workloads are now cheap enough for almost any hacker to afford, opening the door to applications and capabilities that were once only science fiction dreams.

John Welsh is a Developer Technology Engineer with NVIDIA, a leading company in the Edge computing space. He’ll be dropping by the Hack Chat to discuss NVIDIA’s Edge offerings, like the Jetson Nano we recently reviewed. Join us as we discuss NVIDIA’s complete Jetson embedded AI product line up, getting started with Edge AI, and where Edge AI is headed.

join-hack-chat

Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, May 1 at noon Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.

But Can Your AI Recognize Slugs?

The common garden slug is a mystery. Observing these creatures as they slowly emerge from their slimy lairs each evening, it’s hard to imagine how much damage they can do. With paradoxical speed, they can mow down row after row of tender seedlings, leaving nothing but misery in their mucusy wake.

To combat this slug menace, [Tegwyn☠Twmffat] (the [☠] is silent) is developing this AI-powered slug busting system. The squeamish or those challenged by the ethics of slug eradication can relax: no slugs have been harmed yet. So far [Tegwyn] has concentrated on the detection of slugs, a considerably non-trivial problem since there are few AI models that are already trained for slugs.

So far, [Tegwyn] has acquired 5,712 images of slugs in their natural environment – no mean feat as they only come out at night, they blend into their background, and their slimy surface makes for challenging reflections. The video below shows moderate success of the trained model using a static image of a slug; it also gives a glimpse at the hardware used, which includes an Nvidia Jetson TX2. [Tegwyn] plans to capture even more images to refine the model and boost it up from the 50 to 60% confidence level to something that will allow for the remediation phase of the project, which apparently involves lasers. Although he’s willing to entertain other methods of disposal; perhaps a salt-shooting turret gun?

This isn’t the first garden-tending project [Tegwyn] has tackled. You may recall The Weedinator, his 2018 Hackaday Prize entry. This slug buster is one of his entries for the 2019 Hackaday Prize, which was just announced. We’re looking forward to seeing the onslaught of cool new projects everyone will be coming up with.

Continue reading “But Can Your AI Recognize Slugs?”

Hands-On: New Nvidia Jetson Nano Is More Power In A Smaller Form Factor

Today, Nvidia released their next generation of small but powerful modules for embedded AI. It’s the Nvidia Jetson Nano, and it’s smaller, cheaper, and more maker-friendly than anything they’ve put out before.

The Jetson Nano follows the Jetson TX1, the TX2, and the Jetson AGX Xavier, all very capable platforms, but just out of reach in both physical size, price, and the cost of implementation for many product designers and nearly all hobbyist embedded enthusiasts.

The Nvidia Jetson Nano Developers Kit clocks in at $99 USD, available right now, while the production ready module will be available in June for $129. It’s the size of a stick of laptop RAM, and it only needs five Watts. Let’s take a closer look with a hands-on review of the hardware.

Continue reading “Hands-On: New Nvidia Jetson Nano Is More Power In A Smaller Form Factor”

Uncertain Future Of Orphaned Jibo Robots Presents Opportunities

In our modern connected age, our devices have become far more powerful and useful when they could draw upon resources of a global data network. The downside of a cloud-connected device is the risk of being over-reliant on computers outside of our own control. The people who brought a Jibo into their home got a stark reminder of this fact when some (but not all) Jibo robots gave their owners a farewell message as their servers are shut down, leaving behind little more than a piece of desktop sculpture.

Jibo launched their Indiegogo crowdfunding campaign with the tagline “The World’s First Social Robot For The Home.” Full of promises of how Jibo will be an intelligent addition to a high tech household, it has always struggled to justify its price tag. It cost as much as a high end robot vacuum, but without the house cleaning utility. Many demonstrations of a Jibo’s capabilities centered around its voice control, which an Amazon Echo or Google Home could match at a fraction of the price.

By the end of 2018, all assets and intellectual property have been sold to SQN Venture Partners. They have said little about what they planned to do with their acquisition. Some Jibo owner still hold hope that there’s still a bright future ahead. Both on the official forums (for however long that will stay running) and on unofficial channels like Reddit. Other owners have given up and unplugged their participation in this social home robotics experiment.

If you see one of these orphans in your local thrift store for a few bucks, consider adopting it. You could join the group hoping for something interesting down the line, but you’re probably more interested in its hacking potential: there is a Nvidia Jetson inside good for running neural networks. Probably a Tegra K1 variant, because Jibo used the Jetson TK1 to develop the robot before launch. Jibo has always promised a developer SDK for the rest of us to extend Jibo’s capabilities, but it never really materialized. The inactive Github repo mainly consists of code talking to servers that are now offline, not much dealing directly with the hardware.

Jibo claimed thousands were sold and, if they start becoming widely available inexpensively, we look forward to a community working to give new purpose to these poor abandoned robots. If you know of anyone who has done a teardown to see exactly what’s inside, or if someone has examined upgrade files to create custom Jibo firmware, feel free to put a link in the comments and help keep these robots out of e-waste.

If you want to experiment with power efficient neural network accelerators but rather work with an officially supported development platform, we’ve looked at the Jetson TK1 successors TX1 and TX2. And more recently, Google has launched one of their own, as has our friends at Beaglebone.