LEGO-Based Robot Arm With Motion Planning

Robotic arms have found all manner of applications in industry. Whether its welding cars, painting cars, or installing dashboards in cars, robotic arms can definitely do the job. However, you don’t need to be a major automaker to experiment with the technology. You can build your own, complete with proper motion planning, thanks to Arduino and ROS.

Motion planning is important, as it makes working with the robotic arm much easier. Rather than having to manually specify the rotation of each and every joint for every desired movement, instead mathematics is used to figure everything out. End effectors can be moved, and software will figure out the necessary motions required to achieve the end results. This functionality is baked into Robot Operating System (ROS) and proves useful to this project.

The construction of this particular arm is impressive in its simplicity, too. It has 7 degrees of freedom, which is plenty to play with. The arm is built out of LEGO Technic components, which are attached to the servos with the addition of some 3D printed components. It’s a smart and simple way to integrate the servos into the LEGO world, and we’re surprised we don’t see this more often.

Robotic arms remain an area of active research; there are even efforts to allow them to self-correct in the event of damage. Video after the break.

Continue reading “LEGO-Based Robot Arm With Motion Planning”

Dashing Diademata Delivers Second Generation ROS

A simple robot that performs line-following or obstacle avoidance can fit all of its logic inside a single Arduino sketch. But as a robot’s autonomy increases, its corresponding software gets complicated very quickly. It won’t be long before diagnostic monitoring and logging comes in handy, or the desire to encapsulate feature areas and orchestrate how they work together. This is where tools like the Robot Operating System (ROS) come in, so we don’t have to keep reinventing these same wheels. And Open Robotics just released ROS 2 Dashing Diademata for all of us to use.

ROS is an open source project that’s been underway since 2007 and updated regularly, each named after a turtle species. What makes this one worthy of extra attention? Dashing marks the first longer term support (LTS) release of ROS 2, a refreshed second generation of ROS. All high level concepts stayed the same, meaning almost everything in our ROS orientation guide is still applicable in ROS 2. But there were big changes under the hood reflecting technical advances over the past decade.

ROS was built in an age where a Unix workstation cost thousands of dollars, XML was going to be how we communicate all data online, and an autonomous robot cost more than a high-end luxury car. Now we have $35 Raspberry Pi running Linux, XML has fallen out of favor due to processing overhead, and some autonomous robots are high-end luxury cars. For these and many other reasons, the people of Open Robotics decided it was time to make a clean break from legacy code.

The break has its detractors, as it meant leaving behind the vast library of freely available robot intelligence modules released by researchers over the years. Popular ones were (or will be) ported to ROS 2, and there is a translation bridge sufficient to work with some, but the rest will be left behind. However, this update also resolved many of the deal-breakers preventing adoption outside of research, making ROS more attractive for commercial investment which should bring more robots mainstream.

Judging by responses to the release announcement, there are plenty of people eager to put ROS 2 to work, but it is not the only freshly baked robotics framework around. We just saw Nvidia release their Isaac Robot Engine tailored to make the most of their Jetson hardware.

Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.

Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.

While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.

It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.

Little Lamp To Learn Longer Leaps

Reinforcement learning is a subset of machine learning where the machine is scored on their performance (“evaluation function”). Over the course of a training session, behavior that improved final score is positively reinforced gradually building towards an optimal solution. [Dheera Venkatraman] thought it would be fun to use reinforcement learning for making a little robot lamp move. But before that can happen, he had to build the hardware and prove its basic functionality with a manual test script.

Inspired by the hopping logo of Pixar Animation Studios, this particular form of locomotion has a few counterparts in the natural world. But hoppers of the natural world don’t take the shape of a Luxo lamp, making this project an interesting challenge. [Dheera] published all of his OpenSCAD files for this 3D-printed lamp so others could join in the fun. Inside the lamp head is a LED ring to illuminate where we expect a light bulb, while also leaving room in the center for a camera. Mechanical articulation servos are driven by a PCA9685 I2C PWM driver board, and he has written and released code to interface such boards with Robot Operating System (ROS) orchestrating our lamp’s features. This completes the underlying hardware components and associated software foundations for this robot lamp.

Once all the parts have been printed, electronics wired, and everything assembled, [Dheera] hacked together a simple “Hello World” script to verify his mechanical design is good enough to get started. The video embedded after the break was taken at OSH Park’s Bring-A-Hack afterparty to Maker Faire Bay Area 2019. This motion sequence was frantically hand-coded in 15 minutes, but these tentative baby hops will serve as a great baseline. Future hopping performance of control algorithms trained by reinforcement learning will show how far this lamp has grown from this humble “Hello World” hop.

[Dheera] had previously created the shadow clock and is no stranger to ROS, having created the ROS topic text visualization tool for debugging. We will be watching to see how robot Luxo will evolve, hopefully it doesn’t find a way to cheat! Want to play with reinforcement learning, but prefer wheeled robots? Here are a few options.

Continue reading “Little Lamp To Learn Longer Leaps”

ROS Gets Quick Sensor Debugging In The Terminal

Sensors are critical in robotics. A robot relies on its sensor package to perform its programmed duties. If sensors are damaged or non-functional, the robot can perform unpredictably, or even fail entirely. [Dheera Venkatraman] has been working to make debugging sensor issues easier with the rosshow package for Robot Operating System.

Normally, if you want to be certain a camera feed is working on a robot, normally you’d have to connect a monitor and other peripherals, check manually, then put everything away again when you’re finished. [Dheera] considered this was altogether too much of a pain for basic sensor checks.

Instead, rosshow uses the power of SSH to speed things along. Log in to the robot, fire off a few command line instructions, and rosshow will start displaying sensor data in the terminal on your remote machine. It’s achieved through the use of Unicode Braille art in the terminal.  Sure, you won’t get a full-resolution feed from your high-definition camera, and the display from the laser scanner isn’t exactly perfect. But it’s enough to provide an instant verification that sensors are connected and working, and will speed up those routine is-it-connected checks by an order of magnitude.

Robot Operating System is a particularly useful platform if you’re thinking about the software platform for your next build. If you do put something together, be sure to let us know.

Buy Or Build An Autonomous Race Car To Take The Checkered Flag

Putting autonomous vehicles on public roads takes major resources beyond most of our means. But we can explore all the same general concepts at a smaller scale by modifying remote-control toy cars, limited only by our individual budgets and skill levels. For those of us whose interest and expertise lie in software, Amazon Web Services just launched AWS DeepRacer: a complete package for exploring machine learning on autonomous vehicles.

At a hardware level, the spec sheet makes it sound like they’ve bolted their AWS DeepLens machine vision computer on an 1/18th scale monster truck chassis. But the hardware is only the tip of the iceberg. The software behind DeepRacer is AWS RoboMaker, a set of tools for applying AWS to robot development. Everything from running digital simulations on AWS to training neural networks on AWS. Don’t know enough about machine learning? No problem! Amazon has also just opened up their internal training curriculum to the world. And to encourage participation, Amazon is running a DeepRacer League with races taking place both digitally online and physically at AWS Summit events around the world. They’ve certainly offered us a full plate at their re:Invent conference this week.

But maybe someone prefers not to use Amazon, or prefer to build their own hardware, or run their own competitions. Fortunately, Amazon is not the only game in town, merely the latest entry in an existing field. The DeepRacer’s League’s predecessor was the Robocar Rally, and the DeepRacer itself follows the Donkey Car. A do-it-yourself autonomous racing platform we first saw at Bay Area Maker Faire 2017, Donkey Car has since built up its documentation and software tools including a simulator. The default Donkey Car code is fairly specific to the car, but builders are certainly free to use something more general like the open source Robot Operating System and Gazebo robot simulator. (Which is what AWS RoboMaker builds on.)

So if the goal is to start racing little autonomous cars, we have options to buy pre-built hardware or enjoy the flexibility of building our own. Either way, it’s just another example of why this is a great time to get into neural networks, with or without companies like Amazon devising ways to earn our money. Of course, this isn’t the only Amazon project trying to build a business around an idea explored by an existing open source project. We had just talked about their AWS Ground Station offering which covers similar ground (sky?) as our 2014 Hackaday Prize winner SatNOGS.

Is 2018 Finally The Year Of Windows On The Robot?

Microsoft is bringing ROS to Window 10. ROS stands for Robot Operating System, a software framework and large collection of libraries for developing robots which we recently wrote an introductory article about, It’s long been primarily supported under Linux and Mac OS X, and even then, best under Ubuntu. My own efforts to get it working under the Raspbian distribution on the Raspberry Pi led me to instead download a Pi Ubuntu image. So having it running with the support of Microsoft on Windows will add some welcome variety.

TurtleBot 3 at ROSCon 2018
TurtleBot 3 at ROSCon 2018, Photo: Evan Ackerman/IEEE Spectrum

To announce it to the world, they had a small booth at the recent ROSCon 2018 in Madrid. There they showed a Robotis TurtleBot 3 robot running the Melodic Morenia release of ROS under Windows 10 IoT Enterprise on an Intel Coffee Lake NUC and with a ROS node incorporating hardware-accelerated Windows Machine Learning.

Why are they doing this? It may be to help promote their own machine learning products to roboticists and manufacturing. From their recent blog entry they say:

We’re looking forward to bringing the intelligent edge to robotics by bringing advanced features like hardware-accelerated Windows Machine Learning, computer vision, Azure Cognitive Services, Azure IoT cloud services, and other Microsoft technologies to home, education, commercial, and industrial robots.

Initially, they’ll support ROS1, the version most people will have used, but also have plans for ROS2. Developers will use Microsoft’s Visual Studio toolset. Thus far it’s an experimental release but you can give it a try by starting with the details here.

[Main Image Credit: Microsoft]