Dashing Diademata Delivers Second Generation ROS

A simple robot that performs line-following or obstacle avoidance can fit all of its logic inside a single Arduino sketch. But as a robot’s autonomy increases, its corresponding software gets complicated very quickly. It won’t be long before diagnostic monitoring and logging comes in handy, or the desire to encapsulate feature areas and orchestrate how they work together. This is where tools like the Robot Operating System (ROS) come in, so we don’t have to keep reinventing these same wheels. And Open Robotics just released ROS 2 Dashing Diademata for all of us to use.

ROS is an open source project that’s been underway since 2007 and updated regularly, each named after a turtle species. What makes this one worthy of extra attention? Dashing marks the first longer term support (LTS) release of ROS 2, a refreshed second generation of ROS. All high level concepts stayed the same, meaning almost everything in our ROS orientation guide is still applicable in ROS 2. But there were big changes under the hood reflecting technical advances over the past decade.

ROS was built in an age where a Unix workstation cost thousands of dollars, XML was going to be how we communicate all data online, and an autonomous robot cost more than a high-end luxury car. Now we have $35 Raspberry Pi running Linux, XML has fallen out of favor due to processing overhead, and some autonomous robots are high-end luxury cars. For these and many other reasons, the people of Open Robotics decided it was time to make a clean break from legacy code.

The break has its detractors, as it meant leaving behind the vast library of freely available robot intelligence modules released by researchers over the years. Popular ones were (or will be) ported to ROS 2, and there is a translation bridge sufficient to work with some, but the rest will be left behind. However, this update also resolved many of the deal-breakers preventing adoption outside of research, making ROS more attractive for commercial investment which should bring more robots mainstream.

Judging by responses to the release announcement, there are plenty of people eager to put ROS 2 to work, but it is not the only freshly baked robotics framework around. We just saw Nvidia release their Isaac Robot Engine tailored to make the most of their Jetson hardware.

DARPA Goes Underground For Next Challenge

We all love reading about creative problem-solving work done by competitors in past DARPA robotic challenges. Some of us even have ambition to join the fray and compete first-hand instead of just reading about them after the fact. If this describes you, step on up to the DARPA Subterranean Challenge.

Following up on past challenges to build autonomous vehicles and humanoid robots, DARPA now wants to focus collective brainpower solving problems encountered by robots working underground. There will be two competition tracks: the Systems Track is what we’ve come to expect, where teams build both the hardware and software of robots tackling the competition course. But there will also be a Virtual Track, opening up the challenge to those without resources to build big expensive physical robots. Competitors on the virtual track will run their competition course in the Gazebo robot simulation environment. This is similar to the NASA Space Robotics Challenge, where algorithms competed to run a virtual robot through tasks in a simulated Mars base. The virtual environment makes the competition accessible for people without machine shops or big budgets. The winner of NASA SRC was, in fact, a one-person team.

Back on the topic of the upcoming DARPA challenge: each track will involve three sub-domains. Each of these have civilian applications in exploration, infrastructure maintenance, and disaster relief as well as the obvious military applications.

  • Man-made tunnel systems
  • Urban underground
  • Natural cave networks

There will be a preliminary circuit competition for each, spaced roughly six months apart, to help teams get warmed up one environment at a time. But for the final event in Fall of 2021, the challenge course will integrate all three types.

More details will be released on Competitor’s Day, taking place September 27th 2018. Registration for the event just opened on August 15th. Best of luck to all the teams! And just like we did for past challenges, we will excitedly follow progress. (And have a good-natured laugh at fails.)

Taking First Place At IMAV 2016 Drone Competition

The IMAV (International Micro Air Vehicle) conference and competition is a yearly flying robotics competition hosted by a different University every year. AKAMAV – a university student group at TU Braunschweig in Germany – have written up a fascinating and detailed account of what it was like to compete (and take first place) in 2016’s eleven-mission event hosted by the Beijing Institute of Technology.

AKAMAV’s debrief of IMAV 2016 is well-written and insightful. It covers not only the five outdoor and six indoor missions, but also details what it was like to prepare for and compete in such an intensive event. In their words, “If you share even a remote interest in flying robots and don’t mind the occasional spectacular crash, this place was Disney Land on steroids.”

Continue reading “Taking First Place At IMAV 2016 Drone Competition”

Robotic Farming, Aussie Style

Australian roboticists from the Queensland University of Technology have developed a prototype agricultural robot that uses machine vision to identify both weed and crop plants before either uprooting or poisoning the weeds or applying fertiliser to the crop.

The machine is a wide platform designed to straddle a strip of the field upon which it is working, with electric wheel motors for propulsion. It is solar-powered, and it is envisaged that a farm could have several of them continuously at work.

At a superficial level there is nothing new in the robot, its propulsion, or even the plant husbandry and weeding equipment. The really clever technology lies in the identification and classification of the plants it will encounter. It is on the success or failure of this in real farm environments that the robot’s future will hinge. The university’s next step will be to take it on-farm, and the ABC report linked above has a wonderfully pithy quote from a farmer on the subject. You can see the machine in action in the video below the break.

Farming robots have a significant following among the hardware hacker community, but it is possible that the machine-vision and plant-identifying abilities of this one would be beyond most hackers. However it is still an interesting project to watch, marking as it does a determined attempt to take the robot out of the lab and into real farm settings.

Continue reading “Robotic Farming, Aussie Style”

Soft Robot With Microfluidic Logic Circuit

Perhaps our future overlords won’t be made up of electrical circuits after all but will instead be soft-bodied like ourselves. However, their design will have its origins in electrical analogues, as with the Octobot.

The Octobot is the brainchild a team of Harvard University researchers who recently published an article about it in Nature. Its body is modeled on the octopus and is composed of all soft body parts that were made using a combination of 3D printing, molding and soft lithography. Two sets of arms on either side of the Octobot move, taking turns under the control of a soft oscillator circuit. You can see it in action in the video below.

Continue reading “Soft Robot With Microfluidic Logic Circuit”

Tissue-Engineered Soft Robot Swims Like A Stingray

We’re about to enter a new age in robotics. Forget the servos, the microcontrollers, the H-bridges and the steppers. Start thinking in terms of optogenetically engineered myocytes, microfabricated gold endoskeletons, and hydrodynamically optimized elastomeric skins, because all of these have now come together in a tissue-engineered swimming robotic stingray that pushes the boundary between machine and life.

In a paper in Science, [Kevin Kit Parker] and his team at the fantastically named Wyss Institute for Biologically Inspired Engineering describe the achievement. It turns out that the batoid fishes like skates and rays have a pretty good handle on how to propel themselves in water with minimal musculoskeletal and neurological requirements, and so they’re great model organisms for a tissue engineered robot.

The body is a laminate of silicone rubber and a collection of 200,000 rat heart muscle cells. The cardiomyocytes provide the contractile force, and the pattern in which they are applied to the 1/2″ (1.25cm) body allows for the familiar undulating motion of a stingray’s wings. A gold endoskeleton with enough stiffness to act as a spring is used to counter the contraction of the muscle fibers and reset the system for another wave. Very clever stuff, but perhaps the coolest bit is that the muscle cells are genetically engineered to be photosensitive, making the robofish controllable with pulses of light. Check out the video below to see the robot swimming through an obstacle course.

This is obviously far from a finished product, but the possibilities are limitless with this level of engineering, especially with a system that draws energy from its environment like this one does. Just think about what could be accomplished if a microcontroller could be included in that gold skeleton.

Continue reading “Tissue-Engineered Soft Robot Swims Like A Stingray”

Autonomous Truck Teaches Itself To Powerslide

When you’re a teenager new to the sensations of driving, it seems counterintuitive to “turn into the skid”, but once you’ve got a few winters of driving under your belt, you’re drifting like a pro. We learn by experience, and as it turns out, so does this fully autonomous power-sliding rally truck.

Figuring out how to handle friction-optional roadways is entirely the point of the AutoRally project at Georgia Tech, which puts a seriously teched-up 1/5 scale rally truck through its paces on an outdoor dirt track. Equipped with high-precision IMU, high-resolution GPS, dual front-facing cameras, and Hall-effect sensors on each wheel sampled at 70 Hz, the on-board Quad-core i7 knows exactly where the vehicle is and what the relationship between it and the track is at all times. There’s no external sensing or computing – everything needed to run the track is in the 21 kg truck. The video below shows how the truck navigates the oval track on its own with one simple goal – keep the target speed as close to 8 meters per second as possible. The truck handles the red Georgia clay like a boss, dealing not only with differing surface conditions but also with bright-to-dark lighting transitions. So far the truck only appears to handle an oval track, but our bet is that a more complex track is the next step for the platform.

While we really like the ride-on scale of this autonomous chase vehicle, other than that there haven’t been too many non-corporate self-driving vehicle hacks around here lately. Let’s hope that AutoRally is an indication that the hackers haven’t ceded the field to Google entirely. Why let them have all the fun?

Continue reading “Autonomous Truck Teaches Itself To Powerslide”