Currently underway is the DARPA Subterranean Challenge (SubT) systems competition for urban circuits streamed live on YouTube now through Wednesday, February 26th.
The DARPA Grand Challenge of 2004 kicked research and development of autonomous vehicles into high gear. Many components on today’s self-driving vehicles can be traced back to systems developed for that competition. Hoping to spur further development, DARPA has since held several more challenges focused moving the state of the art in autonomous robotics ahead.
To succeed in this challenge, robots must handle terrain that would confuse today’s self-driving cars. Cluttered environments, uneven surfaces of different materials, even the occasional flooded section are fair game. These robots also lose access to some of the tools previously available, such as GPS. The “systems track” denotes teams building physical robot systems versus a separate “virtual track” for simulation robots. “Urban circuit” is the second of four phases in this competition, environments of this phase are focused on man-made underground structures. (Think subway station.) For more details on this competition as well as description of various phases, see our introductory post or the competition site.
We all love reading about creative problem-solving work done by competitors in past DARPA robotic challenges. Some of us even have ambition to join the fray and compete first-hand instead of just reading about them after the fact. If this describes you, step on up to the DARPA Subterranean Challenge.
Following up on past challenges to build autonomous vehicles and humanoid robots, DARPA now wants to focus collective brainpower solving problems encountered by robots working underground. There will be two competition tracks: the Systems Track is what we’ve come to expect, where teams build both the hardware and software of robots tackling the competition course. But there will also be a Virtual Track, opening up the challenge to those without resources to build big expensive physical robots. Competitors on the virtual track will run their competition course in the Gazebo robot simulation environment. This is similar to the NASA Space Robotics Challenge, where algorithms competed to run a virtual robot through tasks in a simulated Mars base. The virtual environment makes the competition accessible for people without machine shops or big budgets. The winner of NASA SRC was, in fact, a one-person team.
Back on the topic of the upcoming DARPA challenge: each track will involve three sub-domains. Each of these have civilian applications in exploration, infrastructure maintenance, and disaster relief as well as the obvious military applications.
Man-made tunnel systems
Natural cave networks
There will be a preliminary circuit competition for each, spaced roughly six months apart, to help teams get warmed up one environment at a time. But for the final event in Fall of 2021, the challenge course will integrate all three types.
[adam] is a caver, meaning that he likes to explore caves and map their inner structure. This is still commonly done using traditional tools, such as notebooks (the paper ones), tape measure, compasses, and inclinometers. [adam] wanted to upgrade his equipment, but found that industrial LiDAR 3D scanners are quite expensive. His Hackaday Prize entry, the Open LIDAR, is an affordable alternative to the expensive industrial 3D scanning solutions out there.
LiDAR — Light Detection And Ranging — is the technology that senses the distance between a sensor and an object by reflectively measuring the time of flight of a light beam between the two. By acquiring a two-dimensional array of multiple distance readings, this can be used for 3D scanning. Looking at how the industrial LiDAR scanners capture the environment using fast spinning mirrors, [adam] realized that he could basically achieve the same by using a cheap laser range finder strapped to a pan and tilt gimbal.
The gimbal he designed for this task uses stepper motors to aim an SF30-B laser rangefinder. An Arduino controls the movement and lets the eye of the sensor scan an object or an entire environment. By sampling the distance readings returned by the sensor, a point cloud is created which then can be converted into a 3D model. [adam] plans to drive the stepper motors in microstepping mode to increase the resolution of his scanner. We’re looking forwards to see the first renderings of 3D cave maps captured with the Open LIDAR.