Sending teams of tiny drones to explore areas and structures is a staple in sci-fi and research, but the weight and size of sensors and the required processing power have long been a limiting factor. In the video below, a research team from [ETH Zurich] breaks through these limits, demonstrating indoor mapping with a swarm of tiny drones without dependence on any external systems.
The drone is the modular Crazyflie platform, which uses stackable PCBs (decks) to expand capabilities. The team added a Flow deck for altitude control and motion tracking, and a Loco positioning deck with a UWB module determining relative distances between drones. On top of this, the team added two custom decks. The first mounts four VL53L5CX 8×8 pixel TOF sensors for omnidirectional LIDAR scanning. The final deck does handles all the required processing with a GAP9 System-on-Chip, which features 10 RISC-V cores running on just 200 mW of power.
Of course the special sauce of this project lies in the software. The team developed a lightweight collaborative Simultaneous Localization And Mapping (SLAM) algorithm which can be distributed across all the drones in the swarm. It combines LIDAR scan data and the estimated position of the drone during the scan, and then overlays the data for the scans for each location across different drones, compensating for errors in the odometry data. The team also implemented inter-drone collision avoidance, packet collision avoidance and optimizing drones’ paths. The code is supposed to be available on GitHub, but the link was broken at the time of writing.
The Crazyflie platform has been around for more than a decade now, and we’ve seen it used in several research projects, especially related to autonomous navigation.
Are TOF cameras really considered to be LIDAR devices?
Yes
More like the other way around
RADAR but with a different wavelength of electromagnetic radiation, so yes.
What is the problem they are trying to solve here?
Mapping derelict spaceships.
Or terrorist organisations’ tunnels.
Could see it in use for Fukushima style relief where it is too hazardous for humans but too involved for floor crawling robots, which i hear was their problem
The hard radiation is having effects on the bot’s electronics, much like it would have on humans. So the bot’s electronics have to be radiation hardened and heavily shielded. That’s extra weight for a drone to carry, and likely why I haven’t seen any mention of flying drones being used there. May still be feasible tho…
Props may also kick up more radioactive dust more than a tracked device
30 years ago (!) i did a highschool project on automated robotic mapping (aka maze solving). i didn’t have much luck of course. and the literature i reviewed wasn’t very impressive either. and i knew a grad student trying to build a robotic indoor helicopter and i don’t think he had much luck either. and i knew a professor who was into swarms but iirc his swarms didn’t do anything other than ‘demonstrate surprising emergent behavior’ (stiquito).
crazy that putting all of this together with frickin lidar isn’t even particularly impressive today
I guess there’s a certain appeal to “egalitarian” swarm behavior, but I’m not sure it makes that much practical sense. I think I’d rather collect data in fewer-smarter devices (which might be static). I don’t really see the point of ending up with N drones that all have the map, since presumably you’re going to download the map (elsewhere) and use it somehow.
The main use I can imagine for this is not just getting the map, but using drones to collect “field” data. I’d love to sample the wifi strength field of my house. Mapping heat or CO2 or radiation or noise or nearness to metal would be interesting.