When you put a human driver behind the wheel, they will use primarily their eyes to navigate. Both to stay on the road and to use any navigation aids, such as maps and digital navigation assistants. For self-driving cars, tackling the latter is relatively easy, as the system would use the same information in a similar way: when to to change lanes, and when to take a left or right. The former task is a lot harder, with situational awareness even a challenge for human drivers.
In order to maintain this awareness, self-driving and driver-assistance systems use a combination of cameras, LIDAR, and other sensors. These can track stationary and moving objects and keep track of the lines and edges of the road. This allows the car to precisely follow the road and, at least in theory, not run into obstacles or other vehicles. But if the weather gets bad enough, such as when the road is covered with snow, these systems can have trouble coping.
Looking for ways to improve the performance of autonomous driving systems in poor visibility, engineers are currently experimenting with ground-penetrating radar. While it’s likely to be awhile before we start to see this hardware on production vehicles, the concept already shows promise. It turns out that if you can’t see whats on the road ahead of you, looking underneath it might be the next best thing. Continue reading “Navigating Self-Driving Cars By Looking At What’s Underneath The Road”