Self Driving Cars Learn From Our Eyes

[Michelle Hampson] reports in IEEE Spectrum that Chinese researchers may improve self-driving cars by mimicking how the human eye works. In some autonomous cars, two cameras use polarizing filters to help understand details about what the car sees. However, these filters can penalize the car’s vision in low light conditions.

Humans, however, have excellent vision in low-lighting conditions. The Retinex theory (based on the Land Effect discovered by [Edwin Land]) attributes this to the fact that our eyes sense both the reflectance and the illumination of light. The new approach processes polarized light from the car’s cameras in the same way.

Continue reading “Self Driving Cars Learn From Our Eyes”

Spoofing LIDAR Could Blind Autonomous Vehicles To Obstacles

Humans manage to drive in an acceptable fashion using just two eyes and two ears to sense the world around them. Autonomous vehicles are kitted out with sensor packages altogether more complex. They typically rely on radar, lidar, ultrasonic sensors, or cameras all working in concert to detect the road conditions ahead.

While humans are pretty wily and difficult to fool, our robot driving friends are less robust. Some researchers are concerned that LiDAR sensors could be spoofed, hiding obstacles and tricking driverless cars into crashes, or worse.

Continue reading “Spoofing LIDAR Could Blind Autonomous Vehicles To Obstacles”