[Michelle Hampson] reports in IEEE Spectrum that Chinese researchers may improve self-driving cars by mimicking how the human eye works. In some autonomous cars, two cameras use polarizing filters to help understand details about what the car sees. However, these filters can penalize the car’s vision in low light conditions.
Humans, however, have excellent vision in low-lighting conditions. The Retinex theory (based on the Land Effect discovered by [Edwin Land]) attributes this to the fact that our eyes sense both the reflectance and the illumination of light. The new approach processes polarized light from the car’s cameras in the same way.
The images pass through two algorithms. One compensates for brightness levels, while the other processes the reflective properties of the incoming light. They mounted cameras on real cars and drove them in actual dim environments to test everything out.
The result? Studies show that the approach improved driving accuracy by approximately 10%. However, the algorithms require extensive training on difficult-to-obtain data sets, so that is one challenge to adoption.
Self-driving cars certainly need improving. Oddly enough, navigation can be done with polarizing filter cameras and a clear view of the sky. Or, you can look under the road.
The real strength of human intelligence is our ability to shift modes and adapt in real time to the information sources available. i.e. We can shift focus from one sensory clue to another, or integrate a constantly varying pattern of sensory channels. This is called fluid intelligence, whereas a well trained AI is using crystal intelligence.
Humans also are great at operating on no data, if driving in poor light and we hit a cat. We think gosh it was so dark the cat was impossible to see.
If a autonomous driving vehicle did the same we’d be angry.
Cool research I wonder if it’ll be more useful outside of autonomous vehicles because the utility of this is to operate in a manner humans would for a world designed for humans… Important but shortsighted if we do some blue sky thinking
There is a need to start making markers for AI interpretation, currently it’s optimized for humans and we’re surprised that it’s proving challenging.
Why not have UV/IR ArUco style markers for speed limits, give way signs. Road names? They would not be visible to humans but would be easy for appropriate spectrum cameras to detect.
Maybe it’s a totally useless idea but I’m surprised if that’s the case.
A lot of the markers would either need to be dynamic as the speed limit has changed etc or you end relying on the GPS and map being up to date anyway, thus meaning you don’t really need markers for the AI.
I suspect it would actually end up being more confusing for them as its inevitable the patterns will get worn down, shot up, spray painted, bird crap etc, and the more malicious humans will be swapping or altering the AI sign so it gets lost and makes poor turns, probably everywhere in the GPS signal deadspots…
All good points.
I was picturing coating on standard signs.
I.e. a physical sign that also shows an ArUco pattern in UV/IR.
So it’d be fairly large and could be pretty robust (a couple feet wide for a simple ArUco).
Certainly it’d be open to malice but you can also just put up malicious signs. The bigger risk is the malice might not be obvious if you only look with your eyes.
“Humans, however, have excellent vision in low-lighting conditions.”
Speak for yourself, [Al Williams]!
B^)
Thanks for assuming I’m human! ;-)
“….are you suggesting the Duke’s son is an animal ? ….no I am suggesting he might be human…” (Reverend Mother Gaius Helen Mohiam – Dune)
Who! Or whom say owls. I wish we could turn our head like they do, it’d be great in traffic.
“SELF DRIVING CARS LEARN FROM OUR EYES”
I’m trying to figure out how a self-driving car AI can possibly improve if it’s busy reading texts, checking Facebook, and watching Tik Tok…. that, of course when it’s not busy looking in the rear-view mirror to apply makeup.
The things I witness other drivers do during my morning commute sometimes scares the h#?! out of me.
“Self driving car crashes after seeing someone with too sexy an ass walking on the sidewalk”
Will it be able to restore indicator function in BMWs?
That’s a subscription item. /snark
And while it’s neat that we are using vision, why are we not using all the other fantastic technologies like LIDAR, IR sensing, and other systems to figure our where we are, how fast we are going, and what’s in our way?
Just saying…
Self driving cars, eh? The very acme of a solution in search of a problem.
I recall, from 2016, that “self driving” is just one year out. Too bad they had not considered putting the camera where the driver is, so close…. ;)