Currently, if you want to use the Autopilot or Self-Driving modes on a Tesla vehicle you need to keep your hands on the wheel at all times. That’s because, ultimately, the human driver is still the responsible party. Tesla is adamant about the fact that functions which allow the car to steer itself within a lane, avoid obstacles, and intelligently adjust its speed to match traffic all constitute a driver assistance system. If somebody figures out how to fool the wheel sensor and take a nap while their shiny new electric car is hurtling down the freeway, they want no part of it.
So it makes sense that the company’s official line regarding the driver-facing camera in the Model 3 and Model Y is that it’s there to record what the driver was doing in the seconds leading up to an impact. As explained in the release notes of the June 2020 firmware update, Tesla owners can opt-in to providing this data:
Help Tesla continue to develop safer vehicles by sharing camera data from your vehicle. This update will allow you to enable the built-in cabin camera above the rearview mirror. If enabled, Tesla will automatically capture images and a short video clip just prior to a collision or safety event to help engineers develop safety features and enhancements in the future.
But [green], who’s spent the last several years poking and prodding at the Tesla’s firmware and self-driving capabilities, recently found some compelling hints that there’s more to the story. As part of the vehicle’s image recognition system, which usually is tasked with picking up other vehicles or pedestrians, they found several interesting classes that don’t seem necessary given the official explanation of what the cabin camera is doing.
If all Tesla wanted was a few seconds of video uploaded to their offices each time one of their vehicles got into an accident, they wouldn’t need to be running image recognition configured to detect distracted drivers against it in real-time. While you could make the argument that this data would be useful to them, there would still be no reason to do it in the vehicle when it could be analyzed as part of the crash investigation. It seems far more likely that Tesla is laying the groundwork for a system that could give the vehicle another way of determining if the driver is paying attention.
While Tesla certainly has the public’s eye and the Internet’s attention, they aren’t the only automaker experimenting with self-driving technology. General Motors offers a feature called Super Cruise on their high-end Cadillac luxury cars and SUVs that offers a number of very similar features. While Tesla’s vehicles undoubtedly know a few tricks that no Cadillac is capable of, Super Cruise does have a pretty clear advantage over the competition: hands-free driving.
To pull it off, Super Cruise uses a driver-facing camera that’s there specifically to determine where the driver is looking. If the aptly named “Driver Attention Camera” notices the operator doesn’t have their eyes on the road, it will flash a green and then red light embedded in the top of the steering wheel in the hopes of getting their attention.
If that doesn’t work, the car will then play a voice prompt telling the driver Super Cruise is going to disengage. Finally, if none of that got their attention, the car will come to a stop and contact an OnStar representative; at that point it’s assumed the driver is asleep, inebriated, or suffering some kind of medical episode.
With Super Cruise, GM has shown that a driver-facing camera is socially acceptable among customers interested in self-driving technology. More importantly, it demonstrates considerable real-world benefits. Physical steering wheel sensors offer a valuable data point, but by looking at the driver and studying their behavior, the system becomes far more reliable.
Given the number of high profile cases in which users have fooled Tesla’s wheel sensors, it’s clear the company needs to step up their efforts. When police are pulling over speeding vehicles only to discover their “drivers” are sound asleep, something has obviously gone very wrong. Even if these situations are statistical anomalies in the grand scheme of things, there’s no denying the system is exploitable. For self-driving vehicles to become mainstream, automakers will need to demonstrate that they are nigh infallible; embarrassing missteps like this only serve to hold the entire industry back.
Tesla’s failure to keep their drivers engaged certainly isn’t going unnoticed. In the European New Car Assessment Programme’s recently released ratings for several vehicles equipped with driver assistance systems, the Tesla Model 3 was given just a 36% on Assistance Competence. The report explained that while the Model 3 offered an impressive array of functions, it did a poor job of working collaboratively with the human driver. It goes on to say that the situation is made worse by Tesla’s Autopilot marketing, as it fosters a belief that the vehicle can drive itself without user intervention. The fact that the Model 3 has an internal camera but isn’t currently using it to monitor the driver was also specifically mentioned as a shortcoming of the system.
While Tesla was an early pioneer in the field, traditional automakers are now stepping up their efforts to develop their own driver assistance systems. With this new competition comes increased regulatory oversight, greater media attention, and of course, ever higher customer expectations. While it seems Tesla has been reluctant to turn a camera on their own users thus far, the time will soon come where pressure from the rest of the industry means they no longer have a choice.