A Million Zombie Taxis By 2020? It’s Not Going To Happen

The tech world has a love for Messianic figures, usually high-profile CEOs of darling companies whose words are hung upon and combed through for hidden meaning, as though they had arrived from above to our venture-capital-backed prophet on tablets of stone. In the past it has been Steve Jobs or Bill Gates, now it seems to be Elon Musk who has received this treatment. Whether his companies are launching a used car into space, shooting things down tubes in the desert, or synchronised-landing used booster rockets, everybody’s talking about him. He’s a showman whose many pronouncements are always soon eclipsed by bigger ones to keep his public on the edge of their seats, and now we’ve been suckered in too, which puts us on the spot, doesn’t it.

Your Johnny Cab is almost here

The latest pearl of Muskology came in a late April presentation: that by 2020 there would be a million Tesla electric self-driving taxis on the road. It involves a little slight-of-hand in assuming that a fleet of existing Teslas will be software upgraded to be autonomous-capable and that some of them will somehow be abandoned by their current owners and end up as taxis, but it’s still a bold claim by any standard.

Here at Hackaday, we want to believe, but we’re not so sure. It’s time to have a little think about it all. It’s the start of May, so 2020 is about 7 months away. December 2020 is about 18 months away, so let’s give Tesla that timescale. 18 months to put a million self-driving taxis on the road. Can the company do it? Let’s find out.

Continue reading “A Million Zombie Taxis By 2020? It’s Not Going To Happen”

Was The Self Driving Car Invented In The 1980s?

The news is full of self-driving cars and while there is some bad news, most of it is pretty positive. It seems a foregone conclusion that it is just a matter of time before calling for an Uber doesn’t involve another person. But according to a recent article, [Ernst Dickmanns] — a German aerospace engineer —  built three autonomous vehicles starting in 1986 and culminating with on-the-road demonstrations in 1994 for Daimler.

It is hard to imagine what had to take place to get a self-driving car in 1986. The article asserts that you need computer analysis of video at 10 frames a second minimum. In the 1980s doing a single frame in 10 minutes was considered an accomplishment. [Dickmanns’] vehicles borrowed tricks from how humans drive. They focused on a small area at any one moment and tried to ignore things that were not relevant.

Continue reading “Was The Self Driving Car Invented In The 1980s?”

Self-Driven: Uber And Tesla

Self-driving cars have been in the news a lot in the past two weeks. Uber’s self-driving taxi hit and killed a pedestrian on March 18, and just a few days later a Tesla running in “autopilot” mode slammed into a road barrier at full speed, killing the driver. In both cases, there was a human driver who was supposed to be watching over the shoulder of the machine, but in the Uber case the driver appears to have been distracted and in the Tesla case, the driver had hands off the steering wheel for six seconds prior to the crash. How safe are self-driving cars?

Trick question! Neither of these cars were “self-driving” in at least one sense: both had a person behind the wheel who was ultimately responsible for piloting the vehicle. The Uber and Tesla driving systems aren’t even comparable. The Uber taxi does routing and planning, knows the speed limit, and should be able to see red traffic lights and stop at them (more on this below!). The Tesla “Autopilot” system is really just the combination of adaptive cruise control and lane-holding subsystems, which isn’t even enough to get it classified as autonomous in the state of California. Indeed, it’s a failure of the people behind the wheels, and the failure to properly train those people, that make the pilot-and-self-driving-car combination more dangerous than a human driver alone would be.

A self-driving Uber Volvo XC90, San Francisco.

You could still imagine wanting to dig into the numbers for self-driving cars’ safety records, even though they’re heterogeneous and have people playing the mechanical turk. If you did, you’d be sorely disappointed. None of the manufacturers publish any of their data publicly when they don’t have to. Indeed, our glimpses into data on autonomous vehicles from these companies come from two sources: internal documents that get leaked to the press and carefully selected statistics from the firms’ PR departments. The state of California, which requires the most rigorous documentation of autonomous vehicles anywhere, is another source, but because Tesla’s car isn’t autonomous, and because Uber refused to admit that its car is autonomous to the California DMV, we have no extra insight into these two vehicle platforms.

Nonetheless, Tesla’s Autopilot has three fatalities now, and all have one thing in common — all three drivers trusted the lane-holding feature well enough to not take control of the wheel in the last few seconds of their lives. With Uber, there’s very little autonomous vehicle performance history, but there are leaked documents and a pattern that makes Uber look like a risk-taking scofflaw with sub-par technology that has a vested interest to make it look better than it is. That these vehicles are being let loose on public roads, without extra oversight and with other traffic participants as safety guinea pigs, is giving the self-driving car industry and ideal a black eye.

If Tesla’s and Uber’s car technologies are very dissimilar, the companies have something in common. They are both “disruptive” companies with mavericks at the helm that see their fates hinging on getting to a widespread deployment of self-driving technology. But what differentiates Uber and Tesla from Google and GM most is, ironically, their use of essentially untrained test pilots in their vehicles: Tesla’s in the form of consumers, and Uber’s in the form of taxi drivers with very little specific autonomous-vehicle training. What caused the Tesla and Uber accidents may have a lot more to do with human factors than self-driving technology per se.

You can see we’ve got a lot of ground to cover. Read on!

Continue reading “Self-Driven: Uber And Tesla”

Uber Has An Autonomous Fatality

You have doubtlessly heard the news. A robotic Uber car in Arizona struck and killed [Elaine Herzberg] as she crossed the street. Details are sketchy, but preliminary reports indicate that the accident was unavoidable as the woman crossed the street suddenly from the shadows at night.

If and when more technical details emerge, we’ll cover them. But you can bet this is going to spark a lot of conversation about autonomous vehicles. Given that Hackaday readers are at the top of the technical ladder, it is likely that your thoughts on the matter will influence your friends, coworkers, and even your politicians. So what do you think?

Continue reading “Uber Has An Autonomous Fatality”

Autopilots Don’t Kill Drivers, Humans Do

The US National Highway Traffic Safety Administration (NHTSA) report on the May 2016 fatal accident in Florida involving a Tesla Model S in Autopilot mode just came out (PDF). The verdict? “the Automatic Emergency Braking (AEB) system did not provide any warning or automated braking for the collision event, and the driver took no braking, steering, or other actions to avoid the collision.” The accident was a result of the driver’s misuse of the technology.

quote-not-a-true-targetThis places no blame on Tesla because the system was simply not designed to handle obstacles travelling at 90 degrees to the car. Because the truck that the Tesla plowed into was sideways to the car, “the target image (side of a tractor trailer) … would not be a “true” target in the EyeQ3 vision system dataset.” Other situations that are outside of the scope of the current state of technology include cut-ins, cut-outs, and crossing path collisions. In short, the Tesla helps prevent rear-end collisions with the car in front of it, but has limited side vision. The driver should have known this.

The NHTSA report concludes that “Advanced Driver Assistance Systems … require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.” The report also mentions the recent (post-Florida) additions to Tesla’s Autopilot that help make sure that the driver is in the loop.

The takeaway is that humans are still responsible for their own safety, and that “Autopilot” is more like anti-lock brakes than it is like Skynet. Our favorite footnote, in carefully couched legalese: “NHTSA recognizes that other jurisdictions have raised concerns about Tesla’s use of the name “Autopilot”. This issue is outside the scope of this investigation.” (The banner image is from this German YouTube video where a Tesla rep in the back seat tells the reporter that he can take his hands off the wheel. There may be mixed signals here.)

cropped_shot_2017-01-23-181745There are other details that make the report worth reading if, like us, you would like to see some more data about how self-driving cars actually perform on the road. On one hand, Tesla’s Autosteer function seems to have reduced the rate at which their cars got into crashes. On the other, increasing use of the driving assistance functions comes with an increase driver inattention for durations of three seconds or longer.

People simply think that the Autopilot should do more than it actually does. Per the report, this problem of “driver misuse in the context of semi-autonomous vehicles is an emerging issue.” Whether technology will improve fast enough to protect us from ourselves is an open question.

[via Popular Science].

Self-Driving Cars Are Not (Yet) Safe

Three things have happened in the last month that have made me think about the safety of self-driving cars a lot more. The US Department of Transportation (DOT) has issued its guidance on the safety of semi-autonomous and autonomous cars. At the same time, [Geohot]’s hacker self-driving car company bailed out of the business, citing regulatory hassles. And finally, Tesla’s Autopilot has killed its second passenger, this time in China.

At a time when [Elon Musk], [President Obama], and Google are all touting self-driving cars to be the solution to human error behind the wheel, it’s more than a little bold to be arguing the opposite case in public, but the numbers just don’t add up. Self-driving cars are probably not as safe as a good sober driver yet, but there just isn’t the required amount of data available to say this with much confidence. However, one certainly cannot say that they’re demonstrably safer.

Continue reading “Self-Driving Cars Are Not (Yet) Safe”

Geohot’s Comma.ai Self-Driving Code On GitHub

First there was [Geohot]’s lofty goal to build a hacker’s version of the self-driving car. Then came comma.ai and a whole bunch of venture capital. After that, a letter from the Feds and a hasty retreat from the business end of things. The latest development? comma.ai’s openpilot project shows up on GitHub!

If you’ve got either an Acura ILX or Honda Civic 2016 Touring addition, you can start to play around with this technology on your own. Is this a good idea? Are you willing to buy some time on a closed track?

A quick browse through the code gives some clues as to what’s going on here. The board files show just how easy it is to interface with these cars’ driving controls: there’s a bunch of CAN commands and that’s it. There’s some unintentional black comedy, like a (software) crash-handler routine named crash.py.

What’s shocking is that there’s nothing shocking going on. It’s all pretty much straightforward Python with sprinklings of C. Honestly, it looks like something you could get into and start hacking away at pretty quickly. Anyone want to send us an Acura ILX for testing purposes? No promises you’ll get it back in one piece.

If you missed it, read up on our coverage of the rapid rise and faster retreat of comma.ai. But we don’t think the game is over yet: comma.ai is still hiring. Are open source self-driving cars in our future? That would be fantastic!

Via Endagadget. Thanks for the tip, [FaultyWarrior]!