Hackaday Links Column Banner

Hackaday Links: August 18, 2024

They’re back! The San Francisco autonomous vehicle hijinks, that is, as Waymo’s fleet of driverless cars recently took up the fun new hobby of honking their horns in the wee hours of the morning. Meat-based neighbors of a Waymo parking lot in the South Market neighborhood took offense at the fleet of autonomous vehicles sounding off at 4:00 AM as they shuffled themselves around in the parking lot in a slow-motion ballet of undetermined purpose. The horn-honking is apparently by design, as the cars are programmed to tootle their horn trumpets melodiously if they detect another vehicle backing up into them. That’s understandable; we’ve tootled ourselves under these conditions, with vigor, even. But when the parking lot is full of cars that (presumably) can’t hear the honking and (also presumably) know where the other driverless vehicles are as well as their intent, what’s the point? Luckily, Waymo is on the case, as they issued a fix to keep the peace. Unfortunately, it sounds like the fix is just to geofence the lot and inhibit honking there, which seems like just a band-aid to us.

Continue reading “Hackaday Links: August 18, 2024”

San Francisco Sues To Keep Autonomous Cars Out Of The City

Although the arrival of self-driving cars and taxis in particular seems to be eternally ‘just around the corner’ for most of us, in an increasing number of places around the world they’re already operational, with Waymo being quite prevalent in the US. Yet despite approval by the relevant authorities, the city of San Francisco has opted to sue the state commission that approved Google’s Waymo and GM’s Cruise. Their goal? To banish these services from the streets of SF, ideally forever.

Whether they will succeed in this seems highly doubtful. Although Cruise has lost its license to operate in California after a recent fatal accident, Waymo’s track record is actually quite good. Using public information sources, there’s a case to be made that Waymo cars are significantly safer to be in or around than those driven by human operators. When contrasted with Cruise’s troubled performance, it would seem that the problem with self-driving cars isn’t so much the technology as it is the safety culture of the company around it.

Yet despite Waymo’s better-than-humans safety record, it is regarded as a ‘nuisance’, leading some to sabotage the cars. The more reasonable take would seem to be that although technology is not mature yet, it has the overwhelming advantage over human drivers that it never drives distracted or intoxicated, and can be deterministically improved and tweaked across all cars based on experiences.

These considerations have been taken into account by the state commission that has approved Waymo operating in SF, which is why legal experts note that SF case’s chances are very slim based on the available evidence.

Ask Hackaday: Why Do Self Driving Cars Keep Causing Traffic Jams?

Despite what some people might tell you, self-driving cars aren’t really on the market yet. Instead, there’s a small handful of startups and big tech companies that are rapidly developing prototypes of this technology. These vehicles are furiously testing in various cities around the world.

In fact, depending on where you live, you might have noticed them out and about. Not least because many of them keep causing traffic jams, much to the frustration of their fellow road users. Let’s dive in and look at what’s going wrong.

Continue reading “Ask Hackaday: Why Do Self Driving Cars Keep Causing Traffic Jams?”

Self-Driven: Uber And Tesla

Self-driving cars have been in the news a lot in the past two weeks. Uber’s self-driving taxi hit and killed a pedestrian on March 18, and just a few days later a Tesla running in “autopilot” mode slammed into a road barrier at full speed, killing the driver. In both cases, there was a human driver who was supposed to be watching over the shoulder of the machine, but in the Uber case the driver appears to have been distracted and in the Tesla case, the driver had hands off the steering wheel for six seconds prior to the crash. How safe are self-driving cars?

Trick question! Neither of these cars were “self-driving” in at least one sense: both had a person behind the wheel who was ultimately responsible for piloting the vehicle. The Uber and Tesla driving systems aren’t even comparable. The Uber taxi does routing and planning, knows the speed limit, and should be able to see red traffic lights and stop at them (more on this below!). The Tesla “Autopilot” system is really just the combination of adaptive cruise control and lane-holding subsystems, which isn’t even enough to get it classified as autonomous in the state of California. Indeed, it’s a failure of the people behind the wheels, and the failure to properly train those people, that make the pilot-and-self-driving-car combination more dangerous than a human driver alone would be.

A self-driving Uber Volvo XC90, San Francisco.

You could still imagine wanting to dig into the numbers for self-driving cars’ safety records, even though they’re heterogeneous and have people playing the mechanical turk. If you did, you’d be sorely disappointed. None of the manufacturers publish any of their data publicly when they don’t have to. Indeed, our glimpses into data on autonomous vehicles from these companies come from two sources: internal documents that get leaked to the press and carefully selected statistics from the firms’ PR departments. The state of California, which requires the most rigorous documentation of autonomous vehicles anywhere, is another source, but because Tesla’s car isn’t autonomous, and because Uber refused to admit that its car is autonomous to the California DMV, we have no extra insight into these two vehicle platforms.

Nonetheless, Tesla’s Autopilot has three fatalities now, and all have one thing in common — all three drivers trusted the lane-holding feature well enough to not take control of the wheel in the last few seconds of their lives. With Uber, there’s very little autonomous vehicle performance history, but there are leaked documents and a pattern that makes Uber look like a risk-taking scofflaw with sub-par technology that has a vested interest to make it look better than it is. That these vehicles are being let loose on public roads, without extra oversight and with other traffic participants as safety guinea pigs, is giving the self-driving car industry and ideal a black eye.

If Tesla’s and Uber’s car technologies are very dissimilar, the companies have something in common. They are both “disruptive” companies with mavericks at the helm that see their fates hinging on getting to a widespread deployment of self-driving technology. But what differentiates Uber and Tesla from Google and GM most is, ironically, their use of essentially untrained test pilots in their vehicles: Tesla’s in the form of consumers, and Uber’s in the form of taxi drivers with very little specific autonomous-vehicle training. What caused the Tesla and Uber accidents may have a lot more to do with human factors than self-driving technology per se.

You can see we’ve got a lot of ground to cover. Read on!

Continue reading “Self-Driven: Uber And Tesla”

Make Cars Safer By Making Them Softer

Would making autonomous vehicles softer make them safer?

Alphabet’s self-driving car offshoot, Waymo, feels that may be the case as they were recently granted a patent for vehicles that soften on impact. Sensors would identify an impending collision and adjust ‘tension members’ on the vehicle’s exterior to cushion the blow. These ‘members’ would be corrugated sections or moving panels that absorb the impact alongside the crumpling effect of the vehicle, making adjustments based on the type of obstacle the vehicle is about to strike.

Continue reading “Make Cars Safer By Making Them Softer”