Fatal Arizona Crash: Uber Car Saw Woman, Called It a False Positive
The Uber self-driving car that struck and killed a woman crossing a roadway in March appears to have seen the victim and her bicycle with the car’s multiple sensors. But the car — the car’s software algorithms — apparently determined she was not in the car’s path, or she wasn’t a danger to the car. In other words, the car’s sensors generated what the car considered a false positive.
The car has to ignore some objects it detects because they’re not really hazardous. One example would be a newspaper blown up from the ground by the preceding car that unfolds and creates a larger target, but slips past the car without causing damage to the car. That would be a false positive: There’s an object there, but it’s not a danger to the car, and vice versa.
The false-positive conclusion was first reported by The Information, citing two people briefed on the incident.
If you view the video clip of the incident, it’s hard to believe that neither the car’s multiple sensors nor the driver didn’t pick up the victim in time. The driver, apparently not fully attentive to the road, might have at least slowed the car.
Investigators worked up several theories and discarded them:
- Failure of the hardware. That’s almost impossible, because the car needed the sensors to drive autonomously before the accident. The lidar maker had quickly issued a statement that it wasn’t possible for the lidar to fail in a way that recognized the roadway and other hazards, but not the woman and her bicycle.
- Failure to see at night. That’s impossible, too. Lidar by definition includes a laser, and forward-facing cameras work well with headlamps.
- Failure of the recognition system. This means the software didn’t recognize a pedestrian pushing a bicycle. There are plenty of algorithms to recognize pedestrians walking, recognize bicycles, people riding bicycles (the circular pumping motion of the legs), and pedestrians walking their bicycles, which often happens at crosswalks and in this case across a multi-lane road. So that was ruled out.
- Failure of the algorithms. There are algorithms that work through common as well as rare situations. A car ignores a pedestrian walking along the side of the road, as well as a bicyclist — unless the latter is swerving onto the roadway surface, usually 12 feet wide and used by cars and trucks. It also needs to ignore debris — the newspaper, a plastic shopping bag blowing in the wind, but maybe not a mattress falling off a car roof.
According to The Information, its sources said this last possibility is what investigators are focusing on.
That conclusion is problematic for several reasons. The conclusion effectively says the rules Uber software engineers set for the Volvo test car weren’t good enough to cope with a not-uncommon situation: a person walking a bike across the road. It was at night and it wasn’t at a crosswalk, but the car needs to cope with those situations.
Meanwhile, Uber’s ability to test self-driving cars in Arizona remains suspended by Gov. Doug Ducey, who had been seen as an advocate of autonomous cars, or at least of testing autonomous cars in his state, and issued an executive order to that effect. That was in 2015. In March of this year, he updated the executive order to allow testing of self-driving cars without a human behind the wheel. Some have said the governor has been too cozy with the self-driving-testing business.
In a letter to Uber CEO Dara Khosrowshahi, Darcy wrote:
As governor, my top priority is public safety. Improving public safety has always been an emphasis of Arizona’s approach to autonomous vehicle testing, and my expectation is that public safety is also the top priority for all who operate this technology in the state of Arizona. The incident that took place on March 18 is an unquestionable failure to comply with this expectation.
Uber has taken its cars off the road in all self-test cities: Tempe (a Phoenix suburb), Pittsburgh, San Francisco, and Toronto. The challenge for Uber is that the path to getting cars that can be licensed for self-driving involves driving millions of miles. There’s only so much you can learn from testing on closed courses.
Meaning, as The Guardian, a UK newspaper and site, said,
[Ducey] repeatedly encouraged Uber’s controversial experiment with autonomous cars in the state, enabling a secret testing program for self-driving vehicles with limited oversight from experts, according to hundreds of emails obtained by the Guardian…. Uber began quietly testing self-driving cars in Phoenix in August 2016 without informing the public.
As for on-road testing, it’s also clear that test-driver inattention is a big issue. If a car is mostly in charge, how do you get the driver to remain constantly alert? This also is one reason some automakers may skip past Level 3 autonomy and go straight from Level 2 to Level 4, where the car gives the driver plenty of time to take over, such as getting off the highway onto local roads.