The software that decides what objects the car can ignore was incorrectly tuned, according to reports 

Uber has said that the software within the autonomous vehicle, responsible for determining how the car reacts to objects it detects, was most likely at fault for the fatal test crash which killed a pedestrian in Arizona.

In March, an Uber which was testing its autonomous driving mode struck a pedestrian, who later died from her injuries. Since then, Uber has been investigating the cause of the crash and why the car did not stop.

According to The Information, two sources who were briefed about the matter said the software was incorrectly tuned.

According to reports, Uber’s Lidar cameras and radar sensors on the car were all working perfectly and did their jobs.

But it was the software that determines which objects around the car can be ignored was tuned in a way that caused it to ignore the pedestrian.

The software was meant to be tuned so leaves, floating rubbish or other small objects would not set off the automatic braking feature.

Uber provided a statement:

”We’re actively cooperating with the National Transportation Safety Board (NTSB) in their investigation. Out of respect for that process and the trust we’ve built with NTSB, we can’t comment on the specifics of the incident.

”In the meantime, we have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB chair Christopher Hart to advise us on our overall safety culture. Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

False positives

Clyde and Co partner, Nigel Brook (pictured) commented on the reports, saying that there needs to be the right balance for a vehicle to properly determine which objects can be ignored.

He said: “An autonomous vehicle has to interpret its changing surroundings so as to navigate roads while avoiding collisions. But can the AV’s systems rapidly distinguish between a ‘false positive’ such as plastic bags blown by the wind from a real threat? If the systems are too ‘neurotic’, the ride will be jerky and uncomfortable; too relaxed, and the car could fail to react to real danger. This is one of the fundamental challenges faced by all AV developers, not just Uber.

Brook Nigel

“Training the algorithms currently requires millions of miles of driving – and there are commercial pressures on the various companies competing to be first to market.

“This incident could also prove to be a test for regulators, and for local authorities hosting trials. For example, should they insist that two people are always present in the vehicle rather than one? Would this make it any safer in practice?”

Commenting on the challenge of interpreting data, Brook added: “There are lots of things humans find very easy and machines find very hard – such as picking one type of object out of a bucket containing multiple objects.

”Through years of experience, our brains develop an impressive ability to process images and work out what is happening without any conscious thought. Machines struggle in this regard – and this is where the real challenge lies.”