There are problems around the public’s understanding of when a vehicle can be deemed to need no human supervision

One of the insurance market’s leading experts on autonomous vehicles has said the industry has to work to ensure that the public understand when they can truly switch off behind the wheel.

David Williams, managing director, underwriting & technical services at AXA,  leads the insurer’s work on connected and autonomous vehicles and gave the warning as part of his work with the International Underwriting Association’s (IUA) Developing Technology Monitoring Group (DTMG).

The SAE ratings for autonomous vehicles go from 0 (manual) to 5 (full automation) and Williams said the issue was around the public’s understanding of when a vehicle can be deemed to need no human supervision.

“If you speak to the UK government now and ask them in terms of the SAE Levels which ones will the Automated and Electric Vehicles Act relate to, they will tell you Levels 4 and 5,” he explained. “We welcome that because we think Level 3 is unsafe. Level 3 gives all of the appearances of being autonomous but requires the human being to act as the backstop, the safety mechanism in the whole operation.

“The problem is that giving the impression that the vehicle is autonomous will result in people reading books, having a nap, being completely disengaged from the process.

”We have carried out tests in various trials and we know that it takes quite a considerable period to get back up to speed to be as capable a driver as if you had been engaged. Therefore, we are quite anti-Level 3 and Jaguar Land Rover, Ford and Tata Motors that we have worked with as part of the government consortia have been similarly persuaded and have said they are not going to build Level 3 vehicles.”

’Fundamentally dangerous’

David williams

If a human is the safety mechanism, a vehicle is not autonomous, says Axa’s David Williams, pictured.

He said there have been reports that there has been pressure from some of the motor manufacturers to include what could be described as high-end Level 3 vehicles in the autonomous definition.

“This possibility worries me tremendously, because if it can perform a safe harbour safety manoeuvre by itself then it is a Level 4 vehicle. If it cannot, it is a Level 3. Therefore, my assumption is that what they are trying to do is push into this definition of autonomous, vehicles that will still require humans to be the safety mechanism, this is something that we absolutely have to push back on because it is fundamentally dangerous.

“It has the potential to cause deaths on UK roads and, if we are not careful, it could derail the whole autonomous vehicle programme because people will not know the difference between levels.”

Williams said it has to be made clear that if a human is the safety mechanism, it is not autonomous.

“You are responsible, do not disengage. If it can perform safe harbour safety stop manoeuvre, not just slowdown in the fast lane of the motorway, its Level 4, its autonomous, you can disengage.”

The IUA’s DMTG was set up to monitor developing technologies, with a particular focus on Autonomous Vehicles, Autonomous Vessels, Unmanned Aerial Vehicles and the Internet of Things, and act as a focal point for such risks at the IUA.