One day we’ll be able to let go of the wheel and truly let a computer do the driving. It’s less clear whether we’ll be able to wash our hands of car accidents — and car insurance.
Some automakers, such as Audi and BMW, hope to offer fully autonomous vehicles by the early 2020s. As autonomous cars begin hitting the streets in greater numbers, drivers should brace for confusion over exactly who should shoulder the financial burden in case of a wreck.
Levels of autonomy and driver participation
Today, most drivers rely on liability insurance to pay for damage if they cause an accident. But when computers in our cars are calling the shots, you may wonder if you still need to worry about getting blamed for crashes.
» MORE: What does car insurance cover?
Carmakers and tech companies may accept liability when the car’s system is at fault. But exactly who or what is responsible at the time of a crash might be tough to determine, largely because cars may have autonomous capabilities of varying sophistication — and varying degrees of driver involvement.
The National Highway Traffic Safety Administration has outlined five levels of autonomy for current and future vehicles.
|Most cars on the road today fall into this category. They might have automated alerts — such as blind-spot monitoring or lane-departure warnings — but it’s up to the driver to control the vehicle.|
|These vehicles have one or more automated features that can ease the impact of a crash. Brake assist and stability control are two examples.|
|These cars have interconnected automated features that take over the vehicle in certain situations. Drivers should stay focused but can give up control periodically. For instance, a car with adaptive cruise control and lane-centering technology could assume the tasks of steering and maintaining speed on the highway.|
|These cars can take over all aspects of driving, but they alert the driver to resume control if a dangerous situation arises.|
|These are fully autonomous cars that handle all driving functions for the whole trip. Drivers become passengers. Level 4 vehicles are not yet available to the public.|
Whether the car’s computer or the driver had control at the time of the crash could determine where the blame lies. Sorting out liability claims after an autonomous car crashes will cause confusion, says Kathryn Haun, an account administrator at the insurance brokerage company Lockton. Figuring out who was in control gets especially complicated with Level 2 and Level 3 vehicles that can take over driving, but require that humans regain control in emergencies.
Haun highlights a fatal 2016 crash involving a Tesla in autopilot mode — meaning it was automatically braking, steering and changing lanes. Initially, it seemed the crash resulted from the car mistaking a tractor trailer for bright sky. But an NHTSA investigation into the autopilot system cleared Tesla of fault and “did not identify any defects in design or performance,” according to the agency’s report.
NHTSA’s evaluation asserts that the autopilot technology is “not designed to reliably perform in all crash modes” and instead “requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.”
“As long as humans have the ability to intervene, there’s liability risk” for the car owner, says Robert Passmore, assistant vice president of personal lines policy at the Property Casualty Insurers Association of America, a trade group. In other words, the need for car owners to have liability insurance isn’t going to disappear anytime soon.
Research shows it may be unreasonable to expect people to react quickly when alerted by their cars to take over. A 2015 study by NHTSA found that drivers needed an average of 17 seconds to respond to an alert and regain control of a vehicle driving itself at Level 3 automation.
Level 4 vehicles would eliminate the need for people to ever take control. But Haun estimates that such cars may not become standard until 2050.
People may get blamed when technology fails
If the past is any indication, lawmakers may have a tough time getting the companies behind autonomous vehicles to take sole blame for accidents. Even if carmakers and software companies have vowed to accept liability when their machines mess up, they may not admit that the computer was entirely at fault.
“There’s a longstanding precedent of blaming humans for crashes,” says David Ratcliff, a researcher at the American Association for Justice, an advocacy group that promotes fair trials for injury victims.
The association’s February report, “Driven to Safety: Robot Cars and the Future of Liability,” points to autopilot in aviation as a cautionary example. Blame still falls to pilots when the plane’s autopilot technology malfunctions, and airlines have been the defendants in trials, rather than autopilot manufacturers.
Similar patterns may be forming in the world of autonomous vehicles, with some companies already deflecting blame for crashes, according to the association. Among other examples, it cites a 2016 crash in which an automated Google vehicle cut off a bus as it was trying to merge. Google admitted it had “some responsibility,” but the company indicated that the bus driver was also to blame for not letting the car in.
Situations like this may create less confusion once the number of human drivers dwindles and more accidents involve only computers. But even if human error is removed, drivers could share in the blame for other hard-to-predict reasons. For example, say you forget to download the latest software update for your car and the technology malfunctions as a result, Passmore says.
Given the difficulty of pinning fault entirely on a machine, he believes personal liability insurance will play an important role as we adjust to the evolution of autonomy.
Lack of data could create liability limbo
Insurance companies will need access to a vehicle’s data to get the best view of accidents to handle claims, Passmore says.
The Association of California Insurance Companies, a trade group, also cited the key role of data in an April letter to the state’s Department of Motor Vehicles.
“In today’s world, an exchange of information or a statement is made to law enforcement. If the automobile is driverless, then the data in that automobile is the statement necessary to determine liability,” the letter states.
Getting this crucial accident data isn’t guaranteed. A lot depends on cooperation from vehicle manufacturers, which Ratcliff notes is far from a sure thing.
Accident data might reveal only so much anyway, Ratcliff says. “The algorithms driving these cars will look like an ocean of math; pinpointing an exact cause of the vehicle’s behavior may be impossible.”