On the night of March 18, an autonomous Uber car in Arizona, with an emergency backup driver behind the wheel, struck and killed a female as she walked her bike across the street. As one of the first ever pedestrian deaths caused by self-driving technology, Uber paused its autonomous service which was testing operations throughout Arizona, Pittsburgh, and Toronto.
In a normal car accident, fault is usually assigned to the driver, the victim, or some combination of the two. Depending on the specifics of the negligence and comparative fault laws, responsibility for the accident and damages will be divided amongst those parties accordingly. However, in accidents involving autonomous cars, it is much more difficult to determine who gets the blame. Should it be the company who built the vehicle? The software developers? The car owner?
Since this is a relatively new issue facing insurance companies, attorneys, and consumers alike, a lot of these questions are still unanswered. Moreover, the laws governing these types of vehicle accidents are being formulated as we speak and what applies today may be completely different even a year from now.
As is the case with most laws here in the United States, there is never going to be one set of rules. Individual states have and will continue to pass their own sets of laws as they pertain to self-driving cars. However, the general consensus seems to be that there will be a shift away from fault on behalf of the driver and more towards product liability claims.
The New Face of Product Liability
Until now, automotive product liability claims were fairly straightforward. If a manufacturing or design flaw, such as a steering or engine malfunction, caused a vehicle to injure someone, then the car manufacturer would be held liable. However, the conversation today has become much more complex with the combination of human input and autonomous design. For example, in the May 2016 Tesla autopilot crash that killed its driver by crashing into a tractor-trailer, a federal investigation revealed that the system warned the driver of the situation and alerted him to put his hands on the wheel. Similarly, in the Uber example above, one could argue that is was more the fault of the emergency backup driver for not slamming on the brakes than car manufacturer.
As stated above, these types of cases are new ground for the legal community. Most of the claims associated with non-fatal accidents involving autonomous vehicles have been confidentially settled out of court and away from the public spotlight, meaning that there is no real legal precedent for these types of cases yet.
Certain states have taken steps to pass legislation directed at clearing up these potential confusions with liability. The California DMV, for example, passed a regulation which stated that car companies cannot avoid liability if a car had not been maintained to factory standards. This closed a legal loophole that would absolve them of responsibility if a car’s sensor was dirty or the tires were not inflated fully, among other things. On the other hand, Michigan has a law in place which limits a vehicle manufacturer’s liability if a third party makes certain modifications that in turn cause an accident or damage.
A New Era of Litigation
As can be seen from the information above, we are on the cusp of a new era with self-driving cars. The laws regarding liability are still in their early stages of formation and no precedent has been set in terms of how these cases will be dealt with going forward. Because of this, it will be very interesting to see if the family of the most recent victim in Arizona decides to file a lawsuit and if so, what those ramifications are.
According to the NHTSA, 94% of crashes are caused by driver error and self-driving cars have a real opportunity to reduce the tens of thousands of deaths caused by car accidents each year. However, car accidents will never be eliminated completely, and it will be an important task for the legal community in the coming years to determine where liability falls and if car manufacturers are even willing to shoulder that level of responsibility moving forward.
Update 5/24/2018
A federal investigation found that the vehicle’s guidance system had spotted the women six seconds before hitting her, but the emergency braking system on the Uber was disabled. Instead the Uber had anticipated that the human back-up driver would intervene, but the automated system was not designed to alert the driver.
No Comment