Self-driving vehicles are a glimpse into the future of the automotive industry. Major car manufacturers and new start-ups alike have leapt into the race to be the first to produce efficient and safe autonomous vehicles. With more than 33,000 people killed in auto accidents in 2017 alone, it is hoped that this new technology can significantly reduce the number of serious accidents, and make the roads safer for everyone. Unfortunately, the concept faces many challenges due to technological and legal limitations.
A self-driving car being tested by Uber struck a woman in Tempe, Arizona in March 2018, the first instance of a self-driving vehicle killing a pedestrian in the US. The company immediately pulled its autonomous vehicles off the road in the face of investigations and plummeting public trust, and spurred a still ongoing debate on who is ultimately liable for self-driving vehicle accidents.
One year after the accident, the Yavapai County Attorney’s Office declared that there was “no basis for criminal liability for the Uber corporation.” In the meantime, the ride-sharing giant settled with the victim’s family for an undisclosed amount, but still faces scrutiny from the National Transportation Safety Board and the National Highway Transportation Safety Administration.
As the first accident of its kind, it serves as a reminder that the technology is imperfect. However, regulations for self-driving vehicles are still in flux around the country, and the answer as to who could ultimately be held liable for future incidents remains still unclear.
The state of Arizona potentially opened itself up to liability due to its government’s role in limiting self-driving vehicle regulations
The accident came at a time where states around the country were grappling with the idea of allowing self-driving vehicles more access to public roads. California has been historically less than enthused about driverless vehicle testing. Arizona legislators saw an opportunity to lure businesses away from its neighbor, announcing that the state was a regulation-free zone in 2015.
Just weeks before the pedestrian was killed in Tempe, Governor Doug Ducey updated a 2015 executive order, further pushing legislators to “eliminate unnecessary regulations and hurdles” in the testing of self-driving vehicles in Arizona.
The dangers represented by autonomous vehicles on public roads is not an unknown factor. Near misses are regularly reported by testers of the technology, and there have been accidents. Most notably, the driver of a Tesla Model S was killed in 2016 after its operator and the vehicle’s sensors failed to notice a tractor trailer cutting across their lane.
Despite this, and other accidents reported by other manufacturers, Arizona continued to downplay the need for stricter regulation.
The city of Tempe, Governor Ducey, and Department of Transportation Director John Halikowski have all been named as defendants in a $5 million lawsuit. The attorneys representing the victim’s family claim that the state’s laws and its rush to attract tech companies seeking new testing grounds have made the state’s roads potentially more dangerous.
Other governments that take a lax approach to the regulation of this emerging technology may well run the risk of being targeted with similar attacks—and lawsuits.
Uber may have placed further attention on the testing of self-driving vehicles due to actions that may have compromised safety measures.
Uber’s may have dodged liability for the Tempe accident, but the NTSB and NHTSA have yet to release their final reports, and certain facts could potentially put the company back on the hook.
According to the initial report from the NTSB, the vehicle detected the pedestrian 6 seconds before impact, marking it as an “unknown obstruction.” However, it only defined the obstacle as a person with 1.3 seconds to spare. The car’s software determined that it needed to employ emergency brakes to avoid impact, but the system did not engage.
The problem is that Uber deliberately disabled the emergency braking systems on all of its vehicles testing in Arizona. The reason stated was to facilitate smoother road tests and reduce the potential for erratic vehicle behaviors. If a similar incident occurs in the future, in a state less accommodating than Arizona, a vehicle operator or manufacturer could potentially face charges for disabling safety systems on a vehicle.
Manufacturers are also potentially liable in another self-driving vehicle accident that occurred due to flaws with object detection.
A Georgia Tech study released in 2019 determined that many of the commonly employed sensor systems in self-driving vehicles struggle to detect pedestrians with darker skin tones. This means that some people could be in greater danger than others of becoming a victim of an accident like the one in Tempe.
A problem facing self-driving vehicles is that they do not rely on a single, uniform sensor solution. Tesla, for example, is developing a system entirely reliant on cameras. But some sensor manufacturers are pushing systems that exclusively use light detection and ranging (LiDAR) sensors. Meanwhile, many vehicle manufacturers are using systems that incorporate multiple sensor types, including infrared, short-range radar, and motion tracking.
In cases where it’s provable that the sensors in an autonomous vehicle were inadequate, the component manufacturer could face a lawsuit for those design flaws. On the other hand, if the vehicle has design flaws, such as poor sensor placement or an inadequate number of sensors, that could be a factor in a lawsuit.
Technological progress further complicates the legal picture. Current models of sensors and self-driving vehicles will be eclipsed by future iterations. The law, and any lawsuits for self-driving accidents, will have to keep pace with these advancements.
How much responsibility do autonomous vehicle operators have in an accident? Existing laws may not be fully prepared to answer that question.
While Uber was cleared of any criminal liability for the fatal accident in Tempe, the Yavapai County Attorney’s Office stated that the driver’s conduct should still be investigated.
The Uber driver initially claimed she was looking at a work-related display, but investigators learned that she was most likely watching a Hulu stream. The car was in autonomous mode at night, and the street where the accident occurred was dark enough that the victim seemingly appeared out of nowhere when they came within range of the vehicle’s headlights.
This presents a unique quandary for the operators of self-driving vehicles. How much responsibility does the “driver” in this case have, especially in vehicles that are fully autonomous? How much attention do they have to pay to the road if the car is supposed to do everything?
Operator liability is a concern not just for individual drivers (or observers, in this case), but for companies that rely on large vehicle fleets. Trucking companies, for example are considering the costs and benefits of transitioning to autonomous vehicles. However, adoption for the new technology may be slowed by the potential for expensive lawsuits should safety systems fail, or if improper maintenance compromises a self-driving system.
Future laws regulating autonomous vehicles will likely affect litigation around self-driving vehicle accidents
Under the Trump administration, in 2018 the NHTSA proposed plans to revise existing safety rules for autonomous vehicles. Currently, all self-driving cars must adhere to safety standards that were originally written for standard passenger vehicles. It is an effort to streamline the currently patchwork nature of regulations for self-driving vehicles around the country But even if these changes are implemented, these new guidelines could potentially be reversed by future administrations.
For now, only laws at the state level govern autonomous vehicles, but there is little uniformity from one state to the next. California is considered one of the more strict states in regards to self-driving vehicles (while still allowing the testing of completely autonomous vehicles on public roads). On the other hand, while Arizona initially did away with regulations on self-driving vehicle testing, it banned Uber’s vehicles from the state after the Tempe incident.
In the future, the question of liability for self-driving vehicle accidents will depend on the location of the accidents. As Arizona has proven, lawmakers and regulators alike will likely be trying to straddle the line between holding operators liable, while trying to avoid the perception of interfering with lucrative technological innovation.