Self-Driving Car Accidents in California: Who Is Liable When There’s No Human Driver?

Self-driving and partially automated vehicles are now a familiar sight on California roads. Companies continue testing and deploying autonomous systems in cities like San Francisco, Los Angeles, and Sacramento, and many drivers use advanced driver-assist features in their personal vehicles.

Their growing presence on the roads have, unfortunately, also resulted in accidents. As of Jan. 16, 2026, the California Department of Motor Vehicles has received 918 autonomous vehicle collision reports.

But who’s liable when a self-driving or semi-autonomous vehicle is involved in an accident?

California regulators, federal safety agencies, and manufacturers each play a role in determining what happened, but none of them assign civil fault. A self-driving car accident lawyer can help you understand how liability is evaluated and what steps may be appropriate for your situation.

Below, we explain how the issue of liability generally works, followed by real examples that show how these cases unfold in practice.

Key Takeaways

  • Crashes involving autonomous or driver-assist systems are evaluated differently because the vehicle’s level of automation affects who may be responsible.
  • What appears to be a “system failure” or “driver mistake” often depends on digital evidence that isn’t visible at the scene.
  • Even when a vehicle is operating autonomously, liability may involve several parties, including fleet operators and component manufacturers.
  • Publicly reported crashes involving Tesla, Waymo, and Cruise show that automated-system behavior is now a routine part of collision investigations in California.
  • Understanding these distinctions early can help you take steps that preserve information needed to assess a potential claim.

Understanding Liability When a Vehicle Is Driving Itself

Determining liability in a crash tends to depend on how the vehicle was operating at the time. California recognizes different levels of automation, and each level comes with different implications.

Human Driver Responsibility Still Applies in Many Cases

Not every “self-driving” system is capable of fully autonomous operation. Tesla’s Autopilot and Full Self-Driving (FSD) systems are considered Level 2 driver-assist technology. That means the vehicle can steer, accelerate, and brake, but the human driver must remain attentive and ready to take control at all times. If a crash occurs because the driver failed to supervise the system, the human operator may be held responsible.

Manufacturer or Software Liability in True Autonomous Mode

In contrast, vehicles operated by companies like Waymo and Cruise are designed to drive themselves in certain areas without human intervention. These Level 4 systems rely on sensors, mapping, and software to make driving decisions.

If a crash occurs while the vehicle is operating autonomously, liability questions shift toward:

  • defective vehicle components
  • software or sensor errors
  • system decision-making problems
  • operational oversight by the fleet operator

Liability does not automatically fall on the manufacturer, but the analysis is different from a traditional driver-involved crash.

Why These Crashes Are More Complex

Autonomous and driver-assist collisions often involve:

  • multiple potentially responsible parties
  • digital evidence that must be preserved quickly
  • state and federal investigations
  • unfamiliar insurance structures

Real Examples of Self-Driving and Driver-Assist Crashes

Several recent, well-reported incidents reinforce that autonomous technology, while advanced, can still be involved in collisions that cause injury and show why determining liability can be complex.

Tesla Autopilot and Full Self-Driving Crashes

Analyses of National Highway Traffic Safety Administration (NHTSA) crash reports found that Tesla’s Autopilot systems had been involved in at least 736 reported crashes and 17 fatalities from 2019 through mid-2023. In 2025, a Tesla, driving on Autopilot, was found partly responsible for a deadly 2019 crash; the victims were awarded more than $240 million in damages.

PBS NewsHour also reported that regulators opened an investigation into nearly 2.9 million Teslas equipped with full self-driving technology (FSD) after dozens of incidents where vehicles reportedly ran red lights or committed other traffic violations while using FSD features.

Separate reporting in the Washington Post noted 58 incidents tied to FSD, including crashes, fires, and approximately 23 injuries.

Why this matters legally: Tesla’s systems (Autopilot, FSD) are, as we noted earlier, not truly “self-driving” but Level 2 driver-assist, meaning a human driver must remain attentive. That distinction often affects liability since regulators and courts consider whether a human was properly supervising at the time of a crash.

Waymo Autonomous Vehicle Collisions

NHTSA data shows Waymo has reported 1,500+ crashes involving its autonomous vehicles as of December, 2025. One notable case occurred in January 2025, when an unoccupied Waymo robotaxi stopped in traffic in San Francisco was struck by a speeding human driver during a multi-car chain-reaction crash. At least one person died in the broader collision sequence. Waymo has also recalled over 1,200 vehicles due to a software issue linked to collisions with barriers.

Why this matters legally: Waymo uses Level 4 autonomous technology — meaning vehicles can drive without human steering in approved areas — so liability may focus more on product/system defects rather than driver oversight.

Cruise Collision and Obstruction Incidents

California DMV data shows Cruise vehicles were involved in 68 collisions in San Francisco between 2022 and 2023. Reported incidents have included vehicles stopping unexpectedly or obstructing emergency response vehicles.

Why it matters for liability: Even when an autonomous vehicle is not the primary cause of a collision, inappropriate stopping behavior or obstruction can contribute to the outcome, making liability multi-layered.

What These Cases Show About Autonomous Vehicle Liability

Liability May Be Shared

Depending on how the crash occurred, responsibility may fall on:

  • a human driver (in partially automated systems)
  • the vehicle manufacturer
  • the software or sensor developer
  • the fleet operator
  • other road users

It is common for multiple parties to be named in complex cases.

Digital Evidence Is Central

Autonomous vehicles store extensive operational data, including:

  • event data recorder information
  • system logs
  • sensor readings
  • camera footage
  • mapping decisions
  • fleet operator records

This information can clarify whether the system behaved as expected, whether a human driver intervened, and what decisions the vehicle made before the crash.

Investigations May Involve Several Agencies

NHTSA, NTSB, and the California DMV may review these collisions. Their findings influence the understanding of what happened but do not determine civil liability. Injury claims are evaluated independently based on the facts of the crash.

How a Self-Driving Car Accident Lawyer Helps You

Identifying Whether the Vehicle Was in Autonomous Mode

Determining how the vehicle was operating is critical. This can involve:

  • requesting logs from the manufacturer
  • reviewing fleet operator reports
  • analyzing digital evidence

Identifying All Possible Liable Parties

An attorney will look at whether responsibility may lie with:

  • a human driver
  • the manufacturer
  • the software developer
  • the fleet operator
  • another party involved in the crash

Preserving Time-Sensitive Evidence

Digital records may be overwritten if not preserved early. Legal requests can secure:

  • event data recorder outputs
  • system logs
  • camera footage
  • mapping and routing files

Managing Insurance and Communication

Autonomous vehicle cases often involve multiple insurers and corporate entities. A self-driving car accident attorney helps organize communication and ensure that the details of the crash are presented accurately.

When to Contact a Self-Driving Car Accident Lawyer

You should reach out for legal guidance if:

  • you were injured in a crash involving an autonomous or semi-autonomous vehicle
  • you are unsure whether the vehicle was in self-driving mode
  • fault is being disputed
  • insurers are giving conflicting information
  • the other vehicle was a fleet-operated robotaxi

Understanding your options early can help protect your claim.

Free Consultation With Penney & Associates

Penney & Associates represents clients throughout California, including Sacramento, Roseville, Chico, Santa Clara, Irvine, and beyond. If you were hurt in a crash involving a self-driving or driver-assist vehicle, you can request a free consultation to understand your legal options. Our experienced trial attorneys are here to fight for your rights and ensure you get the compensation that you deserve.

FAQs

Can you file a lawsuit after being hit by a self-driving car?

Yes. Liability depends on how the vehicle was operating and which parties contributed to the crash.

Is the human driver still responsible if Autopilot or FSD was active?

In most cases, yes. Tesla’s systems require supervision, and drivers may still be held responsible.

What if the autonomous vehicle was empty?

Liability may focus on the manufacturer or fleet operator if a fully autonomous vehicle caused the crash.

Who investigates self-driving crashes?

Investigations may involve NHTSA, NTSB, or the California DMV. Their findings can provide useful information but do not assign civil fault.

What should you do after a crash with an autonomous vehicle?

Seek medical attention, document the scene, and contact a lawyer to help preserve any digital evidence from the vehicle.

Read More
How Penney & Associates Won a $14 Million Bus Accident Settlement
RV Accidents in California: What You Need to Know About Injury Risks
What to Do If You’re Hit by a Drunk Driver in California

* This blog is not meant to dispense legal advice and is not a comprehensive review of the facts, the law, this topic or cases related to the topic. For a full review of our disclaimer and policies, please click here.

Aerial view of a bridge in Sacramento

What to Do After a Car Accident in Sacramento: Legal Options and Deadlines

A serious crash can happen without warning on any Sacramento road—whether you’re stopped at a light on Florin Road or merging onto I-5. In the minutes afterward, it’s common to...
Close up image of a red car's bumper after a crash

What to Do If You’re Hit by a Drunk Driver in California

Alcohol-related roadway deaths in California have increased by more than 50% over the past decade, according to a statewide investigation by CalMatters. More than 1,300 people die each year in...
News1

Recent California Injury Cases that Made Headlines in 2025 (And What They Mean for Victims)

California saw several injury cases in 2025 that drew widespread attention. Some involved mass-casualty events, while others centered on workplace tragedies or legal rulings that may affect how future claims...