Federal safety regulators have opened a preliminary investigation after a Waymo autonomous vehicle struck a child near an elementary school in Santa Monica, California. The crash occurred on Jan. 23 during morning drop-off hours, an already busy and high-risk time around school campuses.
The incident has renewed attention on how self-driving cars operate in school zones and how these systems are evaluated when children are involved.
Below is what is known so far and what the investigation may mean for families in California communities where autonomous vehicles are operating.
Key Takeaways
- Federal regulators are investigating a Waymo vehicle that struck a child near a Santa Monica elementary school during morning drop-off hours.
- The review centers on how self-driving cars operate in school zones, where reduced visibility and unpredictable pedestrian movement are common.
- Waymo vehicles operate at Level 4 autonomy, meaning no human driver is required within approved service areas.
- The company has faced prior federal scrutiny, including investigations involving traffic law compliance and school bus stopping procedures.
- Findings from NHTSA investigations can lead to software updates, recalls, operational changes, and may affect related injury claims.
What Happened in the Santa Monica School Zone?
According to information released by the National Highway Traffic Safety Administration (NHTSA), the self-driving car collision occurred within two blocks of an elementary school during normal drop-off hours.
Reports indicate:
- Multiple children were present
- A crossing guard was on duty
- Several vehicles were double-parked
- The child entered the roadway from behind a double-parked SUV
- The Waymo vehicle made contact with the child
- The child sustained minor injuries
- No safety operator was inside the vehicle
Waymo’s Statement on the Incident
Waymo has stated that its system detected the child as they emerged from behind the parked vehicle and applied hard braking. The company reported that the vehicle slowed from approximately 17 miles per hour to under 6 miles per hour before contact occurred. After the incident, the vehicle remained at the scene and emergency services were contacted.
Why Federal Regulators Opened an Investigation
On Jan. 29, NHTSA confirmed it had opened a preliminary investigation into Waymo’s automated driving system.
The agency’s Office of Defects Investigation is examining:
- How the vehicle’s system behaves in school zones
- Whether the vehicle followed posted speed limits
- How it responded to crossing guards and parked vehicles
- Whether the system exercised appropriate caution given the presence of children
- How the vehicle and company responded after the crash
A preliminary investigation does not mean wrongdoing has been determined. It allows regulators to collect data, review system design, and assess whether additional safety measures are necessary.
How Level 4 Autonomous Vehicles Work
Waymo operates vehicles classified as Level 4 autonomy under federal guidelines.
At this level:
- The vehicle performs all driving tasks within defined service areas
- No human driver is required to intervene
- A safety operator is not required to be inside the vehicle
- The vehicles are typically deployed as ride-hailing services in select cities
Self-Driving vs. ADAS
Level 4 vehicles are different from common advanced driver-assistance systems (ADAS) such as automatic emergency braking or lane-keeping assistance found in consumer vehicles. Those systems support a human driver. Level 4 vehicles assume full control of driving within their operational limits.
This distinction matters when examining how self-driving cars function in school zones, where traffic patterns and pedestrian movement can change rapidly.
School Zones and Heightened Caution Under California Law
School zones carry added safety considerations. Under California law, drivers must reduce speed and exercise increased caution when children are present. Courts often expect drivers to anticipate sudden or unpredictable movement in areas where children gather.
Driving in School Zones
Common conditions in school zones include:
- Reduced speed limits during specified hours
- Crossing guards directing traffic
- Double-parked vehicles limiting visibility
- Children crossing mid-block
- Parents stopping and starting unexpectedly
Was the System Designed for Obstructed Visibility?
Human drivers are expected to account for these conditions. When self-driving cars operate in school zones, regulators evaluate whether automated systems are designed to respond to these same risks.
In the Santa Monica self-driving car accident, investigators are examining whether the vehicle’s programming accounted for obstructed views caused by double-parked cars and whether additional precautionary measures were warranted during peak drop-off times.
Obstructed visibility is common in school zones, particularly during morning and afternoon traffic.
Autonomous vehicles rely on a combination of:
- Cameras
- Radar
- Lidar sensors
- Onboard software models
These systems are designed to detect movement and classify objects. The question regulators are evaluating is how quickly the system identified the child and whether its response met safety expectations in that environment.
Prior Federal Scrutiny of Waymo
The Santa Monica investigation is not the first time Waymo’s automated driving system has faced federal review.
Earlier Collision Investigation
In July, 2025, NHTSA closed a 14-month investigation into a series of minor collisions involving Waymo vehicles. Regulators reviewed 22 reports of robotaxis striking stationary objects and engaging in driving behavior that may have violated traffic safety laws.
That investigation concluded after two recalls and related software updates.
School Bus Compliance Probe
In October 2025, NHTSA also opened a separate preliminary investigation into approximately 2,000 Waymo vehicles following reports that one robotaxi may have failed to properly stop for a school bus displaying flashing red lights and an extended stop arm.
Video footage from an incident in Georgia reportedly showed the vehicle stopping initially, then maneuvering around the bus while students were exiting.
The vehicle involved was operating without a human safety driver and was equipped with Waymo’s fifth-generation automated driving system.
Waymo has stated that it has implemented improvements related to school bus stops and plans additional software updates.
What This Means for Communities Where Self-Driving Cars Operate
Autonomous vehicles are currently operating in select California cities, including Los Angeles and San Francisco. Their presence in residential areas and near schools raises practical questions for families and local officials.
Areas of concern often include:
- How self-driving cars adjust behavior in school zones
- Whether operating areas should exclude certain high-risk locations
- How quickly companies report incidents
- What data is preserved after a crash
- Whether additional local oversight is necessary
Federal investigations can lead to recommendations, recalls, software updates, or operational restrictions if regulators determine changes are required.
How Investigations Can Affect Injury Claims
When a crash involves an autonomous vehicle, several layers of evidence may be involved.
Investigating an Autonomous Vehicle Accident
Potential sources of information include:
- Sensor recordings
- Event data recorder logs
- Software performance data
- Communication between the vehicle and company servers
- Law enforcement reports
- Witness statements
A federal investigation may result in public findings or safety recommendations that become relevant in civil injury claims. In cases involving children, medical documentation and long-term monitoring are often important, even when initial injuries appear minor.
Seeking Answers After Personal Injury
Families may have questions about:
- Who may be responsible for damages
- Whether the vehicle’s system functioned properly
- How comparative fault laws apply
- What deadlines apply for filing a claim
Each case depends on specific facts and evidence.
Legal Considerations After a Pedestrian Injury
When a child is struck by any vehicle—whether operated by a human driver or an automated system—families may face medical expenses, follow-up care, and other concerns.
In California, injury claims may involve:
- Medical costs
- Future treatment needs
- Pain and suffering
- Parental claims for related damages
Autonomous vehicle cases can involve additional technical evidence. Consulting an attorney familiar with emerging vehicle technology can help ensure that relevant data is preserved and reviewed.
Contact Penney & Associates for a Free Consultation
The investigation into the Santa Monica school zone incident is ongoing. Federal regulators will determine whether Waymo’s automated system met safety expectations and whether further action is necessary.
If your child has been injured in a vehicle collision, whether involving a human driver or an autonomous vehicle, understanding your legal options can help you make informed decisions about next steps. Contact our experienced, trial-tested personal injury attorneys at Penney & Associates for a free consultation or call us directly at (800) 616-4529. With offices across California, including Santa Monica, we’re always within easy reach.
Frequently Asked Questions
Are self-driving cars allowed to operate in school zones in California?
Yes. Autonomous vehicles operating under approved programs may travel on public roads, including near schools, as long as they comply with state and federal regulations. Companies must meet specific testing and operational requirements set by California authorities.
What is Level 4 autonomy?
Level 4 autonomous vehicles can perform all driving tasks within defined service areas without requiring human intervention. These vehicles are typically deployed in limited geographic zones and are not available for consumer purchase.
What happens when NHTSA opens a preliminary investigation?
A preliminary investigation allows federal regulators to gather information, review system performance data, and determine whether additional action is necessary. The agency may close the investigation, request software updates, recommend recalls, or expand the review depending on its findings.
Can a family pursue a claim if a child is injured by an autonomous vehicle?
In California, injury claims may be available if negligence or product defects contributed to the incident. These cases can involve technical evidence, including system data and vehicle logs. Each situation depends on the specific facts involved.
Do self-driving cars have to stop for school buses?
Yes. Autonomous vehicles are required to follow traffic safety laws, including those related to stopped school buses displaying flashing red lights and extended stop arms. Regulators may review system performance if compliance concerns are reported.
Read More
Self-Driving Car Accidents in California: Who Is Liable When There’s No Human Driver?
How Penney & Associates Won a $14 Million Bus Accident Settlement
RV Accidents in California: What You Need to Know About Injury Risks



