Waymo robotaxi hits child near school as feds open probe

A child walking near a Santa Monica elementary school was hit by a driverless Waymo vehicle during the morning rush, an incident that has jolted parents, regulators, and the autonomous driving industry. Federal safety officials have now opened a formal probe into how the robotaxi behaved in a school zone, turning a single collision with minor injuries into a major test of whether self-driving cars can safely share the road with children.

 If you live in a city where robotaxis are starting to appear, this crash is not an abstract tech story. It is a real-world stress test of the promises companies like Waymo have made about safety, accountability, and how their systems handle the messy reality of school drop-off traffic.

What happened outside the Santa Monica school

Earlier this year, a Waymo autonomous vehicle struck a child near an elementary school in Santa Monica, Calif, during normal school drop-off hours, in an area crowded with parents, students, and parked vehicles. According to multiple accounts, the child was walking toward the school when they moved into the street from behind a double-parked SUV and were then hit by the Waymo AV, which was operating in driverless mode as a robotaxi service in California traffic. The collision left the child with minor injuries, but the fact that a self-driving car hit a student near a campus has raised sharp questions about how these systems perceive and react to children in complex school environments, especially when visibility is blocked by other parked vehicles in the vicinity, as described in early reports.

 Officials said the child ran across the street from behind the double-parked SUV toward the school, a classic school-zone hazard that can challenge even attentive human drivers, and that the Waymo AV did not avoid contact despite its array of sensors and software designed to detect pedestrians. Those same Officials noted that the child’s injuries were minor, which spared the community a far worse outcome but did not lessen the alarm among parents watching a driverless car fail in a moment when children are most vulnerable. For families who walk their kids to school, the idea that a robotaxi could misjudge a child darting from behind an SUV cuts directly against the industry’s core safety pitch.

How regulators are responding to the crash

The collision quickly triggered a formal response from The National Highway Traffic Safety Administration, with the agency opening what it calls a preliminary evaluation into Waymo’s automated driving system. The National Highway Traffic Safety Administration, often shortened to NHTSA, is focusing on how the company’s protocols handle school zones and other areas with vulnerable road users, and local coverage in Santa Monica has emphasized that the federal review is looking at whether the system behaved as designed when the child appeared from behind another vehicle, as described in a detailed summary.

 At the national level, the agency has framed the probe as part of a broader effort to understand how autonomous vehicles behave in chaotic school zones, where children, crossing guards, and double-parked cars create a constantly shifting puzzle. A federal notice described the review as an evaluation of system performance around schools and other potential vulnerable road users, and a separate description of the case on a public docket noted that Waymo provided its required report under SGO 2021-01 and that NHTSA is aware the incident occurred within two blocks of a Santa Monica school, details that appear in the Waymo and NHTSA filing.

Waymo’s account of what its robotaxi did

Waymo has said its technology reacted as soon as the child became visible, arguing that the system “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle” and then braked. In the company’s telling, the robotaxi’s sensors and software recognized the child and responded faster than a typical human driver might have in the same situation, a claim that is central to Waymo’s defense of its automated driving system and that appears in its public statement.

 At the same time, the company has acknowledged that the collision did occur and that the child was struck by a Waymo vehicle during normal school drop-off hours outside a school, a fact that federal filings and multiple accounts repeat. Waymo has said it is fully cooperating with investigators and has shared new details about the incident, including how its system classified the child and what braking maneuvers it executed, as part of a broader effort to reassure riders and regulators that it takes school-zone safety seriously, a message echoed in coverage of the company’s response in California.

Inside the federal probe and what is at stake

For you as a road user, the most important part of this story may be what the federal probe actually looks at and what it could change. The National Highway Traffic Safety Administration has launched what it calls a preliminary evaluation of Waymo’s system performance in school zones, a research step that lets engineers pull data from the company, reconstruct the crash, and compare the robotaxi’s behavior to both its own design specifications and to human driving norms, as described in a technical evaluation.

 Regulators are not only interested in this one collision but also in whether it reveals a pattern or design gap in how Waymo handles children, crossing guards, and parked vehicles around schools. A federal summary of the case notes that the child was struck by a Waymo self-driving vehicle near a school, causing minor injuries, and that the United States has opened a probe into the company’s self-driving operations in that context, language that appears in a Waymo focused filing. If the evaluation finds systemic issues, it could lead to a broader engineering analysis or even a recall of software, which would ripple across every city where you might see a Waymo robotaxi in service.

Parents, schools, and the trust problem around robotaxis

For parents and school staff in Santa Monica, the collision has turned a theoretical debate about autonomous vehicles into a very local safety conversation. One local education report described how a Waymo hit a student during drop-off at a Santa Monica school and noted that the update was posted at 9:53 am, with the byline from Emma Gallegos and a “Republish” tag, underscoring how quickly the story spread through school communities that Friday morning, details captured in the Emma Gallegos coverage. For families who walk or bike to class, the idea that a driverless car can appear at the curb without a human behind the wheel is already unsettling, and a single crash can harden that skepticism into outright opposition.

Beyond Santa Monica, the incident feeds into a national conversation about whether you should trust robotaxis around your own neighborhood schools. A federal account of the case framed it as part of a pattern of safety concerns around self-driving vehicles, with Federal officials investigating after a Waymo self-driving vehicle struck a child near an elementary school in California and highlighting that the child sustained minor injuries, as noted in a Federal summary. When you combine that with the fact that the child was struck outside a school during normal drop-off hours, as repeated in multiple US focused descriptions, it is clear why school districts and parent groups are now pressing for stricter rules on where and when autonomous vehicles can operate.

More from Fast Lane Only

Charisse Medrano Avatar