US probes Waymo robotaxis for blowing past stopped school buses

Federal safety officials are scrutinizing Waymo after its driverless taxis were recorded gliding past stopped school buses with red lights flashing and stop arms extended, maneuvers that would bring steep penalties for any human driver. The incidents, clustered around Austin, Texas, have triggered overlapping federal investigations and raised pointed questions about whether autonomous systems are ready to navigate the most sensitive moments on American roads. At stake is not only the safety of children stepping off buses but also public confidence in a technology that has long promised to be safer than people behind the wheel.

From local complaints to dual federal investigations

The current storm around Waymo began at the local level, when the Austin Independent School District documented a series of encounters between its buses and the company’s robotaxis. In a public letter, the district said that five of nineteen school bus related incidents involved Waymo vehicles passing buses that were stopped with their stop arms out and lights flashing, a scenario in which state law typically requires all traffic to halt. Those reports prompted the National Highway Traffic Safety Administration to open a defect investigation into Waymo’s behavior around school buses, focusing on whether the company’s automated driving system is failing to recognize or properly respond to one of the clearest signals on the road.

That initial probe has since been expanded after regulators received additional information about the nineteen violations and sent Waymo a formal information request that referenced the Austin School Distr records. The National Highway Traffic Safety Administration is examining how the company’s software interprets bus signals, what internal safeguards exist to prevent illegal passes, and whether similar behavior is occurring in other cities where Waymo operates. The agency’s October decision to open the case, described in summaries of the investigation, set the stage for a broader federal response that now includes a second, independent review by the National Transportation Safety Board.

NTSB steps in as incidents persist

As reports of illegal passes continued, The National Transportation Safety Board announced that it was opening its own investigation into Waymo’s conduct around school buses. The NTSB, which typically focuses on systemic safety failures rather than individual traffic tickets, said it was responding to a pattern of robotaxis proceeding past stopped buses with extended stop arms and flashing red lights in Austin. According to local accounts, each of the documented scenarios involved buses that were clearly signaling that children could be crossing, a context in which no driver is allowed to pass under Texas law.

The National Transportation Safety Board has emphasized that it is not a regulatory agency like NHTSA and cannot issue fines or direct recalls. Rather, The NTSB investigates crashes and hazardous behavior, then issues safety recommendations that can shape future rules and industry practices. In this case, the board has requested detailed data from Waymo about the incidents, including video, sensor logs, and any internal analyses of why the vehicles continued moving when the buses were stopped. The NTSB has also indicated that it will examine whether Waymo’s claims that its technology is superior to human drivers hold up when the vehicles are confronted with school bus scenarios that most motorists are trained to treat with extreme caution.

Waymo’s defense, software fixes, and remaining gaps

Waymo has argued that, despite the violations, its vehicles have not caused crashes in the school bus encounters now under federal review. The company has said that in the incidents flagged by The National Highway Traffic Safety Administration and The National Transportation Safety Board, its Waymo Driver system safely navigated around the buses without colliding with children, pedestrians, or other vehicles. In public statements, Waymo has framed the problem as one of interpretation and nuance, suggesting that its software made decisions that technically moved past buses but did so at low speeds and with careful monitoring of the surroundings.

In response to the mounting scrutiny, Waymo has said it rolled out a software update in November intended to improve how its robotaxis respond to stopped school buses. According to reporting on the company’s communications with regulators, that update was supposed to reduce or eliminate instances in which the vehicles would proceed when a bus had its stop arm extended and red lights flashing. Yet local authorities in Austin have reported at least four additional violations after the update, indicating that the fix did not fully resolve the issue. That gap between Waymo’s assurances and the continued citations has fueled skepticism among parents, school officials, and safety advocates who see school bus stops as a bright line that automated systems must not cross.

Why school buses are a red line for regulators

For federal investigators, the school bus incidents cut to the heart of the social contract around autonomous vehicles. Passing a stopped bus with its red lights on is one of the clearest and most widely understood traffic offenses in the United States, and it exists to protect children who may dart into the street with little warning. The NTSB has noted that its interest in Waymo’s behavior is tied to the broader question of whether automated driving systems can be trusted to handle the most vulnerable road users, especially in environments like school zones where the margin for error is effectively zero. Regulators are also aware that thousands of drivers illegally pass school buses each week across the country, which makes any suggestion that robotaxis might repeat that behavior particularly troubling.

The stakes are heightened by the way autonomous vehicle companies market their technology. Waymo has long promoted the Waymo Driver as safer than human drivers, pointing to millions of miles of operation and a relatively low crash rate. The National Transportation Safety Board’s decision to open a dedicated investigation into school bus behavior, alongside NHTSA’s defect probe, signals that federal officials want to test those safety claims against one of the most unforgiving benchmarks on the road. If the investigations conclude that the system systematically misreads or mishandles school bus cues, the findings could influence not only Waymo’s operations but also how other companies design and validate their own automated driving software.

What the probes could mean for autonomous driving’s future

The dual investigations place Waymo at a critical juncture in the broader rollout of robotaxis in cities like Austin and Atlanta. The NHTSA probe could lead to a formal recall or mandated software changes if regulators determine that the Waymo Driver contains a safety defect related to school bus recognition or response. At the same time, The NTSB’s eventual report, which could take a year or more, is likely to include detailed recommendations on how automated vehicles should be tested and certified for interactions with school buses and other high risk scenarios. Those recommendations, while not binding, often carry significant weight with both regulators and industry engineers.

For the autonomous vehicle sector, the outcome will help define how much public trust these systems can realistically command. If Waymo is able to demonstrate that its updated software eliminates illegal passes and that its vehicles handle school bus stops more conservatively than human drivers, the company could turn a reputational crisis into evidence that the technology can learn and improve under pressure. If, however, the investigations uncover deeper flaws or a pattern of risky decision making, the findings could slow deployments, invite stricter oversight, and harden public skepticism about sharing the road with driverless cars. Either way, the sight of a robotaxi sliding past a stopped school bus has already become a powerful symbol of the unresolved questions that still surround automated driving in the United States.

More from Fast Lane Only

Charisse Medrano Avatar

Leave a Reply

Your email address will not be published. Required fields are marked *