Waymo’s long running pitch has been that its robotaxis are safer than human drivers, yet its vehicles have now been caught repeatedly driving past stopped school buses with their stop arms extended. At the same time, one of the company’s founders is publicly attacking Tesla’s driver assistance technology as fundamentally unsafe, sharpening a rivalry that is increasingly about ethics as much as engineering. The clash raises a blunt question for regulators and parents alike: whose safety narrative can be trusted when both companies are under scrutiny for how their vehicles behave around the most vulnerable road users.
Waymo’s school bus failures collide with its safety brand
For a company that has built its identity around meticulous safety, the image of a Waymo Driverless Vehicle gliding past a school bus with red lights flashing is particularly damaging. Austin ISD officials have documented at least 20 incidents in which Waymo vehicles failed to stop for buses that were loading or unloading children, a pattern that prompted the company to initiate a software recall and update. In one December video captured by an Austin ISD bus camera, a Waymo vehicle can be seen driving through the stop despite the extended stop arm, a violation of a basic traffic rule that human drivers are taught in their first lessons.
Waymo has acknowledged that its software did not consistently recognize or respond correctly to stopped school buses, and it has told regulators it is updating its system to address the problem. Yet reports from Austin ISD and school transportation officials indicate that Waymo Driverless Vehicles continue to illegally pass school buses even after the initial fix, suggesting that the underlying perception and decision making challenges are not yet fully resolved. Safety advocates note that this is not a subtle edge case but a core scenario that any system claiming to be “better than a human driver” should handle flawlessly, especially in neighborhoods where children cross in front of large vehicles.
The “better than human” promise meets a harsh real world test
Waymo and other robotaxi developers often argue that their vehicles should be judged against the performance of average human drivers, not an impossible standard of perfection. In theory, that benchmark should favor automated systems, which do not get distracted, drunk, or fatigued. However, the school bus incidents in Austin show how brittle that argument becomes when a system repeatedly fails at a rule that most human drivers internalize as non negotiable. Analysts who have examined the footage describe Waymo robotaxis “blowing past” stopped buses in situations where a cautious human would have slowed well in advance, undercutting the company’s claim that its technology already exceeds human competence.
Critics also point out that the “better than human” framing can morph into a kind of moral shield, where companies imply that any residual risk is acceptable because the overall statistics look favorable. Commentators on automated driving have warned that some robotaxi operators behave as if they are “above the law because [they are] better than human,” treating traffic rules as flexible guidelines rather than hard constraints when their algorithms predict a low probability of harm. The repeated bus violations in Austin suggest that, at least in this context, the system’s understanding of both human behavior and legal requirements was incomplete, raising questions about how thoroughly Waymo validated its software before deploying it in neighborhoods filled with schoolchildren.
Waymo’s founder escalates his critique of Tesla’s safety
Against this backdrop, one of Waymo’s original architects has intensified his criticism of Tesla’s approach to automated driving. In a recent interview, the Waymo founder argued that Tesla’s Full Self Driving (Supervised) system relies on too little data and too few sensors, asserting that “one clear first principle about autonomous driving is you want as much data as possible” and that “everyone in the space agrees on this.” He suggested that Tesla’s camera only strategy would not even pass a basic DMV style vision test, a pointed way of saying the system lacks the redundancy and environmental awareness needed for robust safety.
The same figure has previously declined to comment directly on Tesla, saying he did not know the details of its technology and preferred to focus on the ethos that guided Waymo’s development. That ethos, he has said, centered on designing a system that is at least as safe as a careful human driven car, with multiple sensor modalities and conservative driving policies. The shift from that guarded stance to a blunt assertion that Tesla’s system would fail a vision test signals a growing frustration within the Waymo camp about what it sees as Tesla’s willingness to push driver assistance features to consumers before they are ready, and to market them in ways that blur the line between assistance and autonomy.
Data wars: Tesla’s crash numbers versus Waymo’s robotaxis
Tesla, for its part, has responded to industry criticism by publishing more granular safety statistics for its Full Self Driving (Supervised) feature. According to Tesla, North American drivers using Full Self Driving (Supervised) travel around 5 million miles between airbag deployments, a figure the company presents as evidence that its software materially reduces crash risk compared with conventional driving. The company has also emphasized that its current offering is a driver assist system that requires human supervision, not a fully driverless service like Waymo’s robotaxis, and it has framed its data disclosures as part of a broader push to make the road safer through transparency.
Independent comparisons of crash data, however, paint a more complicated picture. Analyses that look at NHTSA incident reports for ADS vehicles have found that Tesla robotaxis with human safety monitors have been involved in more reported crashes than Waymo’s fully driverless fleet, even after accounting for differences in deployment scale. One review that drew on NHTSA filings and Electrek’s compilation of incidents concluded that, with the two companies now operating ADS vehicles, Tesla and Waymo show distinct safety profiles, with Waymo generally experiencing fewer reportable crashes per mile in its limited service areas. These comparisons are imperfect, since reporting thresholds and operating domains differ, but they undercut any simple narrative that one company has definitively “won” the safety race.
Competing philosophies on autonomy, law, and public trust
Beyond raw crash numbers, the Waymo Tesla rivalry reflects two starkly different philosophies about how automated driving should be introduced to the public. Waymo has focused on tightly geofenced robotaxi services in cities like Phoenix and Austin, with no human driver behind the wheel but extensive pre mapping, conservative driving behavior, and a gradual expansion of service zones. Its executives, including Tekedra Mawakana, have argued that it is important for AV companies to be transparent about their safety performance and limitations, and they have framed their approach as a cautious, data driven path toward removing human drivers entirely in specific, well understood environments.
Tesla has taken the opposite route, embedding its Full Self Driving (Supervised) software in consumer vehicles such as the Model 3 and Model Y and allowing owners to use it on a wide range of public roads, from freeways to city streets. Former Waymo CEO John Krafcik has dismissed Tesla’s Cybercab ambitions by saying that Tesla is “a car company with a driver assist system” and “still not” a direct competitor to a true robotaxi operator, arguing that a system that depends on constant human oversight should not be marketed as self driving. Safety advocates have echoed that concern, pointing to incidents in which Tesla owners have treated Autopilot or Full Self Driving as a substitute for attentive driving, including cases where drivers let the system take them to the hospital instead of focusing on the road themselves.
More from Fast Lane Only






