Tesla is about to put a few alleged robotaxis on the streets in Austin soon. They won’t be the CyberCab robotaxis that Tesla “revealed” late last year, but regular ole Model Ys running some version of Tesla’s so-called “Full Self-Driving” software. And how are things going with that software? Well, if you look at the results of the latest test from the Dawn Project, a safety organization started by Caltech grad and software engineer Dan O’Dowd, the answer is a resounding “not great.” At least if you care about cars stopping for kids getting off school buses.
As you can see in the video below, O’Dowd drove a Model Y running FSD 13.2.9 past a stopped school bus with its stop sign deployed and the red lights flashing. They then pulled a child-size dummy out into the Tesla’s path to see whether or not it would stop for the child. The car drove through the child like it wasn’t even there.
To make matters worse, the Tesla didn’t fail to detect the mannequin, which would have still been bad enough. The system actually detected the child-size stand-in and classified it as a pedestrian but didn’t bother to stop or even slow down. It just kept on going, completely ignoring the “kid” crossing the street. You’d think someone as baby-happy as Elon Musk would at least want his products to avoid running over children, right?
Tesla hits kid
A the video points out, the Dawn Project documented FSD’s issues with stopping for school buses and warned Tesla about the risk two and a half years ago. It also ran a Super Bowl commercial in 2023, pointing out the same thing. Not long after the Dawn Project’s ad ran, the exact thing it warned about supposedly happened in real life. As ABC 11 reports, the driver of a Tesla Model Y hit Tillman Mitchell, a high school student at the Haliwa-Saponi Tribal School in Hollister, North Carolina, as he got off a school bus. Investigators suspect that either Autopilot or Full Self Driving may have been engaged during the crash. Mitchell suffered “life-threatening injuries” and was flown to a nearby hospital for treatment, where, thankfully, he survived. Still, a fractured neck and broken leg aren’t exactly the kinds of injuries you recover from quickly. That incident is also far from the only time a Tesla failed to stop.
If Tesla’s so-called “Full Self-Driving” software only worked on highways like some other driver-assistance systems, its inability to reliably stop for emergency vehicles and other stopped vehicles, such as school buses, would still be a problem. But with plans to deploy a bunch of Model Ys as robotaxis on city streets in a matter of weeks, it’s even more concerning.
The robotaxis are still coming
Granted, the Dawn Project and founder Dan O’Dowd are anti-Tesla and don’t try to hide that fact. As a result, you can expect Tesla’s army of nerds to show up in the comments, complaining that this was just a hit piece from an organization that hates their hero, Elon Musk. And yet, you can still watch the video itself. You can still see it run over the mannequin. If O’Dowd faked the test and is so obviously guilty of defaming Tesla and Musk, where are the lawsuits?
I asked the same question when YouTuber Mark Rober crashed a Tesla through a foam wall with a road painted on it, Wile E. Coyote-style, and that lawsuit has yet to materialize, too. But you can totally trust the guy who’s been promising self-driving cars for more than a decade. Totally.
In an app on your phone, software bugs are still annoying, but no one’s life is ever at risk if you can’t get your McDonald’s order to go through. Self-driving software in cars is a completely different thing, though, and the Silicon Valley motto of “move fast and break things” really shouldn’t apply when those things are actually people. Especially children.