Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.
Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.



It’s important to draw the line between what Tesla is trying to do and what Waymo is actually doing. Tesla has a 4x higher rate, but Waymo has a lower rate.
Not just lower, a tiny fraction of the human rate of accidents:
https://waymo.com/safety/impact/
Also, AFAIK this includes cases when the Waymo car isn’t even slightly at fault. Like, there have been 2 deaths involving a Waymo car. In one case a motorcyclist hit the car from behind, flipped over it, then was hit by another car and killed. In the other case, ironically, the real car at fault was a Tesla being driven by a human who claims he experienced “sudden unintended acceleration”. It was driving at 98 miles per hour in downtown SF and hit a bunch of stopped cars at a red light, then spun into oncoming traffic and killed a man and his dog who were in another car.
Whether or not self-driving cars are a good thing is up for debate. But, it must suck to work at Waymo and to be making safety a major focus, only to have Tesla ruin the market by making people associate self-driving cars with major safety issues.
https://www.iihs.org/research-areas/fatality-statistics/detail/state-by-state
Well, no. Lets talk fatality rate. According to linked data, human drivers
Vs Waymo 2 deaths per 127 million miles :)
Well, Waymo’s really at 0 deaths per 127 million miles.
The 2 deaths are deaths that happened were near Waymo cars in a collision involving the Waymo car. Not only did the Waymo not cause the accidents, they weren’t even involved in the fatal part of either event. In one case a motorcyclist was hit by another car, and in the other one a Tesla crashed into a second car after it had hit the Waymo (and a bunch of other cars).
The IIHS number takes the total number of deaths in a year, and divides it by the total distance driven in that year. It includes all vehicles, and all deaths. If you wanted the denominator to be “total distance driven by brand X in the year”, you wouldn’t keep the numerator as “all deaths” because that wouldn’t make sense, and “all deaths that happened in a collision where brand X was involved as part of the collision” would be of limited usefulness. If you’re after the safety of the passenger compartment you’d want “all deaths for occupants / drivers of a brand X vehicle” and if you were after the safety of the car to all road users you’d want something like “all deaths where the driver of a brand X vehicle was determined to be at fault”.
The IIHS does have statistics for driver death rates by make and model, but they use “per million registered vehicle years”, so you can’t directly compare with Waymo:
https://www.iihs.org/ratings/driver-death-rates-by-make-and-model
Also, in Waymo it would never be the driver who died, it would be other vehicle occupants, so I don’t know if that data is tracked for other vehicle models.
I seem to recall a homeless woman that got killed like right away when they released these monstrosities on the road, because why pay people to do jobs when machines can do them for you? I’m sure that will work out for everyone, with investment income.
That was Uber’s attempt at self driving after they had to give up the stolen Waymo data. Waymo is probably the best at self driving, but even they spend too much time blocking traffic when they can’t reach the Indian call center to fix the situation they’ve gotten themselves in.
Backed up by AI, Actually Indians.
You seem to recall wrongly.
Unless you found the video I will trust my memory.
The video of the thing that didn’t happen?
When there’s two deaths total it’s pretty obvious that there just isn’t enough data yet to consider the fatal accident rate. Also FWIW like was said neither of those was in any way the Waymo’s fault.
That’s the problem, you can’t trust these companies not to use corrupt influence to blame others for their mistakes. It’s you verses a billions of dollars companies with everything at stake, that owns (senior tiered leasing rights,) your politicians, both locally, in state, and federally, and by extension the regulators up and down the line.
Do you not know how things work in this country? Given their outsized power we don’t want them involved in determining blame for accidents, dash cam footage or no, we’ve seen irrefutable evidence is no guarentee of justice, even if it’s provided to you.
Well Waymo isn’t assigning blame, it’s a third party assessment based on the information released about those accidents. The strongest point remains that fatal accidents are rare enough that there simply isn’t enough data to claim any statistical significance for these events. The overall accident rate for which data is sufficient remains significantly lower than the US average.
They have influence with the police and regulators, and insurance companies, to avoid blame.
They are on limited routes, at lower speeds, so they won’t have a higher fatality rate. If you compared human drivers for that same stretch of road it would also be zero. You can’t compare human drivers on expressways during rush hour with waymo’s trip between the airport and the hotels on a mapped out route that doesn’t go on the expressway.
It is obviously false that fatal accidents would be “zero” on the roads Waymos are limited to, it’s ridiculous to even suggest such a thing. What is true that such accidents are even more rare there though. It’s another good reason for why it makes no sense to solely focus on fatal accidents as they are unlikely to be involved in them anyway due to these limits. That’s in addition to the fact that the statistical analysis is simply impossible with current vehicle miles.
Now, I’m not saying we know for certain Waymo is much safer than a human as the current statistics imply, that is going to require more rigorous studies. I would say what we’ve got is good enough to say that nothing points to them being particularly unsafe.
What do you mean, you are comparing dangerous driving spots to safe driving spots. No shit the highway entry ramps have more fatal accidents than the leisure cruise in the 8 lane road from the airport to the hotel. And yes, human drivers on that leisure cruise would have a different rate of accidents than on the death ramps on the expressways.
Not acknowledging that point, and misrepresenting it, doesn’t speak well to your credibility here, it’s a simple and unarguable point.
The “fault” means nothing to “deaths per miles” statistic though?
I immediately formed a conspiracy theory that Teslas automatically accelerate when they see Waymo cars
And it’s not out of aggression. It’s just that their image recognition algorithms are so terrible that they match the Waymo car with all its sensors to a time-traveling DeLorean and try to hit 88 mph… or something.
They crash for the memes. Sounds about right considering who’s in charge.
Isn’t Waymo rate better because they are very particular where they operate? When they are asked to operate in sligthly less than perfect conditions it immediately goes downhill https://www.researchgate.net/publication/385936888_Identifying_Research_Gaps_through_Self-Driving_Car_Data_Analysis (page 7, Uncertainty)
Edit: googled it a bit, and apparently Waymo mostly drives in
Teslas do not.
We are talking about Tesla robotaxis. They certainly do drive in very limited geofenced areas also. While Waymo now goes on freeways only in the Bay Area with the option being offered to only some passengers Tesla Robotaxis do not go on any freeways ever currently. In fact they only have a handful of cars doing any unsupervised driving at all and those are geofenced in Austin to a small area around a single stretch of road.
Tesla Robotaxis currently also cease operations in Austin when it rains so Waymo definitely is the more flexible one when it comes to less than perfect conditions.
That is certainly true, but they are also better than humans in those specific areas. Tesla is (shockingly) stupid about where they choose to operate. Waymo understands their limitations and choose to only operate where they can be better than humans. They are increasing their range, though, including driving on the 405 freeway in Los Angeles… which is usually less than 35mph!!
Because Waymo uses more humans?
Because Waymo doesn’t try and do FSD with only cameras.
Are they doing FSD if there are human overseas? Surely that is not “fully”.
So human overseas and not only cameras.
All these services have the ability for a human to solve issues if the FSD disengages. Doesn’t mean they’re not driving on their own most of the time including full journeys. The remote assistant team is just ready to jump in if there’s something unusual that causes the Waymo driver to disengage and even then they don’t usually directly control the car, they just give the driver instructions on how to resolve the situation.
I think Waymo are right to do what they do. I just wouldn’t call it “fully”. If Telsa are doing the same and still doing badly, or should be doing the same and aren’t, it still makes them worse than Waymo either way.
No, they don’t.
Searching for “Waymo human overseas” brings up results about it. Doing similar for Telsa isn’t finding anything. Also I’ve not heard about like I have with Waymo. I don’t think Waymo are wrong to do this at all. It not making a decision when unsure is safer.
Waymo has a capability for remote control of their cars in niche situations. They don’t do it all the time like Tesla has been doing. It is for when they get boxed in to a street by moving trucks and the only way to move past is to break a driving rule like passing in a no passing zone. They have one remote driver for every 40 cars. Tesla has one remote driver for every one car.
You got a link for those numbers? Very damning if true. 40x the humans and still worse stats?
https://www.forbes.com/sites/bradtempleton/2026/02/17/waymo-overseas-human-assist-wasnt-secret-but-is-it-secure/
“It turns out Waymo has about one remote assistance operator for every 40 vehicles.”
https://www.t3.com/auto/teslas-first-robotaxis-are-actually-manned-by-humans
https://www.motorbiscuit.com/tesla-robotaxis-crash-higher-humans/
“Making the numbers look even worse for Tesla, “virtually every single one of these miles was driven with a trained safety monitor in the vehicle who could intervene at any moment, which means they likely prevented more crashes that Tesla’s system wouldn’t have avoided.””