Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding “confusing” crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can’t handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I’m not blaming Waymo for doing it as safe as they can, that’s great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What’s really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
RoboTaxis will also have to “navigate” the Fashla hate. Not many will be eager to risk their lives with them
I think “near ideal conditions” is a huge exaggeration. The situations Waymo avoids are a small fraction of the total mileage driven by Waymo vehicles or the humans they’re being compared with. It’s like you’re saying a football team’s stats are grossly wrong if they don’t include punt returns.
I am once again begging journalists to be more critical
of tech companies.But as this happens, it’s crucial to keep the denominator in mind. Since 2020, Waymo has reported roughly 60 crashes serious enough to trigger an airbag or cause an injury. But those crashes occurred over more than 50 million miles of driverless operations. If you randomly selected 50 million miles of human driving—that’s roughly 70 lifetimes behind the wheel—you would likely see far more serious crashes than Waymo has experienced to date.
[…] Waymo knows exactly how many times its vehicles have crashed. What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash. Waymo has tried to address this by estimating human crash rates in its two biggest markets—Phoenix and San Francisco. Waymo’s analysis focused on the 44 million miles Waymo had driven in these cities through December, ignoring its smaller operations in Los Angeles and Austin.
This is the wrong comparison. These are taxis, which means they’re driving taxi miles. They should be compared to taxis, not normal people who drive almost exclusively during their commutes (which is probably the most dangerous time to drive since it’s precisely when they’re all driving).
We also need to know how often Waymo intervenes in the supposedly autonomous operations. The latest we have from this, which was leaked a while back, is that Cruise (different company) cars are actually less autonomous than taxis, and require >1 employee per car.
edit: The leaked data on human interventions was from Cruise, not Waymo. I’m open to self-driving cars being safer than humans, but I don’t believe a fucking word from tech companies until there’s been an independent audit with full access to their facilities and data. So long as we rely on Waymo’s own publishing without knowing how the sausage is made, they can spin their data however they want.
edit2: Updated to say that ournalists should be more critical in general, not just about tech companies.
I was going to say they should only be comparing them under the same driving areas, since I know they aren’t allowed in many areas.
But you’re right, it’s even tighter than that.
These articles frustrate the shit out of me. They accept both the company’s own framing and its selectively-released data at face value. If you get to pick your own framing and selectively release the data that suits you, you can justify anything.
Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it’s only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.
Are you talking about remote controlling cars from India or something?
That last sentence makes very little sense to me.How is that relevant? I’m pretty sure the latency would be too high, so it wouldn’t even work.
Ah OK you are talking about the navigators, that “help” the car when it can’t figure out what to do.
That’s a fair point.But still 1 navigator can probably handle many cars. So from the perspective of making a self driving taxi, it makes sense.
Thanks, but I am not, others on the road however, abysmal.
I find the scariest people on the road to be the arrogant ones that think they make no mistakes.
I would t consider anyone who hasn’t done at least a dozen track days, experienced several different extreme scenarios (over/under steer, looping, wet grass at speed, airtime (or at least one or more wheels off the ground), high speed swerving, snap oversteer, losing systems, like brakes, engine, or the steering wheel lock engaging, etc) to be remotely prepared to handle a car going more than 25 or so mph. An extreme minority of drivers are actually prepared to handle an incoming collision in order to fully mitigate a situation. And that is only covering the mechanical skill of piloting the car, it doesn’t even touch in the theoretical and practical knowledge (rules of the road, including obscure and unenforced rules) and it definitely doesn’t even broach the discipline that is required to actually put it all together.
If you a driver has never been trained, or even have an understanding of what will happen in an extreme scenario in a car, how could we consider them trained or sufficiently skilled.
We don’t let pilots fly without spending time in a simulator, going over emergency scenarios and being prepared for when things go sideways. You can’t become an airline pilot if you don’t know what happens when you lose power.
We let sub par people drive because restricting it too much would be seen as discrimination, but the overwhelming majority of people are ill equipped to actually drive.
I hope this is a copy pasta lmao, if you actually go to a training course where you learn to handle oversteer, understeer and spin you out, they tell you that you have about a fuck all chance of recovering, even when there when you have warning and you know it’s coming and you have a fairly low speed you have very little chance of counter steering correctly.
Here is what you actually have to do to drive safely:
1, dont be a dumbass that thinks you need to go through 12 years of Formula 1 training to drive on the road, if anything the fact that you think training can make you prepared for extreme situations and that you can handle it is what’s arrogant and dangerous.
2, dont be a dumbass and adjust your speed to driving conditions
3 dont be a dumbass and don’t push the limits of your car on public roads
4, defensive driving, assume people on the road are idiots and will fuck up and drive accordingly.
5, learn how your car works, eg. just because you have an e-Handbrake you can still pull on it and it will stop the car
6, and most important, because people don’t know how to do it, learn to emergency break, meaning your hazard lights come on.
I completely disagree.
You are using the hand brake as an example. 95 percent of people (including you, evidently) don’t even understand that the handbrake is not an emergency brake, they don’t get how the behavior works, or the fact that it’s meant to be used as a parking brake, I consistently see people slam their parking pawls verytime they get out of their car. (Not to mention that it doesn’t even work while you are driving on most modern cars and has no modulation, as it’s just a button)
If not being an idiot was good enough to drive a car, then it wouldn’t be so deadly. It’s also possible to fly a plane with common sense, but you wouldn’t be happy if your pilot told you they don’t have training.
Driving isn’t easy, it’s just that we accept an absolutely catastrophic amount of accidents as a cost of doing business.
It is an emergency brake when your brake fails, you donut. Again, it’s part of safety driving courses, that you clearly didn’t take.
I am also from Europe, drivers are much better here compared to the US, just because your country absolutely sucks at training it’s drivers despite being entirely reliant on them is not my fault
We always knew good quality self-driving tech would vastly outperform human skill. It’s nice to see some decent metrics!
Indeed
deleted by creator
I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.
“After 6 miles, Teslas crash a lot more than human drivers.”
So only drive 5 miles. I guess that’s good advice in general
And yet it’s still the least efficient mode of transport.
That doesn’t seem like a very high bar to achieve
As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.
Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?
I believe it, but they also only drive specific routes.
What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.
Also, I think it’s worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.
There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don’t have the data to actually analyze that, as far as I can tell.
There’s a limit to what assist systems can do. Having the car and driver fighting for control actually makes everything far less safe.
Focusing on airbag-deployments and injuries ignores the obvious problem: these things are unbelievably unsafe for pedestrians and bicyclists. I curse SF for allowing AVs and always give them a wide berth because there’s no way to know if they see you and they’ll often behave erratically and unpredictably in crosswalks. I don’t give a shit how often the passengers are injured, I care a lot more how much they disrupt life for all the people who aren’t paying Waymo for the privilege.
So the fact that after 50 million miles of driving there have been no pedestrian or cyclist deaths means they are unbelievably unsafe for pedestrians and cyclists? As far as I can tell, the only accidents involving pedestrians or cyclists AT ALL after 50 million miles is when a Waymo struck a plastic crate that careened into another lane where a scooter ran into it. And yet in your mind they are unbelievably unsafe?
I live in Phoenix, Arizona and these are all around. Honestly I feel like the future everyone will have Waymo type services and no one will own cars or even need to learn how to drive one. Who needs to worry about car repairs insurance etc.
*human drivers remotely controlling cars crash less than humans directly controlling cars