Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.
I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.
What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
This would be more impressive if Waymos were fully self-driving. They aren't. They depend on remote "navigators" to make many of their most critical decisions. Those "navigators" may or may not be directly controlling the car, but things do not work without them.
When we have automated cars that do not actually rely on human being we will have something to talk about.
It's also worth noting that the human "navigators" are almost always poorly paid workers in third-world countries. The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.
Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.
I am once again begging journalists to be more critical of tech companies.
But as this happens, it’s crucial to keep the denominator in mind. Since 2020, Waymo has reported roughly 60 crashes serious enough to trigger an airbag or cause an injury. But those crashes occurred over more than 50 million miles of driverless operations. If you randomly selected 50 million miles of human driving—that’s roughly 70 lifetimes behind the wheel—you would likely see far more serious crashes than Waymo has experienced to date.
[...] Waymo knows exactly how many times its vehicles have crashed. What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash. Waymo has tried to address this by estimating human crash rates in its two biggest markets—Phoenix and San Francisco. Waymo’s analysis focused on the 44 million miles Waymo had driven in these cities through December, ignoring its smaller operations in Los Angeles and Austin.
This is the wrong comparison. These are taxis, which means they're driving taxi miles. They should be compared to taxis, not normal people who drive almost exclusively during their commutes (which is probably the most dangerous time to drive since it's precisely when they're all driving).
We also need to know how often Waymo intervenes in the supposedly autonomous operations. The latest we have from this, which was leaked a while back, is that Cruise (different company) cars are actually less autonomous than taxis, and require >1 employee per car.
edit: The leaked data on human interventions was from Cruise, not Waymo. I'm open to self-driving cars being safer than humans, but I don't believe a fucking word from tech companies until there's been an independent audit with full access to their facilities and data. So long as we rely on Waymo's own publishing without knowing how the sausage is made, they can spin their data however they want.
edit2: Updated to say that ournalists should be more critical in general, not just about tech companies.
No shit. The bar is low. Humans suck at driving. People love to throw FUD at automated driving, and it's far from perfect, but the more we delay adoption the more lives are lost. Anti-automation on the roads is up there with anti-vaccine mentality in my mind. Fear and the incorrect assumption that "I'm not the problem, I'm a really good driver," mentality will inevitably delay automation unnecessarily for years.
I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.
As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.
Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?
I had a friend that worked for them in the past. They really aren't that impressive. They get stuck constantly. While the tech down the line might be revolutionary for people who cannot drive for whatever reason right now it still needs a LOT of work.
What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.
Also, I think it's worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.
There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don't have the data to actually analyze that, as far as I can tell.
Focusing on airbag-deployments and injuries ignores the obvious problem: these things are unbelievably unsafe for pedestrians and bicyclists. I curse SF for allowing AVs and always give them a wide berth because there's no way to know if they see you and they'll often behave erratically and unpredictably in crosswalks. I don't give a shit how often the passengers are injured, I care a lot more how much they disrupt life for all the people who aren't paying Waymo for the privilege.
I live in Phoenix, Arizona and these are all around. Honestly I feel like the future everyone will have Waymo type services and no one will own cars or even need to learn how to drive one. Who needs to worry about car repairs insurance etc.
Thing is, the end goal after sorting out all the bugs in the AI is no human druven cars since having both will only lead to crashes dur to AI being unable to predict a human. All the AI cars would be linked to a central system to communicate with eachother and alwats know where eachither are. Then all we have to do is make sure people only use the cross walks and traffic accudents will be solely due to idiots.
driving regulations and enforcement should just be stricter on humans and self driving can stay as trains separated from cars,bikes, and pedestrians, which should all be separated from each other as well.