Crashes have surged in the past four years, reflecting the hazards associated with increasingly widespread use of Tesla’s futuristic driver-assistance tech.
Tesla CEO Elon Musk has said that cars operating in Tesla’s Autopilot mode are safer than those piloted solely by human drivers, citing crash rates when the modes of driving are compared. He has pushed the carmaker to develop and deploy features programmed to maneuver the roads, arguing that the technology will usher in a safer, virtually accident-free future. While it’s impossible to say how many crashes may have been averted, the data shows clear flaws in the technology being tested in real time on America’s highways.
I dislike elon, and I'm never buying a Tesla (I own a different EV already). However, until someone shows me the equivalent human caused rates (for the same type of roads and distance) these numbers just simply don't look out of line with what I would expect for any car.
IMO Self driving doesn't need to be perfectly safe, it just needs to be equivalent or safer than the average human driver.
I think the bigger issue is the lack of transparency. Tesla only reported 3 fatalities involving autopilot while the real number is 17. Not a massive difference when dealing with low numbers like this, but still a big issue if Tesla is lying about safety data.
Yeah this. Maybe I could agree that it’s too soon to be testing these auto pilots on the road, but I dislike how people miss the point with this tech. They set an impossible standard for a technology that could potentially be better than us on the road
@BlameThePeacock @tango_octogono
Fair, and in principle I agree. But it's not (only) the people who miss the point and set an impossible standard but foremost Mr Elon Musk himself. He has been promoting Tesla's autopilot and even its self-driving capability for years (although the folks at Tesla will certainly know that the latter won't come anytime soon).
A 2016 video that Tesla (TSLA.O) used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system did not have, according to testimony by a senior engineer.
One of the things we needed for setting reasonable expectations regarding this tech is more reliable information also from Tesla and its CEO. As long as the company itself is frequently flooding the market with unrealistic "news" about this tech, it is good that there are independent investigations imo.
Maybe that's the case but it's not what the article says. Experts say that "the surge in Tesla crashes is troubling" and that "the number of fatalities compared to overall crashes was also a concern". And they are critical of Tesla as the company is obviously beta-testing a car on the highway without disclosing its data as others have already said.
I don't see someone setting impossible security standards, at least that's how I read the article. Tesla appears to value its profits more than the safety and lives of people.
Yeah, the article reads like a hit piece. Why didn't they even try to do an apples to apples comparison? Same thing when they compared Tesla to Subaru, just raw total numbers. Not normalized to cars sold or miles driven or anything. The raw data is pretty meaningless.
Why didn't they even try to do an apples to apples comparison?
Maybe because the data released by Tesla is incomplete and biased as it appears to serve its sales rather than safety?
It's the company and Elon Musk himself that are frequently making bold statements while it seems that not even the authorities have the data to verify the claims. As the article says:
In a March presentation, Tesla claimed Full Self-Driving crashes at a rate at least five times lower than vehicles in normal driving, in a comparison of miles driven per collision. That claim, and Musk’s characterization of Autopilot as “unequivocally safer,” is impossible to test without access to the detailed data that Tesla possesses.
I agree with you that we need more data. Right now the USA average car death's are 1.37 per 100 million miles driven.
From what little Tesla has talked about the autopilot is below that average. BUT the raw data hasn't been released. We don't know how many miles have been driven on autopilot, we don't know the road conditions it was used in (assumingely autopilot would be used more often on freeways), and we don't know how the safety rating of the Tesla vehicle compares to others on the road (its possible Teslas are getting in more accidents but the car is keeping them safe, or vice versa).
Too many unknowns. So while I dislike this article because it mostly comes off as hollow in my eyes, I do think that Tesla needs to make more of it's data public so users can make an informed choice.
Yeah, this article is ridiculous. Self-driving cars are FAR safer than human drivers. The number of accidents is minuscule compared to what would be expected if the automated features were absent.
We run the real risk of screwing this up if people insist on an automated car never causing a crash, or never hurting anyone. Even if they hurt someone, the point is that it harmed maybe 0.1% of the people that would have been hurt in traditional vehicles.
Self-driving cars are FAR safer than human drivers. The number of accidents is minuscule compared to what would be expected if the automated features were absent.
We run the real risk of screwing this up if people insist on an automated car never causing a crash, or never hurting anyone. Even if they hurt someone, the point is that it harmed maybe 0.1% of the people that would have been hurt in traditional vehicles.
i feel like you're illustrating the issue here but from the other direction. are they? is the number of accidents miniscule? self-driving technology is frequently hyped up in exactly this manner--particularly by Elon and Tesla apologetics, who have a vested interest in it being correct--but i've seen nothing to suggest that the technology is either widespread enough or reliable enough to draw a meaningful conclusion in either direction.[^1] i also don't think it's reasonable to conclude, if there's an absence of numbers, that these kinds of technologies are just inherently more safe than humans. we've already seen plenty of technological snafus that have potential to be way more harmful at scale than anything the human brain can muster.
[^1]: as far as reliability: Tesla self-driving technology struggles in a lot of cases, and even defenders of the cars will admit it has problems in many circumstances interpreting basic rules of the road and pedestrians in its path. this is obviously a problem now, and would be a much bigger one at scale.
Even if they hurt someone, the point is that it harmed maybe 0.1% of the people that would have been hurt in traditional vehicles.
Where have you got this number from?
This is exactly what I meant in my comment above. Anyone throws out a number and claims it is true. I don't think we should let perfect be the enemy of good, but here are people's lives on the line. We need independent and reliable data.