Among motorcyclists, there is a persistent rumor that Teslas are dangerous to ride around in traffic. Whether it’s their silent electric drivetrain, extreme acceleration, or self-driving technology supposedly failing to see motorcycles, every biker seems to know someone who’s had a close call with a...
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).
Yep, that one was purely about hitting a certain KPI of 'miles driven on autopilot without incident'. If it turns off before the accident, technically the driver was in control and to blame, so it won't show up in the stats and probably also won't be investigated by the NTSB.
A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
That's a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.
Maybe, if that two-step determination of liability is really what the parent commenter had in mind.
I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.
I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.
Most frustrating thing is, as far as I can tell, Tesla doesn't even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?
These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s