Among motorcyclists, there is a persistent rumor that Teslas are dangerous to ride around in traffic. Whether it’s their silent electric drivetrain, extreme acceleration, or self-driving technology supposedly failing to see motorcycles, every biker seems to know someone who’s had a close call with a...
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.
I imagine bicyclists must be æffected as well if they're on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.
Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).
Unless it's a higher rate than human drivers per mile or hours driven I do not care. Article doesn't have those stats so it's clickbait as far as I'm concerned
Every captcha.....can you see the motorcycle? I would be afraid if they wanted all the squares with small babies or maybe just regular folk...can you pick all the hottie's? Which of these are body parts?
On a quick read, I didn't see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.
The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it's position entirely.
Let's get this out of the way: Felon Musk is a nazi asshole.
Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I'm so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.
Then there's shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it's not self driving. Stopping is a key function of how self driving tech self drives. It's not like the car swerved to another lane and nailed someone, the driver literally did this.
Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn't made by the guy. it's made by engineers. I wouldn't buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.
For what it's worth, it really isn't clear if this is FSD or AP based on the constant mention of self driving even when it's older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.
So these may all be AP, or one or two might be FSD, it's unclear.
Every Tesla has AP as well, so the likelihood of that being the case is higher.