Also fairly impressive interception of a moving small target, especially considering they're hidden behind the bus for most of their initial trajectory.
I'm reading the comments about this video, and I think people are missing the point.
It's not about the Telsa running into the kid. It's about the Tesla completely ignoring the FLASHING FUCKING STOP SIGN at the side of the bus, which resulted in it hitting the kid dummy.
This could have been a pedestrian crossing, railroad stop, intersection, etc.
These vehicles aren't "smart" and should not be allowed on the road. Any idiot can have greater awareness than a Tesla.
Oh, now I get it. Didn‘t know it’s not allowed to pass the bus even when it’s on the other side of the street.
In our country we teach the kids to not run across the street when they get out of the bus.
Kids will do stupid things sometimes, no avoiding that. In Germany you can pass a stopped bus on the other side of the road, but if it has its hazards on, you can't go faster than walking speed.
I'm a school bus driver and this year we finally got the automatic cameras that catch people going past our red flashers and stop signs. My camera has captured about two to three drivers per day doing this. I would have rather had the automatic machine guns but the camera is a fine second choice.
Edit: the funniest thing I've had happen with the camera so far is one person that came flying past my reds, noticed the lights and stop sign as they were passing me, slammed on their brakes and then backed up past me again while mouthing "I'm so sorry" to me. Yes, they received two tickets for this - and I had nothing to do with it as the cameras are completely automated.
The guy who wrote this is an idiot. Vehicles with self driving should always follow the driving law to the letter regardless of what normal human behavior is. Human drivers do an immense amount of stupid shit in cars so imitating them with a computer is extremely dangerous.
If anything people should have extra safety requirements for self driving cars such as "always 5km/h below speed limit", "2s stop at stop sign" and "slow down to -10km/h below speed limit before an intersection".
I'll give you my uneducated findings: self driving cars are not ready.
I doubt they will ever be really ready, they'll eventually be considered "ready enough" no software will always work without flaws. When that software controls a car a minor flaw might mean 20 deaths.
Isn't Waymo in San Francisco completely self driving? And if their own recently released data is anything to go by, it would seem self driving cars are more ready than manually controlled cars. Because people are absolutely awful at driving.
Comparing self driving cars to American driving standards is kinda a moot point because the american safety standards are so low that death and injury is considered the cost of doing business.
I'd be curious to see how well waymo performs compared to a country with far safer road designs and drivers that are better trained and respect rules of the road more frequently.
Nobody is disputing that a machine that is never distracted and has reaction times down to fractions of a second would make a better driver than even the most skilled human, but Tesla's FSD hardware and software aren't there yet and probably never will be.
40.000 deaths by traffic accident by year (in the US). Only 20 deaths would be a major improvement. Obviously "cars" is a highly irrational discussion though.
And it's not just the victims who could be spared their lives, it's also the mental toll on those who kill people on accident. Blaming it on a flaw in the software that can be improved and flaws permanently fixed is great.
Edit: They also did it in Austin and somewhere else, so same situation in 2 different spots, generating like 4-5x the stories as each one gets repeated in the news cycle
That's shrimply not true. The numbers Tesla releases are heavily cooked.
Had a quick look around but I didn't manage to find any numbers that weren't either using Tesla'd numbers, or guessing.
But it's pretty well known that FSD sucks (have been in a car using it
.. terrifying af) and that it'll turn itself off before an accident to pass accountability to the driver.
I love how this keeps getting repeated by everyone everywhere
it’ll turn itself off before an accident to pass accountability to the driver.
But both Tesla (5 seconds) and the NHSTA (30 seconds) count any incident where a L2 system was on before the accident as having happened with the system active. So no, they do not use it for that purpose.
You know that video going around a few weeks ago where some dude with FSD on darted across the rode into a tree? Well, he got the cars data, and it turns out it was disabled due to enough torque on the wheel which is one of the ways you disable it. He probably nudged the wheel too hard by mistake and disabled it, or there was a mechanical failure which disabled it, but the accident counted as FSD in the report he got from Tesla as ON even though it was OFF at the time of the accident when he started going out of his lane.
Despite doors blowing off, Boeing planes are safer than human drivers tbh. You'd think tech fans would understand the importance of logic in computers. Red means stop.