Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner. Only this time, the t...
Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
All these years, I always thought all self driving cars used LiDAR or something to see in 3D/through fog. How was this allowed on the roads for so long?
Honestly all the fails with the kid dummy were a way bigger deal than the wall test. The kid ones will happen a hundred times more than the wall scenario.
Some sort of radar or lidar should 100% be required on autonomous cars.
Anyone with half a brain could tell you plain cameras is a non-starter. This is nearly a Juicero level blunder. Tesla is not a serious car company nor tech company. If markets were rational it would have been the end for Tesla.
I love that one of the largest YouTubers is the one that did this. Surely, somebody near our federal government will throw a hissy fit if he hears about this but Mark’s audience is ginormous
The rain test was far more concerning because it's much more realistic of a scenario. Both a normal person and the lidar would've seen the kid and stopped, but the cameras and image processing just isn't good enough to make out a person in the rain. That's bad. The test portrays it as a person in the middle of a straight road, but I don't see why the same thing wouldn't happen at a crosswalk or other place where pedestrians are often in the path of a vehicle. If an autonomous system cannot make out pedestrians in the rain reliably, that alone should be enough to prevent these vehicles from being legal.
This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.
There's a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:
Congress will pass a law that makes NOBODY liable -- as long as a human wasn't involved in the decision making process during the incident.
This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can't be held liable. 🤷🏻♂️
Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!
I saw the video pop up in my Youtube recommended, but didn't bother watching because I just assumed that any cars tested would be using LIDAR and thus would ignore the fake road just fine. I had no idea Tesla a) was still using basic cameras for this and b) actually had sophisticated enough "self driving" capabilities that this could be tested on them safely.
Can this be solved with just cameras, or would this need additional hardware? I know they removed LIDAR, but thought that would only be effective short range, and would not be too helpful at 65 km/h.