But according to this book that is not going to happen. The author says that the real purpose is to get rid of the skilled drivers and replace them with underpaid button pushers.
Will they really do that? What's going to be the situation few years from now?
I see at least four big problems with having drivers that sit around to supervise the AI.
It's a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It's like a real-life version of Desert Bus, the worst video game ever.
Human skills will deteriorate with lack of practice. Drivers won't have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.
The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the "moral crumple zone" to absolve the AI of liability. That sounds like a terrible thing for society.
With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.
The AI will shut off before an impending accident just to transfer the blame onto the human.
I may be mistaken but I thought a law was passed (or maybe it was just a NHTSA regulation?) that stipulated any self driving system was at least partially to blame if it was in use within 30 seconds of an accident. I believe this was done after word got out that Tesla’s FSD was supposedly doing exactly this.
It’s a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It’s like a real-life version of Desert Bus, the worst video game ever.
Agreed. I don't see any chance humans will be continuously supervising trucks except as some sort of quality assurance system. And there's no reason for the driver to be in the truck for that - let them watch via a video feed so you can have multiple people supervising and give them regular breaks/etc.
Human skills will deteriorate with lack of practice. Drivers won’t have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.
I don't see that happening at all. An passenger jet is a special case of nasty where if you slow down or stop, you die. With a truck in the rare occasion you encounter something unexpected, just have the human go slow. Also seriously it's just not that difficult. Right pedal to go forward, left pedal to go backward, steering wheel to turn and if you screw up, well maybe you'll damage some panels.
The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the “moral crumple zone” to absolve the AI of liability. That sounds like a terrible thing for society.
So you're thinking a truck sees that it's about to run a red light, and transfers control to a human who wasn't paying attention? Yeah I don't see that happening. The truck will just slam on the brakes. And it will do it with a faster reaction time than any human driver.
With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.
Hard disagree. A snowstorm is a lot less problematic when there's no human in the truck who needs to get home somehow. An AI truck will just park until the road is safe. If that means two days stuck in the breakdown lane of a freeway, who cares.
Driving a truck is extremely more difficult than that.
I'm continually boggled by the fact any jackass can walk into a uhaul and drive out with 30 foot box truck, because those are wildly different to handle than a regular car.
Massively larger stopping distance, something almost no one leaves in their regular cars, massively wider turning radius, and heavy enough that if you make a mistake or lose control, there's a whole lot more destructive capability that you clearly are not appreciative of.
Going down a hill with a loaded box truck requires multiple different braking methods than just pushing the left pedal. You engine brake as much as possible, and use what's called stab braking, to keep the pads and rotor cool enough so they don't fail.
All of this is multiplied when you go from an automatic transmission, straight box truck to an actual semi truck, which weighs another order of magnitude more, has usually has a ten speed manual transmission (and three pedals, not two) and the whole trailer aspect.
And despite the extra weight, heavy winds can still blow the things over.
Frankly your cavalier attitude about how easy it is to drive anything is exactly why the roads are so dangerous.
Because nothing I said really mentions how people driving cars interact with trucks or buses on the road. It's a constant stream of getting cut off and having to slam on the brakes because the dipshits don't even know where the edges of their own vehicle are, let alone where mine begins, or the wildly longer stopping distance, or my extremely limited maneuvering capabilities , especially at speed, or the simple fact the larger vehicle will absolutely crush their whole car and everyone in it completely fucking flat.
Driving is absolutely a skill, and like any other, it will atrophy without use.
You assume that it will be either the self driving software in charge or the button pusher taking the wheel. You did not consider that the button pusher might have a foot on the brake, but instead of taking the wheel he might have to enter some commands.
Like the case where there is a road block ahead and the button pusher has to evaluate whether it is safe to move forward or not, but he wouldn't take the wheel he would tell to the driving software where to go. In similar cases he would have to decide whether it is safe to pass aside an obstacle or stop there. Even in case of a burglar trying to get on board he would have to call the police and then give some commands to the driving software.
The idea at the base of the question is that in the future the AI or whatever you want to call it might be always in charge for the specialized functions, like calculating the right trajectory and turning the wheel, while the human will be in charge to check the surrounding environment and evaluate the situation. So the Ai is never supposed to be deactivated, in that case the truck would stop until the maintenance team arrives.
As someone who worked there previously, I can confirm that both of your statements are correct. (This has already been publicly shared by Aurora)
There will be nobody in (most of) their trucks.
There will be button pushers remotely to help it in confusing situations or failures.
They've already been operating the trucks near-fully autonomously with safety drivers behind the wheel and copilots in the right seat monitoring the system. They plan to remove both operators from the vehicle completely, eventually.
(Now for some of my own speculation)
Someone else mentioned mother goose, they may do a similar approach, however the follow trucks don't need to keep up with the lead truck. It would be only for the lead truck to be an early warning for unexpected road conditions (new construction for example) that is handled by the safety driver, and info sent back to other trucks quickly on how to handle it or to pull over and wait for help (default action if it gets confused). It's impossible to require that a convoy remains together in close formation, too many scenarios can split up the trucks even on the highway.
In a mechanical failure it would pull over and wait for a rescue team. The rescue team will probably include backup drivers in case it can't resume driving autonomously.
Also, always take timetables with a grain of salt regarding anything related to autonomous vehicles.
My guess is the situation a few years from now will be that an inconsequential percentage of the US trucking fleet will be autonomous, a smaller percentage will have no safety drivers, and the remote operators will still be 1:1 ratio, maybe 1:2 (one operator for 2 trucks), but not the desired 1:10. This tech advances very slowly.
Historically, anything that reduces cost of transporting goods has advanced extremely quickly. The best comparison, I think, is the shipping container.
It took about ten years for shipping containers to go from an invention nobody had heard of to one that was being used in every major seaport in the world and about another ten years for virtually all shipping used that method.
The New York docks for example, dramatically increased activity (as in, handled several times more cargo per day) while also reducing the workforce by two thirds. I think self driving trucks will do the same thing - companies/cities/highways that adopt AI will grow rapidly and any company/city/highway that doesn't support self driving trucks will suddenly stop being used almost entirely.
Shipping containers were not a simple transition. New ships and new docks had to be built to take advantage of it. A lot of new trucks and trains were also built. Just 20 years to replace nearly all the infrastructure in one of the biggest and most important industries in the world.
I don't disagree with you. There will be a rapid rate of adoption.
But how long before it's capable enough to be adopted? We (as in anybody) don't know. We just know that it's been many many years and they're still not there yet, and just because a few driverless vehicles are operating (in extremely ideal scenarios with lots of help) doesn't mean it's ready for the kind of hockey stick curve that the industry is looking forward to.
It will happen eventually, sure. My prediction was in regards to the OP's question of what will things look like in a few years. I don't think the tech will be ready for mass adoption in just a few years, neither does the author of the article linked.
A serious self driving vehicle must be able to see around with different sensors. But then it must have a lot of computing power on board to merge different streams of data coming from different sensor. That adds up to the computing power required to make a proper prediction of the trajectories of dozen of other objects moving around the vehicle. I don't know about the latest model, but I knew that the google cars few years ago had the boot occupied by big computers with several CUDA cards.
That's not something you can put in a commercial car sold to the public, what you get is a car that relies only on one camera to look around and has a sensor in the bumper that cuts the engine if activated, but it does not create an additional stream of data. Maybe that there is a second camera looking down at the line on the road, but the data stream is not merged to the other, it is used to adjust the driving commands. I don't even know if the little onboard computer they have is able to computes the trajectories of all the objects around the car. Few sensors and little processing power, that is not enough, it is not a self driving car.
When Tesla sells a car with driving assistance they tell to the customer that their car is not a self driving car, but they fail to explain why, where is the difference. How big is the gap. That's one of the reasons why we had so many accidents.
Similar post earlier.
It starts from the same news, but taking the idea from the book in the link it asks something different.
the google cars few years ago had the boot occupied by big computers
But those were prototypes. These days you can get an NVIDIA H100 - several inches long, a few inches wide, one inch thick. It has 80GB of memory running at 3.5TB/s and 26 teraflops of compute (for comparison, Tesla autopilot runs on a 2 teraflop GPU).
The H100 is designed to be run in clusters, with eight GPUs on a single server, but I don't think you'd need that much compute. You'd have two or maybe three servers, with one GPU each, and they'd be doing the same workload (for redundancy).
They're not cheap... you couldn't afford to put one in a Tesla that only drives 1 or 2 hours a day. But a car/truck that drives 20 hours a day? Yeah that's affordable.
It will likely be a mix. E.g. you might have 10 trucks on a particular run. You put a driver in the lead truck, as a human-in-the-loop safety. The rest play duckling to the mother duck.
What it will do is lower the skill level needed, and lower the stress. A driver having a nap isn't a problem anymore. They just need to be able to get involved either if the autopilot has issues and has to stop, or if they need to fill out paperwork at the destination.
The duck-duckling model would probably work okay on the highway, but not so well once you arrive in a town or city. You can't reliably get ten semis through a set of lights in traffic without getting split up. I guess they could have a depot outside of town where human drivers would meet the ducklings for the final leg of the journey.
I believe it's common to have separate long haul trucks and last leg trucks. If the depot is right next to the motorway/highway, then it provides an obvious place for a handover. It also means drivers can stay in 1 area, and so go home each night.
Different companies have different plans. Arizona has had auto-driving trucks on freeways off and on for a couple years now as part of test programs. Always with a driver in the cab though.
A few years ago I would have though robo-convoys would be where things landed because three or four companies where working toward that. That's where the front truck has an operator and all the other trucks follow that leader driverlessly.
Now I feel like I have no idea where any of it is going. Step 1 in driverless should have always been to adopt an industry-wide mesh-network for all vehicles with level 3 (or higher) autonomy. If I'm on the road with (or inside of) an autonomous vehicle, I want it to be able get help from every other nearby car if its sensors suddenly die or start feeding it bad data. Especially after they've been on the road, poorly maintained by their owners, for a decade or more. If there are autonomous cars where will eventually be autonomous jalopies that drive like a drunk toddler because they sees lidar echos.
Can't get a train track to every single depot and loading dock in the country that receives shipments (which is like, practically every big box store and warehouse there is). There has to be a handover at some point.
Edit: also not a big fan of the train system in the US, since the vast majority of rail is privately owned. The operators have too much control. They'll charge towns extra to put automated crossing guards on their rail and then keep charging them for its maintenance. The jurisdiction can't use their own third party workers to maintain it. The railroads are legally only required to put up a sign. It's extortion if you ask me.
Just like mercedes 'full self driving' this sounds like its on limited routes where there's been extensive testing. I don't expect truck driving to go full auto on arbitrary roads in the next few years. The tech is not there yet.
I have a feeling funding to self driving truck tech may stall a bit if marijuana rescheduling can change the fact that a single positive piss test can get your CDL revoked for good.
They'll keep someone in the truck for maintenance purposes. A self driving truck wouldn't be able to change a flat tire for example and it would be more efficient to have the human driver change it than wait for someone to come out and change the flat.