The first statement is not even wholly true. While training does take more, executing the model (called "inference") takes much, much more power than non-AI search algorithms, or really any traditional computational algorithm besides bogosort.
Big Tech weren't doing the best they possibly could transitioning to green energy, but they were making substantial progress before LLMs exploded on the scene because the value proposition was there: traditional algorithms were efficient enough that the PR gain from doing the green energy transition offset the cost.
Now Big Tech have for some reason decided that LLMs represent the biggest game of gambling ever. The first to find the breakthrough to AGI will win it all and completely take over all IT markets, so they need to consume as much as they can get away with to maximize the probability that that breakthrough happens by their engineers.
Yeah, I ran some image generators on my RTX2070, and it took a solid minute at full power to do it. Sure, it's not a crazy amount, but it's not like it's running on your iPhone.
How much energy does AI really use? (zdnet) Seems like queries aren't that expensive, so I guess the enormous energy cost of AI must be mostly from training. I reckon this is why apologists try to minimize it.
Hmmm, is this true? The fact that AI corpos are talking about standing up brand new power plants just to keep their GPU farms running makes me a little dubious of this claim. I don't remember Ubisoft ever being that desperate for wattage
If you do the math, which I've posted here, you can see it's true. This is a situation where hundreds of SUV drivers are sitting in traffic, see a large bus bellow out a large puff of diesel smoke, and think, "Wow buses are bad for the environment."
People are bad at math. They don't see their individual contributions add up to really bad things.
Gpt 4 had a training cost of $78 million. Gta5 cost $300 million. 4000 developers each with the latest GPU burning hundreds of watts per employee to create the assets. A rough estimate of 750watt pc, 4,000 developers, 8 hour a day, 300 days a year, 5 years = 36 giga watt-hours.
Off topic: while trying to find my post where I did the math I discovered that Blue Morpho is the name of an AI company. God damn it. AI ruined my Venture Bros reference.
Seems more like "putting things in scale" than "whataboutism." I'm not sure I agree with the premise, but I don't think it's whataboutism at all. Whataboutism would be "it's fine, because something else is worse," whereas I think the commenter is trying to say "it's not much, since it's less than something else that isn't much either."
People do bad things for the environment all the time. It's hypocritical to drive around in a big SUV (4080 gaming GPU) and mock people driving a minivan because cars are bad for the environment.