While the mass adoption of AI has transformed digital life seemingly overnight, regulators have fallen asleep on the job in curtailing AI data centers’ drain on energy and water resources.
The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.
AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.
I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.
That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.
The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.
If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.
I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.
To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.
Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.
I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?
This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?
You know the shit that we should have been doing before I was born.
I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
I wonder what the power consumption of getting to the information in the summary is as a whole when using a regular search, clicking on multiple links, finding the right information and extracting the relevant parts. Including the expenditures of energy by the human performing the task and everything that surrounds the activity.
There are real concerns surrounding AI, I wonder if this is truly one of them or if it’s just poorly researched ragebait.
I would point out that Google has been "carbon neutral" with it's data centers for quite some time, unlike others who still rape the environment ahem AWS.