The Extreme Cost of Training AI Models.
The Extreme Cost of Training AI Models.
The Extreme Cost of Training AI Models.
I don't care how they estimate their cost in dollars. I think the cost to all of us in environmental impact would be more interesting.
Unless they're finding exciting new and efficient ways to generate electricity, I imagine its a linear comparison. Maybe some are worse than others. I know Grok's datacenter in Mississippi is relying exclusively on portable gas powered electric generators that are wrecking havoc on the local environment.
I didn't know that; thanks for sharing.
(BTW, I think you meant wreaking havoc.)
Gas like natural gas? Or gas like gasoline? I'm sure it's the former, but I take nothing for granted anymore.
Maybe this is the push we need to switch to nuclear. The attack is good it just needs somebody with deeper pockets than coal/gas to lobby it.
Honestly you can thank decades of anti-nuclear lobbying
I want to see what the long term economic cost was after they fired tens of thousands of tech workers hoping to replace us with AI. It feels like workers are always the ones who suffer the most under capitalism.
They'll fire more than that when the AI bubble busts and they stop pushing so hard into that development as it stagnates.
How in the hell is Gemini both two and a half times more expensive and vastly inferior to GPT?
Some claim due to it was trained on too much data with too little intervention
Maybe we donnot understand what its objective function actually wants?
Maybe it is impeding its users intentionally.
Google sucks
It's obvious that Google didn't pay the crazy AWS prices to train Gemini, seeing how many servers they have in gcp.
They mean that they used creative accounting to pay themselves crazy gcp usage bills to deduct from taxes?
Geez, you’d think Gemini would be better than it is if they spent that much on it…
Base model =/= Corpo fine tune
and gemini is still hot ass
because this entire model of AI as an idea is garbage to begin with
trueee
Now imagine if they had to pay for the content they're training the models off of.
Only 80 million dollars for gpt4? Cheaper than expected
How is Inflection-2 cheaper to train in the cloud than own hardware?
That probably indicates a problem with the estimates.
Humanity: develops nuclear fusion
AI:
It's like the south park "Now we can finally play the game" but for AI. First we get infinite energy and then we can train an AI to calculate how we can create infinite energy.
We must consider the benefits of AI as such and how they can contribute to our life. I can assure you prices of such while AI may seem like a game or useless thing for others. It’s actually a useful tool able to help others understand complex concepts that most people have a hard time explaining or won’t. Many more things too.
All that shit needs to be just down and not revisited again.
“It cost a lot, so it absolutely should be allowed!”
Is an even dumber excuse to keep it going.
The source didn’t have this detail - google training gemini “cloud” vs “own hardware”. Does Google Cloud not count as “own hardware” for google?
That's why the bars are so different. The "cloud" price is MSRP
This is an accounting trick as well, a way to shed profit, and maximize deductions, by having different units within a parent company purchase services from each other.
I realize that my sentence long explainer doesn't shed any light on how it gets done, but funnily enough, you can ask an LLM for an explainer and I bet it'd give a mostly accurate response.
Edit: Fuck it, I asked an LLM myself and just converted my first sentence into a prompt, by asking what that was called, and how it's done. Here's the reply:
From the source:
https://epochai.org/blog/how-much-does-it-cost-to-train-frontier-ai-models