All those AI models powering email summaries, regicidal chatbots, and videos of Homer Simpson singing nu-metal are racking up a hefty server bill measured in megawatts per hour.
(Judy Priest, CTO for cloud operations and innovations at Microsoft said in an e-mail that the company is currently “investing in developing methodologies to quantify the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.” OpenAI and Meta did not respond to requests for comment.)
Luccioni, who’s authored several papers examining AI energy usage, suggests this secrecy is partly due to competition between companies but is also an attempt to divert criticism.
After a system is created, it’s rolled out to consumers who use it to generate output, a process known as “inference.” Last December, Luccioni and colleagues from Hugging Face and Carnegie Mellon University published a paper (currently awaiting peer review) that contained the first estimates of inference energy usage of various AI models.
Luccioni and her colleagues tested ten different systems, from small models producing tiny 64 x 64 pixel pictures to larger ones generating 4K images, and this resulted in a huge spread of values.
A recent report by the International Energy Agency offered similar estimates, suggesting that electricity usage by data centers will increase significantly in the near future thanks to the demands of AI and cryptocurrency.
The original article contains 1,603 words, the summary contains 232 words. Saved 86%. I'm a bot and I'm open source!
This is the best summary I could come up with:
All those AI models powering email summaries, regicidal chatbots, and videos of Homer Simpson singing nu-metal are racking up a hefty server bill measured in megawatts per hour.
(Judy Priest, CTO for cloud operations and innovations at Microsoft said in an e-mail that the company is currently “investing in developing methodologies to quantify the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.” OpenAI and Meta did not respond to requests for comment.)
Luccioni, who’s authored several papers examining AI energy usage, suggests this secrecy is partly due to competition between companies but is also an attempt to divert criticism.
After a system is created, it’s rolled out to consumers who use it to generate output, a process known as “inference.” Last December, Luccioni and colleagues from Hugging Face and Carnegie Mellon University published a paper (currently awaiting peer review) that contained the first estimates of inference energy usage of various AI models.
Luccioni and her colleagues tested ten different systems, from small models producing tiny 64 x 64 pixel pictures to larger ones generating 4K images, and this resulted in a huge spread of values.
A recent report by the International Energy Agency offered similar estimates, suggesting that electricity usage by data centers will increase significantly in the near future thanks to the demands of AI and cryptocurrency.
The original article contains 1,603 words, the summary contains 232 words. Saved 86%. I'm a bot and I'm open source!