A new study reveals significant energy disparities among generative AI models, particularly chatbots, with larger models consuming much more energy without offering substantial improvements in accuracy. According to a Department of Energy report, U.S. data centers could account for up to 12% of national electricity consumption by 2028—up from 4.4%—due to AI proliferation, potentially increasing reliance on fossil fuels. An analysis of 14 large language models showed that those with advanced reasoning utilized more energy per response but did not significantly outperform smaller models. Variability in energy sources across regions means emissions can differ dramatically. Maximilian Dauner, the study’s lead author, emphasized that simpler tasks might not require the largest models. As AI becomes increasingly integrated into everyday life, understanding its energy requirements and carbon footprint is crucial amidst ongoing climate concerns, especially as larger models yield diminishing performance returns relative to their environmental impact.
Source link
Assessing the Environmental Impact of AI Tools Amid Growing Energy Demands

Leave a Comment
Leave a Comment