Sam Altman, CEO of OpenAI, recently stated that an average ChatGPT query consumes 0.34 watt-hours of energy, comparable to an oven’s use in just over a second. With 800 million weekly active users, concerns about overall energy consumption are rising. However, experts question the significance of Altman’s figure due to the lack of transparency from OpenAI regarding its calculation. Sasha Luccioni from Hugging Face critiques its reliability, noting that essential data on the energy impact of AI tools is largely missing. An analysis she co-authored highlights that 84% of large language model (LLM) usage lacks environmental disclosure, leaving consumers unaware of their carbon emissions. The disparity in information is striking—while consumers know vehicle fuel efficiency, they lack insights into the environmental costs of AI use. This situation necessitates regulatory attention, as public misconceptions continue to proliferate regarding the energy demands of AI compared to traditional tools like Google.
Source link
Unveiling AI’s Energy Consumption: Why Key Insights Remain Under Wraps

Leave a Comment
Leave a Comment