Google’s Gemini AI shows impressive energy efficiency, using only 0.24 watt-hours (Wh) per text prompt, significantly lower than earlier estimates and competitive with traditional Google searches, which consume about 0.03 Wh. Research highlights that 58% of this energy comes from AI chips, with additional contributions from server CPUs, memory, cooling systems, and backup equipment. While these figures reflect a substantial improvement, they do not account for the energy-intensive training phase of AI models, which is crucial for a more comprehensive sustainability assessment. Google has made strides in reducing energy use by enhancing hardware like the latest TPU, Ironwood, and optimizing algorithms. Despite the low energy cost per prompt, the increasing global adoption of AI tools could strain power grids, prompting calls for industry-wide transparency and sustainability measures. As demand rises, shifts in consumer behavior and more data from AI providers will be vital for minimizing the environmental impact of AI technologies.
Source link