During the India AI Impact summit, OpenAI CEO Sam Altman addressed concerns about AI’s resource consumption, particularly the alleged water usage of ChatGPT. He asserted that claims about the chatbot consuming gallons of water per query are “completely untrue” and highlighted that data centers have transitioned from water-intensive evaporative cooling methods. Altman acknowledged the significant electricity demands of AI, advocating for a shift toward sustainable energy sources like nuclear, wind, or solar power. He emphasized that comparing AI’s energy needs to human consumption is misleading, humorously noting that training a human requires 20 years of energy. Altman suggested that, under specific comparative metrics, AI may already match human energy efficiency. However, experts foresee a surge in AI’s cumulative resource use, projecting a 130% increase in water consumption and a corresponding rise in electricity demands by 2050. OpenAI plans to utilize a more efficient closed-loop water system at its Texas data center.
Source link
