OpenAI has partnered with AI chipmaker Cerebras in a groundbreaking multi-year agreement aimed at enhancing its computing infrastructure. This partnership is set to provide OpenAI with 750 megawatts of high-speed compute capacity by 2028, valued over $10 billion. Focused on improving inference—the process where AI models generate user responses—this collaboration is designed to deliver faster, low-latency interactions, crucial for applications like code generation and image creation. Cerebras’ unique architecture, featuring a wafer-scale chip, tackles the bottlenecks common in traditional GPU systems, ultimately aiming to reduce response times significantly. OpenAI’s strategy emphasizes a resilient portfolio tailored to specific workloads, positioning this partnership as a game-changer in AI. Both leaders believe that enhancing real-time inference will revolutionize AI interactions, similar to broadband’s impact on the internet. Founded in 2015, Cerebras continues to challenge established AI hardware giants like NVIDIA, with OpenAI’s CEO Sam Altman as an early investor.
Source link
Share
Read more