OpenAI has partnered with chip startup Cerebras Systems in a groundbreaking agreement that will provide 750 megawatts of computing power for high-speed AI inference, valued at over $10 billion. This multi-year deal, slated for implementation in phases through 2028, aims to enhance the efficiency and speed of AI model responses, significantly benefiting user interactions with ChatGPT. Cerebras’ wafer-scale AI processors outperform traditional GPU systems, processing requests up to 15 times faster, thereby transforming user engagement.
OpenAI aims to diversify its chip supply strategy to meet the demands of over 900 million weekly users amidst a shortage of computing resources. Alongside Cerebras, OpenAI is collaborating with other chip providers like Broadcom and AMD. Financially, OpenAI generated approximately $13 billion in revenue last year and is preparing for a strategic IPO, seeking to raise funds to support its expansive growth and operational commitments. This collaboration positions both companies at the forefront of AI technology innovation.
Source link