OpenAI has begun renting artificial intelligence chips from Google to enhance its ChatGPT and other products, shifting from its heavy reliance on NVIDIA’s GPUs. This collaboration is part of Google’s strategy to increase the availability of its tensor processing units (TPUs) beyond internal use, attracting customers like Apple and other startups founded by former OpenAI leaders. This partnership marks a significant change for OpenAI, which has historically depended on Microsoft’s data centers. Although OpenAI aims to reduce inference costs through this collaboration, Google is reportedly not providing its most powerful TPUs to OpenAI. The details surrounding this arrangement remain unconfirmed, with both companies not providing immediate comments on the matter.
Source link