OpenAI has begun using Google’s Tensor Processing Units (TPUs) to power ChatGPT and its other AI products, transitioning away from reliance on Nvidia’s graphics processors. This collaboration with Google Cloud aims to reduce inference costs, although OpenAI is not receiving access to Google’s most powerful TPU models. This move sends a strategic signal to Microsoft, OpenAI’s largest investor, as it both diversifies OpenAI’s infrastructure sources and leverages competition between tech giants. Meanwhile, OpenAI is reportedly in discussions with Microsoft about their partnership and has also expanded its compute capabilities through a deal with Oracle. Google’s cloud services, which have traditionally been for internal use, are now available to external entities, making the TPUs an integral part of Google’s AI strategy and a competitor to Nvidia’s offerings. As a result, this development influences both the AI and cloud sectors significantly.
Source link
OpenAI Leases Google TPUs: A Strategic Signal to Microsoft

Leave a Comment
Leave a Comment