Cerebras to supply large-scale AI compute for OpenAI
New OpenAI–Cerebras partnership boosts AI infrastructure.
OpenAI has agreed to purchase up to 750 megawatts of computing power from AI chipmaker Cerebras over the next three years. The deal, announced on 14 January, is expected to be worth more than US$10 billion and will support ChatGPT and other AI services.
Cerebras will provide cloud services powered by its wafer-scale chips, which are designed to run large AI models more efficiently than traditional GPUs. OpenAI plans to use the capacity primarily for inference and reasoning models that require high compute.
Cerebras will build or lease data centres filled with its custom hardware, with computing capacity coming online in stages through 2028. OpenAI said the partnership would help improve the speed and responsiveness of its AI systems as user demand continues to grow.
The deal is also essential for Cerebras as it prepares for a second attempt at a public listing, following a 2025 IPO that was postponed. Diversifying its customer base beyond major backers such as UAE-based G42 could strengthen its financial position ahead of a potential 2026 flotation.
The agreement highlights the wider race among AI firms to secure vast computing resources, as investment in AI infrastructure accelerates. However, some analysts have warned that soaring valuations and heavy spending could resemble past technology bubbles.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
