The deal is valued at more than $10 billion, according to people familiar with the matter, and is part of OpenAI's efforts to expand its computing capacity and unlock "the next generation of use cases and onboard the next billion users to AI".
OpenAI signed a multiyear deal to use hardware from Cerebras Systems Inc. for 750 megawatts' worth of computing power, an alliance that will support the company's rapid build-out of AI infrastructure.
OpenAI will use Cerebras as a supplier of computing to get faster response times when running AI models, according to a joint statement Wednesday. The infrastructure will be built in multiple stages "through 2028" and hosted by Cerebras, the companies said. Though terms weren't disclosed, people familiar with the matter put the size of the deal at more than $10 billion.
"This partnership will make ChatGPT not just the most capable but also the fastest AI platform in the world," said Greg Brockman, OpenAI co-founder and president. This speed will help unlock "the next generation of use cases and onboard the next billion users to AI," he said.
Cerebras, a semiconductor startup, has pioneered a unique approach to processing information using huge chips. It's seeking widespread adoption of its technology in a bid to challenge market leader Nvidia Corp. and also operates data centers to showcase the capabilities of its components and bring in recurring revenue.
High-profile wins like the OpenAI agreement take Cerebras closer to tapping into the tens of billions of dollars being poured into new infrastructure for artificial intelligence computing.
For OpenAI, the pact is just the latest massive data center deal aimed at expanding its computing capacity. It's part of an unprecedented bet by the technology industry that runaway demand for power-hungry AI tools will continue unabated.
In September, Nvidia announced it would invest as much as $100 billion in OpenAI to build AI infrastructure and new data centers with capacity of at least 10 gigawatts of power.
In October, Advanced Micro Devices Inc. said it would deploy 6 gigawatts' worth of graphics processing units over multiple years for OpenAI. A gigawatt is about the capacity of a conventional nuclear power plant.
OpenAI, the maker of ChatGPT and other AI tools, is also developing its own chip with Broadcom Inc.
Cerebras and OpenAI have been exploring the idea of collaborating since 2017, according to the statement. Recent work by Cerebras in support of OpenAI's GPT-OSS-120B model showed it running 15 times faster than "conventional hardware," the companies said.
Cerebras founder and Chief Executive Officer Andrew Feldman said that AI's inference stage -- the process of getting models to respond to queries -- is crucial to the advancement of the technology and that's where his products shine.
"They're choosing a new and different architecture because it's faster and drives value for them," Feldman said in an interview. "This transaction launches us into the big league and launches high-speed inference into the mainstream."
His company is also in talks to raise money ahead of a potential initial public offering.
Cerebras has discussed a new funding round of roughly $1 billion, a person familiar with the matter said earlier this week. The round would value the startup at $22 billion before the investment, said the person.
The Wall Street Journal previously reportedBloomberg Terminal on the $10 billion OpenAI-Cerebras deal.