Andrew Feldman rang the Nasdaq bell in New York on Thursday and left with a number neither his prospectus nor his bankers had priced. Cerebras sold 30 million shares at $185. The first trade came at $350; the close, after a brief run to $386, was $311.07. CNBC put the fully diluted valuation near $95 billion.

The SEC prospectus gives the other side of the trade: $510.0 million of 2025 revenue after $290.3 million in 2024. Cerebras asked public investors to finance an OpenAI-backed inference buildout before that new customer mix has shown up cleanly in revenue. Investors bought a May roadshow that started with a $115 to $125 range, priced at $185, and finished its first session far above the $56.4 billion IPO valuation.

Key Takeaways

AI-generated summary, reviewed by an editor. More on our AI guidelines.

What the pop priced

CNBC's comparison was harsh on revenue. Alibaba ended its first day above $231 billion after producing $5.5 billion of annual revenue. Facebook ended near $104 billion after $3.7 billion. Cerebras ended near $95 billion after $510.0 million in 2025 sales.

PitchBook's private-market bridge was just as steep. Three months before the listing, Cerebras raised Series H money at $23 billion; at $185 a share, the IPO valued it at $56.4 billion. By the close, investors had added roughly another two-thirds. One PitchBook source said orders were "20x oversubscribed." Nicholas Smith of Renaissance Capital told Reuters that the valuation "looked reasonable" at the IPO price on 2028 sales and EBITDA metrics. At the trading price, he said, "it is quite high even out to 2028."

Where the customer risk moved

Cerebras used the IPO to show that its G42-specific problem had changed, even as customer concentration remained. In 2024, G42 accounted for 85% of revenue and helped stall the original listing under CFIUS scrutiny. In the refreshed filing, G42 was down to 24% of 2025 revenue, while Mohamed bin Zayed University of Artificial Intelligence accounted for 62%. Morningstar put the combined UAE share at 86%.

"There's some whales out there," Feldman told CNBC. "That is one of the characteristics of this market."

Feldman said the university work involved training "English-Arabic models" and called MBZUAI "the first university set up and dedicated to training AI practitioners." Cerebras calls its Wafer-Scale Engine 3 the "world's largest and fastest commercialized AI processor." Revenue still depends on a few buyers.

The OpenAI agreement gives Cerebras more than $20 billion of contracted compute, 750 megawatts of planned capacity, a $1.0 billion working-capital loan and warrants for up to 33.4 million Class N shares. Amazon Web Services adds a term sheet for inference infrastructure and a warrant commitment for up to 2.7 million shares. The OpenAI deal gave the IPO a large U.S. customer story tied to future capacity delivery.

Why the chip has to carry the story

Cerebras' pitch starts with an object roughly the size of a children's picture book. The company says the WSE-3 has 4 trillion transistors, 900,000 cores and 44 gigabytes of on-chip memory. It says the chip is 58 times larger than Nvidia's B200, with 2,625 times more memory bandwidth, and can deliver inference up to 15 times faster than leading GPU-based systems on benchmarked open-source models.

That is why the company keeps pulling OpenAI into the story. Business Insider reported that Sam Altman owned 89,373 Cerebras shares, worth around $30 million at the $350 opening price, while OpenAI received stock warrants tied to the partnership. OpenAI is a customer, lender, technical partner and potential shareholder. Greg Brockman wrote in 2017 that "exclusive access to Cerebras hardware" would give OpenAI an "overwhelming hardware advantage over Google."

Feldman told Fortune, "We're not in a situation like Field of Dreams," adding that Anthropic and OpenAI have more demand than compute. He told the Journal that "you can switch from a workload on Nvidia to a workload on Cerebras in about 10 keystrokes." For investors, that claim is where the quarterly filings begin.

What the first filing must show

The prospectus separates signed stories from recognized revenue. OpenAI selected Cerebras as a fast inference solution, AWS signed a binding term sheet, and the AWS definitive agreements still need to be negotiated. Cerebras reported a 2025 non-GAAP net loss of $75.7 million, compared with $21.8 million in 2024, even as GAAP net income reached $237.8 million after the prior year's $481.6 million loss.

On customer concentration, the 2025 filing still points back to the UAE. MBZUAI and G42 supplied 86% of revenue.

The quarterly report will have to show how much of the OpenAI and AWS story turns into recognized revenue, how much dilution the warrants create, and whether WSE-3 speed claims are winning inference work at scale. Morningstar's Brian Colello put the risk simply: "the OpenAI deal is critical."

The next answer will not come from the Nasdaq screen. It will come in a quarterly filing: revenue recognized, warrants counted, and the first public measure of the $95 billion promise.

Frequently Asked Questions

Why did Cerebras shares jump in the Nasdaq debut?

Investors bought a pure-play AI chip story tied to inference demand, OpenAI capacity and AWS distribution. The IPO priced at $185 after a $115 to $125 range, then closed at $311.07.

How large was the Cerebras IPO?

Cerebras sold 30 million shares at $185, raising $5.55 billion. The stock closed 68% above the offer price, giving the company a roughly $95 billion fully diluted valuation.

Why does OpenAI matter so much to the story?

OpenAI signed a deal valued at more than $20 billion, funded a $1.0 billion working-capital loan and received warrants. That makes OpenAI central to the revenue and dilution questions.

What is the customer concentration risk?

G42 fell from 85% of 2024 revenue to 24% in 2025, but Mohamed bin Zayed University of Artificial Intelligence accounted for 62%. Morningstar put the combined UAE share at 86%.

What should investors watch after the IPO pop?

The next test is quarterly reporting: recognized revenue from OpenAI and AWS, warrant dilution, finalized AWS terms and evidence that Cerebras' speed claims translate into durable inference demand.

AI-generated summary, reviewed by an editor. More on our AI guidelines.

Nvidia still wins per chip. Google just changed what counts.
Google Cloud on Wednesday unveiled two new TPUs at Cloud Next 2026, splitting its eighth-generation design into a training chip and an inference chip for the first time in the program's decade-long hi
Nvidia Didn't Just Launch Chips at GTC. It Launched a Lock-In Machine.
Monday at the SAP Center in San Jose, Jensen Huang held up a chip. Rotated it under the stage lights, slow, deliberate, the way he always does. A jeweler showing off a diamond. Thirty thousand people
Nvidia Collects. Google Crawls Back.
Las Vegas | January 6, 2025 Four AI CEOs stood on stage at CES to praise their chip supplier. Not their product. Their supplier. OpenAI, Anthropic, Meta, xAI, all genuflecting before Jensen Huang's R
AI News

San Francisco

Editor-in-Chief and founder of Implicator.ai. Former ARD correspondent and senior broadcast journalist with 10+ years covering tech. Writes daily briefings on policy and market developments. Based in San Francisco. E-mail: [email protected]