Cerebras Systems will start pitching its stock to investors on Monday, with plans to sell shares at somewhere between $115 and $125 each, according to someone withCerebras Systems will start pitching its stock to investors on Monday, with plans to sell shares at somewhere between $115 and $125 each, according to someone with

Cerebras launches IPO roadshow, targeting $115-$125 per share

2026/05/04 19:17
4 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Cerebras Systems will start pitching its stock to investors on Monday, with plans to sell shares at somewhere between $115 and $125 each, according to someone with knowledge of the plans who spoke to Reuters.

The artificial intelligence chip maker is trying to go public for the second time. The company pulled its first attempt in October last year.

Cerebras launches IPO roadshow, targeting $115-$125 per share

Cerebras reported stronger financial results for the year that ended December 31. The company brought in $510 million in revenue, a jump from $290.3 million the year before. It also made a profit of $1.38 for each share, compared to losing $9.90 per share in the previous year.

Morgan Stanley, Citigroup, Barclays and UBS are handling the stock sale.

Industry is taking a shift

Cerebras’ strategy is not random. The AI industry is taking a shift from the development of new AI models to running them for actual use. This shift is a golden chance for small companies competing with Nvidia’s (NASDAQ: NVDA) monopoly. As reported by Cryptopolitan, even OpenAI isn’t convinced by Nvidia’s inference hardware.

This is because running AI models, known as inference, requires different capabilities than training them. This creates openings for specialized chip makers to find their spot in the market. Processing large batches of information needs a different balance of computing power, memory and data transfer speeds than running an AI chatbot or coding assistant.

This variety in requirements has made the inference market more diverse. Some tasks work better on traditional graphics chips, while others need more advanced equipment.

Nvidia’s purchase of Groq last December for $20 billion shows how this is playing out. Groq built chips packed with fast SRAM memory that could process AI responses faster than standard graphics chips. But the company struggled to scale up because its chips had limited computing power and were built on older technology.

Nvidia solved this problem by splitting the work. It uses its regular graphics chips for the heavy computing part of generating AI responses, called prefill, while using Groq’s chips for the faster decode step that requires less computing but needs quick data access.

Other big companies are doing something similar. Amazon Web Services announced its own split system shortly after a major tech conference. It combines its custom Trainium chips for prefill work with Cerebras’ wafer-sized chips for decode operations.

Intel joined in too, revealing plans to pair graphics chips with processors from another startup called SambaNova. The graphics chips will handle prefill while SambaNova’s chips tackle decode.

Most of the smaller chip companies have found success with decode work. SRAM memory doesn’t hold much information, but it’s extremely fast. With enough chips, or one very large chip like Cerebras makes, these systems excel at decode tasks. But companies aren’t stopping there.

New technologies challenge split-chip approach

Lumai, another startup, announced this week it built a chip that uses light instead of electricity for the math operations at the core of AI work. This approach uses much less power than traditional chips.

The company expects its upcoming Iris Tetra systems to deliver an exaOPS of AI performance while using just 10 kilowatts of power by 2029.

The chips mix light-based and electrical components, but light handles most of the work during inference. Lumai plans to use these chips first as standalone replacements for graphics chips in batch processing jobs. Later, the company wants to use them for prefill work too.

Not everyone thinks splitting the work between different chips makes sense. Tenstorrent rolled out its Galaxy Blackhole systems this week, and CEO Jim Keller criticized the approach.

“Every company in the industry is pairing up to build the accelerator accelerator accelerator. CPUs run code. GPUs accelerate CPUs. TPUs accelerate GPUs. LPUs accelerate TPUs. And so on. This leads to complex solutions which are unlikely to be compatible with changes in AI models and uses. At Tenstorrent, we thought something more general and simpler would work,” Keller said.

Still letting the bank keep the best part? Watch our free video on being your own bank.

Market Opportunity
USD.AI Logo
USD.AI Price(CHIP)
$0.05937
$0.05937$0.05937
+0.01%
USD
USD.AI (CHIP) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

Starter Gold Rush: Win $2,500!

Starter Gold Rush: Win $2,500!Starter Gold Rush: Win $2,500!

Start your first trade & capture every Alpha move