Cerebras Systems Raises $1.1 Billion Series G at $8.1 Billion Valuation

We set out to build tools that make complex computation straightforward. Each stage of our growth gives us more opportunity to serve the communities using these systems for meaningful work.

Cerebras Systems, founded by Andrew Feldman, Jean-Philippe Fricker, Gary Lauterbach, Michael James, and Sean Lie, has raised $1.1 billion in a Series G funding round, bringing its valuation to $8.1 billion. The round was led by Vision Fund 2, with participation from Benchmark, Eclipse and Altimeter Capital

The funding arrives amid wider recognition that conventional computing architectures are straining under the scale of modern machine learning workloads. Cerebras was built around a simple conviction that computing power must evolve in step with the workloads it serves. That idea is now gaining momentum as industries search for more efficient ways to train increasingly complex models.

From the beginning, we believed that training large models efficiently required rethinking the computer itself,” said Andrew Feldman, co-founder and CEO of Cerebras Systems. “This funding allows us to deepen that commitment and bring more capacity to customers who are shaping the future of science, language and medicine.

Rethinking Hardware for Expanding Workloads

For many years, general-purpose graphics processors formed the foundation of AI training. They offered a workable solution to rising computational demands, but over time, their limits became clear. As data sets expanded, distributing workloads across multiple GPUs introduced greater complexity and inefficiency. Cerebras chose a different direction.

Its Wafer-Scale Engine (WSE), the largest single chip ever built, integrates 850,000 AI-optimized cores on one continuous piece of silicon. By constructing a single large processor rather than linking multiple smaller ones, Cerebras reduces the coordination required across multiple chips. The result is steadier throughput and simpler scaling for large models.

Feldman described the idea in straightforward terms: “We wanted to build a computer that lets researchers focus on their work instead of spending time managing hardware. Efficiency grows when the system fits the task, not the other way around.

This design, used in the CS-3 system, supports workloads that demand consistent compute capacity, from government research laboratories to commercial enterprises working on new materials, pharmaceuticals and language systems. It indicates a wider recognition that meaningful progress in computing depends on thoughtful design rather than raw speed alone.

Expanding a Global Compute Network

Cerebras has extended its reach through Condor Galaxy, a network of interconnected AI supercomputers developed with G42. Each system is built around Cerebras hardware and connected through high-speed links, creating a distributed platform for large-scale training. Rather than selling hardware in isolation, the company offers access to full systems, giving organizations the ability to use high-performance computing without building data centers of their own.

This approach has made advanced computing more accessible to research groups and enterprises that need large-scale capacity but prefer flexibility. It also creates a framework where hardware, software and service evolve together. With the new funding, Cerebras plans to expand Condor Galaxy across additional regions that are investing in scientific computing and industrial AI.

Feldman emphasized the collaborative nature of this effort: “The growth of Condor Galaxy is about creating shared infrastructure. When institutions have access to the same high-quality compute, they can work faster, compare results and advance their fields together.”

Steady Growth in a Competitive Landscape

Cerebras’ approach, which began as a technical experiment in wafer-scale manufacturing, has evolved into a commercially viable system now used by institutions such as Lawrence Livermore National Laboratory and several biotechnology firms. These deployments involve tasks that require handling large-scale computations while managing system complexity.

Cerebras’ trajectory also portrays broader trends in the computing industry. Organizations in fields such as energy, life sciences, and advanced manufacturing are increasingly using data-driven methods, creating demand for compute platforms capable of handling large-scale workloads.

Feldman described the company’s next phase in pragmatic terms: “We set out to build tools that make complex computation straightforward. Each stage of our growth gives us more opportunity to serve the communities using these systems for meaningful work.

📣 Want to advertise in AIM Media House? Book here >

Picture of Mansi Mistri
Mansi Mistri
Mansi Mistri is a Content Writer who enjoys breaking down complex topics into simple, readable stories. She is curious about how ideas move through people, platforms, and everyday conversations. You can reach out to her at mansi.mistri@aimmediahouse.com.
14 of Nov. 2025
The Biggest Exclusive Gathering of
CDOs & AI Leaders In United States

Subscribe to our Newsletter: AIM Research’s most stimulating intellectual contributions on matters molding the future of AI and Data.