Cerebras Systems Inc.

General Information
Business:

We believe AI is the most transformative technology of our generation. (Incorporated in Delaware)

Our mission is to accelerate AI by making it faster, easier to use, and more energy efficient, making AI accessible around the world.

Cerebras is an AI company. We design processors for AI training and inference. We build AI systems to power, cool, and feed the processors data. We develop software to link these systems together into industry-leading supercomputers that are simple to use, even for the most complicated AI work, using familiar ML frameworks like PyTorch. Customers use our supercomputers to train industry-leading models. We use these supercomputers to run inference at speeds unobtainable on alternative commercial technologies. We deliver these AI capabilities to our customers on premise and via the cloud.

AI compute is comprised of training and inference. For training, many of our customers have achieved over 10 times faster training time-to-solution compared to leading 8-way GPU systems of the same generation and have produced their own state-of-the-art models. For inference, we deliver over 10 times faster output generation speeds than GPU-based solutions from top CSPs, as benchmarked on leading open-source models. This enables real-time interactivity for AI applications and the development of smarter, more capable AI agents. The Cerebras solution requires less infrastructure, is simpler to use, and consumes less power than leading GPU architectures. It enables faster development and eliminates the complex distributed compute work required when using thousands of GPUs. Cerebras democratizes AI, enabling organizations that have less in-house AI or distributed computing expertise to leverage the full potential of AI.

The rise of AI presents a unique set of compute challenges. Unlike other computational workloads, both training and inference require a huge number of relatively simple calculations, the results of which necessitate constant movement to and from memory, and to and from millions or tens of millions of compute cores. This traditionally demands hundreds or thousands of chips, and puts tremendous pressure on memory, memory bandwidth, and the communication fabric linking them all together.

Cerebras started with a simple question: How can we design a processor, purpose-built to meet these exact challenges? If we were to start with a clean sheet, how would we avoid carrying forward the tradeoffs made for graphics and other workloads, and ensure that every transistor is optimized for the specific challenges presented by AI?

Our answer is wafer-scale integration. Cerebras solved a problem that was open for the entire 75-year history of the computer industry: building chips the size of full silicon wafers. The third-generation Cerebras Wafer-Scale Engine (the “WSE-3”) is the largest chip ever sold. It is 57 times larger than the leading commercially available GPU. It has 52 times more compute cores, 880 times more on-chip memory (44 gigabytes), and 7,000 times more memory bandwidth (21 petabytes per second). The sheer size of the wafer-scale chip allows us to keep more work on-silicon and minimize the time-consuming, power-hungry movement of data. This enables Cerebras customers to solve problems in less time and using less power. Our AI compute platform combines processors, systems, software, and AI expert services, to deliver massive acceleration on even the largest, most capable AI models. It substantially reduces training times and inference latencies, while reducing programming complexity

Our business model is designed to meet the needs of our customers. Organizations seeking control over their data and AI compute infrastructure can purchase Cerebras AI supercomputers for on-premise deployment. Those that want the flexibility of a cloud-based platform can purchase Cerebras high-performance AI compute via a consumption-based model through the Cerebras Cloud, or via our partner’s cloud. We offer customers the flexibility to choose the solution that best aligns with their budgetary, security, and scalability requirements, and some customers choose to use both options simultaneously.
We have established a growing set of customer engagements spanning CSPs, leading enterprises, Sovereign AI programs, national laboratories, research institutions, and other innovators at the forefront of AI. While a substantial portion of our current business is supported by one primary customer, we are actively seeking to expand our reach and diversify our customer base. We collaborate with our customers to harness the power of AI to tackle their most significant challenges and drive breakthroughs across industries.
Bloomberg Intelligence estimates that the AI market will grow to $1.3 trillion by 2032. Consumer and enterprise models like Google’s Gemini, Meta’s Llama, and OpenAI’s ChatGPT have driven demand for AI infrastructure training and inference solutions, powering AI applications such as specialized assistants, agents, and services. We believe that our AI compute platform addresses a large and growing AI hardware and software opportunity across training and inference, as well as software and expert services. We believe that further adoption of AI, accelerated by the advent of GenAI, and the widespread integration of AI into business processes, will rapidly expand our total addressable market (“TAM”) from an estimated $131 billion in 2024 to $453 billion by 2027, a compounded annual growth rate (“CAGR”) of 51%.

(Note: Cerebras Systems filed its S-1 on Sept. 30, 2024, without disclosing the terms of its IPO. Some IPO pros estimate that Cerebras Systems’ IPO  could raise $1 billion or more. Background: Cerebras Systems submitted confidential IPO documents to the SEC on June 17, 2024. )

Industry: SEMICONDUCTORS & RELATED DEVICES
Employees: 400
Founded: 2016
Contact Information
Address 1237 E. Arques Avenue Sunnyvale, California 94085 (650) 933-4980
Phone Number (650) 933-4980
Web Address http://cerebras.ai/
View Prospectus: Cerebras Systems Inc.
Financial Information
Market Cap
Revenues $206.4 mil (last 12 months)
Net Income $-116.0 mil (last 12 months)
IPO Profile
Symbol CBRS
Exchange NASDAQ
Shares (millions): 0.0
Price range $0.00 - $0.00
Est. $ Volume $1000.0 mil
Manager / Joint Managers Citigroup/Barclays/UBS Investment Bank/Wells Fargo Securities/Mizuho/TD Cowen
CO-Managers Needham & Co./Craig-Hallum Group/Wedbush Securities/Rosenblatt/ Academy Securities
Expected To Trade:
Status: TBA
Quiet Period Expiration Date:
Lock-Up Period Expiration Date:
SCOOP Rating
Rating Change