Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Cerebras Packs 16 Wafer-Scale Chips Into Andromeda "AI" Supercomputer

Accepted submission by takyon at 2022-11-14 21:21:51
Hardware

Hungry for AI? New supercomputer contains 16 dinner-plate-size chips [arstechnica.com]

On Monday, Cerebras Systems unveiled its 13.5 million core Andromeda AI supercomputer for deep learning, reports [reuters.com] Reuters. According Cerebras, Andromeda delivers over one 1 exaflop (1 quintillion operations per second) of AI computational power at 16-bit half precision.

The Andromeda is itself a cluster of 16 Cerebras C-2 [cerebras.net] computers linked together. Each CS-2 contains one Wafer Scale Engine chip [cerebras.net] (often called "WSE-2"), which is currently the largest silicon chip [computerhistory.org] ever made, at about 8.5-inches square and packed with 2.6 trillion transistors organized into 850,000 cores.

Cerebras built Andromeda at a data center in Santa Clara, California, for $35 million. It's tuned for applications like large language models [arstechnica.com] and has already been in use for academic and commercial work. "Andromeda delivers near-perfect scaling via simple data parallelism across GPT-class large language models, including GPT-3, GPT-J and GPT-NeoX," writes [cerebras.net] Cerebras in a press release.

Previously: Cerebras "Wafer Scale Engine" Has 1.2 Trillion Transistors, 400,000 Cores [soylentnews.org]
Cerebras Systems' Wafer Scale Engine Deployed at Argonne National Labs [soylentnews.org]
Cerebras More than Doubles Core and Transistor Count with 2nd-Generation Wafer Scale Engine [soylentnews.org]
The Trillion-Transistor Chip That Just Left a Supercomputer in the Dust [soylentnews.org]


Original Submission