342 Transistors for Every Person In the World: Cerebras 2nd Gen Wafer Scale Engine Teased
One of the highlights of Hot Chips from 2019 was the startup Cerebras showcasing its product – a large 'wafer-scale' AI chip that was literally the size of a wafer. The chip itself was rectangular, but it was cut from a single wafer, and contained 400,000 cores, 1.2 trillion transistors, 46225 mm2 of silicon, and was built on TSMC's 16 nm process.
[...] Obviously when doing wafer scale, you can't just add more die area, so the only way is to optimize die area per core and take advantage of smaller process nodes. That means for TSMC 7nm, there are now 850,000 cores and 2.6 trillion transistors. Cerebras has had to develop new technologies to deal with multi-reticle designs, but they succeeded with the first gen, and transferred the learnings to the new chip. We're expecting more details about this new product later this year.
Previously: Cerebras "Wafer Scale Engine" Has 1.2 Trillion Transistors, 400,000 Cores
Cerebras Systems' Wafer Scale Engine Deployed at Argonne National Labs
(Score: 2) by driverless on Wednesday August 19 2020, @02:58AM (1 child)
I was at the Hot Chips presentation when they introduced this... the response from industry professionals was a near-universal WTF, it just doesn't make sense to pile everything onto one monster device when you can avoid the near-insurmountable engineering problems just by going with many smaller ones. Even Cerebras admitted that every time someone's tried WSI it's failed, "but this time it's different". Surprised to see they're still around, presumably they're relying on a couple of near-infinite-budget national-lab customers to keep going.
(Score: 3, Informative) by takyon on Saturday August 22 2020, @02:59AM
Cerebras Wafer Scale Engine News: DoE Supercomputer Gets 400,000 AI Cores [anandtech.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]