Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday August 20 2019, @09:11PM   Printer-friendly
from the not-going-to-fit-in-a-cell-phone dept.

The five technical challenges Cerebras overcame in building the first trillion transistor chip

Superlatives abound at Cerebras, the until-today stealthy next-generation silicon chip company looking to make training a deep learning model as quick as buying toothpaste from Amazon. Launching after almost three years of quiet development, Cerebras introduced its new chip today — and it is a doozy. The "Wafer Scale Engine" is 1.2 trillion transistors (the most ever), 46,225 square millimeters (the largest ever), and includes 18 gigabytes of on-chip memory (the most of any chip on the market today) and 400,000 processing cores (guess the superlative).

It's made a big splash here at Stanford University at the Hot Chips conference, one of the silicon industry's big confabs for product introductions and roadmaps, with various levels of oohs and aahs among attendees. You can read more about the chip from Tiernan Ray at Fortune and read the white paper from Cerebras itself.

Also at BBC, VentureBeat, and PCWorld.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Tuesday August 20 2019, @09:27PM (1 child)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday August 20 2019, @09:27PM (#882791) Journal

    https://cdn.wccftech.com/wp-content/uploads/2019/08/cerebras-wse-nvidia-v100-featured-image.jpg [wccftech.com]

    I skimmed the white paper [cerebras.net] and there's nothing to answer your specific questions. But there's one bad typo:

    The Cerebras WSE has 18 Gigabytes of on chip memory and 9.6 bytes of memory bandwidth.

    I think that's supposed to be 9.6 petabytes per second.

    The TechCrunch article does get into packaging:


    The third challenge Cerebras confronted was handling thermal expansion. Chips get extremely hot in operation, but different materials expand at different rates. That means the connectors tethering a chip to its motherboard also need to thermally expand at precisely the same rate, lest cracks develop between the two.

    As Feldman explained, “How do you get a connector that can withstand [that]? Nobody had ever done that before, [and so] we had to invent a material. So we have PhDs in material science, [and] we had to invent a material that could absorb some of that difference.”

    Once a chip is manufactured, it needs to be tested and packaged for shipment to original equipment manufacturers (OEMs) who add the chips into the products used by end customers (whether data centers or consumer laptops). There is a challenge though: Absolutely nothing on the market is designed to handle a whole-wafer chip.

    “How on earth do you package it? Well, the answer is you invent a lot of shit. That is the truth. Nobody had a printed circuit board this size. Nobody had connectors. Nobody had a cold plate. Nobody had tools. Nobody had tools to align them. Nobody had tools to handle them. Nobody had any software to test,” Feldman explained. “And so we have designed this whole manufacturing flow, because nobody has ever done it.” Cerebras’ technology is much more than just the chip it sells — it also includes all of the associated machinery required to actually manufacture and package those chips.

    [...] Cerebras has a demo chip (I saw one, and yes, it is roughly the size of my head), and it has started to deliver prototypes to customers, according to reports. The big challenge, though, as with all new chips, is scaling production to meet customer demand.

    For Cerebras, the situation is a bit unusual. Because it places so much computing power on one wafer, customers don’t necessarily need to buy dozens or hundreds of chips and stitch them together to create a compute cluster. Instead, they may only need a handful of Cerebras chips for their deep-learning needs. The company’s next major phase is to reach scale and ensure a steady delivery of its chips, which it packages as a whole system “appliance” that also includes its proprietary cooling technology.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by All Your Lawn Are Belong To Us on Tuesday August 20 2019, @10:26PM

    by All Your Lawn Are Belong To Us (6553) on Tuesday August 20 2019, @10:26PM (#882826) Journal

    to original equipment manufacturers (OEMs) who add the chips into the products used by end customers (whether data centers or consumer laptops)

    I'd hate to hold a laptop with a chip like this in my lap.... :O ;)

    --
    This sig for rent.