Intel has teased* plans to return to the discrete graphics market in 2020. Now, some of those plans have leaked. Intel's Xe branded GPUs will apparently use an architecture capable of scaling to "any number" of GPUs that are connected by a multi-chip module (MCM). The "e" in Xe is meant to represent the number of GPU dies, with one of the first products being called X2/X2:
Developers won't need to worry about optimizing their code for multi-GPU, the OneAPI will take care of all that. This will also allow the company to beat the foundry's usual lithographic limit of dies that is currently in the range of ~800mm2. Why have one 800mm2 die when you can have two 600mm2 dies (the lower the size of the die, the higher the yield) or four 400mm2 ones? Armed with One API and the Xe macroarchitecture Intel plans to ramp all the way up to Octa GPUs by 2024. From this roadmap, it seems like the first Xe class of GPUs will be X2.
The tentative timeline for the first X2 class of GPUs was also revealed: June 31st, 2020. This will be followed by the X4 class sometime in 2021. It looks like Intel plans to add two more cores [dies] every year so we should have the X8 class by 2024. Assuming Intel has the scaling solution down pat, it should actually be very easy to scale these up. The only concern here would be the packaging yield – which Intel should be more than capable of handling and binning should take care of any wastage issues quite easily. Neither NVIDIA nor AMD have yet gone down the MCM path and if Intel can truly deliver on this design then the sky's the limit.
AMD has made extensive use of MCMs in its Zen CPUs, but will reportedly not use an MCM-based design for its upcoming Navi GPUs. Nvidia has published research into MCM GPUs but has yet to introduce products using such a design.
Intel will use an MCM for its upcoming 48-core "Cascade Lake" Xeon CPUs. They are also planning on using "chiplets" in other CPUs and mixing big and small CPU cores and/or cores made on different process nodes.
*Previously: Intel Planning a Return to the Discrete GPU Market, Nvidia CEO Responds
Intel Discrete GPU Planned to be Released in 2020
Intel Announces "Sunny Cove", Gen11 Graphics, Discrete Graphics Brand Name, 3D Packaging, and More
Related: Intel Integrates LTE Modem Into Custom Multi-Chip Module for New HP Laptop
Intel Promises "10nm" Chips by the End of 2019, and More
(Score: 0) by Anonymous Coward on Sunday March 31 2019, @11:06PM (6 children)
Not the day for units on Soylent is it? Diamond forming pressures given in quadi-elephants per square fingernail, and now we have chips almost three feet across.
800mm is 0.8 meters (0.91 meters is 3 feet). Either Intel have truly astounding die process with 800m wafers from which they cut dozens of chips, or something has gone squirly. A single chip on a 800mm die would not quite fit inside my laptop, or any computer I've ever owned. And they named them "micro" processors... :)
(Score: 2) by takyon on Sunday March 31 2019, @11:18PM (1 child)
I fixed it before you commented.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by EvilSS on Sunday March 31 2019, @11:40PM
(Score: 2) by PartTimeZombie on Monday April 01 2019, @12:07AM
I thought that was just the "Texas" option, because everything's bigger in Texas.
(Score: 2) by Snotnose on Monday April 01 2019, @12:26AM (2 children)
Not in my experience. Granted, I'm a software engineer. But I spent a lot of my life verifying new born silicon. The closer to the cutting edge, e.g. the smaller the die, the more weird problems you run across and, in general, the lower the yield for any given die in it's first incarnations.
You want weird? Had a SOC (system on a chip) that would trigger the dead man timer every 1-5 days. Have fun troubleshooting that. Boss put me on it because I had the most hardware knowledge. I had the "golden" laptop that triggered the issue most, and a bunch of "do this, it dies" from assorted folks. Took me 2 weeks (mostly thumb twiddling), but I tracked it down to a write of a particular register. Nothing to do with the laptop, nothing to do with the "do this".
The pisser on that one was we were short of JPEG debuggers, so while waiting for days for the problem to hit I literally had nothing to do. There was a flash game about mining stuff that I got really good at. My boss knew, his boss knew, and I spent 8 hours a day playing some stupid flash game because without a debugger I was useless.
Best part? Commented out the offending line, then waiting a week to see if the system crashed. It didn't (it was a debug register the hardware folks used, but did nothing critical). I felt good I'd found the problem but wouldn't have bet anything on it. When a crash happens within 1 hour to 1 week it's hard to have confidence you've found the problem, even if you have rock solid evidence.
How did I find it? A rolling array of where the code went. 256 bytes. In the code I put checkpoints that wrote to the array. When the system crashed I could bring up the memory controller and read my array. Narrowed things down to 1 "you have got to be kidding me" write instruction, commenting that out solved the problem.
Bad decisions, great stories
(Score: 2) by takyon on Monday April 01 2019, @12:47AM
They are comparing large GPU to smaller GPU, not Qualcomm SoC to whatever. From a linked older article:
https://wccftech.com/nvidia-future-gpu-mcm-package/ [wccftech.com]
Here's the Zen 2 chiplet + I/O die (estimated sizes):
https://www.anandtech.com/show/13829/amd-ryzen-3rd-generation-zen-2-pcie-4-eight-core [anandtech.com]
So for a 64-core Epyc, there should be 8 of the chiplets and an I/O die (larger size version I think). CPUs tend to be much smaller than GPUs.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday April 02 2019, @12:01PM
>we were short of JPEG debuggers
JTAG, hopefully
(Score: 0) by Anonymous Coward on Monday April 01 2019, @12:23AM (1 child)
Why settle for a single undocumented attack vector? Intel and partner NSA announce OneIME, which allows seamless scaling to 8 undocumented telemetry devices at once.
(Score: 2) by takyon on Monday April 01 2019, @12:48AM
It's possible that the Intel GPUs won't have any telemetry/backdoor features, and you could just connect it some other company's CPU.
So, are you running on open hardware?
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by linkdude64 on Monday April 01 2019, @01:59AM (2 children)
Intel late to the game!
Also just in: We are late to the game, also!
Good luck winning back your consumer confidence after your years of stagnation, Intel.
(Score: 2) by driverless on Monday April 01 2019, @05:15AM (1 child)
They're not late to the game, they've been trying to get in since the i740 twenty years ago (the 82720 doesn't really count since it was a rebadged NEC design), and have failed to penetrate anything but the budget market every single time they've tried. This is another attempt that'll fail, they may be big in the CPU world but they can't compete with nVidia/ATI-AMD who have been doing this for their entire corporate lives.
(Score: 2) by takyon on Monday April 01 2019, @05:28AM
Intel Larrabee [wikipedia.org] was a failed Intel GPU effort that later became the basis of the "manycore" Xeon Phi [wikipedia.org] chips, that have seen use in supercomputers and machine learning.
https://www.nextplatform.com/2018/07/27/end-of-the-line-for-xeon-phi-its-all-xeon-from-here/ [nextplatform.com]
https://www.theregister.co.uk/2018/06/13/intel_gpus_2020/ [theregister.co.uk]
Xeon Phi was discontinued. In its place, Intel will sell Xeons with lots of cores (like 48-core Cascade Lake, and more cores are sure to be added as Intel expands its use of MCMs to try to compete with AMD's Epyc, Threadripper, and Ryzen) and these new discrete GPUs. Intel sees Nvidia making a lot of money selling GPUs for machine learning, driverless vehicles, etc. and wants a piece of that pie. Even the market for high-end gaming GPUs has been pretty strong, and could remain so if high-spec VR becomes the driver of upgrades. MCMs consist of multiple dies; Intel can pick and choose which ones go into the server/enterprise products, and leave the scrappier ones for the gamers.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by shortscreen on Monday April 01 2019, @04:59AM (1 child)
A letter X with a number after it. What an original naming scheme. I'm so impressed.
Now that Intel is going to make fancy discrete GPUs, does that mean they can also go back to making CPUs uninfested by their redundant rubbish graphics?
(Score: 2) by takyon on Monday April 01 2019, @05:01AM
https://www.anandtech.com/show/13865/intels-graphics-free-chips-are-also-savings-free-same-price-fewer-features [anandtech.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]