Intel squeezed an AMD graphics chip, RAM and CPU into one module
the new processor integrates a "semi-custom" AMD graphics chip and the second generation of Intel's "High Bandwidth Memory (HBM2)", which is comparable to GDDR5 in a traditional laptop.
Intel CPU and AMD GPU, together at last
Summary of Intel's news:
The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* – all in a single processor package.
[...] At the heart of this new design is EMIB (Embedded Multi-Die Interconnect Bridge), a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.
[...] Additionally, this solution is the first mobile PC to use HBM2, which consumes much less power and takes up less space compared to traditional discrete graphics-based designs using dedicated graphics memory, like GDDR5 memory.
takyon: This is more like an "integrated discrete GPU" than standard integrated graphics. It also avoids the need for Intel to license AMD's IP. AMD also needs to make a lot of parts since its wafer supply agreement with GlobalFoundries penalizes AMD if they buy less than a target number of wafers each year.
Also at AnandTech and Ars Technica.
Previously: AMD Stock Surges on Report of Intel Graphics Licensing Deal, 16-Core Ryzen Confirmed
Related: Samsung Increases Production of 8 GB High Bandwidth Memory 2.0 Stacks
Related Stories
Shares of AMD rose 11.6% on Tuesday as Fudzilla reported that Intel would license graphics technologies from AMD after a similar deal with Nvidia expired two months earlier. The deal has not been confirmed.
On the other hand, AMD's 16-core "Threadripper" enthusiast/HEDT CPUs have been confirmed:
With one of the gnarliest CPU codenames we've ever seen, the Threadripper multicore monsters will go head to head with Intel's Broadwell-E and upcoming Skylake-E High-End Desktop (HEDT) CPUs alongside a new motherboard platform that promises expanded memory support and I/O bandwidth. That's likely to take the form of quad-channel RAM and more PCIe lanes, similar to Intel's X99 platform, but AMD is saving further details for its press conference at Computex at the end of May.
AMD's 32-core "Naples" server chips are now known as... "Epyc".
You have seen the launch of 4, 6, and 8-core AMD Ryzen parts. How do you feel about 10, 12, 14, and 16 cores (prices unknown, likely $1,000 or more for 16 cores)?
Previously: CPU Rumor Mill: Intel Core i9, AMD Ryzen 9, and AMD "Starship"
In response to increased demand, Samsung is increasing production of the densest HBM2 DRAM available:
Samsung on Tuesday announced that it is increasing production volumes of its 8 GB, 8-Hi HBM2 DRAM stacks due to growing demand. In the coming months the company's 8 GB HBM2 chips will be used for several applications, including those for consumers, professionals, AI, as well as for parallel computing. Meanwhile, AMD's Radeon Vega graphics cards for professionals and gamers will likely be the largest consumers of HBM2 in terms of volume. And while AMD is traditionally a SK Hynix customer, the timing of this announcement with AMD's launches certainly suggests that AMD is likely a Samsung customer this round as well.
Samsung's 8 GB HBM Gen 2 memory KGSDs (known good stacked die) are based on eight 8-Gb DRAM devices in an 8-Hi stack configuration. The memory components are interconnected using TSVs and feature over 5,000 TSV interconnects each. Every KGSD has a 1024-bit bus and offers up to 2 Gbps data rate per pin, thus providing up to 256 GB/s of memory bandwidth per single 8-Hi stack. The company did not disclose power consumption and heat dissipation of its HBM memory components, but we have reached out [to] Samsung for additional details.
Previously:
Samsung Announces Mass Production of HBM2 DRAM
CES 2017: AMD Vega GPUs and FreeSync 2
AMD Launches the Radeon Vega Frontier Edition
An Intel website leaked some details of the Intel Core i7-8809G, a "Kaby Lake" desktop CPU with on-package AMD Radeon graphics and High Bandwidth Memory 2.0. While it is listed as an 8th-generation part, 8th-generation "Coffee Lake" CPUs for desktop users have up to 6 cores (in other words, Intel has been releasing multiple microarchitectures as "8th-generation"). The i7-8809G may be officially announced at the Consumer Electronics Show next week.
The components are linked together using what Intel calls "embedded multi-die interconnect bridge technology" (EMIB). The thermal design power (TDP) of the entire package is around 100 Watts:
Intel at the original launch did state that they were using Core-H grade CPUs for the Intel with Radeon Graphics products, which would mean that the CPU portion is around 45W. This would lead to ~55W left for graphics, which would be in the RX 550 level: 8 CUs, 512 SPs, running at 1100 MHz. It is worth nothing that AMD already puts up to 10 Vega CUs in its 15W processors, so with the Intel i7-8809G product Intel has likely has gone wider and slower: judging by the size of the silicon in the mockup, this could be more of a 20-24 CU design built within that 55W-75W window, depending on how the power budget is moved around between CPU and GPU. We await more information, of course.
It is rumored to include 4 GB of HBM2 on-package, while the CPU also supports DDR4-2400 memory. Two cheaper EMIB CPUs have been mentioned:
According to some other media, the 8809G will turbo to 4.1 GHz, while the graphics will feature 24 [compute units (CUs)] (1536 [stream processors (SPs)]) running at 1190 MHz while the HBM2 is 4GB and will run at 800 MHz. The same media are also listing the Core i7-8705G (20 CUs, 1000 MHz on 'Vega M GL', 700 MHz on HBM2) and a Core i7-8706G. None of the information from those sources is yet to be verified by AnandTech or found on an official Intel webpage.
Currently available AMD Ryzen Mobile APUs only include 8-10 Vega CUs. These are mobile chips with a maximum TDP of 25 W; no desktop Ryzen chips with integrated graphics have been announced yet.
Previously: Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory
Intel isn't just poaching a prominent AMD employee. Intel is planning a return to the discrete GPU market:
On Monday, Intel announced that it had penned a deal with AMD to have the latter provide a discrete GPU to be integrated onto a future Intel SoC. On Tuesday, AMD announced that their chief GPU architect, Raja Koduri, was leaving the company. Now today the saga continues, as Intel is announcing that they have hired Raja Koduri to serve as their own GPU chief architect. And Raja's task will not be a small one; with his hire, Intel will be developing their own high-end discrete GPUs.
[...] [In] perhaps the only news that can outshine the fact that Raja Koduri is joining Intel, is what he will be doing for Intel. As part of today's revelation, Intel has announced that they are instituting a new top-to-bottom GPU strategy. At the bottom, the company wants to extend their existing iGPU market into new classes of edge devices, and while Intel doesn't go into much more detail than this, the fact that they use the term "edge" strongly implies that we're talking about IoT-class devices, where edge goes hand-in-hand with neural network inference. This is a field Intel already plays in to some extent with their Atom processors on the GPU side, and their Movidius neural compute engines on the dedicated silicon sign.
However in what's likely the most exciting part of this news for PC enthusiasts and the tech industry as a whole, is that in aiming at the top of the market, Intel will once again be going back into developing discrete GPUs. The company has tried this route twice before; once in the early days with the i740 in the late 90s, and again with the aborted Larrabee project in the late 2000s. However even though these efforts never panned out quite like Intel has hoped, the company has continued to develop their GPU architecture and GPU-like devices, the latter embodying the massive parallel compute focused Xeon Phi family.
Yet while Intel has GPU-like products for certain markets, the company doesn't have a proper GPU solution once you get beyond their existing GT4-class iGPUs, which are, roughly speaking, on par with $150 or so discrete GPUs. Which is to say that Intel doesn't have access to the midrange market or above with their iGPUs. With the hiring of Raja and Intel's new direction, the company is going to be expanding into full discrete GPUs for what the company calls "a broad range of computing segments."
The boss of AMD's Radeon Technologies Group is leaving the company:
Remember when we reported on the Radeon Technologies Group boss, Raja Koduri, taking a leave of absence with an intent to return to the fold in December? That isn't going to happen, according to a memo Raja has written to his team, because today is his last day in the job.
[...] Our sources tell us that Lisa Su, AMD CEO, will continue to oversee RTG for the foreseeable future. AMD appreciates that such an important role cannot be the sole domain of the CEO, and to this end is actively searching for a successor to Raja. We expect the appointment to be made within a few months.
The rumor mill suggests that Koduri will take a job at Intel, which would come at an interesting time now that Intel is including AMD graphics and High Bandwidth Memory in some of its products.
Update: Intel to Develop Discrete GPUs, Hires Raja Koduri as Chief Architect & Senior VP
Also at HotHardware and Fudzilla.
Previously: Interview With Raja Koduri, Head of the Radeon Technologies Group at AMD
(Score: 2) by MichaelDavidCrawford on Tuesday November 07 2017, @05:17AM (4 children)
... when I got paid, but instead I'll wait until Apple ships a model with this chip in it.
Right around that same time, Intel will announce an even better chip.
When carried to its logical conclusion, this corollary to Zeno's Paradox results in my new MacBook Pro being unobtainable.
While I have a Mac mini - which I truly enjoy, Linux will never be ready for the desktop while windows is like pounding nails with my fists - I need two Macs to develop drivers, as that's what the two-machine debugger requires.
Yes I Have No Bananas. [gofundme.com]
(Score: 3, Insightful) by Anonymous Coward on Tuesday November 07 2017, @09:04AM (2 children)
For me, Linux is already ready for the desktop (I'm using it on all my computers since 2000).
(Score: 3, Interesting) by stormreaver on Tuesday November 07 2017, @01:56PM
I started using Linux as my exclusive desktop system in 1999. Windows has never been ready for the desktop. It was forced onto everybody for so long, though, that people just learned how to shoehorn it into doing the job.
I have transitioned a number of non-technical people to Kubuntu over the years, and all but one refused to return to Windows.
(Score: 0) by Anonymous Coward on Tuesday November 07 2017, @07:36PM
What's nice about this announcement is its timing. I think this article will work on the Intel CPU:
http://schd.ws/hosted_files/ossna2017/91/Linuxcon%202017%20NERF.pdf [schd.ws]
Get rid of most of UEFI, rewrite the firmware on the motherboard. If that doesn't keep out the spooks/whoever, I'd plan on disabling the onboard ethernet and add an ethernet card of a different type so the driver in firmware is useless. Which leaves me with one question: will the graphics chip have an open-source driver for linux?
(Score: 1) by xhedit on Tuesday November 07 2017, @01:03PM
Linux is great on the desktop if you aren't a washed up hack.
(Score: 0) by Anonymous Coward on Tuesday November 07 2017, @09:08AM (10 children)
So Intel effectively admits that its own graphics is worse than AMD's?
BTW, the "Slow Down Cowboy" message should be placed somewhere where you actually can see it not only by accident.
(Score: 4, Interesting) by TheRaven on Tuesday November 07 2017, @10:45AM (7 children)
sudo mod me up
(Score: 2) by Spamalope on Tuesday November 07 2017, @11:12AM (4 children)
The chips may be destined to a market AMD doesn't have any penetration into so it won't cannibalize sales.
On the other hand, perhaps they're hoping Intel won't invest as heavily in their own solutions and that the product will be very successful. A few generations later AMD would be in a much better negotiating position. (aka, use this to cut off the air supply for any other cheap embedded solutions)
(Score: 2) by tonyPick on Tuesday November 07 2017, @11:18AM (3 children)
Be interesting to see how it stacks against Ryzen Mobile though - this looks like a fairly direct competitor there.
https://www.amd.com/en/products/ryzen-processors-laptop [amd.com]
(Score: 4, Interesting) by TheRaven on Tuesday November 07 2017, @05:04PM (2 children)
sudo mod me up
(Score: 3, Informative) by takyon on Tuesday November 07 2017, @11:48PM (1 child)
DDR4 [slickdeals.net] vs. LPDDR4 [slickdeals.net]
LPDDR4 in a laptop seems like a goddamn unicorn to me. So it clearly does not matter.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Informative) by TheRaven on Wednesday November 08 2017, @10:14AM
sudo mod me up
(Score: 2) by richtopia on Tuesday November 07 2017, @04:33PM (1 child)
The Anandtech article speculates that these chips will still have the Intel graphics on-die for low power applications. Intel has never sold their GPU independent of a mobo or CPU, so they realize that they have a niche. But they do dominate that low power space. In addition, Intel can leverage these cores for non-display applications such as video encoding (Intel Quick Sync Video) or allowing an additional monitor to be attached.
It will be interesting if AMD ever brings something to this market segment (mobile 45W CPU with graphics) and how it would compete on low power applications. However I suspect AMD's marketing team realizes they can only compete with Intel directly in certain segments, and I doubt high performance mobile will be one of them.
(Score: 2) by takyon on Tuesday November 07 2017, @11:50PM
The A6-3400M [notebookcheck.net] chip I use is a 35 W TDP chip. AMD is releasing 15 W first for Raven Ridge but I would be shocked if they did not put out something near 30-35 W later. Not sure about 45 W.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Wootery on Tuesday November 07 2017, @11:14AM (1 child)
Well, sure. Intel is doing pretty well considering where they used to be (awful embedded graphics), but they're not seriously competing with the big boys.
(Score: 2) by bob_super on Tuesday November 07 2017, @05:42PM
Just when Intel Graphics were getting "good enough" for HD, the industry moved to 4K...
Got some catching up to do (in performance, since they still have the lead in volume), and in the meantime propping up AMD a bit keeps the regulators (not the US obviously, but the ROW) at bay while not making an impact on the bottom line.
(Score: 3, Insightful) by ledow on Tuesday November 07 2017, @01:34PM
Intel was always running last in graphics anyway.
Intel + nVidia is the gamer's combo.
AMD + AMD/ATI is the cheap gamer's combo / "i have better numbers for 2.5 seconds until another product comes out" show-off.
Intel does need a better on-board graphics, there's no question. Licensing nVidia would pretty much cut AMD out of the market entirely overnight. Who would bother to buy AMD?
So to combat that I wouldn't be surprised if AMD approached Intel about putting their GPU on Intel's chips, to try to stay relevant and not get shut out entirely (and/or have to sue under anti-trust to prevent such a deal in the first place).
But... though I would like a better default GPU on all machines so I don't have to explain to people that they must have another card / a particular chipset to play even the most basic of games (e.g. The Sims series etc.), I still see that people would buy Intel and put nVidia in if they want it to be a gamer's machine.
I reckon AMD's combat to that would be something like an SLI mechanism so the on-board GPU can help an AMD PCI-e card a little.
It seems the only logical way forward that doesn't end up in an Intel + nVidia monopoly that could quickly turn on the consumer.
And I'll be quite happy to have a decent-enough GPU - even if it is AMD - in processors by default in 5-10 year's time. It would mean that things like OpenGL / Vulkan etc. would become de-facto rather than a bolt-on or severely limited. And maybe we'd even get some decent drivers / abstraction layers out of it (but that's hoping for a lot!).
Roll on the days where you can just assume that playing a basic 3D game, running something in OpenCL or WebGL, or running something like a browser in accelerated mode won't kill a machine, even a business-class machine.
(Score: 3, Informative) by Rich on Tuesday November 07 2017, @02:35PM (3 children)
Look at the accumulated and current losses of AMD. That's not sustainable. But Intel fears Antitrust more than AMD. So, every now and then, they throw them a little bone to chew on. :)
(Disclaimer: I hold a couple of AMD shares. Should've sold them at 44 in 2004 instead of Apple...)
(Score: 2) by RS3 on Tuesday November 07 2017, @04:17PM (2 children)
D'oh!
In all fairness, Apple wasn't looking strong in 2004.
(Score: 0) by Anonymous Coward on Tuesday November 07 2017, @04:33PM (1 child)
Bitcoin wasn't looking strong in 2010 (kill me).
(Score: 0) by Anonymous Coward on Tuesday November 07 2017, @05:28PM
Bitcoin wasn't worth my CPU cycles and $0.07/kWh electricity in March 2009 .... and I was one of the top-1000 participants in the SETI at-home and the public crypto factoring challenge (forgot what it was called now) ... so yeah...
But you know, don't cry over spilled milk.