An Intel website leaked some details of the Intel Core i7-8809G, a "Kaby Lake" desktop CPU with on-package AMD Radeon graphics and High Bandwidth Memory 2.0. While it is listed as an 8th-generation part, 8th-generation "Coffee Lake" CPUs for desktop users have up to 6 cores (in other words, Intel has been releasing multiple microarchitectures as "8th-generation"). The i7-8809G may be officially announced at the Consumer Electronics Show next week.
The components are linked together using what Intel calls "embedded multi-die interconnect bridge technology" (EMIB). The thermal design power (TDP) of the entire package is around 100 Watts:
Intel at the original launch did state that they were using Core-H grade CPUs for the Intel with Radeon Graphics products, which would mean that the CPU portion is around 45W. This would lead to ~55W left for graphics, which would be in the RX 550 level: 8 CUs, 512 SPs, running at 1100 MHz. It is worth nothing that AMD already puts up to 10 Vega CUs in its 15W processors, so with the Intel i7-8809G product Intel has likely has gone wider and slower: judging by the size of the silicon in the mockup, this could be more of a 20-24 CU design built within that 55W-75W window, depending on how the power budget is moved around between CPU and GPU. We await more information, of course.
It is rumored to include 4 GB of HBM2 on-package, while the CPU also supports DDR4-2400 memory. Two cheaper EMIB CPUs have been mentioned:
According to some other media, the 8809G will turbo to 4.1 GHz, while the graphics will feature 24 [compute units (CUs)] (1536 [stream processors (SPs)]) running at 1190 MHz while the HBM2 is 4GB and will run at 800 MHz. The same media are also listing the Core i7-8705G (20 CUs, 1000 MHz on 'Vega M GL', 700 MHz on HBM2) and a Core i7-8706G. None of the information from those sources is yet to be verified by AnandTech or found on an official Intel webpage.
Currently available AMD Ryzen Mobile APUs only include 8-10 Vega CUs. These are mobile chips with a maximum TDP of 25 W; no desktop Ryzen chips with integrated graphics have been announced yet.
Previously: Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory
(Score: 2, Informative) by Anonymous Coward on Tuesday January 02 2018, @04:51PM (6 children)
The term has started to lose its meaning. Sites that suck use it to attract viewers.
I was going to blame Anandtech, but after going there to confirm things, the word "leak" isn't part of the content. The word appears when they referenced information they had last year, but it's not relating to the Intel press release that the article is about. The info was posted on the other side of the world prior to being posted on this side of the world.
Perhaps geographical regions that have a differently timed day and night cycles due to, you know, actually not being in California, actually have scripts that publish authorized data on a schedule that doesn't follow Silicon Valley time? Strange and unusual, I know... Intel might even have cheap local resources posting it, too, who knows, it *IS* an Indian website that the info was first seen.
But it's in no way a leak..
(Score: 2) by LoRdTAW on Tuesday January 02 2018, @05:10PM (1 child)
Agreed. This is a "sneak peek". Not a leak.
(Score: 3, Funny) by DannyB on Tuesday January 02 2018, @05:46PM
If it were more info about Intel's Management Engine, it would be a whistleblower, not a "leak".
Paid for by Americans for Renewable Complaining and Sustainable Whining.
Fact: We get heavier as we age due to more information in our heads. When no more will fit it accumulates as fat.
(Score: 2) by takyon on Tuesday January 02 2018, @05:39PM (3 children)
The info was obviously published on accident before a regular press release:
And the very end of the article references leaks/rumors beyond what came from Intel's Indian website.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday January 02 2018, @06:22PM (2 children)
The "mistake" has saved Intel millions in advertisement costs.
(Score: 2) by takyon on Tuesday January 02 2018, @06:25PM (1 child)
All they have to do is issue a press release to get coverage on these sites. One drone typing for an hour, cross checked with marketing and legal. Probably several hundreds of dollars of expenditure, not millions. And they will still issue one even if ALL of the relevant details have already leaked.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by bob_super on Tuesday January 02 2018, @06:55PM
Dang, they still have much to learn from Apple, which gets full front page coverage on the potential that some rumor about the idea of a leak might be plausible, plus full-blown analysis of fanfiction photoshops.
(Score: 2, Interesting) by waximius on Tuesday January 02 2018, @08:31PM (6 children)
Ok, I've seen this going on for a while and will now ask the dumb question - why is Intel integrating GPUs onto the same package as their processors?
Don't misunderstand, I can see many benefits of on-package GPUs, but if we're going to disrupt motherboards and sockets again why are we not making faster/larger processors instead of trying to integrate GPUs? If we are creating more real estate on-package, I don't see that putting a GPU on it is the best decision for the additional space needed and heat produced.
As far as I know, full video cards can run circles around on-package GPUs. They are modular (I can use an old CPU and upgrade my video card independently), and the technology can still be made faster (we're not done innovating in that space). So, what benefits are there to an on-chip GPU that it would ever be an attractive option for a buyer? It will increase cost, be useful for only a certain market segment, and right now I completely discount this combined package because I think a video card gives me more performance bang for my buck than on-chip GPUs.
Lastly, why AMD? Their drivers are not near as good as NVidia's (IMHO), and they own less than half of the market share by comparison. I upgrade my rig infrequently, and I switched from AMD video cards back in 2012 because I had such a bad 4-year experience with AMD.
I don't see anybody asking these questions, so I'm hoping the answer is obvious and I'm just missing it somewhere.
Market share comparison:
https://wccftech.com/nvidia-amd-discrete-gpu-market-share-report-q3-2017/ [wccftech.com]
(Score: 4, Interesting) by LoRdTAW on Tuesday January 02 2018, @09:25PM (5 children)
You're thinking small.
Have you looked at the Intel and AMD server/HPC offerings? Plenty of big CPU's with lots of cores there.
There's this thing called the internet which is heavily driven by visual content delivered to screens. With all the new web technologies such as WebGL, streaming video, and all sorts of other stuff, why would you not include a GPU? It's a desktop necessity nowadays. Just because it cant play Crysis in 4k at 240Hz doesn't mean it's useless.
Because Intel and AMD already cross license technologies (hello, x86-64!). Business wise, Nvidia doesn't need Intel as they are doing quite well in the Mobile, HPC, AI, Deep Learning, and autonomous automotive markets. And the HPC/AI/Deep LEarning is very profitable as you can sell shit loads of chips at once to big customers with DEEP pockets. Intel is gearing up in some of those areas with The Xeon Phi and FPGA tech they got from Altera. So Intel and Nvidia are going to compete head to head in those markets where AMD is pretty much absent from. The enemy of my enemy is my friend and AMD is more of a friend than Nvidia at this point. And as for your driver complaint, you think Intel would let that be a problem? I mean who better than Intel to get those damn drivers into the Linux Kernel? Intel CPU with excellent GPU with mainlined kernel drivers: Win-win in my book.
(Score: 0) by Anonymous Coward on Tuesday January 02 2018, @09:58PM (1 child)
Ever hear of GMA500? Poulsbo? Yeah, considering how Intel screwed me once with integrated graphics licensed from a third party, I absolutely believe they'd do it again.
(Score: 2) by LoRdTAW on Wednesday January 03 2018, @07:41PM
Yea, that was a hiccup. I had one of the diamondville's on an Intel ITX board with the GMA950. Terrible performance but it was my first ITX/low power system to play with and I had it hooked to a TV for a while as a media player which it sucked at when it came to HD but I didn't really care, and then it was a small desktop before I shelved it.
(Score: 1) by waximius on Tuesday January 02 2018, @10:18PM (2 children)
Thank you, great information. One follow up to your point:
I didn't mean to imply that I thought this configuration was useless, but that it applies to a limited market segment. Based on what you say though, I can see that the segment is much broader than I initially thought. If I understand right, the on-package GPU could be used for rendering lighter weight things, and a full video card could still be added and utilized for heavier weight applications like gaming. It's not an "either-or" situation, but a "yes-and".
I like that as long as an Intel+GPU combo doesn't speed up obsolescence. My current configuration of CPU + video card runs just fine 10 years after I built it. I've upgraded to an SSD, and upgraded my video card multiple times, but am still using a Core-i7 920 and only have 6GB RAM. The longevity of that processor has been amazing, and I'm just now thinking I should upgrade the CPU. Having the GPU on-chip scares me only because I feel like I need to upgrade my video card somewhat regularly (based on need, but games require better hardware every year).
In any case, thanks for the response, very informative.
(Score: 3, Informative) by takyon on Tuesday January 02 2018, @10:45PM
The combination of an on-package GPU and the High Bandwidth Memory may have some advantages over discrete GPUs. Moving everything closer together helps overcome certain limits:
http://www.nersc.gov/users/computational-systems/cori/application-porting-and-performance/using-on-package-memory/ [nersc.gov]
Some users want smaller form factors, for Home Theater PCs (HTPCs) for example. This kind of on-package stuff might be cheaper than using a discrete GPU, with lower power consumption, but with better performance than integrated graphics. It might be worth it.
Also, I was not sure when writing the summary, but I think both the Intel integrated graphics and AMD Radeon Vega graphics may be included on these chips. In which case you might have a setup well suited to newer graphics APIs like Vulkan which can take advantage of these disparate assets.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by LoRdTAW on Wednesday January 03 2018, @03:12AM
There are plenty of use cases where the on die GPU is basically a necessity, even if it is low end. One I forgot to mention is that I think chrome renders pages on the GPU if supported. Then there is GPU 3D accelerated desktop compositing window managers. All that is used on business desktops daily.
My Linux box is an AMD A10 APU which has plenty of CPU and GPU to do all the basics for development stuff and web browsing. Though I now wish it would morph into a Ryzen but I'm not spending money on it just to have it as I don't need all that CPU.
(Score: 2) by Azuma Hazuki on Tuesday January 02 2018, @10:27PM (3 children)
I don't like this. Intel poached Koduri essentially--we know *nothing* of how or why he joined Intel!--and now appears to be trying to create a mashup of Ryzen CPU performance with Ravenridge-level IGP. It's an interesting technical angle, but something about this smells as far as the corporate side goes. Intel fights fucking dirty, always did, and they seem to want to undermine AMD instead of competing with them.
I am "that girl" your mother warned you about...
(Score: 4, Interesting) by takyon on Tuesday January 02 2018, @10:39PM (2 children)
AMD has done well with Ryzen and GPUs recently, but their easy cryptomining cash boost will probably dry up soon.
This gets them to tap into a source of revenue. Although it probably doesn't help them with market share, they can leech off of Intel's.
This is also less of a devastating self-flagellation for AMD than it might have been before they launched Ryzen. Ryzen has partially closed a huge gap with Intel's CPUs and IPC. It has inserted them back into desktops which they had all but abandoned. Future iterations of their hardware might be somewhat more competitive with Intel since Intel has struggled to move past 14nm and although GlobalFoundries, Samsung, et al. are said to have crappier process nodes than Intel, they are moving a little faster (for example, GlobalFoundries 7nm might be comparable to Intel's 10nm, but 7nm will be around before Intel gets much 10nm stuff out).
AMD's next big move may be to muscle into the machine learning and automotive territory where Nvidia is riding high. There was talk of a GPU made for Tesla (the car company, not the Nvidia product).
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday January 02 2018, @10:57PM
With Ryzen and Vega, AMD has fully thrown itself into the 'Secured for government spying' Arena, with backdoored CPUs and GPUs with signed firmware and potential hypervisor level backdoors that co-opt what little of the mainstream computer market was left after Intel was done with it.
If you consider this from the intelligence-technlogy complex mindset, throwing AMD a few bones to help provide access to 99 percent of non-cellular computer users *IN THE WORLD*, this is a bargain to keep them in business while also ensuring the entire x86 market is following in lock-step the controls being put in place to either control or surveil the public.
Given that basically all major software requires x86 today, even the software that doesn't require windows, this gives a dramatic amount of potential (even if unused) surveillance power to the gatekeepers who control it. In this case the NSA, Mossad, GCHQ, and possibly Japan through SoftBank's newfound ownership of ARM.
Without competitors coming onto the market, especially competitors from other regions/nationalities, and ideally from other countries who still believe in privacy, if not free speech, we are rapidly approaching the sort of technological tipping point that Continuum depicted as far as one technology coopting control of the world towards one groups ideological control.
(Score: 2) by LoRdTAW on Wednesday January 03 2018, @03:24AM
And, lets just throw this in here: What if Intel worked with AMD to enable AMD discreet GPU cards work in tandem with each other in laptop or desktop? You can have it either way and everything plays nicely: Ryzen APU + AMD GPU || Intel [AMD]APU + AMD GPU. Pick your CPU core of choice. Then the main GPU can be turned off when just browsing or watching netflix and throttle on for the latest mmor-fps-rpg-whatever or buttcoin mining.