Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday January 02 2018, @04:27PM   Printer-friendly
from the progress++ dept.

An Intel website leaked some details of the Intel Core i7-8809G, a "Kaby Lake" desktop CPU with on-package AMD Radeon graphics and High Bandwidth Memory 2.0. While it is listed as an 8th-generation part, 8th-generation "Coffee Lake" CPUs for desktop users have up to 6 cores (in other words, Intel has been releasing multiple microarchitectures as "8th-generation"). The i7-8809G may be officially announced at the Consumer Electronics Show next week.

The components are linked together using what Intel calls "embedded multi-die interconnect bridge technology" (EMIB). The thermal design power (TDP) of the entire package is around 100 Watts:

Intel at the original launch did state that they were using Core-H grade CPUs for the Intel with Radeon Graphics products, which would mean that the CPU portion is around 45W. This would lead to ~55W left for graphics, which would be in the RX 550 level: 8 CUs, 512 SPs, running at 1100 MHz. It is worth nothing that AMD already puts up to 10 Vega CUs in its 15W processors, so with the Intel i7-8809G product Intel has likely has gone wider and slower: judging by the size of the silicon in the mockup, this could be more of a 20-24 CU design built within that 55W-75W window, depending on how the power budget is moved around between CPU and GPU. We await more information, of course.

It is rumored to include 4 GB of HBM2 on-package, while the CPU also supports DDR4-2400 memory. Two cheaper EMIB CPUs have been mentioned:

According to some other media, the 8809G will turbo to 4.1 GHz, while the graphics will feature 24 [compute units (CUs)] (1536 [stream processors (SPs)]) running at 1190 MHz while the HBM2 is 4GB and will run at 800 MHz. The same media are also listing the Core i7-8705G (20 CUs, 1000 MHz on 'Vega M GL', 700 MHz on HBM2) and a Core i7-8706G. None of the information from those sources is yet to be verified by AnandTech or found on an official Intel webpage.

Currently available AMD Ryzen Mobile APUs only include 8-10 Vega CUs. These are mobile chips with a maximum TDP of 25 W; no desktop Ryzen chips with integrated graphics have been announced yet.

Previously: Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by LoRdTAW on Tuesday January 02 2018, @09:25PM (5 children)

    by LoRdTAW (3755) on Tuesday January 02 2018, @09:25PM (#616900) Journal

    You're thinking small.

    Don't misunderstand, I can see many benefits of on-package GPUs, but if we're going to disrupt motherboards and sockets again why are we not making faster/larger processors instead of trying to integrate GPUs? If we are creating more real estate on-package, I don't see that putting a GPU on it is the best decision for the additional space needed and heat produced.

    Have you looked at the Intel and AMD server/HPC offerings? Plenty of big CPU's with lots of cores there.

    As far as I know, full video cards can run circles around on-package GPUs. They are modular (I can use an old CPU and upgrade my video card independently), and the technology can still be made faster (we're not done innovating in that space). So, what benefits are there to an on-chip GPU that it would ever be an attractive option for a buyer? It will increase cost, be useful for only a certain market segment, and right now I completely discount this combined package because I think a video card gives me more performance bang for my buck than on-chip GPUs.

    There's this thing called the internet which is heavily driven by visual content delivered to screens. With all the new web technologies such as WebGL, streaming video, and all sorts of other stuff, why would you not include a GPU? It's a desktop necessity nowadays. Just because it cant play Crysis in 4k at 240Hz doesn't mean it's useless.

    Lastly, why AMD? Their drivers are not near as good as NVidia's (IMHO), and they own less than half of the market share by comparison. I upgrade my rig infrequently, and I switched from AMD video cards back in 2012 because I had such a bad 4-year experience with AMD.

    Because Intel and AMD already cross license technologies (hello, x86-64!). Business wise, Nvidia doesn't need Intel as they are doing quite well in the Mobile, HPC, AI, Deep Learning, and autonomous automotive markets. And the HPC/AI/Deep LEarning is very profitable as you can sell shit loads of chips at once to big customers with DEEP pockets. Intel is gearing up in some of those areas with The Xeon Phi and FPGA tech they got from Altera. So Intel and Nvidia are going to compete head to head in those markets where AMD is pretty much absent from. The enemy of my enemy is my friend and AMD is more of a friend than Nvidia at this point. And as for your driver complaint, you think Intel would let that be a problem? I mean who better than Intel to get those damn drivers into the Linux Kernel? Intel CPU with excellent GPU with mainlined kernel drivers: Win-win in my book.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Informative=1, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @09:58PM (1 child)

    by Anonymous Coward on Tuesday January 02 2018, @09:58PM (#616916)

    And as for your driver complaint, you think Intel would let that be a problem?

    Ever hear of GMA500? Poulsbo? Yeah, considering how Intel screwed me once with integrated graphics licensed from a third party, I absolutely believe they'd do it again.

    • (Score: 2) by LoRdTAW on Wednesday January 03 2018, @07:41PM

      by LoRdTAW (3755) on Wednesday January 03 2018, @07:41PM (#617308) Journal

      Yea, that was a hiccup. I had one of the diamondville's on an Intel ITX board with the GMA950. Terrible performance but it was my first ITX/low power system to play with and I had it hooked to a TV for a while as a media player which it sucked at when it came to HD but I didn't really care, and then it was a small desktop before I shelved it.

  • (Score: 1) by waximius on Tuesday January 02 2018, @10:18PM (2 children)

    by waximius (1136) on Tuesday January 02 2018, @10:18PM (#616925) Homepage

    Thank you, great information. One follow up to your point:

    There's this thing called the internet which is heavily driven by visual content delivered to screens. With all the new web technologies such as WebGL, streaming video, and all sorts of other stuff, why would you not include a GPU? It's a desktop necessity nowadays. Just because it cant play Crysis in 4k at 240Hz doesn't mean it's useless.

    I didn't mean to imply that I thought this configuration was useless, but that it applies to a limited market segment. Based on what you say though, I can see that the segment is much broader than I initially thought. If I understand right, the on-package GPU could be used for rendering lighter weight things, and a full video card could still be added and utilized for heavier weight applications like gaming. It's not an "either-or" situation, but a "yes-and".

    I like that as long as an Intel+GPU combo doesn't speed up obsolescence. My current configuration of CPU + video card runs just fine 10 years after I built it. I've upgraded to an SSD, and upgraded my video card multiple times, but am still using a Core-i7 920 and only have 6GB RAM. The longevity of that processor has been amazing, and I'm just now thinking I should upgrade the CPU. Having the GPU on-chip scares me only because I feel like I need to upgrade my video card somewhat regularly (based on need, but games require better hardware every year).

    In any case, thanks for the response, very informative.

    • (Score: 3, Informative) by takyon on Tuesday January 02 2018, @10:45PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday January 02 2018, @10:45PM (#616942) Journal

      The combination of an on-package GPU and the High Bandwidth Memory may have some advantages over discrete GPUs. Moving everything closer together helps overcome certain limits:

      http://www.nersc.gov/users/computational-systems/cori/application-porting-and-performance/using-on-package-memory/ [nersc.gov]

      Some users want smaller form factors, for Home Theater PCs (HTPCs) for example. This kind of on-package stuff might be cheaper than using a discrete GPU, with lower power consumption, but with better performance than integrated graphics. It might be worth it.

      Also, I was not sure when writing the summary, but I think both the Intel integrated graphics and AMD Radeon Vega graphics may be included on these chips. In which case you might have a setup well suited to newer graphics APIs like Vulkan which can take advantage of these disparate assets.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 3, Interesting) by LoRdTAW on Wednesday January 03 2018, @03:12AM

      by LoRdTAW (3755) on Wednesday January 03 2018, @03:12AM (#617051) Journal

      There are plenty of use cases where the on die GPU is basically a necessity, even if it is low end. One I forgot to mention is that I think chrome renders pages on the GPU if supported. Then there is GPU 3D accelerated desktop compositing window managers. All that is used on business desktops daily.

      My Linux box is an AMD A10 APU which has plenty of CPU and GPU to do all the basics for development stuff and web browsing. Though I now wish it would morph into a Ryzen but I'm not spending money on it just to have it as I don't need all that CPU.