Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday January 02 2018, @04:27PM   Printer-friendly
from the progress++ dept.

An Intel website leaked some details of the Intel Core i7-8809G, a "Kaby Lake" desktop CPU with on-package AMD Radeon graphics and High Bandwidth Memory 2.0. While it is listed as an 8th-generation part, 8th-generation "Coffee Lake" CPUs for desktop users have up to 6 cores (in other words, Intel has been releasing multiple microarchitectures as "8th-generation"). The i7-8809G may be officially announced at the Consumer Electronics Show next week.

The components are linked together using what Intel calls "embedded multi-die interconnect bridge technology" (EMIB). The thermal design power (TDP) of the entire package is around 100 Watts:

Intel at the original launch did state that they were using Core-H grade CPUs for the Intel with Radeon Graphics products, which would mean that the CPU portion is around 45W. This would lead to ~55W left for graphics, which would be in the RX 550 level: 8 CUs, 512 SPs, running at 1100 MHz. It is worth nothing that AMD already puts up to 10 Vega CUs in its 15W processors, so with the Intel i7-8809G product Intel has likely has gone wider and slower: judging by the size of the silicon in the mockup, this could be more of a 20-24 CU design built within that 55W-75W window, depending on how the power budget is moved around between CPU and GPU. We await more information, of course.

It is rumored to include 4 GB of HBM2 on-package, while the CPU also supports DDR4-2400 memory. Two cheaper EMIB CPUs have been mentioned:

According to some other media, the 8809G will turbo to 4.1 GHz, while the graphics will feature 24 [compute units (CUs)] (1536 [stream processors (SPs)]) running at 1190 MHz while the HBM2 is 4GB and will run at 800 MHz. The same media are also listing the Core i7-8705G (20 CUs, 1000 MHz on 'Vega M GL', 700 MHz on HBM2) and a Core i7-8706G. None of the information from those sources is yet to be verified by AnandTech or found on an official Intel webpage.

Currently available AMD Ryzen Mobile APUs only include 8-10 Vega CUs. These are mobile chips with a maximum TDP of 25 W; no desktop Ryzen chips with integrated graphics have been announced yet.

Previously: Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory


Original Submission

Related Stories

AMD at CES 2018 10 comments

At the Consumer Electronics Show, AMD confirmed details about products coming out in 2018:

  1. Ryzen 3 Mobile APUs: January 9th
  2. Ryzen Desktop APUs: February 12th
  3. Second Generation Ryzen Desktop Processors: April.
  4. Ryzen Pro Mobile APUs: Q2 2018
  5. Second Generation Threadripper Processors: 2H 2018
  6. Second Generation Ryzen Pro Desktop Processors: 2H 2018

The second generation "Zen+" products use a "12nm" process. Zen 2 and Zen 3 will use a "7nm" and "7nm+" process and will be out around 2019-2020.

Two cheaper Ryzen-based mobile APUs have been released. The Ryzen 3 2300U has 4 cores, 4 threads, and the Ryzen 3 2200U has 2 cores, 4 threads, making it the first dual-core part in the entire Ryzen product line. All of the Ryzen mobile parts have a 15 W TDP so far.

AMD has also lowered the suggested pricing for many of its Ryzen CPUs. For example, $299 for Ryzen 7 1700 from $329. The Threadripper Ryzen TR 1900X is down to $449 from $549.

Intel has officially launched five new Kaby Lake CPUs with AMD Radeon Vega graphics and 4 GB of High Bandwidth Memory. Each CPU also includes Intel's HD 630 GT2 integrated graphics, which is expected to be used for lower power video encode/decode tasks.

Previously: AMD Launches First Two Ryzen Mobile APUs With Vega Graphics
Intel Core i7-8809G with Radeon Graphics and High Bandwidth Memory: Details Leaked


Original Submission

Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory 21 comments

Intel squeezed an AMD graphics chip, RAM and CPU into one module

the new processor integrates a "semi-custom" AMD graphics chip and the second generation of Intel's "High Bandwidth Memory (HBM2)", which is comparable to GDDR5 in a traditional laptop.

Intel CPU and AMD GPU, together at last

Summary of Intel's news:

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* – all in a single processor package.

[...] At the heart of this new design is EMIB (Embedded Multi-Die Interconnect Bridge), a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.

[...] Additionally, this solution is the first mobile PC to use HBM2, which consumes much less power and takes up less space compared to traditional discrete graphics-based designs using dedicated graphics memory, like GDDR5 memory.

takyon: This is more like an "integrated discrete GPU" than standard integrated graphics. It also avoids the need for Intel to license AMD's IP. AMD also needs to make a lot of parts since its wafer supply agreement with GlobalFoundries penalizes AMD if they buy less than a target number of wafers each year.

Also at AnandTech and Ars Technica.

Previously: AMD Stock Surges on Report of Intel Graphics Licensing Deal, 16-Core Ryzen Confirmed

Related: Samsung Increases Production of 8 GB High Bandwidth Memory 2.0 Stacks


Original Submission #1Original Submission #2

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Informative) by Anonymous Coward on Tuesday January 02 2018, @04:51PM (6 children)

    by Anonymous Coward on Tuesday January 02 2018, @04:51PM (#616763)

    The term has started to lose its meaning. Sites that suck use it to attract viewers.

    I was going to blame Anandtech, but after going there to confirm things, the word "leak" isn't part of the content. The word appears when they referenced information they had last year, but it's not relating to the Intel press release that the article is about. The info was posted on the other side of the world prior to being posted on this side of the world.

    Perhaps geographical regions that have a differently timed day and night cycles due to, you know, actually not being in California, actually have scripts that publish authorized data on a schedule that doesn't follow Silicon Valley time? Strange and unusual, I know... Intel might even have cheap local resources posting it, too, who knows, it *IS* an Indian website that the info was first seen.

    But it's in no way a leak..

    • (Score: 2) by LoRdTAW on Tuesday January 02 2018, @05:10PM (1 child)

      by LoRdTAW (3755) Subscriber Badge on Tuesday January 02 2018, @05:10PM (#616774) Journal

      Agreed. This is a "sneak peek". Not a leak.

      • (Score: 3, Funny) by DannyB on Tuesday January 02 2018, @05:46PM

        by DannyB (5839) Subscriber Badge on Tuesday January 02 2018, @05:46PM (#616797)

        If it were more info about Intel's Management Engine, it would be a whistleblower, not a "leak".

        Paid for by Americans for Renewable Complaining and Sustainable Whining.

    • (Score: 2) by takyon on Tuesday January 02 2018, @05:39PM (3 children)

      by takyon (881) Subscriber Badge <{takyon} {at} {soylentnews.org}> on Tuesday January 02 2018, @05:39PM (#616789) Journal

      The info was obviously published on accident before a regular press release:

      I imagine that this listing will come down fairly quickly. The product page that the link goes to for this chip gives a 404.

      And the very end of the article references leaks/rumors beyond what came from Intel's Indian website.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @06:22PM (2 children)

        by Anonymous Coward on Tuesday January 02 2018, @06:22PM (#616814)

        The "mistake" has saved Intel millions in advertisement costs.

        • (Score: 2) by takyon on Tuesday January 02 2018, @06:25PM (1 child)

          by takyon (881) Subscriber Badge <{takyon} {at} {soylentnews.org}> on Tuesday January 02 2018, @06:25PM (#616817) Journal

          All they have to do is issue a press release to get coverage on these sites. One drone typing for an hour, cross checked with marketing and legal. Probably several hundreds of dollars of expenditure, not millions. And they will still issue one even if ALL of the relevant details have already leaked.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by bob_super on Tuesday January 02 2018, @06:55PM

            by bob_super (1357) on Tuesday January 02 2018, @06:55PM (#616825)

            Dang, they still have much to learn from Apple, which gets full front page coverage on the potential that some rumor about the idea of a leak might be plausible, plus full-blown analysis of fanfiction photoshops.

  • (Score: 2, Interesting) by waximius on Tuesday January 02 2018, @08:31PM (6 children)

    by waximius (1136) on Tuesday January 02 2018, @08:31PM (#616875) Homepage

    Ok, I've seen this going on for a while and will now ask the dumb question - why is Intel integrating GPUs onto the same package as their processors?

    Don't misunderstand, I can see many benefits of on-package GPUs, but if we're going to disrupt motherboards and sockets again why are we not making faster/larger processors instead of trying to integrate GPUs? If we are creating more real estate on-package, I don't see that putting a GPU on it is the best decision for the additional space needed and heat produced.

    As far as I know, full video cards can run circles around on-package GPUs. They are modular (I can use an old CPU and upgrade my video card independently), and the technology can still be made faster (we're not done innovating in that space). So, what benefits are there to an on-chip GPU that it would ever be an attractive option for a buyer? It will increase cost, be useful for only a certain market segment, and right now I completely discount this combined package because I think a video card gives me more performance bang for my buck than on-chip GPUs.

    Lastly, why AMD? Their drivers are not near as good as NVidia's (IMHO), and they own less than half of the market share by comparison. I upgrade my rig infrequently, and I switched from AMD video cards back in 2012 because I had such a bad 4-year experience with AMD.

    I don't see anybody asking these questions, so I'm hoping the answer is obvious and I'm just missing it somewhere.

    Market share comparison:
    https://wccftech.com/nvidia-amd-discrete-gpu-market-share-report-q3-2017/ [wccftech.com]

    • (Score: 4, Interesting) by LoRdTAW on Tuesday January 02 2018, @09:25PM (5 children)

      by LoRdTAW (3755) Subscriber Badge on Tuesday January 02 2018, @09:25PM (#616900) Journal

      You're thinking small.

      Don't misunderstand, I can see many benefits of on-package GPUs, but if we're going to disrupt motherboards and sockets again why are we not making faster/larger processors instead of trying to integrate GPUs? If we are creating more real estate on-package, I don't see that putting a GPU on it is the best decision for the additional space needed and heat produced.

      Have you looked at the Intel and AMD server/HPC offerings? Plenty of big CPU's with lots of cores there.

      As far as I know, full video cards can run circles around on-package GPUs. They are modular (I can use an old CPU and upgrade my video card independently), and the technology can still be made faster (we're not done innovating in that space). So, what benefits are there to an on-chip GPU that it would ever be an attractive option for a buyer? It will increase cost, be useful for only a certain market segment, and right now I completely discount this combined package because I think a video card gives me more performance bang for my buck than on-chip GPUs.

      There's this thing called the internet which is heavily driven by visual content delivered to screens. With all the new web technologies such as WebGL, streaming video, and all sorts of other stuff, why would you not include a GPU? It's a desktop necessity nowadays. Just because it cant play Crysis in 4k at 240Hz doesn't mean it's useless.

      Lastly, why AMD? Their drivers are not near as good as NVidia's (IMHO), and they own less than half of the market share by comparison. I upgrade my rig infrequently, and I switched from AMD video cards back in 2012 because I had such a bad 4-year experience with AMD.

      Because Intel and AMD already cross license technologies (hello, x86-64!). Business wise, Nvidia doesn't need Intel as they are doing quite well in the Mobile, HPC, AI, Deep Learning, and autonomous automotive markets. And the HPC/AI/Deep LEarning is very profitable as you can sell shit loads of chips at once to big customers with DEEP pockets. Intel is gearing up in some of those areas with The Xeon Phi and FPGA tech they got from Altera. So Intel and Nvidia are going to compete head to head in those markets where AMD is pretty much absent from. The enemy of my enemy is my friend and AMD is more of a friend than Nvidia at this point. And as for your driver complaint, you think Intel would let that be a problem? I mean who better than Intel to get those damn drivers into the Linux Kernel? Intel CPU with excellent GPU with mainlined kernel drivers: Win-win in my book.

      • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @09:58PM (1 child)

        by Anonymous Coward on Tuesday January 02 2018, @09:58PM (#616916)

        And as for your driver complaint, you think Intel would let that be a problem?

        Ever hear of GMA500? Poulsbo? Yeah, considering how Intel screwed me once with integrated graphics licensed from a third party, I absolutely believe they'd do it again.

        • (Score: 2) by LoRdTAW on Wednesday January 03 2018, @07:41PM

          by LoRdTAW (3755) Subscriber Badge on Wednesday January 03 2018, @07:41PM (#617308) Journal

          Yea, that was a hiccup. I had one of the diamondville's on an Intel ITX board with the GMA950. Terrible performance but it was my first ITX/low power system to play with and I had it hooked to a TV for a while as a media player which it sucked at when it came to HD but I didn't really care, and then it was a small desktop before I shelved it.

      • (Score: 1) by waximius on Tuesday January 02 2018, @10:18PM (2 children)

        by waximius (1136) on Tuesday January 02 2018, @10:18PM (#616925) Homepage

        Thank you, great information. One follow up to your point:

        There's this thing called the internet which is heavily driven by visual content delivered to screens. With all the new web technologies such as WebGL, streaming video, and all sorts of other stuff, why would you not include a GPU? It's a desktop necessity nowadays. Just because it cant play Crysis in 4k at 240Hz doesn't mean it's useless.

        I didn't mean to imply that I thought this configuration was useless, but that it applies to a limited market segment. Based on what you say though, I can see that the segment is much broader than I initially thought. If I understand right, the on-package GPU could be used for rendering lighter weight things, and a full video card could still be added and utilized for heavier weight applications like gaming. It's not an "either-or" situation, but a "yes-and".

        I like that as long as an Intel+GPU combo doesn't speed up obsolescence. My current configuration of CPU + video card runs just fine 10 years after I built it. I've upgraded to an SSD, and upgraded my video card multiple times, but am still using a Core-i7 920 and only have 6GB RAM. The longevity of that processor has been amazing, and I'm just now thinking I should upgrade the CPU. Having the GPU on-chip scares me only because I feel like I need to upgrade my video card somewhat regularly (based on need, but games require better hardware every year).

        In any case, thanks for the response, very informative.

        • (Score: 3, Informative) by takyon on Tuesday January 02 2018, @10:45PM

          by takyon (881) Subscriber Badge <{takyon} {at} {soylentnews.org}> on Tuesday January 02 2018, @10:45PM (#616942) Journal

          The combination of an on-package GPU and the High Bandwidth Memory may have some advantages over discrete GPUs. Moving everything closer together helps overcome certain limits:

          http://www.nersc.gov/users/computational-systems/cori/application-porting-and-performance/using-on-package-memory/ [nersc.gov]

          Some users want smaller form factors, for Home Theater PCs (HTPCs) for example. This kind of on-package stuff might be cheaper than using a discrete GPU, with lower power consumption, but with better performance than integrated graphics. It might be worth it.

          Also, I was not sure when writing the summary, but I think both the Intel integrated graphics and AMD Radeon Vega graphics may be included on these chips. In which case you might have a setup well suited to newer graphics APIs like Vulkan which can take advantage of these disparate assets.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 3, Interesting) by LoRdTAW on Wednesday January 03 2018, @03:12AM

          by LoRdTAW (3755) Subscriber Badge on Wednesday January 03 2018, @03:12AM (#617051) Journal

          There are plenty of use cases where the on die GPU is basically a necessity, even if it is low end. One I forgot to mention is that I think chrome renders pages on the GPU if supported. Then there is GPU 3D accelerated desktop compositing window managers. All that is used on business desktops daily.

          My Linux box is an AMD A10 APU which has plenty of CPU and GPU to do all the basics for development stuff and web browsing. Though I now wish it would morph into a Ryzen but I'm not spending money on it just to have it as I don't need all that CPU.

  • (Score: 2) by Azuma Hazuki on Tuesday January 02 2018, @10:27PM (3 children)

    by Azuma Hazuki (5086) Subscriber Badge on Tuesday January 02 2018, @10:27PM (#616932) Journal

    I don't like this. Intel poached Koduri essentially--we know *nothing* of how or why he joined Intel!--and now appears to be trying to create a mashup of Ryzen CPU performance with Ravenridge-level IGP. It's an interesting technical angle, but something about this smells as far as the corporate side goes. Intel fights fucking dirty, always did, and they seem to want to undermine AMD instead of competing with them.

    --
    I am "that girl" your mother warned you about...
    • (Score: 4, Interesting) by takyon on Tuesday January 02 2018, @10:39PM (2 children)

      by takyon (881) Subscriber Badge <{takyon} {at} {soylentnews.org}> on Tuesday January 02 2018, @10:39PM (#616940) Journal

      AMD has done well with Ryzen and GPUs recently, but their easy cryptomining cash boost will probably dry up soon.

      This gets them to tap into a source of revenue. Although it probably doesn't help them with market share, they can leech off of Intel's.

      This is also less of a devastating self-flagellation for AMD than it might have been before they launched Ryzen. Ryzen has partially closed a huge gap with Intel's CPUs and IPC. It has inserted them back into desktops which they had all but abandoned. Future iterations of their hardware might be somewhat more competitive with Intel since Intel has struggled to move past 14nm and although GlobalFoundries, Samsung, et al. are said to have crappier process nodes than Intel, they are moving a little faster (for example, GlobalFoundries 7nm might be comparable to Intel's 10nm, but 7nm will be around before Intel gets much 10nm stuff out).

      AMD's next big move may be to muscle into the machine learning and automotive territory where Nvidia is riding high. There was talk of a GPU made for Tesla (the car company, not the Nvidia product).

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Tuesday January 02 2018, @10:57PM

        by Anonymous Coward on Tuesday January 02 2018, @10:57PM (#616951)

        With Ryzen and Vega, AMD has fully thrown itself into the 'Secured for government spying' Arena, with backdoored CPUs and GPUs with signed firmware and potential hypervisor level backdoors that co-opt what little of the mainstream computer market was left after Intel was done with it.

        If you consider this from the intelligence-technlogy complex mindset, throwing AMD a few bones to help provide access to 99 percent of non-cellular computer users *IN THE WORLD*, this is a bargain to keep them in business while also ensuring the entire x86 market is following in lock-step the controls being put in place to either control or surveil the public.

        Given that basically all major software requires x86 today, even the software that doesn't require windows, this gives a dramatic amount of potential (even if unused) surveillance power to the gatekeepers who control it. In this case the NSA, Mossad, GCHQ, and possibly Japan through SoftBank's newfound ownership of ARM.

        Without competitors coming onto the market, especially competitors from other regions/nationalities, and ideally from other countries who still believe in privacy, if not free speech, we are rapidly approaching the sort of technological tipping point that Continuum depicted as far as one technology coopting control of the world towards one groups ideological control.

      • (Score: 2) by LoRdTAW on Wednesday January 03 2018, @03:24AM

        by LoRdTAW (3755) Subscriber Badge on Wednesday January 03 2018, @03:24AM (#617053) Journal

        And, lets just throw this in here: What if Intel worked with AMD to enable AMD discreet GPU cards work in tandem with each other in laptop or desktop? You can have it either way and everything plays nicely: Ryzen APU + AMD GPU || Intel [AMD]APU + AMD GPU. Pick your CPU core of choice. Then the main GPU can be turned off when just browsing or watching netflix and throttle on for the latest mmor-fps-rpg-whatever or buttcoin mining.

(1)