Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday November 07 2017, @04:34AM   Printer-friendly
from the AMD-Inside™ dept.

Intel squeezed an AMD graphics chip, RAM and CPU into one module

the new processor integrates a "semi-custom" AMD graphics chip and the second generation of Intel's "High Bandwidth Memory (HBM2)", which is comparable to GDDR5 in a traditional laptop.

Intel CPU and AMD GPU, together at last

Summary of Intel's news:

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* – all in a single processor package.

[...] At the heart of this new design is EMIB (Embedded Multi-Die Interconnect Bridge), a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.

[...] Additionally, this solution is the first mobile PC to use HBM2, which consumes much less power and takes up less space compared to traditional discrete graphics-based designs using dedicated graphics memory, like GDDR5 memory.

takyon: This is more like an "integrated discrete GPU" than standard integrated graphics. It also avoids the need for Intel to license AMD's IP. AMD also needs to make a lot of parts since its wafer supply agreement with GlobalFoundries penalizes AMD if they buy less than a target number of wafers each year.

Also at AnandTech and Ars Technica.

Previously: AMD Stock Surges on Report of Intel Graphics Licensing Deal, 16-Core Ryzen Confirmed

Related: Samsung Increases Production of 8 GB High Bandwidth Memory 2.0 Stacks


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by MichaelDavidCrawford on Tuesday November 07 2017, @05:17AM (4 children)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Tuesday November 07 2017, @05:17AM (#593493) Homepage Journal

    ... when I got paid, but instead I'll wait until Apple ships a model with this chip in it.

    Right around that same time, Intel will announce an even better chip.

    When carried to its logical conclusion, this corollary to Zeno's Paradox results in my new MacBook Pro being unobtainable.

    While I have a Mac mini - which I truly enjoy, Linux will never be ready for the desktop while windows is like pounding nails with my fists - I need two Macs to develop drivers, as that's what the two-machine debugger requires.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Insightful) by Anonymous Coward on Tuesday November 07 2017, @09:04AM (2 children)

      by Anonymous Coward on Tuesday November 07 2017, @09:04AM (#593559)

      For me, Linux is already ready for the desktop (I'm using it on all my computers since 2000).

      • (Score: 3, Interesting) by stormreaver on Tuesday November 07 2017, @01:56PM

        by stormreaver (5101) on Tuesday November 07 2017, @01:56PM (#593633)

        I started using Linux as my exclusive desktop system in 1999. Windows has never been ready for the desktop. It was forced onto everybody for so long, though, that people just learned how to shoehorn it into doing the job.

        I have transitioned a number of non-technical people to Kubuntu over the years, and all but one refused to return to Windows.

      • (Score: 0) by Anonymous Coward on Tuesday November 07 2017, @07:36PM

        by Anonymous Coward on Tuesday November 07 2017, @07:36PM (#593782)

        What's nice about this announcement is its timing. I think this article will work on the Intel CPU:
        http://schd.ws/hosted_files/ossna2017/91/Linuxcon%202017%20NERF.pdf [schd.ws]
        Get rid of most of UEFI, rewrite the firmware on the motherboard. If that doesn't keep out the spooks/whoever, I'd plan on disabling the onboard ethernet and add an ethernet card of a different type so the driver in firmware is useless. Which leaves me with one question: will the graphics chip have an open-source driver for linux?

    • (Score: 1) by xhedit on Tuesday November 07 2017, @01:03PM

      by xhedit (6669) on Tuesday November 07 2017, @01:03PM (#593621)

      Linux is great on the desktop if you aren't a washed up hack.

  • (Score: 0) by Anonymous Coward on Tuesday November 07 2017, @09:08AM (10 children)

    by Anonymous Coward on Tuesday November 07 2017, @09:08AM (#593562)

    So Intel effectively admits that its own graphics is worse than AMD's?

    BTW, the "Slow Down Cowboy" message should be placed somewhere where you actually can see it not only by accident.

    • (Score: 4, Interesting) by TheRaven on Tuesday November 07 2017, @10:45AM (7 children)

      by TheRaven (270) on Tuesday November 07 2017, @10:45AM (#593591) Journal
      I don't think that's a shock to anyone. There were basically two high-end discrete GPU vendors left: nVidia and ATi. Then AMD bought ATi. If Intel wants to be in this space, then they have three choices: Invest a lot more in their GPU division, buy nVidia or license nVidia GPUs, or license AMD GPUs. Buying nVidia would likely hit a lot of antitrust issues (and there's also a lot of nVidia that Intel doesn't want). This was a pretty obvious move for Intel, if they could manage to persuade AMD to agree to it. From AMD's perspective, it has up and down sides: on the plus side, it's probably getting AMD GPUs to a lot more customers. On the down side, their on-die GPUs were one of their major competitive advantages over Intel. I guess they figured that the increase in GPU sales would offset the loss in CPU sales.
      --
      sudo mod me up
      • (Score: 2) by Spamalope on Tuesday November 07 2017, @11:12AM (4 children)

        by Spamalope (5233) on Tuesday November 07 2017, @11:12AM (#593595) Homepage

        The chips may be destined to a market AMD doesn't have any penetration into so it won't cannibalize sales.
        On the other hand, perhaps they're hoping Intel won't invest as heavily in their own solutions and that the product will be very successful. A few generations later AMD would be in a much better negotiating position. (aka, use this to cut off the air supply for any other cheap embedded solutions)

        • (Score: 2) by tonyPick on Tuesday November 07 2017, @11:18AM (3 children)

          by tonyPick (1237) on Tuesday November 07 2017, @11:18AM (#593598) Homepage Journal

          Be interesting to see how it stacks against Ryzen Mobile though - this looks like a fairly direct competitor there.

          https://www.amd.com/en/products/ryzen-processors-laptop [amd.com]

          • (Score: 4, Interesting) by TheRaven on Tuesday November 07 2017, @05:04PM (2 children)

            by TheRaven (270) on Tuesday November 07 2017, @05:04PM (#593720) Journal
            That link indicates that not only is AMD not supporting LPDDR, they don't even realise that it's a requirement for a lot of the mobile market. A 15W TDP is nice for the CPU, but if it has to be coupled with 12W-at-idle RAM instead of 2W-at-idle RAM then it's suddenly much less interesting.
            --
            sudo mod me up
            • (Score: 3, Informative) by takyon on Tuesday November 07 2017, @11:48PM (1 child)

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday November 07 2017, @11:48PM (#593883) Journal

              DDR4 [slickdeals.net] vs. LPDDR4 [slickdeals.net]

              LPDDR4 in a laptop seems like a goddamn unicorn to me. So it clearly does not matter.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
              • (Score: 3, Informative) by TheRaven on Wednesday November 08 2017, @10:14AM

                by TheRaven (270) on Wednesday November 08 2017, @10:14AM (#594005) Journal
                LPDDR3 is common in laptops, LPDDR4 isn't because no laptop CPUs yet support it (Intel's Cannonlake will, and was originally due to be released 9 months ago, but is now delayed for another 3-6 months). When it's finally released, Intel's mobile chips will support 32GB of LPDDR. A lot of people are eagerly awaiting that, because laptops have been stuck with either 16GB of RAM or really crappy battery life for over four years, which is longer than most corporate upgrade cycles. If RAM is a limiting factor for your workloads, Cannonlake is going to be a big deal. AMD's offerings don't even support LPDDR3, so can't compete in the low-power space with Intel.
                --
                sudo mod me up
      • (Score: 2) by richtopia on Tuesday November 07 2017, @04:33PM (1 child)

        by richtopia (3160) on Tuesday November 07 2017, @04:33PM (#593704) Homepage Journal

        The Anandtech article speculates that these chips will still have the Intel graphics on-die for low power applications. Intel has never sold their GPU independent of a mobo or CPU, so they realize that they have a niche. But they do dominate that low power space. In addition, Intel can leverage these cores for non-display applications such as video encoding (Intel Quick Sync Video) or allowing an additional monitor to be attached.

        It will be interesting if AMD ever brings something to this market segment (mobile 45W CPU with graphics) and how it would compete on low power applications. However I suspect AMD's marketing team realizes they can only compete with Intel directly in certain segments, and I doubt high performance mobile will be one of them.

    • (Score: 2) by Wootery on Tuesday November 07 2017, @11:14AM (1 child)

      by Wootery (2341) on Tuesday November 07 2017, @11:14AM (#593596)

      So Intel effectively admits that its own graphics is worse than AMD's?

      Well, sure. Intel is doing pretty well considering where they used to be (awful embedded graphics), but they're not seriously competing with the big boys.

      • (Score: 2) by bob_super on Tuesday November 07 2017, @05:42PM

        by bob_super (1357) on Tuesday November 07 2017, @05:42PM (#593737)

        Just when Intel Graphics were getting "good enough" for HD, the industry moved to 4K...
        Got some catching up to do (in performance, since they still have the lead in volume), and in the meantime propping up AMD a bit keeps the regulators (not the US obviously, but the ROW) at bay while not making an impact on the bottom line.

  • (Score: 3, Insightful) by ledow on Tuesday November 07 2017, @01:34PM

    by ledow (5567) on Tuesday November 07 2017, @01:34PM (#593627) Homepage

    Intel was always running last in graphics anyway.

    Intel + nVidia is the gamer's combo.

    AMD + AMD/ATI is the cheap gamer's combo / "i have better numbers for 2.5 seconds until another product comes out" show-off.

    Intel does need a better on-board graphics, there's no question. Licensing nVidia would pretty much cut AMD out of the market entirely overnight. Who would bother to buy AMD?

    So to combat that I wouldn't be surprised if AMD approached Intel about putting their GPU on Intel's chips, to try to stay relevant and not get shut out entirely (and/or have to sue under anti-trust to prevent such a deal in the first place).

    But... though I would like a better default GPU on all machines so I don't have to explain to people that they must have another card / a particular chipset to play even the most basic of games (e.g. The Sims series etc.), I still see that people would buy Intel and put nVidia in if they want it to be a gamer's machine.

    I reckon AMD's combat to that would be something like an SLI mechanism so the on-board GPU can help an AMD PCI-e card a little.

    It seems the only logical way forward that doesn't end up in an Intel + nVidia monopoly that could quickly turn on the consumer.

    And I'll be quite happy to have a decent-enough GPU - even if it is AMD - in processors by default in 5-10 year's time. It would mean that things like OpenGL / Vulkan etc. would become de-facto rather than a bolt-on or severely limited. And maybe we'd even get some decent drivers / abstraction layers out of it (but that's hoping for a lot!).

    Roll on the days where you can just assume that playing a basic 3D game, running something in OpenCL or WebGL, or running something like a browser in accelerated mode won't kill a machine, even a business-class machine.

  • (Score: 3, Informative) by Rich on Tuesday November 07 2017, @02:35PM (3 children)

    by Rich (945) on Tuesday November 07 2017, @02:35PM (#593649) Journal

    Look at the accumulated and current losses of AMD. That's not sustainable. But Intel fears Antitrust more than AMD. So, every now and then, they throw them a little bone to chew on. :)

    (Disclaimer: I hold a couple of AMD shares. Should've sold them at 44 in 2004 instead of Apple...)

    • (Score: 2) by RS3 on Tuesday November 07 2017, @04:17PM (2 children)

      by RS3 (6367) on Tuesday November 07 2017, @04:17PM (#593695)

      (Disclaimer: I hold a couple of AMD shares. Should've sold them at 44 in 2004 instead of Apple...)

      D'oh!

      In all fairness, Apple wasn't looking strong in 2004.

      • (Score: 0) by Anonymous Coward on Tuesday November 07 2017, @04:33PM (1 child)

        by Anonymous Coward on Tuesday November 07 2017, @04:33PM (#593705)

        Bitcoin wasn't looking strong in 2010 (kill me).

        • (Score: 0) by Anonymous Coward on Tuesday November 07 2017, @05:28PM

          by Anonymous Coward on Tuesday November 07 2017, @05:28PM (#593731)

          Bitcoin wasn't looking strong in 2010 (kill me).

          Bitcoin wasn't worth my CPU cycles and $0.07/kWh electricity in March 2009 .... and I was one of the top-1000 participants in the SETI at-home and the public crypto factoring challenge (forgot what it was called now) ... so yeah...

          But you know, don't cry over spilled milk.

(1)