Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Thursday May 18 2017, @08:34AM   Printer-friendly
from the threadripper?-really? dept.

Shares of AMD rose 11.6% on Tuesday as Fudzilla reported that Intel would license graphics technologies from AMD after a similar deal with Nvidia expired two months earlier. The deal has not been confirmed.

On the other hand, AMD's 16-core "Threadripper" enthusiast/HEDT CPUs have been confirmed:

With one of the gnarliest CPU codenames we've ever seen, the Threadripper multicore monsters will go head to head with Intel's Broadwell-E and upcoming Skylake-E High-End Desktop (HEDT) CPUs alongside a new motherboard platform that promises expanded memory support and I/O bandwidth. That's likely to take the form of quad-channel RAM and more PCIe lanes, similar to Intel's X99 platform, but AMD is saving further details for its press conference at Computex at the end of May.

AMD's 32-core "Naples" server chips are now known as... "Epyc".

You have seen the launch of 4, 6, and 8-core AMD Ryzen parts. How do you feel about 10, 12, 14, and 16 cores (prices unknown, likely $1,000 or more for 16 cores)?

Previously: CPU Rumor Mill: Intel Core i9, AMD Ryzen 9, and AMD "Starship"


Original Submission

Related Stories

CPU Rumor Mill: Intel Core i9, AMD Ryzen 9, and AMD "Starship" 9 comments

AMD is rumored to be releasing a line of Ryzen 9 "Threadripper" enthusiast CPUs that include 10, 12, 14, or 16 cores. This is in contrast to the Ryzen lines of AMD CPUs that topped out at the 8-core Ryzen 7 1800X with a base clock of 3.6 GHz.

Meanwhile, Intel is supposedly planning to release 6, 8, 10, and 12 core Skylake-X processors under an "Intel Core i9" designation. Two Kaby Lake-X, a quad-core and another quad-core with hyper-threading disabled, are also mentioned.

Finally, AMD's 32-core "Naples" server chips could be succeeded in late 2018 or 2019 by a 48-core 7nm part nicknamed "Starship". GlobalFoundries plans to skip the 10nm node, and where GF goes, AMD follows. Of course, according to Intel, what really matters are transistors per square millimeter.

All of the processors mentioned could be officially announced at Computex 2017, running from May 30 to June 3. Expect the high end desktop (HEDT) CPUs to be in excess of $500 and as high as $1,500. Intel may also announce Coffee Lake CPUs later this year including a "mainstream" priced 6-core chip.


Original Submission

Intel Announces Core H Laptop Chips With AMD Graphics and High Bandwidth Memory 21 comments

Intel squeezed an AMD graphics chip, RAM and CPU into one module

the new processor integrates a "semi-custom" AMD graphics chip and the second generation of Intel's "High Bandwidth Memory (HBM2)", which is comparable to GDDR5 in a traditional laptop.

Intel CPU and AMD GPU, together at last

Summary of Intel's news:

The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD's Radeon Technologies Group* – all in a single processor package.

[...] At the heart of this new design is EMIB (Embedded Multi-Die Interconnect Bridge), a small intelligent bridge that allows heterogeneous silicon to quickly pass information in extremely close proximity. EMIB eliminates height impact as well as manufacturing and design complexities, enabling faster, more powerful and more efficient products in smaller sizes. This is the first consumer product that takes advantage of EMIB.

[...] Additionally, this solution is the first mobile PC to use HBM2, which consumes much less power and takes up less space compared to traditional discrete graphics-based designs using dedicated graphics memory, like GDDR5 memory.

takyon: This is more like an "integrated discrete GPU" than standard integrated graphics. It also avoids the need for Intel to license AMD's IP. AMD also needs to make a lot of parts since its wafer supply agreement with GlobalFoundries penalizes AMD if they buy less than a target number of wafers each year.

Also at AnandTech and Ars Technica.

Previously: AMD Stock Surges on Report of Intel Graphics Licensing Deal, 16-Core Ryzen Confirmed

Related: Samsung Increases Production of 8 GB High Bandwidth Memory 2.0 Stacks


Original Submission #1Original Submission #2

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Funny) by qzm on Thursday May 18 2017, @08:49AM

    by qzm (3260) on Thursday May 18 2017, @08:49AM (#511599)

    And by the time this news hit Soylent, the stock was back below the start of the 'surge'
    Blink and you missed it, folks :)

    On a side note, whomever thought the name EPYC was good need to be tarred, feathered, hung, drawn, quartered, and made to eat lutefisk as their last dinner.

    EPIC was such a resoundingly successful piece of server technology, after all.....

    I am guessing the kids who made that decision weren't around then though, after all, it was YEARS ago.

  • (Score: 2) by kaszz on Thursday May 18 2017, @09:36AM (1 child)

    by kaszz (4211) on Thursday May 18 2017, @09:36AM (#511607) Journal

    So will this mean less graphics drivers for free open source systems? ie Intel policy vs AMD/ATI.

    Btw, cores are good.. if software can and will make use of them....

    • (Score: 0) by Anonymous Coward on Thursday May 18 2017, @12:33PM

      by Anonymous Coward on Thursday May 18 2017, @12:33PM (#511662)

      So will this mean less graphics drivers

      With self-driving cars, who needs drivers anymore?

  • (Score: 3, Insightful) by Hairyfeet on Thursday May 18 2017, @09:38AM (19 children)

    by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Thursday May 18 2017, @09:38AM (#511608) Journal

    Most folks simply cannot come up with enough work to keep a 10 year old CPU well fed, much less a monster like this. Sure some guys will buy it for bragging rights and I'm sure there are a few with niche applications that will be able to keep that monster chomping data, but I'd say a good 97% of the population? Could have the latest Intel or Ryzen replaced with a Q6600 or Phenom II X4 and they would never know the difference. Hell its even become true for gamers, I have several gamer customers who are a generation or 2 behind and all their games are smooth as butter with a $180-$250 GPU. That includes myself, even after gaming for hours half my cores are parked because there just isn't enough for them to do and that is with an FX-8320e which although released in 2015 is just a gold binned FX-8350 from 2012.

    The simple fact is when we went from the MHz wars to core wars? The software just didn't keep up like it did when all they had to worry about was a single core.

    --
    ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    • (Score: 3, Informative) by kaszz on Thursday May 18 2017, @10:08AM

      by kaszz (4211) on Thursday May 18 2017, @10:08AM (#511618) Journal

      It's way easier to exploit more Hz than it is to do the same with multiple cores. That requires re-thinking.. shudder computations. ;-)

      But see it on the positive side. People that need to brag and buys these things will pay the development costs for those users that actually need these things. If they suck up Microsoft then DRAM makers also become happy and GDP goes to the top! ;-)

      If computing becomes really cheap, then maybe there will also be chips without Backdoor Management Spying?

    • (Score: 1) by garrulus on Thursday May 18 2017, @10:22AM (5 children)

      by garrulus (6051) on Thursday May 18 2017, @10:22AM (#511623)

      But single core performance still benfits games the most, and thats what these new cpus provide.
      Ryzen has roughy 2x the single core performance of my Phenom2 X4, you need it at peak load moments

      • (Score: 2) by takyon on Thursday May 18 2017, @12:27PM (3 children)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday May 18 2017, @12:27PM (#511656) Journal

        Ok, but the story is about Ryzen 9 which probably has lower clock speeds than Ryzen 7 while costing twice as much. For example, Ryzen 7 1800X at 3.6 GHz (base) for $500, Ryzen 9 1998X at 3.5 GHz for $1000+. So if you can't make use of the additional 8 cores/16 threads, it is less than half the performance per dollar.

        All of the Phenom II X4 chips have 4 cores, so the comparable Ryzens would be Ryzen 5 1500X ($189), Ryzen 5 1400 ($169), some other unreleased Ryzen 5 chips, or the upcoming Ryzen 3 1200X, Ryzen 3 Pro 1200, Ryzen 3 1100, and Ryzen 3 Pro 1100 [wccftech.com]. The Ryzen 3 chips will have only one thread per core, same as the Phenom II chips.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by bob_super on Thursday May 18 2017, @07:31PM (2 children)

          by bob_super (1357) on Thursday May 18 2017, @07:31PM (#511798)

          My workloads stretch to 8 threads each, with single-threaded performance and memory bandwidth being critical. Hyperthreading isn't bad, but it's not perfect.
          I need >9 cores (one for the OS) and a metric ton of cache.

          If running the $1000 chip saves me 15 minutes every time a job is queued, it pays for itself in less than a quarter (I wish my boss would finally understand that).

          Xeon 10+ cores chips, with their insane pricetags, ECC I don't care about, and slower clocks, don't have the same ROI. The "Enthusiast" market is the sweet spot for a lot of people.

          • (Score: 2) by takyon on Thursday May 18 2017, @08:15PM

            by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday May 18 2017, @08:15PM (#511809) Journal

            All of the Ryzen CPUs have ECC support. Even the cheapest one I can find, Ryzen 3 1100. Some here would not touch it if it did not have ECC support.

            Ryzen 9 1955X and Ryzen 9 1955 are supposedly the chips with 10 cores. 1955 supposedly runs at 3.1/3.7 GHz and 1955X at 3.6/4.0 GHz. The price should be well under $1000.

            --
            [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by Hairyfeet on Sunday May 21 2017, @06:09AM

            by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Sunday May 21 2017, @06:09AM (#512906) Journal

            It sounds like what you want is "threadripper" the new AMD chip, 16 cores and 32 threads with a reported MSRP of $1000. If you can really slam that many cores? Sounds like the chip for you.

            --
            ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
      • (Score: 2) by Hairyfeet on Thursday May 18 2017, @11:24PM

        by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Thursday May 18 2017, @11:24PM (#511891) Journal

        But if you aren't even maxxing out the cores that you have what does it matter? Your cores can twiddle their thumbs much faster now? I only have 2 applications where half my cores aren't permanently parked, Audacity and Handbrake and even on those I'm not maxxing my cores, a few spikes here and there but the difference between the chip I have and Ryzen when it comes to doing the actual jobs I have to do? I'd have to keep a Ryzen for a decade for me to come out ahead because of the price of electricity being low in my area.

        And that is why I've gotten into HTPCs and home theater installs, because the same is true for a huge chunk of the population. Hell if all you are doing is surfing, watching vids, light photo editing and using FB like a lot of folks? Well I have customers with C2D laptops that are completely happy with what they have after I upgraded it to an SSD. Even my gamer customers see more benefit these days from a GPU or PCIe SSD upgrade than they do CPU because so few of the games are CPU bound these days. Maybe it will change in the future but I'm sure both AMD and Intel have sunk a ton into finding ways to use more cores effectively and so far no joy, we have had multicores since 2006 and after 11 years we still seem to be no better at using cores than we were a decade ago.

        --
        ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    • (Score: 3, Interesting) by takyon on Thursday May 18 2017, @12:07PM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday May 18 2017, @12:07PM (#511646) Journal

      Gaming is becoming more multithreaded. Now you have two major consoles with "eight" cores (6-7 usable). A lot of existing computers have quad cores, with HyperThreading in many cases, and now 6+ cores is becoming more common or at least cheap enough to be accessible. Some games could certainly use more cores if they were available. This is apparent for games where many different AI are "thinking" in parallel and adding each additional AI to a map uses more threads. But there must be a baseline and if consoles are raising the bar, that means more core/thread utilization for PC gamers.

      Due to physical limits, you will get more and more cores over time as we continue to shake out the last easy scaling improvements over the next decade. Some customers/applications can already make use of as many cores as can be thrown at the problem. They are the ones who MIGHT pay $1000 or more for a Ryzen 9. And as more cores per node become available, other applications will focus on multithreading (where possible). The CPU manufacturers' only other options (if not making shrinking the die) are to devote more die space to integrated graphics, fixed-function decoders (8K AV1 [wikipedia.org] 12-bit support, anyone?), or perhaps on-chip memory. None of the Ryzen chips announced yet even come with integrated graphics. Maybe that will be addressed in future generations (whatever happened to CrossFire with integrated graphics, or Vulkan/Mantle combining integrated graphics with a discrete GPU?).

      The market will let us know if putting 16 or more cores on a consumer chip was a good idea. Even if some of the users are just throwing money away.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by c0lo on Thursday May 18 2017, @12:31PM (2 children)

        by c0lo (156) Subscriber Badge on Thursday May 18 2017, @12:31PM (#511659) Journal

        The CPU manufacturers' only other options (if not making shrinking the die) are to devote more die space to integrated graphics, fixed-function decoders (8K AV1 [wikipedia.org] 12-bit support, anyone?), or perhaps on-chip memory.

        You missed an item on this list: the NSA backdoor.
        This will require a boost in performance and accesibility: NSA budget is plateauing and using contractors to keep the costs down leads to leaks.
        It will need to be supported by hardware, hacking Windows only gets them so far.

        (grin)

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2) by takyon on Thursday May 18 2017, @12:37PM (1 child)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday May 18 2017, @12:37PM (#511663) Journal

          The backdoors are already in there, they don't need more die space. (groan)

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by c0lo on Thursday May 18 2017, @01:52PM

            by c0lo (156) Subscriber Badge on Thursday May 18 2017, @01:52PM (#511678) Journal

            The backdoors are already in there, they don't need more die space.

            They may be already, but they are puny and weakly represented.

            We need more of dye space and perhaps more of them backdoors on a single chip - the internationalized version, if you like. At least one for NSA, another one for CIA, one more for GCHQ, perhaps one for Mossad, let's not forget the FSB as well (no, the Chinese don't need one special for them, they'll be consuming their own [wikipedia.org] shit [wikipedia.org] anyway).
            Of course, depending on the market, not all of them need to be active in the same time, but they must be able to be activated in any moment - because most of the computers will eventually reach Africa (that's where Afghanistan is, right? - grin -) as ewaste, and ISIS will surely recycle and use some of them. See terrorists and children, think of them.

            And, err... yes... grin... should'ave gone to sleep earlier.

            --
            https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 2) by c0lo on Thursday May 18 2017, @02:01PM (3 children)

      by c0lo (156) Subscriber Badge on Thursday May 18 2017, @02:01PM (#511680) Journal

      even after gaming for hours half my cores are parked because there just isn't enough for them to do

      Huh!?!?? here's the solution [stanford.edu], your cores will never get cored,

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by Hairyfeet on Sunday May 21 2017, @06:13AM (2 children)

        by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Sunday May 21 2017, @06:13AM (#512908) Journal

        Uhhh...I would want to waste more power by forcing my cores to stay running when they have nothing to do? Tests have shown that the OS can go from parked to unparked in milliseconds and no programs I run can be affected by a millisecond wait to switch it on and my games never use more than 4 cores, so what would be the point?

        --
        ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
        • (Score: 2) by c0lo on Sunday May 21 2017, @10:16AM (1 child)

          by c0lo (156) Subscriber Badge on Sunday May 21 2017, @10:16AM (#512959) Journal

          so what would be the point?

          For the glory of S/N's F@H team?

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
          • (Score: 2) by Hairyfeet on Wednesday May 24 2017, @07:24AM

            by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Wednesday May 24 2017, @07:24AM (#514706) Journal

            Sorry but I try to save power and thus the environment (since my state still has 5 coal fired power plants [sourcewatch.org]) so cranking up those cores plus the extra heat that would require extra cooling? Wouldn't be a great idea in my area, especially with the summer heat waves coming up.

            --
            ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    • (Score: 5, Insightful) by tonyPick on Thursday May 18 2017, @02:04PM (1 child)

      by tonyPick (1237) on Thursday May 18 2017, @02:04PM (#511682) Homepage Journal

      The simple fact is when we went from the MHz wars to core wars? The software just didn't keep up like it did when all they had to worry about was a single core.

      Counterexamples - Compiling software, Video Editing & Encoding, Image Manipulation, Audio processing, Raytracing & compositing, VM's, anybody who does more than one thing a time.....

      Gamers, not so much, but as a serious workstation CPU then the performance per dollar for Ryzen looks pretty good, and if the cores can be fed without becoming IO bound there's a definite win for a number of serious workloads.

      • (Score: 3, Informative) by fyngyrz on Thursday May 18 2017, @04:14PM

        by fyngyrz (6567) on Thursday May 18 2017, @04:14PM (#511725) Journal

        Also SDR (Software Defined Radio) software - of which there are a fair number of users out there, BTW.

        I write both SDR software and image manipulation software. I make extensive use of multiple cores in both.

        I would welcome a 16-core CPU. I run a 12/24 core machine now. 16/32 would be just fine with me, assuming equal or better capacity per core.

        Having said that, main memory speed is extremely important, particularly with lower numbers of cores. Caches aren't large enough -- not even close -- to do many things that deal with large data and tables. For instance, certain types of image processing access many regions of the image more-or-less simultaneously-ish and repeatedly, while moving through those regions. All that does to cache is make it consistently miss. At that point, the CPU instruction cycle time and the main memory speed are the limiting factors. However, as the core count goes up, in many cases, the slices or regions assigned to each core become smaller, and the odds of the cache missing drop given similar workloads. Likewise, large tables of pre-calculated values don't get effectively cached, so that reduces the time savings of the precalculation, and in the smaller computation-avoided cases renders the technique moot.

        Given a choice, I'd like a 100 GHz single core with 100 GHz memory, we can time slice it and call it all good. :) But inasmuch as that's not going to happen with today's silicon tech, many more cores on bigger and bigger chips and/or smaller and/or more 3d-ish geometries are very welcome. It all eventually chokes on memory access though, and that's where I wish we'd see the most improvements. Also very tough to do.

        Another thing... operating system (and translation layer, like Qt) GUI code is lagging way behind; it's fairly typical to not allow anything but a main thread to update the GUI, and that can create a bottleneck in graphics-heavy operations; you end up passing messages around and heavily loading the main thread, which is the only one that can address them. My SDR software, for instance, while spread out over the 12/24 cores in my machine, always has one thread/core working much harder than the others. That's the display thread, and it's because those tasks can't be spread out over multiple cores.

        TL/DR: Better OS code, faster memory, faster cores, more cores. And give every core an FPU, please. I want it all, chip and OS architects!

    • (Score: 0) by Anonymous Coward on Thursday May 18 2017, @05:53PM (1 child)

      by Anonymous Coward on Thursday May 18 2017, @05:53PM (#511760)

      hard to come up with a purpose for more than two

      • (Score: 2) by bob_super on Thursday May 18 2017, @06:28PM

        by bob_super (1357) on Thursday May 18 2017, @06:28PM (#511775)

        Someone never had twins, a threesome, or a threesome with twins.
        I have only had one of those three experiences, and I can vouch for the benefits of more than two boobs.

  • (Score: 1, Informative) by Anonymous Coward on Thursday May 18 2017, @10:53AM (3 children)

    by Anonymous Coward on Thursday May 18 2017, @10:53AM (#511628)

    The back door will be in the graphics part of the chip

  • (Score: 2) by requerdanos on Thursday May 18 2017, @10:40PM (1 child)

    by requerdanos (5997) Subscriber Badge on Thursday May 18 2017, @10:40PM (#511872) Journal

    by kaszz (4211)

    Btw, cores are good.. if software can and will make use of them....

    by Hairyfeet (75)

    Most folks simply cannot come up with enough work to keep a 10 year old CPU well fed, much less a monster like this. Sure some guys will buy it for bragging rights

    by Anonymous Coward

    cores are like boobs / hard to come up with a purpose for more than two

    And, an insightful person...

    by tonyPick (1237)

    Counterexamples - Compiling software, Video Editing & Encoding, Image Manipulation, Audio processing, Raytracing & compositing, VM's, anybody who does more than one thing a time.....

    Some things scale right up almost linearly with more cores, *especially* things like video and audio encoding or re-encoding, and like compiling. I recently re-encoded six seasons of episodes of a television show to a lower bitrate to let it fit on a smaller USB stick for someone who was traveling; this is something I would probably not even have thought of if I didn't have a many-core chip. It wasn't for bragging rights; rather, something mundane and helpful. Certainly had no problem keeping all those cores and threads (8/16; Ryzen R7 1700X) fed.

    The cases of VMs and "anybody who does more than one thing at a time" aren't linear but are still good everyday things that many cores make practical and fast.

    The takeaway here is that if someone says that more cores doesn't make a difference, and don't quote a specific workload with version number and benchmarks, they are simply wrong probably because they want to be.

    • (Score: 2) by Hairyfeet on Sunday May 21 2017, @06:20AM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Sunday May 21 2017, @06:20AM (#512910) Journal

      Uhhh how exactly is that "insightful" when I had already said "and there are some niche applications that can use all those threads" which again the key word is NICHE. How many people do you know that regularly re-encode entire seasons of TV shows? Hell how many people do you know that are actually capable of performing that task if you told them to do so?

      I've been working with the pubic WRT their PCs for nearly 40 years and I stand by my statement, the vast majority simply cannot come up with enough useful work to keep what they have now fed, much less these monsters. I find when someone says "my PC is too slow"? It nearly always translates to "I have a shitty HDD for my OS drive and its holding the system down". Replace that shitty drive with a $60 SSD for an OS drive? Suddenly you are Scotty on the Enterprise, a miracle worker that can "supercharge" a PC as one customer put it.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
(1)