Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Sunday March 08 2020, @04:50AM   Printer-friendly

AMD revealed a number of details about its upcoming CPUs and GPUs at its Financial Analyst Day 2020:

AMD Shipped 260 Million Zen Cores by 2020
AMD Discusses 'X3D' Die Stacking and Packaging for Future Products: Hybrid 2.5D and 3D
AMD Moves From Infinity Fabric to Infinity Architecture: Connecting Everything to Everything
AMD Unveils CDNA GPU Architecture: A Dedicated GPU Architecture for Data Centers
AMD's 2020-2022 Client GPU Roadmap: RDNA 3 & Navi 3X On the Horizon With More Perf & Efficiency
AMD's RDNA 2 Gets A Codename: "Navi 2X" Comes This Year With 50% Improved Perf-Per-Watt
Updated AMD Ryzen and EPYC CPU Roadmaps March 2020: Milan, Genoa, and Vermeer
AMD Clarifies Comments on 7nm / 7nm+ for Future Products: EUV Not Specified

[...] The big focus here (though far from sole) is on the data center market. Long the breadbasket of Intel and increasingly NVIDIA as well, it's a highly profitable market that continues to grow. And it's a market that slipped away from AMD, and which they're now clawing back on the strength of their EPYC processors. Over the next 5 years AMD wants to take a much bigger piece of the total data center pie, and in fact the company expects to cross 10% market share of data center CPUs this next quarter. Which, by our reckoning, would be the first time they've hit that kind of market share in a decade (if not more), showing just how much things have changed for AMD.

[...] Along with great GPU performance, the other big upgrade for the CDNA family is incorporating AMD's Infinity Architecture (née Infinity Fabric). Already extensively used in AMD's EYPC CPUs, the interconnect technology is coming to AMD's GPUs, where it will play a part both in AMD's multi-GPU efforts, as well as AMD's grander plans for heterogeneous computing. With the third generation of the technology scheduled to offer full CPU/GPU coherency, allowing for a single unified memory space, the Infinity Architecture will be how AMD leverages both their CPU and GPU architectures to secure even bigger wins by using them together.

[...] After playing second-fiddle to NVIDIA for the past few years in terms of the performance of their top GPUs, AMD is planning to offer video cards with top-tier performance, capable of delivering "uncompromising" 4K gaming. AMD's rivals won't be standing still, of course, but AMD believes they have the technology and the energy efficiency needed to deliver the extreme performance that enthusiasts are looking for.

AMD will use an improved TSMC "7nm" process node for Zen 3 CPUs, but is unlikely to use the "N7+" node which relies on extreme ultraviolet lithography (EUV). Zen 4 CPUs will be made on a TSMC "5nm" process.

Upcoming RDNA 2 GPUs are confirmed to include features such as hardware-accelerated ray tracing and variable rate shading.

Related: U.S. Department of Energy's "El Capitan" Supercomputer Will Reach 2 Exaflops Using AMD CPUs and GPUs


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Interesting) by Runaway1956 on Sunday March 08 2020, @07:20AM (10 children)

    by Runaway1956 (2926) Subscriber Badge on Sunday March 08 2020, @07:20AM (#968120) Journal

    After playing second-fiddle to NVIDIA

    It's been quite a long while since I've owned an ATI/AMD video card. Fair, or not fair, I've decided that Nvidia is the easier card to install. Early on, in my Linux life, we didn't *really* need video cards. When I decided that maybe I do need a discrete video card, they were real bitches to get installed correctly. I more or less figured out Nvidia. ATI worked fine on a Windows machine, but Nvidia worked on Linux.

    In recent years, there has been little reason to even think of looking back. Nvidia rules, ATI/AMD almost catches up, and Nvidia comes out with something twice as nice.

    Looking at performance stats for folding at home, Nvidia still squashes everything that AMD throws at them - in this spreadsheet, the best ATI places number four, behind Nvidia, then we have 4 more Nvidia, including an older generation GPU, before ATI makes another appearance.

    https://docs.google.com/spreadsheets/d/1vcVoSVtamcoGj5sFfvKF_XlvuviWWveJIg_iZ8U2bf0/pub?output=html# [google.com]

    • (Score: 4, Insightful) by Booga1 on Sunday March 08 2020, @08:06AM (3 children)

      by Booga1 (6333) on Sunday March 08 2020, @08:06AM (#968131)

      You have to remember that the cards are often separated by consumer/professional markets and release dates. When I got my Vega 56 card I was the top contributor to the Soylentnews Folding@home team for a while. Then, like you said, someone with an Nvidia card came along and did better(kudos to them for their contributions, btw).
      Thing is, I only had to pay $400 for mine when the Nvidia 1080 Ti card cost $700. Sure, the performance was in line with the cost increase, but there's the rub. I got it for gaming and it's still serving me well today, for about half the cost of the Nvidia card.

      Unfortunately, once you get to those top tier cards the cost gets a bit wonky and not always in line with the resulting performance improvements. If you look at that current top AMD card, the 5700 XT [newegg.com] you'll find it sells for about $400 or so and the only Nvidia cards above it cost $1000-$1200(2080 Ti [newegg.com]), $3600(Titan V [newegg.com]), and $1600(Titan XP [newegg.com]).

      Nvidia may have higher performance, but they make you pay for it. If you want decent performance on a budget, AMD is still your best value.

      • (Score: 2) by Runaway1956 on Sunday March 08 2020, @10:35AM (2 children)

        by Runaway1956 (2926) Subscriber Badge on Sunday March 08 2020, @10:35AM (#968143) Journal

        Agreed - I can't afford the premium prices for the top producing Nvidia. Or, if I can, I can't justify the cost. The top tier 2080 Supers are running right up around $2000, and a pair, or "titan" would be 4 grand. And, someone would have to pay for my electricity for me to run those things!

        • (Score: 2) by takyon on Sunday March 08 2020, @04:00PM (1 child)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday March 08 2020, @04:00PM (#968189) Journal

          2080 Super is about $700, not $2K. The stronger 2080 Ti is about $1100-$1150. Titan RTX is $2500. Where are you getting your prices from (not that these prices are good)?

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by Runaway1956 on Sunday March 08 2020, @04:25PM

            by Runaway1956 (2926) Subscriber Badge on Sunday March 08 2020, @04:25PM (#968202) Journal

            The Titan RTX is what I was referring to - it's the top of the line premium offering today, in consumer goods. I was too tired to bother looking up the exact naming.

            Not terribly long ago, the top 780 and then the 1080 were pulling significant portions of that price. Not sure that they were quite that high, though, if memory serves correctly, $1200 and $1500.

    • (Score: 3, Informative) by takyon on Sunday March 08 2020, @11:19AM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday March 08 2020, @11:19AM (#968147) Journal

      AMD has been competing favorably in performance/dollar with Nvidia, sort of. It's more like they priced their cards to match Nvidia's price-performance almost exactly (remember this? [soylentnews.org]), and then held back anything that would have been more powerful than the 5700 XT and actually competed with Nvidia's high end GPUs. AMD's 2019 GPUs are also relatively cheap to make. 251 mm2 on "7nm" for 5700 XT vs. 445 mm2 for RTX 2060/2070, 545 mm2 for RTX 2080, and an absolutely massive 754 mm2 for RTX 2080 Ti on "12nm".

      So AMD offered little significant downward pressure with their initial Navi cards. But it was a successful product launch and AMD gained some market share anyway, in theory because of its "mindshare". When AMD's CPUs are selling well, they also sell more GPUs.

      This year, AMD promises to launch a GPU faster than the RTX 2080 Ti (referred to as "Big Navi"). They might hold the performance crown briefly before Nvidia strikes back with RTX 3000-series. But the bigger deal for AMD is that they will set the course for the entire gaming industry, since both the next-gen Xbox and PS5 will use AMD APUs. For example, Nvidia hyped up real-time raytracing in their overpriced RTX 2000-series, but AMD's should become the best supported implementation (assuming the consoles don't come with custom solutions).

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Runaway1956 on Sunday March 08 2020, @04:33PM (1 child)

        by Runaway1956 (2926) Subscriber Badge on Sunday March 08 2020, @04:33PM (#968204) Journal

        Personally, I wish they would all conform to some naming scheme. I rather like how Nvidia is doing it. Starting with the 730, you work your way up through the 740, 750, 760, 770, and 780, each being bigger and faster than it's little brother.

        They didn't actually distribute any of the 8xx family, but it kinda existed on paper.

        The 9xx family was the same, but it wasn't widely distributed either. The naming concept remained the same, though.

        The 10.. was distributed widely, naming scheme remained.

        16 series was quietly dwarfed before it got much traction, but the naming scheme stayed.

        Now, the 20 family. I haven't noticed a 2030, but if it's out there, it's the bargain basement card, and the 2080 is the premium card.

        Maintaining consistency in the naming scheme helps a poor old man like myself to understand why in the hell one offering is outrageously expensive, while another only costs a pittance.

        • (Score: 3, Informative) by takyon on Sunday March 08 2020, @05:31PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday March 08 2020, @05:31PM (#968223) Journal

          I don't think the naming is that important. Nvidia's is a bit easier to understand (although 16XX and Super editions muddled it), but you should end up reading some article/hierarchy anyway. Like this one [tomshardware.com]. Only problem is that it doesn't include prices in the table, so you might have to cross-reference it with something else.

          16 series was quietly dwarfed before it got much traction, but the naming scheme stayed.

          16 series came out after 20 series. Basically, the 20 series was expensive, so they ditched the dedicated raytracing cores (RTX) and some shaders, making the cheaper 16 series cards with about half the die size.

          Now, the 20 family. I haven't noticed a 2030, but if it's out there, it's the bargain basement card, and the 2080 is the premium card.

          The weakest 20 series GPU to date is the RTX 2060, which debuted at $350 [extremetech.com]. So you can see why the 16 series exists.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 3, Interesting) by dwilson on Sunday March 08 2020, @04:13PM (1 child)

      by dwilson (2599) Subscriber Badge on Sunday March 08 2020, @04:13PM (#968199) Journal

      I've always been an nvidia man as well. Ever since I got started with linux back in '04, nvidia cards have 'just worked' with an absolute minimum of fucking around - and that on rare occurrences few and far between. ATI was a shitshow for years and I saw nothing but horror stories and pleas for support as far as they were concerned.

      I guess that's changed in recent years, though. I picked up a budget notebook last year with an AMD/ATI chipset in it, promptly installed Gentoo and was completely floored by how easy the graphics were to get going. The drivers are open source, in-kernel and performance seemed just as good as it was on the Win10 install the machine came with. The biggest pain was identifying the correct firmware for the card and getting it loaded at boot, but that's more a Gentoo pain than the card's fault. Ubuntu or Mint wouldn't have had that step.

      When the nvidia gpu in my desktop machine dies, I may just drop an AMD/ATI card in if the price/performance ratio still looks as good as you report.

      And then I'll have to dig out the sheet of copper and the pipe fittings to build a waterblock, but I have to do that every time anyway.

      --
      - D
      • (Score: 2) by takyon on Sunday March 08 2020, @06:08PM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday March 08 2020, @06:08PM (#968230) Journal

        It will be interesting to see where APUs go in the future. While the Ryzen 4000 mobile chips will have pretty strong graphics performance, possibly beating Nvidia's MX350 and MX330 discrete laptop graphics chips, the custom APU in the Xbox Series X will have 12 teraflops of graphics performance.

        Moving CPU cores, GPU cores, and memory closer together looks like the direction we're heading in.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Sunday March 08 2020, @07:21PM

      by Anonymous Coward on Sunday March 08 2020, @07:21PM (#968258)

      no self respecting gnu+linux user would buy anything from nvidia. you are providing material support to the enemy.

(1)