Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday August 14 2017, @09:19PM   Printer-friendly
from the moah-powah dept.

AMD's new Vega 64 GPU offers comparable performance at a similar price to Nvidia's GTX 1080, which was released over a year ago. But it does so while consuming a lot more power under load (over 100 Watts more). Vega 56, however, runs faster than the GTX 1070 at a slightly lower price:

So how does AMD fare? The answer to that is ultimately going to hinge on your option on power efficiency. But before we get too far, let's start with the Radeon RX Vega 64, AMD's flagship card. Previously we've been told that it would trade blows with NVIDIA's GeForce GTX 1080, and indeed it does just that. At 3840x2160, the Vega 64 is on average neck-and-neck with the GeForce GTX 1080 in gaming performance, with the two cards routinely trading the lead, and AMD holding it more often. Of course the "anything but identical" principle applies here, as while the cards are equal on average, they can sometimes be quite far apart on individual games.

Unfortunately for AMD, their GTX 1080-like performance doesn't come cheap from a power perspective. The Vega 64 has a board power rating of 295W, and it lives up to that rating. Relative to the GeForce GTX 1080, we've seen power measurements at the wall anywhere between 110W and 150W higher than the GeForce GTX 1080, all for the same performance. Thankfully for AMD, buyers are focused on price and performance first and foremost (and in that order), so if all you're looking for is a fast AMD card at a reasonable price, the Vega 64 delivers where it needs to: it is a solid AMD counterpart to the GeForce GTX 1080. However if you care about the power consumption and the heat generated by your GPU, the Vega 64 is in a very rough spot.

On the other hand, the Radeon RX Vega 56 looks better for AMD, so it's easy to see why in recent days they have shifted their promotional efforts to the cheaper member of the RX Vega family. Though a step down from the RX Vega 64, the Vega 56 delivers around 90% of Vega 64's performance for 80% of the price. Furthermore, when compared head-to-head with the GeForce GTX 1070, its closest competition, the Vega 56 enjoys a small but none the less significant 8% performance advantage over its NVIDIA counterpart. Whereas the Vega 64 could only draw to a tie, the Vega 56 can win in its market segment.

[...] The one wildcard here with the RX Vega 56 is going to be where retail prices actually end up. AMD's $399 MSRP is rather aggressive, especially when GTX 1070 cards are retailing for closer to $449 due to cryptocurrency miner demand. If they can sustain that price, then Vega 56 is going to be real hot stuff, besting GTX 1070 in price and performance. Otherwise at GTX 1070-like prices it still has the performance advantage, but not the initiative on pricing. At any rate, this is a question we can't answer today; the Vega 56 won't be launching for another two weeks.

Both the Vega 64 and Vega 56 include 8 GB of HBM2 memory.

Also at Tom's Hardware.

Previously: AMD Unveils the Radeon Vega Frontier Edition
AMD Launches the Radeon Vega Frontier Edition
AMD Radeon RX Vega 64 and 56 Announced


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by richtopia on Monday August 14 2017, @09:59PM (11 children)

    by richtopia (3160) on Monday August 14 2017, @09:59PM (#553854) Homepage Journal

    I thought it was all about mining coin these days. In which case high power consumption is a show stopper. Not covered in the summary: the Vega 54 is also more of a power hog than the Nvidia 1070, although not as pronounced (somewhere around 45 to 75W higher power consumption for AMD).

    What am I buying you ask? I'm still rocking my 4 year old card and MIGHT buy something in the sub-200 dollar range if prices come down.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by EvilSS on Monday August 14 2017, @10:07PM (6 children)

    by EvilSS (1456) Subscriber Badge on Monday August 14 2017, @10:07PM (#553858)
    I almost wonder if the power consumption isn't a ploy by AMD to get the cards in the hands of gamers by making them less appealing to miners, along with those bundled hardware discount packages you have to buy (and pay $100 for) for the FE cards. I wouldn't be surprised at all if a more power efficient version shows up in their next round of dedicated miner cards.
    • (Score: 2) by takyon on Monday August 14 2017, @10:32PM (1 child)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday August 14 2017, @10:32PM (#553868) Journal

      Nice 4D chess scenario, and maybe that will help keep them in stock for gamers, but Occam's razor suggests AMD just sucks at making GPUs compared to Nvidia.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by EvilSS on Tuesday August 15 2017, @12:37AM

        by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @12:37AM (#553949)
        Well, I mean they are known for making cards that can substitute for a furnace during a Siberian winter in the past so it's not like they don't have a history of it. But either way, it should put off some of the mining crazies so gamers might get some of their cards this time around. I'm sure AMD enjoys the sales but they have to know it's not going to last and if people can't buy their cards they are going to go to NV eventually, realize how much better the cards are, and stay there.
    • (Score: 2) by looorg on Monday August 14 2017, @10:47PM (3 children)

      by looorg (578) on Monday August 14 2017, @10:47PM (#553873)

      Seems like it would be a silly ploy. While power consumption might matter more to miners it's not like it's an uninteresting factor for gamers either, more power => more heat => more fans => more noise. I like my machines to be as quiet as possible really. I'd prefer to have passive cooling but if that isn't an option for a viable card then I want it to be as quiet as it possibly can be. That said as far as the high sound of these cards that might change with the other card makers start to make their version of the cards, they probably might not be able to do all that much about the power consumption tho unless they wanna run down-clocked cards and I don't think anyone will want to do that.

      That said I did buy a few for the RX cards that came before this new batch so I don't see a personal reason to upgrade. I don't play any state of the art games or mine coins. I just needed something new and it seemed nice enough and the then new Radeon cards had really low power consumption compared to other cards.

      • (Score: 2) by EvilSS on Tuesday August 15 2017, @12:42AM (2 children)

        by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @12:42AM (#553956)
        Yea but power consumption means a lot less to a gamer who may be pushing his card full tilt for, on average, a couple hours a day at best, vs a miner running it maxed out 24/7/365. As for fan noise, it depends a lot on design of the cooler. Haven't seen a noise comparison for these yet.

        Not that I plan to get one either. I'm happy with my 1070 which I lucked out and bought in the price dip between the 1080ti coming out and miners suddenly discovering it and driving the prices back up.
        • (Score: 2) by Snow on Tuesday August 15 2017, @01:39AM (1 child)

          by Snow (1601) on Tuesday August 15 2017, @01:39AM (#554010) Journal

          Agreed.

          I also had some time to check into the AMD vs NVidia for hashing ( 'cause I'm anal...). Looks like NVidia really closed the gap to the point where they are offering similar performance/watt. The AMD's still outperform (a small amount), but also suck more power.

          http://wccftech.com/ethereum-mining-gpu-performance-roundup/ [wccftech.com]

          When I was mining LTC in 2013, AMD had a clear lead. So much so NVidia wasn't worth mining on at all. Seems things have changed.

          • (Score: 2) by EvilSS on Tuesday August 15 2017, @01:27PM

            by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @01:27PM (#554252)
            Unfortunately yes, NV has caught up in hash performance. Dammit. The ether miners discovered this this year and drove the 1060 and 1070 prices up drastically. Glad I bought my 1070 when I did. They are eating up the good mid-tier cards and making it harder for budget gamers to build new rigs.
  • (Score: 2) by bob_super on Monday August 14 2017, @10:09PM

    by bob_super (1357) on Monday August 14 2017, @10:09PM (#553859)

    What's surprising is that in theory HBM2 should help AMD save power and provide massive bandwidth advantages...
    I want to support AMD, but I'm not buying a new power supply, and I like my PC to not be too noisy.

    Anyway, I just upgraded my ancestral AMD5850 with a free gift from a friend (who works at the right place): a second-hand Titan X (GP102). First time I have a top-of-the-line setup since the Voodoo2. Surprisingly quiet, but then again TF2@1080 isn't exactly pushing it.
    Poisoned gift: gift's gonna cost me a lot of cash in updated games....

  • (Score: 4, Informative) by Snow on Monday August 14 2017, @10:20PM

    by Snow (1601) on Monday August 14 2017, @10:20PM (#553864) Journal

    I haven't been in the mining game for a while, but to the best of my knowledge, AMD is the gold standard for mining crypto.

    AMD and NVidia took different approaches in their video cards. This is all from memory, so there is a good chance I screw it up, but AMD has more shaders, while NVidia has fewer, but faster ones (or maybe more full featured...). Anyways, AMD offered better parallelism because of the increased number of shaders.

    There is a table on this link if you scroll down a bit...:
    https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/ [techpowerup.com]

  • (Score: 2) by takyon on Monday August 14 2017, @10:30PM

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday August 14 2017, @10:30PM (#553867) Journal

    Tom's Hardware has a section about the mining performance. Uh oh!

    http://www.tomshardware.com/reviews/amd-radeon-rx-vega-64,5173-16.html [tomshardware.com]

    Performance is a little under the 1080 Ti, but Vega is about $200 cheaper.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by LoRdTAW on Tuesday August 15 2017, @12:15PM

    by LoRdTAW (3755) on Tuesday August 15 2017, @12:15PM (#554217) Journal

    I have a GTX 580 which was an upgrade to an older mid range AMD Radeon from the same era. The Radeon died after it's fan went and a friend had an old GTX 580 kicking around. Free upgrade. It's actually 2x faster than the Radeon and runs everything I need.