Slash Boxes

SoylentNews is people

posted by martyb on Monday August 14 2017, @09:19PM   Printer-friendly
from the moah-powah dept.

AMD's new Vega 64 GPU offers comparable performance at a similar price to Nvidia's GTX 1080, which was released over a year ago. But it does so while consuming a lot more power under load (over 100 Watts more). Vega 56, however, runs faster than the GTX 1070 at a slightly lower price:

So how does AMD fare? The answer to that is ultimately going to hinge on your option on power efficiency. But before we get too far, let's start with the Radeon RX Vega 64, AMD's flagship card. Previously we've been told that it would trade blows with NVIDIA's GeForce GTX 1080, and indeed it does just that. At 3840x2160, the Vega 64 is on average neck-and-neck with the GeForce GTX 1080 in gaming performance, with the two cards routinely trading the lead, and AMD holding it more often. Of course the "anything but identical" principle applies here, as while the cards are equal on average, they can sometimes be quite far apart on individual games.

Unfortunately for AMD, their GTX 1080-like performance doesn't come cheap from a power perspective. The Vega 64 has a board power rating of 295W, and it lives up to that rating. Relative to the GeForce GTX 1080, we've seen power measurements at the wall anywhere between 110W and 150W higher than the GeForce GTX 1080, all for the same performance. Thankfully for AMD, buyers are focused on price and performance first and foremost (and in that order), so if all you're looking for is a fast AMD card at a reasonable price, the Vega 64 delivers where it needs to: it is a solid AMD counterpart to the GeForce GTX 1080. However if you care about the power consumption and the heat generated by your GPU, the Vega 64 is in a very rough spot.

On the other hand, the Radeon RX Vega 56 looks better for AMD, so it's easy to see why in recent days they have shifted their promotional efforts to the cheaper member of the RX Vega family. Though a step down from the RX Vega 64, the Vega 56 delivers around 90% of Vega 64's performance for 80% of the price. Furthermore, when compared head-to-head with the GeForce GTX 1070, its closest competition, the Vega 56 enjoys a small but none the less significant 8% performance advantage over its NVIDIA counterpart. Whereas the Vega 64 could only draw to a tie, the Vega 56 can win in its market segment.

[...] The one wildcard here with the RX Vega 56 is going to be where retail prices actually end up. AMD's $399 MSRP is rather aggressive, especially when GTX 1070 cards are retailing for closer to $449 due to cryptocurrency miner demand. If they can sustain that price, then Vega 56 is going to be real hot stuff, besting GTX 1070 in price and performance. Otherwise at GTX 1070-like prices it still has the performance advantage, but not the initiative on pricing. At any rate, this is a question we can't answer today; the Vega 56 won't be launching for another two weeks.

Both the Vega 64 and Vega 56 include 8 GB of HBM2 memory.

Also at Tom's Hardware.

Previously: AMD Unveils the Radeon Vega Frontier Edition
AMD Launches the Radeon Vega Frontier Edition
AMD Radeon RX Vega 64 and 56 Announced

Original Submission

Related Stories

Nvidia Unveils GTX 1080 and 1070 "Pascal" GPUs 20 comments

Nvidia revealed key details about its upcoming "Pascal" consumer GPUs at a May 6th event. These GPUs are built using a 16nm FinFET process from TSMC rather than the 28nm processes that were used for several previous generations of both Nvidia and AMD GPUs.

The GeForce GTX 1080 will outperform the GTX 980, GTX 980 Ti, and Titan X cards. Nvidia claims that GTX 1080 can reach 9 teraflops of single precision performance, while the GTX 1070 will reach 6.5 teraflops. A single GTX 1080 will be faster than two GTX 980s in SLI.

Both the GTX 1080 and 1070 will feature 8 GB of VRAM. Unfortunately, neither card contains High Bandwidth Memory 2.0 like the Tesla P100 does. Instead, the GTX 1080 has GDDR5X memory while the 1070 is sticking with GDDR5.

The GTX 1080 starts at $599 and is available on May 27th. The GTX 1070 starts at $379 on June 10th. Your move, AMD.

Original Submission

AMD Unveils the Radeon Vega Frontier Edition 4 comments

AMD has announced the Radeon Vega Frontier Edition, a high-end GPU based on a new architecture (Vega 1) which will launch in June.

Unlike some other recent AMD GPUs such as the Radeon Fury X, the Radeon Vega card has half precision compute capability (FP16 operations) that is twice as fast as single precision compute. AMD is advertising 13 TFLOPS single precision, 26 TFLOPS double precision for the Radeon Vega Frontier Edition.

The GPU includes 16 GB of High Bandwidth Memory 2.0 VRAM. The per-pin memory clock is up to around 1.88 Gbps, but total memory bandwidth is slightly lower than the Radeon Fury X, due to the memory bus being cut to 2048-bit from 4096-bit. However, the Fury X included only 4 GB of HBM1. The new card could include four stacks with 4 GB each, or it could be the first product to include 8 GB stacks of High Bandwidth Memory, a capacity which has not been sold by Samsung or SK Hynix to date.

The new GPU is aimed at professional/workstation users rather than gamers:

As important as the Vega hardware itself is, for AMD the target market for the hardware is equally important if not more. Vega's the first new high-end GPU from the company in two years, and it comes at a time when GPU sales are booming. Advances in machine learning have made GPUs the hottest computational peripheral since the x87 floating point co-processor, and unfortunately for AMD, they've largely missed the boat on this. Competitor NVIDIA has vastly grown their datacenter business over just the last year on the back of machine learning, thanks in large part to the task-optimized capabilities of the Pascal architecture. And most importantly of all, these machine learning accelerators have been highly profitable, fetching high margins even when the cards are readily available.

For AMD then, Vega is their chance to finally break into the machine learning market in a big way. The GPU isn't just a high-end competitor, but it offers high performance FP16 and INT8 modes that earlier AMD GPU architectures lacked, and those modes are in turn immensely beneficial to machine learning performance. As a result, for the Vega Frontier Edition launch, AMD is taking a page from the NVIDIA playbook: rather than starting off the Vega generation with consumer cards, they're going to launch with professional cards for the workstation market.

To be sure, the Radeon Vega Frontier Edition is not officially branded as a Pro or WX series card. But in terms of AMD's target market, it's unambiguously a professional card. The product page is hosted on the pro graphics section of AMD's website, the marketing material is all about professional uses, and AMD even goes so far as to tell gamers to hold off for cheaper gaming cards later on in their official blog post. Consequently the Vega FE is about the closest analogue AMD has to NVIDIA's Titan series cards, which although are gaming capable, in the last generation they have become almost exclusively professional focused.

Original Submission

AMD Launches the Radeon Vega Frontier Edition 5 comments

First it was unveiled, now it has launched. AMD has launched the Radeon Vega Frontier Edition at $999 for the air-cooled version and $1499 for liquid-cooled. The High Bandwidth Memory 2.0 included has been confirmed to be two stacks of 8-layer 8 GB HBM:

After what appears to be a very unusual false start, AMD has now formally launched their new Radeon Vega Frontier Edition card. First announced back in mid-May, the unusual card, which AMD is all but going out of their way to dissuade their usual consumer base from buying, will be available today for $999. Meanwhile its liquid cooled counterpart, which was also announced at the time, will be available later on in Q3 for $1499.

Interestingly, both of these official prices are some $200-$300 below the prices first listed by SabrePC two weeks ago in the false start. To date AMD hasn't commented on what happened there, however it's worth noting that SabrePC is as of press time still listing the cards for their previous prices, with both cards reporting as being in-stock.

[...] Feeding the GPU is AMD's previously announced dual stack HBM2 configuration, which is now confirmed to be a pair of 8 layer, 8GB "8-Hi" stacks. AMD has the Vega FE's memory clocked at just under 1.9Gbps, which gives the card a total memory bandwidth of 483GB/sec. And for anyone paying close attention to AMD's naming scheme here, they are officially calling this "HBC" memory – a callback to Vega's High Bandwidth Cache design.

Original Submission

AMD Radeon RX Vega 64 and 56 Announced 13 comments

AMD has announced two new GPUs, the Radeon RX Vega 64 and 56. The GPUs are named in reference to the amount of "compute units" included. Both GPUs have 8 GB of High Bandwidth Memory 2.0 VRAM and will be released on August 14.

The Vega 64 is priced at $500 and is said to be on par with Nvidia's GeForce GTX 1080. The GTX 1080 was released on May 27, 2016 and has a TDP 105 Watts lower than the Vega 64.

Previously: AMD Unveils the Radeon Vega Frontier Edition
AMD Launches the Radeon Vega Frontier Edition

Original Submission

Cryptocurrency Mining Wipes Out Vega 64 Stock 6 comments

AMD's Vega 64 GPU is an underwhelming chip that competes against the GTX 1080, which is a 15 month old GPU. Nvidia could lower the price of the GTX 1080 and 1070 to better compete against Vega 64 and 56, or launch Volta-based consumer GPUs in the coming months. But Vega 64 is sold out everywhere due to cryptocurrency miners.

AMD has released an updated (Windows-only) driver called Radeon Software Crimson ReLive Edition Beta for Blockchain Compute. The driver improves the hash rate for Ethereum mining significantly, and Vega 56 performance may even exceed that of Vega 64 (it is a beta driver so these results are subject to change):

As you can see, we're getting some pretty significant gains already (at stock speeds) with this beta driver. We wouldn't be surprised if there are even further optimizations to be found, once AMD is ready to go with a production driver, but we'll take what we can get right now. We did have one performance anomaly that we ran into, however. When cranking up the memory speeds, the Vega 56 actually vaulted past the Vega 64, cranking out 36.48 MH/s. That's not bad for a card that's supposed to retail for $399.

Unfortunately, there is some confusion over the true price of Vega 64, although they are out of stock anyway aside from some hardware and game bundles.

Nvidia's market cap hit $100 billion on the day of the Vega 64 launch. Nvidia's CEO told investors that the company has the ability to "rock and roll" with the volatile cryptocurrency market (implying less shortages).

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by richtopia on Monday August 14 2017, @09:59PM (11 children)

    by richtopia (3160) on Monday August 14 2017, @09:59PM (#553854) Homepage Journal

    I thought it was all about mining coin these days. In which case high power consumption is a show stopper. Not covered in the summary: the Vega 54 is also more of a power hog than the Nvidia 1070, although not as pronounced (somewhere around 45 to 75W higher power consumption for AMD).

    What am I buying you ask? I'm still rocking my 4 year old card and MIGHT buy something in the sub-200 dollar range if prices come down.

    • (Score: 2) by EvilSS on Monday August 14 2017, @10:07PM (6 children)

      by EvilSS (1456) Subscriber Badge on Monday August 14 2017, @10:07PM (#553858)
      I almost wonder if the power consumption isn't a ploy by AMD to get the cards in the hands of gamers by making them less appealing to miners, along with those bundled hardware discount packages you have to buy (and pay $100 for) for the FE cards. I wouldn't be surprised at all if a more power efficient version shows up in their next round of dedicated miner cards.
      • (Score: 2) by takyon on Monday August 14 2017, @10:32PM (1 child)

        by takyon (881) <{takyon} {at} {}> on Monday August 14 2017, @10:32PM (#553868) Journal

        Nice 4D chess scenario, and maybe that will help keep them in stock for gamers, but Occam's razor suggests AMD just sucks at making GPUs compared to Nvidia.

        [SIG] 10/28/2017: Soylent Upgrade v14 []
        • (Score: 2) by EvilSS on Tuesday August 15 2017, @12:37AM

          by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @12:37AM (#553949)
          Well, I mean they are known for making cards that can substitute for a furnace during a Siberian winter in the past so it's not like they don't have a history of it. But either way, it should put off some of the mining crazies so gamers might get some of their cards this time around. I'm sure AMD enjoys the sales but they have to know it's not going to last and if people can't buy their cards they are going to go to NV eventually, realize how much better the cards are, and stay there.
      • (Score: 2) by looorg on Monday August 14 2017, @10:47PM (3 children)

        by looorg (578) on Monday August 14 2017, @10:47PM (#553873)

        Seems like it would be a silly ploy. While power consumption might matter more to miners it's not like it's an uninteresting factor for gamers either, more power => more heat => more fans => more noise. I like my machines to be as quiet as possible really. I'd prefer to have passive cooling but if that isn't an option for a viable card then I want it to be as quiet as it possibly can be. That said as far as the high sound of these cards that might change with the other card makers start to make their version of the cards, they probably might not be able to do all that much about the power consumption tho unless they wanna run down-clocked cards and I don't think anyone will want to do that.

        That said I did buy a few for the RX cards that came before this new batch so I don't see a personal reason to upgrade. I don't play any state of the art games or mine coins. I just needed something new and it seemed nice enough and the then new Radeon cards had really low power consumption compared to other cards.

        • (Score: 2) by EvilSS on Tuesday August 15 2017, @12:42AM (2 children)

          by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @12:42AM (#553956)
          Yea but power consumption means a lot less to a gamer who may be pushing his card full tilt for, on average, a couple hours a day at best, vs a miner running it maxed out 24/7/365. As for fan noise, it depends a lot on design of the cooler. Haven't seen a noise comparison for these yet.

          Not that I plan to get one either. I'm happy with my 1070 which I lucked out and bought in the price dip between the 1080ti coming out and miners suddenly discovering it and driving the prices back up.
          • (Score: 2) by Snow on Tuesday August 15 2017, @01:39AM (1 child)

            by Snow (1601) on Tuesday August 15 2017, @01:39AM (#554010) Journal


            I also had some time to check into the AMD vs NVidia for hashing ( 'cause I'm anal...). Looks like NVidia really closed the gap to the point where they are offering similar performance/watt. The AMD's still outperform (a small amount), but also suck more power.


            When I was mining LTC in 2013, AMD had a clear lead. So much so NVidia wasn't worth mining on at all. Seems things have changed.

            • (Score: 2) by EvilSS on Tuesday August 15 2017, @01:27PM

              by EvilSS (1456) Subscriber Badge on Tuesday August 15 2017, @01:27PM (#554252)
              Unfortunately yes, NV has caught up in hash performance. Dammit. The ether miners discovered this this year and drove the 1060 and 1070 prices up drastically. Glad I bought my 1070 when I did. They are eating up the good mid-tier cards and making it harder for budget gamers to build new rigs.
    • (Score: 2) by bob_super on Monday August 14 2017, @10:09PM

      by bob_super (1357) on Monday August 14 2017, @10:09PM (#553859)

      What's surprising is that in theory HBM2 should help AMD save power and provide massive bandwidth advantages...
      I want to support AMD, but I'm not buying a new power supply, and I like my PC to not be too noisy.

      Anyway, I just upgraded my ancestral AMD5850 with a free gift from a friend (who works at the right place): a second-hand Titan X (GP102). First time I have a top-of-the-line setup since the Voodoo2. Surprisingly quiet, but then again TF2@1080 isn't exactly pushing it.
      Poisoned gift: gift's gonna cost me a lot of cash in updated games....

    • (Score: 4, Informative) by Snow on Monday August 14 2017, @10:20PM

      by Snow (1601) on Monday August 14 2017, @10:20PM (#553864) Journal

      I haven't been in the mining game for a while, but to the best of my knowledge, AMD is the gold standard for mining crypto.

      AMD and NVidia took different approaches in their video cards. This is all from memory, so there is a good chance I screw it up, but AMD has more shaders, while NVidia has fewer, but faster ones (or maybe more full featured...). Anyways, AMD offered better parallelism because of the increased number of shaders.

      There is a table on this link if you scroll down a bit...: []

    • (Score: 2) by takyon on Monday August 14 2017, @10:30PM

      by takyon (881) <{takyon} {at} {}> on Monday August 14 2017, @10:30PM (#553867) Journal

      Tom's Hardware has a section about the mining performance. Uh oh!,5173-16.html []

      Performance is a little under the 1080 Ti, but Vega is about $200 cheaper.

      [SIG] 10/28/2017: Soylent Upgrade v14 []
    • (Score: 2) by LoRdTAW on Tuesday August 15 2017, @12:15PM

      by LoRdTAW (3755) on Tuesday August 15 2017, @12:15PM (#554217) Journal

      I have a GTX 580 which was an upgrade to an older mid range AMD Radeon from the same era. The Radeon died after it's fan went and a friend had an old GTX 580 kicking around. Free upgrade. It's actually 2x faster than the Radeon and runs everything I need.

  • (Score: 2) by tibman on Monday August 14 2017, @09:59PM (2 children)

    by tibman (134) Subscriber Badge on Monday August 14 2017, @09:59PM (#553855)

    Can't find stock unless it's part of a bundle. Paper launch? Maybe miners bought them all up? I'm going to wait another month to see what happens. Vega 64 is looking to be a bit of a let down. Not the end of the world but i'm considering switching to team green. Would really like to upgrade past this RX 480.

    Also, the Vega cards all seem to be 100$ more than MSRP?

    SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by EvilSS on Monday August 14 2017, @10:10PM

      by EvilSS (1456) Subscriber Badge on Monday August 14 2017, @10:10PM (#553860)
      Nope, the initial runs pretty much have to be bought with the plus $100 bundles (which include a bunch of discounts on other computer parts). AMD did it to discourage mines snapping up the initial run. ...
    • (Score: 0) by Anonymous Coward on Tuesday August 15 2017, @03:32AM

      by Anonymous Coward on Tuesday August 15 2017, @03:32AM (#554068)

      Just get a real card from nVidia.

  • (Score: 2) by takyon on Monday August 14 2017, @11:44PM

    by takyon (881) <{takyon} {at} {}> on Monday August 14 2017, @11:44PM (#553896) Journal []

    The people are not happy.

    One saving grace is that Nvidia Volta doesn't seem to be coming before Spring 2018, as far as I can tell.

    [SIG] 10/28/2017: Soylent Upgrade v14 []
  • (Score: 4, Interesting) by takyon on Monday August 14 2017, @11:52PM

    by takyon (881) <{takyon} {at} {}> on Monday August 14 2017, @11:52PM (#553903) Journal

    Given roughly equal performance, AMD could be the better buy if you use it with a cheaper FreeSync monitor.

    From the Tom's Hardware comments:

    Disappointing? what. I'm impressed. Sits near a 1080. Keep that in mind when thinking that FreeSync sells for around $200 less than Gsync. So pair that with this GPU and you have awesome 1440p gaming.

    FreeSync vs. G-Sync [].

    [SIG] 10/28/2017: Soylent Upgrade v14 []