Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 12 submissions in the queue.
posted by martyb on Monday July 31 2017, @04:49PM   Printer-friendly
from the graphic-news dept.

AMD has announced two new GPUs, the Radeon RX Vega 64 and 56. The GPUs are named in reference to the amount of "compute units" included. Both GPUs have 8 GB of High Bandwidth Memory 2.0 VRAM and will be released on August 14.

The Vega 64 is priced at $500 and is said to be on par with Nvidia's GeForce GTX 1080. The GTX 1080 was released on May 27, 2016 and has a TDP 105 Watts lower than the Vega 64.

Previously: AMD Unveils the Radeon Vega Frontier Edition
AMD Launches the Radeon Vega Frontier Edition


Original Submission

Related Stories

AMD Unveils the Radeon Vega Frontier Edition 4 comments

AMD has announced the Radeon Vega Frontier Edition, a high-end GPU based on a new architecture (Vega 1) which will launch in June.

Unlike some other recent AMD GPUs such as the Radeon Fury X, the Radeon Vega card has half precision compute capability (FP16 operations) that is twice as fast as single precision compute. AMD is advertising 13 TFLOPS single precision, 26 TFLOPS double precision for the Radeon Vega Frontier Edition.

The GPU includes 16 GB of High Bandwidth Memory 2.0 VRAM. The per-pin memory clock is up to around 1.88 Gbps, but total memory bandwidth is slightly lower than the Radeon Fury X, due to the memory bus being cut to 2048-bit from 4096-bit. However, the Fury X included only 4 GB of HBM1. The new card could include four stacks with 4 GB each, or it could be the first product to include 8 GB stacks of High Bandwidth Memory, a capacity which has not been sold by Samsung or SK Hynix to date.

The new GPU is aimed at professional/workstation users rather than gamers:

As important as the Vega hardware itself is, for AMD the target market for the hardware is equally important if not more. Vega's the first new high-end GPU from the company in two years, and it comes at a time when GPU sales are booming. Advances in machine learning have made GPUs the hottest computational peripheral since the x87 floating point co-processor, and unfortunately for AMD, they've largely missed the boat on this. Competitor NVIDIA has vastly grown their datacenter business over just the last year on the back of machine learning, thanks in large part to the task-optimized capabilities of the Pascal architecture. And most importantly of all, these machine learning accelerators have been highly profitable, fetching high margins even when the cards are readily available.

For AMD then, Vega is their chance to finally break into the machine learning market in a big way. The GPU isn't just a high-end competitor, but it offers high performance FP16 and INT8 modes that earlier AMD GPU architectures lacked, and those modes are in turn immensely beneficial to machine learning performance. As a result, for the Vega Frontier Edition launch, AMD is taking a page from the NVIDIA playbook: rather than starting off the Vega generation with consumer cards, they're going to launch with professional cards for the workstation market.

To be sure, the Radeon Vega Frontier Edition is not officially branded as a Pro or WX series card. But in terms of AMD's target market, it's unambiguously a professional card. The product page is hosted on the pro graphics section of AMD's website, the marketing material is all about professional uses, and AMD even goes so far as to tell gamers to hold off for cheaper gaming cards later on in their official blog post. Consequently the Vega FE is about the closest analogue AMD has to NVIDIA's Titan series cards, which although are gaming capable, in the last generation they have become almost exclusively professional focused.


Original Submission

AMD Launches the Radeon Vega Frontier Edition 5 comments

First it was unveiled, now it has launched. AMD has launched the Radeon Vega Frontier Edition at $999 for the air-cooled version and $1499 for liquid-cooled. The High Bandwidth Memory 2.0 included has been confirmed to be two stacks of 8-layer 8 GB HBM:

After what appears to be a very unusual false start, AMD has now formally launched their new Radeon Vega Frontier Edition card. First announced back in mid-May, the unusual card, which AMD is all but going out of their way to dissuade their usual consumer base from buying, will be available today for $999. Meanwhile its liquid cooled counterpart, which was also announced at the time, will be available later on in Q3 for $1499.

Interestingly, both of these official prices are some $200-$300 below the prices first listed by SabrePC two weeks ago in the false start. To date AMD hasn't commented on what happened there, however it's worth noting that SabrePC is as of press time still listing the cards for their previous prices, with both cards reporting as being in-stock.

[...] Feeding the GPU is AMD's previously announced dual stack HBM2 configuration, which is now confirmed to be a pair of 8 layer, 8GB "8-Hi" stacks. AMD has the Vega FE's memory clocked at just under 1.9Gbps, which gives the card a total memory bandwidth of 483GB/sec. And for anyone paying close attention to AMD's naming scheme here, they are officially calling this "HBC" memory – a callback to Vega's High Bandwidth Cache design.


Original Submission

AMD's Vega 64: Matches the GTX 1080 but Not in Power Consumption 17 comments

AMD's new Vega 64 GPU offers comparable performance at a similar price to Nvidia's GTX 1080, which was released over a year ago. But it does so while consuming a lot more power under load (over 100 Watts more). Vega 56, however, runs faster than the GTX 1070 at a slightly lower price:

So how does AMD fare? The answer to that is ultimately going to hinge on your option on power efficiency. But before we get too far, let's start with the Radeon RX Vega 64, AMD's flagship card. Previously we've been told that it would trade blows with NVIDIA's GeForce GTX 1080, and indeed it does just that. At 3840x2160, the Vega 64 is on average neck-and-neck with the GeForce GTX 1080 in gaming performance, with the two cards routinely trading the lead, and AMD holding it more often. Of course the "anything but identical" principle applies here, as while the cards are equal on average, they can sometimes be quite far apart on individual games.

Unfortunately for AMD, their GTX 1080-like performance doesn't come cheap from a power perspective. The Vega 64 has a board power rating of 295W, and it lives up to that rating. Relative to the GeForce GTX 1080, we've seen power measurements at the wall anywhere between 110W and 150W higher than the GeForce GTX 1080, all for the same performance. Thankfully for AMD, buyers are focused on price and performance first and foremost (and in that order), so if all you're looking for is a fast AMD card at a reasonable price, the Vega 64 delivers where it needs to: it is a solid AMD counterpart to the GeForce GTX 1080. However if you care about the power consumption and the heat generated by your GPU, the Vega 64 is in a very rough spot.

On the other hand, the Radeon RX Vega 56 looks better for AMD, so it's easy to see why in recent days they have shifted their promotional efforts to the cheaper member of the RX Vega family. Though a step down from the RX Vega 64, the Vega 56 delivers around 90% of Vega 64's performance for 80% of the price. Furthermore, when compared head-to-head with the GeForce GTX 1070, its closest competition, the Vega 56 enjoys a small but none the less significant 8% performance advantage over its NVIDIA counterpart. Whereas the Vega 64 could only draw to a tie, the Vega 56 can win in its market segment.

[...] The one wildcard here with the RX Vega 56 is going to be where retail prices actually end up. AMD's $399 MSRP is rather aggressive, especially when GTX 1070 cards are retailing for closer to $449 due to cryptocurrency miner demand. If they can sustain that price, then Vega 56 is going to be real hot stuff, besting GTX 1070 in price and performance. Otherwise at GTX 1070-like prices it still has the performance advantage, but not the initiative on pricing. At any rate, this is a question we can't answer today; the Vega 56 won't be launching for another two weeks.

Both the Vega 64 and Vega 56 include 8 GB of HBM2 memory.

Also at Tom's Hardware.

Previously: AMD Unveils the Radeon Vega Frontier Edition
AMD Launches the Radeon Vega Frontier Edition
AMD Radeon RX Vega 64 and 56 Announced


Original Submission

AMD Profits in Q3 2017 9 comments

AMD turned a profit last quarter:

2017 has been a great year for the tech enthusiast, with the return of meaningful competition in the PC space. Today, AMD announced their third quarter earnings, which beat expectations, and put the company's ledgers back in the black in their GAAP earnings. For the quarter, AMD had revenues of $1.64 billion, compared to $1.31 billion a year ago, which is a gain of just over 25%. Operating income was $126 million, compared to a $293 million loss a year ago, and net income was $71 million, compared to a net loss of $406 million a year ago. This resulted in earnings per share of $0.07, compared to a loss per share of $0.50 in Q3 2016.

[...] The Computing and Graphics segment has been a key to these numbers, with some impressive launches this year, especially on the CPU side. Revenue for this segment was up 74% to $819 million, and AMD attributes this to strong sales of both Radeon GPUs and Ryzen desktop processors. Average Selling Price (ASP) was also up significantly thanks to Ryzen sales. AMD is still undercutting Intel on price, but they don't have to almost give things away like they did the last couple of years. ASP of GPUs was also up significantly, and the proliferation of cryptocurrency likely played a large part in that. Operating income for the segment was an impressive $70 million, compared to an operating loss of $66 million last year.

When AMD turns a profit, it is news. Stocks still plunged on concerns over future growth. Citi Research has predicted big losses for AMD as Intel ships its Coffee Lake CPUs.

Previously: AMD Ryzen Launch News
AMD GPU Supply Exhausted By Cryptocurrency Mining, AIBs Now Directly Advertising To Miners
AMD Epyc 7000-Series Launched With Up to 32 Cores
Cryptocoin GPU Bubble?
Ethereum Mining Craze Leads to GPU Shortages
Used GPUs Flood the Market as Ethereum's Price Crashes Below $150
AMD Radeon RX Vega 64 and 56 Announced
First Two AMD Threadripper Chips Out on Aug. 10, New 8-Core Version on Aug. 31
Cryptocurrency Mining Wipes Out Vega 64 Stock
AMD Expected to Release Ryzen CPUs on a 12nm Process in Q1 2018


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Justin Case on Monday July 31 2017, @05:30PM (9 children)

    by Justin Case (4239) on Monday July 31 2017, @05:30PM (#547258) Journal

    Last time I went down this path I found some comparison site that offered to rank cards from the major manufacturers by raw graphics performance and other specs. I carefully compared several and selected one nearer the top end of the moderate price range.

    It sucks.

    On a good day I get maybe 7FPS. On a bad day the graphics lock up entirely for 30 seconds. Can't even alt-tab to see what's going on.

    Task Manager shows all 4 CPUs at around 50% to 70% but never saturated. Plenty of RAM. Disk IO does not seem to be a bottleneck. It really seems to be all on the graphics card.

    I'd like to have a stack of them to swap in and out until I get one that is "fast enough". I'd be willing to pay somewhat more, but not just to get ripped off again.

    Is there any way to know in advance that what you are considering will be worthwhile?

    • (Score: 2) by turgid on Monday July 31 2017, @05:39PM

      by turgid (4318) Subscriber Badge on Monday July 31 2017, @05:39PM (#547264) Journal

      Let the Lunatic Fringe buy the fancy graphics cards. After a year or two, when the drivers have stabilised, choose a few from those charts then google what the people on the forums are saying about them.

    • (Score: 0) by Anonymous Coward on Monday July 31 2017, @06:39PM (1 child)

      by Anonymous Coward on Monday July 31 2017, @06:39PM (#547287)

      the only way to not get ripped off (or treated as a digital slave by your precious slavewareOS) is to educate yourself. anything less and you will be victimized.

      • (Score: 2) by Justin Case on Monday July 31 2017, @06:43PM

        by Justin Case (4239) on Monday July 31 2017, @06:43PM (#547289) Journal

        So I guess asking for thoughts from people I consider my technical peers (or at least some of them) does not count in your world as one way to educate myself?

    • (Score: 2) by gman003 on Tuesday August 01 2017, @12:14AM

      by gman003 (4155) on Tuesday August 01 2017, @12:14AM (#547437)

      What the hell did you buy, and what the hell are you running that gets 7 FPS? I think you can get better than that with integrated graphics on almost any game these days. And the "locking up for 30 seconds" makes me suspect an entirely different issue is behind it all - that sounds more like thrashing swap than a GPU issue, despite your claim that you had plenty and no disk bottleneck.

    • (Score: 2) by tibman on Tuesday August 01 2017, @12:17AM (4 children)

      by tibman (134) Subscriber Badge on Tuesday August 01 2017, @12:17AM (#547440)

      Gamer here. What card did you get? What games couldn't it play? What games do you want to play?

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 2) by Justin Case on Tuesday August 01 2017, @04:02PM (3 children)

        by Justin Case (4239) on Tuesday August 01 2017, @04:02PM (#547685) Journal

        Thanks for offering clue.

        I got a Radeon R7 240 (*1). At the time it was one of the highest rated -- and available -- "High Mid Range" AMD cards per www.videocardbenchmark.net. I also saw advice to get something a couple years old so the drivers had a chance to "settle in". From what I hear that is still good advice today.

        I chose AMD over NVidia because I've heard NVidia is more linux-hostile. Not that linux matters in this case, because the game only runs on Windows (where have we heard this before?) and IMHO any type of virtualization or emulation layer could only be expected to slow things down.

        I dug up one of those Windows 7 SP1 hard drives that comes with the computer when you buy it; never booted until I decided to try this game. Given my decades of professional experience with Windows, my personal policy is that a Windows instance will never see a live network connection. So installing drivers, patches, etc. all happens via an airgap-hopping thumb drive.

        The game (Trainz A New Era) requires at least ATI 5550 (*2) but recommends AMD HD 6950 (*3).

        *1: Rated 967, higher is supposed to be better
        *2: Rated 539
        *3: A "High End" card. At the time of purchase, what was available was priced out of my range.

        So today, I could drop $BIGBUX on a higher-rated card, but my fundamental question is how do I know in advance that it will deliver? What I got supposedly had 1.8 times the power (rating) of the minimum, but still barely works.

        In case you're curious, the game is somewhat like SecondLife. (Is that still a thing?) Lots of user-created content; long sight lines leading to potentially millions of polygons to render. The game does have settings to pare down the more distant objects, but that kinda ruins it.

        • (Score: 3, Informative) by takyon on Wednesday August 02 2017, @06:25PM

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Wednesday August 02 2017, @06:25PM (#548039) Journal

          The R7 240 is not high mid range. It is low end, a hint being the $70-80 price tag. You were fooled by the "High to Mid Range" description of the list at videocardbenchmark.net. From what I can tell it's more like the middle of a long list with the bottom being goddamn old and slow cards. You have to be careful with PassMark period, as the numbers can be misleading.

          http://www.anandtech.com/show/9217/the-amd-a8-7650k-apu-review-also-new-testing-methodology/7 [anandtech.com]
          http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html [tomshardware.com]

          Nvidia's GT 1030 [videocardbenchmark.net] is around that price and more than double the PassMark at 2281.

          Linux hostile? Maybe more like open source hostile [pcworld.com].

          Now we could call this an epic fail on your part right here, but from the AnandTech benchmarks linked above and this video [youtube.com], you can see that the R7 240 can run modern games at well over 7 FPS. So what's the problem then?

          I can only conclude that Trainz: A New Era is a crappily coded title. Lo and behold [steamcommunity.com]:

          Brand new system:

          Intel 5930k
          Asus X99 Pro MB
          16 GB Corsair 2666 MHz
          2x 980GTX (SLI)

          Fresh Win 7 Install, fully updated, newest drivers all around.

          Max settings both in launcher and in-game.

          Result: 8 FPS

          Total system price: 3000+ $

          So sad...

          Yeah that is a real tear jerker. On the bright side, no other game should give you any problems.

          let me guess...........you maxed out view distance and set shadows to 4096.........DON'TTTTTTTTTTTTTTTTTT until it's patched more.

          Same here high end PC SLI 4GB cards and horrible FPS. People have been complaining since day one, this was nothing but a money grab and the next time they release something remember this. Burn me once shame on you burn me twice shame on me.

          Well. Back in 2015 and 2014, the game was poorly optimised for everything. SP1 Hotfix 4 makes the game a lot better. You only need to be patient.

          I know shadows are not the best yet, but what
          I noticed the most is, that 1920 x 1080 : 60Hz is the maximum resolution to get good FPS (30) Less than 30 FPS means lags. I can run the game with no lags. Shadows off, tree detail normal, scenery detail highest, post procesing low, draw distance 15 000 + meters (yes I ovverrided the distance)
          These are my current settings. Simply, download and install SP1, you get performanced trees. When you upgrade to SP1 HT4, you will get the best performance in entire game.

          Give it little time, and this game will be a lot better than any other trianz game.

          Back in July 2015 I was one of the people complaining about Trainz and how badly it looked and run but I have to say, after SP1 and now SP2 Trainz has improved hugely. I'm enjoying Trainz very much now, I think it looks very good, especially with some addons and runs very smooth.

          User created content and long view distances probably hurt too, but you picked the wrong title to judge the state of GPUs by. Make sure you update Trainz to SP2 (horrible patch names for a reason that should be obvious) and come back.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by tibman on Sunday August 06 2017, @03:00AM (1 child)

          by tibman (134) Subscriber Badge on Sunday August 06 2017, @03:00AM (#549357)

          I've got an extra HD 6970 and R9 270. SN doesn't have priv messaging but if you can post a way to contact you then i'll mail you one. Just let me know which you'd prefer.

          --
          SN won't survive on lurkers alone. Write comments.
          • (Score: 2) by Justin Case on Sunday August 06 2017, @03:38PM

            by Justin Case (4239) on Sunday August 06 2017, @03:38PM (#549544) Journal

            This is a generous offer, thank you. I hope at least I can reimburse your packaging and shipping costs.

            Please send an email to imbrie2 who is at the domain zotline doubt com. (Of course I hope you will doubt part of that address.) I will then reply with a mailing address.

            It looks like the 270 is the better card, though I've already proven myself pretty poor at selecting. This page makes me think there are only drivers for WXP. (I'm on W7.)

            https://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-200-Series-Drivers.aspx [amd.com]

            For the 6970 I hear it runs hot and the fans are loud. Has that been your experience?

  • (Score: 0) by Anonymous Coward on Monday July 31 2017, @06:41PM (1 child)

    by Anonymous Coward on Monday July 31 2017, @06:41PM (#547288)

    hey amd, can you actually get cards on the shelves? are you going to rob peter to pay paul or will your previous gen(s) (rx4xx/rx5xx) be available too? when is this going to happen? q4 2017? q1 2018? in the year 3000?

  • (Score: 2) by tibman on Tuesday August 01 2017, @12:35AM

    by tibman (134) Subscriber Badge on Tuesday August 01 2017, @12:35AM (#547446)

    Haven't seen real benchmarks yet but it isn't looking great. We're all in "wait and see" mode so i can't be too critical. People who have already invested in a FreeSync monitor will probably get a Vega and pray to the driver gods for some efficiency gains. My thoughts on this is if AMD is this far behind right now then what happens when NVIDIA releases their new flagship card in Q1 2018?

    For how hot Vega 64 looks to be i would go liquid cooled. But then i would be thinking "damn, that's the same price as a GTX 1080 Ti !" and spend a month waffling on the purchase. I don't think this is a disaster but it looks grim : /

    --
    SN won't survive on lurkers alone. Write comments.
(1)