Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday April 11 2017, @08:43AM   Printer-friendly
from the gooder-faster-cheaper dept.

Some Soylentils were disappointed by the gaming performance of AMD's Ryzen CPUs when they were launched last month. By now, updates have eliminated some of the advantage that Intel CPUs had, but the potential gains differ depending on the game:

The first big Ryzen patch was for Ashes of the Singularity. Ryzen's performance in Ashes was arguably one of the more surprising findings in the initial benchmarking. The game has been widely used as a kind of showcase for the advantages of DirectX 12 and the multithreaded scaling that it shows. We spoke to the game's developers, and they told us that its engine splits up the work it has to do between multiple cores automatically.

In general, the Ryzen 1800X performed at about the same level as Intel's Broadwell-E 6900K. Both parts are 8-core, 16-thread chips, and while Broadwell-E has a modest instructions-per-cycle advantage in most workloads, Ryzen's higher clock speed is enough to make up for that deficit. But in Ashes of the Singularity under DirectX 12, the 6900K had average frame rates about 25 percent better than the AMD chip.

In late March, Oxide/Stardock released a Ryzen performance update for Ashes, and it has gone a long way toward closing that gap. PC Perspective tested the update, and depending on graphics settings and memory clock speeds, Ryzen's average frame rate went up by between 17 and 31 percent. The 1800X still trails the 6900K, but now the gap is about 9 percent, or even less with overclocked memory (but we'll talk more about memory later on).


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by tibman on Tuesday April 11 2017, @03:22PM (2 children)

    by tibman (134) Subscriber Badge on Tuesday April 11 2017, @03:22PM (#492298)

    I care and here's why. AMD fans who have been waiting over a year for Ryzen (and hopefully saving $$ for it) bought the very first available processor, which is the R7 1800(x). That was six weeks ago. Since then we've had bios tweaking, windows (power) tweaking, and now finally some game tweaking. In the beginning everyone was talking about how it's a new platform and it'll take time for things to settle. This AotS update is just part of the proof that there are substantial gains available from these updates/tweaks.

    Calling the R7 a workstation CPU sounds like you're indirectly saying it's not as good at games. If a gamer's budget is R5 then great, get that. If the budget can fit an R7 then go for that instead. Though normally the best advice is to go for an R5 and spend the extra 200$ on the GPU. I think that's bad advice right now because AMD doesn't sell a high-end GPU : ( Running an RX480 will only cost you 200-250$ and that's good enough to play every game out there. I've been playing through doom at 100+ FPS on an ASUS RX480. If you went NVIDIA then sure, get a nice 500$ GPU. Those of us with FreeSync monitors are waiting for AMD to release a high-end GPU.

    --
    SN won't survive on lurkers alone. Write comments.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by Marand on Tuesday April 11 2017, @09:50PM

    by Marand (1081) on Tuesday April 11 2017, @09:50PM (#492476) Journal

    Though normally the best advice is to go for an R5 and spend the extra 200$ on the GPU. I think that's bad advice right now because AMD doesn't sell a high-end GPU

    Yeah, I went with the "get the R7 now and upgrade GPU later" logic. I don't regret getting the R7 instead of waiting for the R5, though I do kind of regret that I'm still dealing with the nvidia card I had. It's being really fucking weird with this system, works fine from a cold boot but won't send a signal to the displays after a soft reset. It's also not the first problem I've had with with the card, which has always done strange things if all four inputs (2x DVI, HDMI, DP) are used simultaneously, even before the new hardware.

    I've had enough problems with nvidia lately that I'm most likely going to bite the bullet and go AMD for my next GPU. I was already getting annoyed at certain things nvidia has done, like taking out linux driver features because Windows doesn't have them, and crippling certain OpenGL calls at the driver level depending on what GPU it detects, but the strong Linux support made it hard to justify a switch. So, I kept using nvidia GPUs while hoping AMD's linux support would improve, but after this last card I think I'd rather deal with driver issues. At least drivers can be updated.

  • (Score: 2) by Hairyfeet on Wednesday April 12 2017, @05:08AM

    by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Wednesday April 12 2017, @05:08AM (#492624) Journal

    Did I say anywhere it couldn't be used as a gaming chip? Its just not what it was designed for no different than how nobody takes the newest $1100 Intel CPU and claims its a gaming CPU, in fact the previous generation i5 has been shown to beat the top of the line Kaby Lake i7 in most games by a decent amount.

    What I did say is its pointless as a gaming CPU simply because nowhere in the near future are we gonna have 16 thread games in all honesty I'd be amazed if we are even up to 8 threads by 2020 what with so many games as of 2017 only using a couple of threads. But ya know what has no problem using all 16 threads? Workstation loads! Things like multitrack audio editing and video editing (frankly the only time my FX-8320e doesn't have half its cores parked from lack of things to do, in fact I don't think I've played a game on my system yet where half my cores aren't parked) and DB processing and a bazillion other workloads one builds a workstation for and for those tasks? Frankly the R7 BEATS the $1100 Intel CPU as it has all the features you'd want in a workstation class chip like support for ECC and virtual machines.

    Now if you want to buy an R7 1800x for gaming? Go right ahead but you are buying the equivalent of a top fuel funny car to do your grocery shopping, I seriously doubt even half the threads are ever gonna be used. IMHO the better option would be to buy the R5 (which you can buy right now, several retailers have jumped the gun and put their stock for sale the second they got their mitts on 'em) and use the savings to buy a better GPU which will give you a better all around experience, but if you just have money to burn and want it just to say you have 16 threads? Hell there are guys out there running SLI Titans in 1080p, seems kinda pointless but its your money, blow away.

    --
    ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.