Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by martyb on Tuesday April 11 2017, @08:43AM   Printer-friendly
from the gooder-faster-cheaper dept.

Some Soylentils were disappointed by the gaming performance of AMD's Ryzen CPUs when they were launched last month. By now, updates have eliminated some of the advantage that Intel CPUs had, but the potential gains differ depending on the game:

The first big Ryzen patch was for Ashes of the Singularity. Ryzen's performance in Ashes was arguably one of the more surprising findings in the initial benchmarking. The game has been widely used as a kind of showcase for the advantages of DirectX 12 and the multithreaded scaling that it shows. We spoke to the game's developers, and they told us that its engine splits up the work it has to do between multiple cores automatically.

In general, the Ryzen 1800X performed at about the same level as Intel's Broadwell-E 6900K. Both parts are 8-core, 16-thread chips, and while Broadwell-E has a modest instructions-per-cycle advantage in most workloads, Ryzen's higher clock speed is enough to make up for that deficit. But in Ashes of the Singularity under DirectX 12, the 6900K had average frame rates about 25 percent better than the AMD chip.

In late March, Oxide/Stardock released a Ryzen performance update for Ashes, and it has gone a long way toward closing that gap. PC Perspective tested the update, and depending on graphics settings and memory clock speeds, Ryzen's average frame rate went up by between 17 and 31 percent. The 1800X still trails the 6900K, but now the gap is about 9 percent, or even less with overclocked memory (but we'll talk more about memory later on).


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Hairyfeet on Tuesday April 11 2017, @09:10AM (13 children)

    by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Tuesday April 11 2017, @09:10AM (#492195) Journal

    You have a $500 chip that is within 25% of a $1100 chip? I'd call that a super duper touchdown of a win. And who in the hell is recommending an Intel $1100 8 core as a gaming chip? Oh right NOBODY because its a workstation CPU,most games don't even do 4 threads these days, are you REALLY buying a 16 thread CPU for gaming? Why?

    If you want a gaming chip? Get the R5 or R3, those are mainstream chips. The R7 is a workstation CPU that gamers just so happen to be buying simply because its close to the performance of an $1100 Intel CPU at less than half the price.

    --
    ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Informative) by Anonymous Coward on Tuesday April 11 2017, @10:01AM (1 child)

    by Anonymous Coward on Tuesday April 11 2017, @10:01AM (#492205)

    Anything without ECC *IS NOT* a workstation chip. It may be classed as 'workstation class performance', but it is not sufficient for a workstation chip. The usualy standard would be dual socket is workstation, quad and above with single system image is server grade. Neither class has qualified without at least parity, before ECC became popular, and during that period, pretty much everything had parity anyways.

    Going off that categorization, the 500 dollar AMD chip is actually a *BETTER* purchase, since it actually supports ECC *AND* overclocking, which on the Intel side is an impossible combination, outside of engineering samples or very unique limited production run chips like the G3258, which AFAIK was the only one off chip to ever contain both features in a single SKU since they moved the memory controller onto the die.

    • (Score: 1, Funny) by Anonymous Coward on Tuesday April 11 2017, @05:40PM

      by Anonymous Coward on Tuesday April 11 2017, @05:40PM (#492376)

      Unless you're prophesizing the imminent arrival of enthusiast-grade (i.e. overclockable) ECC memory, you don't appear to have thought this entirely through.

  • (Score: 2) by ledow on Tuesday April 11 2017, @10:44AM (4 children)

    by ledow (5567) on Tuesday April 11 2017, @10:44AM (#492210) Homepage

    So long as you recompile every piece of software you have, and with a compiler that optimises towards that architecture, and assuming that this isn't "recoding" to make it better for that chip, but just a simple recompilation (hint: It's more than a recompilation, it's a re-optimisation for one particular class of chips from the raw code).

    Notice how this is for ONE GAME. Which is also the poster-child app for this, that's mentioned whenever someone has discussed Ryzen since its launch.

    It's not quite as clear-cut as "this processor is faster and cheaper".
    In fact, it's much more like "if developers want to target our one specific cheaper chip explicitly, they can almost get close to the performance of the rival market-leading chips for a lot of directed effort, which is effort that doesn't translate to performance improvements for those other chips".

    Given that you want gamers to buy this chip, that's probably not the best scenario. Studios will have to choose to support it, put effort into refactoring cost to use it, use specific compilers for compiling it, and push it out as a binary that is able to take advantage of it (Would it require two binaries? One optimised for other chips, and one for Ryzen? Possibly, depending on how you manage things).

    • (Score: 0) by Anonymous Coward on Tuesday April 11 2017, @10:55AM

      by Anonymous Coward on Tuesday April 11 2017, @10:55AM (#492211)

      What are you replying to? It's as if the parent had said "25% better" performance instead of "within 25%".

    • (Score: 1, Funny) by Anonymous Coward on Tuesday April 11 2017, @01:27PM

      by Anonymous Coward on Tuesday April 11 2017, @01:27PM (#492239)

      No problem! I'm a Gentoo user!

    • (Score: 3, Informative) by tibman on Tuesday April 11 2017, @03:00PM

      by tibman (134) Subscriber Badge on Tuesday April 11 2017, @03:00PM (#492283)

      It's not an across the board thing. Some games play better on Ryzen and some don't. Ashes of the Singularity didn't and it's one of the few DX12 games so it sort of stuck out.

      No special compiler required. No special binary required. For ashes it was a recompile with a newer version of visual c++ because there was a bug in the code emitted from the older compiler.

      --
      SN won't survive on lurkers alone. Write comments.
    • (Score: 0) by Anonymous Coward on Tuesday April 11 2017, @03:37PM

      by Anonymous Coward on Tuesday April 11 2017, @03:37PM (#492308)

      Or you write a reusable library that encapsulates the differences into a single API. That's CS-101 stuff. Heck, AMD would probably pay devs to do it for AAA titles.

  • (Score: 3, Insightful) by tibman on Tuesday April 11 2017, @03:22PM (2 children)

    by tibman (134) Subscriber Badge on Tuesday April 11 2017, @03:22PM (#492298)

    I care and here's why. AMD fans who have been waiting over a year for Ryzen (and hopefully saving $$ for it) bought the very first available processor, which is the R7 1800(x). That was six weeks ago. Since then we've had bios tweaking, windows (power) tweaking, and now finally some game tweaking. In the beginning everyone was talking about how it's a new platform and it'll take time for things to settle. This AotS update is just part of the proof that there are substantial gains available from these updates/tweaks.

    Calling the R7 a workstation CPU sounds like you're indirectly saying it's not as good at games. If a gamer's budget is R5 then great, get that. If the budget can fit an R7 then go for that instead. Though normally the best advice is to go for an R5 and spend the extra 200$ on the GPU. I think that's bad advice right now because AMD doesn't sell a high-end GPU : ( Running an RX480 will only cost you 200-250$ and that's good enough to play every game out there. I've been playing through doom at 100+ FPS on an ASUS RX480. If you went NVIDIA then sure, get a nice 500$ GPU. Those of us with FreeSync monitors are waiting for AMD to release a high-end GPU.

    --
    SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by Marand on Tuesday April 11 2017, @09:50PM

      by Marand (1081) on Tuesday April 11 2017, @09:50PM (#492476) Journal

      Though normally the best advice is to go for an R5 and spend the extra 200$ on the GPU. I think that's bad advice right now because AMD doesn't sell a high-end GPU

      Yeah, I went with the "get the R7 now and upgrade GPU later" logic. I don't regret getting the R7 instead of waiting for the R5, though I do kind of regret that I'm still dealing with the nvidia card I had. It's being really fucking weird with this system, works fine from a cold boot but won't send a signal to the displays after a soft reset. It's also not the first problem I've had with with the card, which has always done strange things if all four inputs (2x DVI, HDMI, DP) are used simultaneously, even before the new hardware.

      I've had enough problems with nvidia lately that I'm most likely going to bite the bullet and go AMD for my next GPU. I was already getting annoyed at certain things nvidia has done, like taking out linux driver features because Windows doesn't have them, and crippling certain OpenGL calls at the driver level depending on what GPU it detects, but the strong Linux support made it hard to justify a switch. So, I kept using nvidia GPUs while hoping AMD's linux support would improve, but after this last card I think I'd rather deal with driver issues. At least drivers can be updated.

    • (Score: 2) by Hairyfeet on Wednesday April 12 2017, @05:08AM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Wednesday April 12 2017, @05:08AM (#492624) Journal

      Did I say anywhere it couldn't be used as a gaming chip? Its just not what it was designed for no different than how nobody takes the newest $1100 Intel CPU and claims its a gaming CPU, in fact the previous generation i5 has been shown to beat the top of the line Kaby Lake i7 in most games by a decent amount.

      What I did say is its pointless as a gaming CPU simply because nowhere in the near future are we gonna have 16 thread games in all honesty I'd be amazed if we are even up to 8 threads by 2020 what with so many games as of 2017 only using a couple of threads. But ya know what has no problem using all 16 threads? Workstation loads! Things like multitrack audio editing and video editing (frankly the only time my FX-8320e doesn't have half its cores parked from lack of things to do, in fact I don't think I've played a game on my system yet where half my cores aren't parked) and DB processing and a bazillion other workloads one builds a workstation for and for those tasks? Frankly the R7 BEATS the $1100 Intel CPU as it has all the features you'd want in a workstation class chip like support for ECC and virtual machines.

      Now if you want to buy an R7 1800x for gaming? Go right ahead but you are buying the equivalent of a top fuel funny car to do your grocery shopping, I seriously doubt even half the threads are ever gonna be used. IMHO the better option would be to buy the R5 (which you can buy right now, several retailers have jumped the gun and put their stock for sale the second they got their mitts on 'em) and use the savings to buy a better GPU which will give you a better all around experience, but if you just have money to burn and want it just to say you have 16 threads? Hell there are guys out there running SLI Titans in 1080p, seems kinda pointless but its your money, blow away.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
  • (Score: 2) by Marand on Tuesday April 11 2017, @09:22PM (2 children)

    by Marand (1081) on Tuesday April 11 2017, @09:22PM (#492466) Journal

    You have a $500 chip that is within 25% of a $1100 chip? I'd call that a super duper touchdown of a win.

    Yeah, that's precisely why I went with Ryzen. So what if it's not beating the $1100 chip? Even the R7 1700 is close enough to competitive, at ⅓ the price. That's the one I got, in fact; ~300 USD price, lower TDP (65w), and still ridiculously overclockable, though I haven't felt a need to OC it so far.

    Even if a similarly-priced Intel chip is a little better for games, the Ryzen chips are better overall until you get into the ludicrously expensive ones. Not just that, but AMD doesn't lock away useful features behind obscure SKUs at higher price points, so the Ryzen chips and boards will have things like IOMMU support and can use ECC RAM, while the similarly-priced Intel options won't. So unless you're building a machine only for games, it's a worthwhile trade-off.

    most games don't even do 4 threads these days, are you REALLY buying a 16 thread CPU for gaming? Why?

    To add to this, some games even crash if you have too many threads available. I know Civilization 5 has this problem for >8 threads regardless of CPU (Intel or AMD), and I've run into stability issues with some other games that may also be related. Still, I'd argue that the high thread count will be a benefit for gaming because of two things: 1) Upcoming games will [hopefully] start using more of the threads and 2) Other things on the system should be less disruptive to running games with more threads. Even if #1 never happens, #2 can be beneficial to a gamer due to streaming or just avoiding stutter caused by the OS or other running applications.

    My opinion is Ryzen 7 is a damn good choice if you want a system that's useful for games and for doing something productive. Sometimes even simultaneously! If I want to play a game while waiting on something CPU-intensive to finish, I can. I suppose if I only cared about making a wintendo, Ryzen 5 might have been a better cost-cutting option, but I'm happy with the R7 1700 so far. I definitely don't feel like I'd have been better off buying Intel right now.

    Intel's been in a comfortable place for a while now because AMD's gamble with the Bulldozer stuff didn't quite pay off, but they're not going to be able to remain lazy because AMD's fighting back again. I'm happy this is the case because I dislike Intel's business practices, so I prefer it when there's viable competition.

    • (Score: 2) by Hairyfeet on Tuesday April 11 2017, @10:14PM (1 child)

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Tuesday April 11 2017, @10:14PM (#492495) Journal

      Actually I have to wonder how much of "bulldozer not paying off" was due to the Intel Cripple Compiler [theinquirer.net] because I have an FX-8320e and use a lot of GCC compiled programs like Audacity and even on super heavy workloads like applying audio effects across multiple tracks the performance is just superb, and many of the games I play (like War thunder) are made by overseas developers that likewise don't use ICC and I'm getting 90+ FPS while having everything cranked up to movie settings. And considering I only paid a hair over $600 for the system with 5Tb of storage with 240gb SSD for boot, R9 280, 16Gb of RAM, gamer board with quad SLI/CF support and a BR drive for data backups? I sure as hell ain't complaining and could easily see myself using this system through 2020 and even possibly later.

      But the whole "ZOMFG its not beating teh Intel on gamez ZOMFG!" smells like a little bit of Intel PR spin to me, especially considering we saw in the antitrust transcript that Intel had employed that tactic before with their "advertising partnerships" where review sites got million dollar checks to advertise Intel chips as long as they sang the Intel gospel. I mean we are talking about a chip that gets within 25% of an $1100 CPU at less than half the price why the fuck isn't every site running with that as the headline? You can literally build an ENTIRE Ryzen system for less than the price of the Intel chip alone and you are getting 75%+ of the performance? Who doesn't think that is just insanely good, especially when Ryzen doesn't have ANY of the important features removed like ECC, OCing ability, and support for hardware VMs unlike what Intel does?

      Like I said Ryzen 7 is complete overkill for games but I can see why gamers are snatching them up, I mean 16 threads with 75% of the performance of an $1100 chip for just $500? That is just nuts and I'm sure Ryzen will take the place of the FX-8 as the chip for streamers and those of us who record our gameplay footage but the fact that the mainstream press is completely ignoring all the positives to focus on a single negative that with any thought shows isn't really a negative given the price? The whole thing just smells funny to me.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
      • (Score: 2) by Marand on Wednesday April 12 2017, @12:20AM

        by Marand (1081) on Wednesday April 12 2017, @12:20AM (#492541) Journal

        Actually I have to wonder how much of "bulldozer not paying off" was due to the Intel Cripple Compiler

        I'm sure that didn't help, but I'm referring to how AMD gambled on integer-heavy workloads by having each "module" essentially be two cores for integer work but only one for floating point. That didn't pay off in some cases, and on top of that, they also gambled on multi-core workloads but games were really slow to make the switch. The architecture is nearly six years old now, and most games still don't use more than four at best. The multi-core bet is finally starting to pay off (albeit too late for bulldozer at this point), but the int/fp one never really did.

        As for the whole ICC bullshit, it's one of the things I had in mind when I said I dislike Intel's business practices. The FTC busting them for it didn't even change anything, because all they did was add a disclaimer on the site stating that ICC might not be generating good executables for non-Intel CPUs. Since then the same shit happened again against ARM [theregister.co.uk], this time by having ICC skip certain instructions used in a benchmark when the CPU was an Atom one only.

        But the whole "ZOMFG its not beating teh Intel on gamez ZOMFG!" smells like a little bit of Intel PR spin to me, especially considering we saw in the antitrust transcript that Intel had employed that tactic before

        Could be shilling, tribalism, or just people getting caught up in "LONGER BARS IS BETTERS!" dick-waving. Any are likely.

        One thing I've noticed is that the people ragging on Ryzen performance in games are conveniently ignoring that, even when Ryzen loses in average or maximum FPS, it tends to have fewer lows as well. That's not as brag-worthy as throwing around bigger numbers, but can be a smoother experience overall and shouldn't be ignored. A few reviews have mentioned it, at least, so it hasn't been completely left out.

        I mean we are talking about a chip that gets within 25% of an $1100 CPU at less than half the price why the fuck isn't every site running with that as the headline?

        I'd wager they don't put that directly in the title because then you'd have no reason to read the rest of the article. To be a fair, most of the reviews I've seen so far have essentially stated that as the final conclusion, and I remember one even saying that buying a Ryzen 7 is basically like getting the $1000 Intel chip plus a really nice GPU for free. (e.g. "Buy the Ryzen 7 and a nice GPU instead.")

        Overclocking wasn't touched on much in the initial reviews, but it's been getting a lot more attention since, and I've even seen it come up in a couple R5 reviews. I think the general consensus with the R7s, after the initial day-1 reviews, has been that the 1700 is the best bang-for-your-buck because it's the lowest price point and still generally overclockable to ~4ghz, even with the Wraith Spire fan that comes with it.

        I didn't even know about the OCing when I chose the 1700, I was just interested in it for the lower TDP and figured that, based on the benchmarks I'd seen, its performance was close enough to the 1700x and 1800x that I wouldn't feel like I'd be missing out. Finding out that I can OC it well if I get the itch was just a nice post-purchase bonus. :D

        but the fact that the mainstream press is completely ignoring all the positives to focus on a single negative that with any thought shows isn't really a negative given the price? The whole thing just smells funny to me.

        I've seen a couple reviews focus heavily on that, but I've seen just as many have either argued that it's mostly irrelevant because everything tested is fast enough, or that it's a good trade-off for most people. Even the ones that emphasise the Ryzen gaming benchmarks and treat it negatively seem to largely be claiming that Ryzen is a return to competitiveness for AMD. Most of the damage control seems to be coming from the comment sections, where a handful of people focus entirely on the gaming benchmarks and nothing else.

        Repeating what I said a few paragraphs ago, this could just as likely be tribalism or people getting too hung up on numbers with no regard for context. That's a trap you can see reviewers fall into often because they review so many products, usually mostly good, and end up nitpicking over things that won't matter most of the time as a way to differentiate the products. Smartphone reviews have had a problem with this for a while, for example, where an otherwise good phone gets slagged for having a slightly worse camera than some other model, or only having a 400ppi screen instead of a 420ppi one, etc.