Stories
Slash Boxes
Comments

SoylentNews is people

posted by azrael on Friday July 11 2014, @09:01PM   Printer-friendly
from the will-it-run-nethack dept.

EGC has just released their Best Gaming CPUs of 2014 and while there are no real surprises (high end is i7, mid is a mix, low end has AMD being the better buys) my question would be...does it really matter anymore with CPUs so insanely powerful for several generations now?

Pretty much any mainstream game plays well with even 5 year old Phenom II X4s and C2Qs with tests by Tom's Hardware showing that both AMD and Intel able to stay above 30 FPS on most games so do we really need to care about CPUs when it comes to gaming anymore?

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by turgid on Friday July 11 2014, @09:19PM

    by turgid (4318) Subscriber Badge on Friday July 11 2014, @09:19PM (#67841) Journal

    My favourite game is called "compiling the C program."

    I am too old and grumpy to play games and I don't have Windows.

    However, I've always found AMD CPUs to be best, better than any pro-intel hype. Over the years I've used many CPUs including intel i7s with 4 cores/2 threads per core.

    At home I "downgraded" from a 3.0GHz AMD Phenom II X4 to a 2.7GHz AMD Phenom II X6 (with "turbo core" which boosts up to two of them to 3.2GHz when it's not busy).

    The AMD stuff has always been more technologically advanced, from faster front-side busses, more instructions-per-clock, more-ways superscaler, cross-point switches vs. buses, hyper transport, 64-bit 3 years before intel, better 64-bit IO, better SIMD, better performance on binaries compiled for previous chip architectures...

    Over the years, the lawyers have nailed down the reporting of bench marks. Also, people have become more wishy-washy and less likely to compile software themselves. One of the great advantages of FOSS was always that you could compile applications to suit the specific version and implementation of CPU that you had (see Gentoo Linux or the extreme example).

    They always used to say that if you were very particular about performance. you should test the particular system you wanted with the software you were interested in using.

    • (Score: 0) by Anonymous Coward on Friday July 11 2014, @09:22PM

      by Anonymous Coward on Friday July 11 2014, @09:22PM (#67843)

      Turgid indeed

    • (Score: 2) by kaszz on Friday July 11 2014, @09:30PM

      by kaszz (4211) on Friday July 11 2014, @09:30PM (#67847) Journal

      AMD tend to have way less on-chip cache than Intel which is experienced when synthesizing FPGA cores.

    • (Score: 1) by Joepublic on Saturday July 12 2014, @02:53AM

      by Joepublic (4488) on Saturday July 12 2014, @02:53AM (#67952)

      Bullshit. My Core i5 kicks the shit out your AMD garbage, even with an extra 2 cores. I'm talking 600Mhz higher clocks and 40+% higher IPC with 20-30% less energy usage.

      • (Score: 2) by Hairyfeet on Saturday July 12 2014, @10:40AM

        by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Saturday July 12 2014, @10:40AM (#68045) Journal

        You are being had friend, look up "Intel cripple compiler" to see why the benches are no better than quack.exe when it comes to judging real world performance. If you were to compare yours to his using a program compiled with GCC you'd find that at best you'd only be 15% faster, and that is if you have the top o' the line, while saving around 12% power (Intel bases their power on a theoretical load, AMD based on real world tests) but you paid as much as 150% more than him and if you went i7 we'd be talking 200%+.

        Sorry friend but unless you are running wave simulations or other software which requires every possible MHz boost AND have high electric bills in your area? You will never make up the cost difference in the trivial amount of power that you save.

        --
        ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
      • (Score: 0) by Anonymous Coward on Saturday July 12 2014, @12:48PM

        by Anonymous Coward on Saturday July 12 2014, @12:48PM (#68079)

        No. No it doesn't. Phenom II X6 blows Core i5 chips right out of the water on heavy work like program compilation. You sound butthurt. Get over it.

    • (Score: 2, Insightful) by nyder on Saturday July 12 2014, @06:35AM

      by nyder (4525) on Saturday July 12 2014, @06:35AM (#67999)

      You are grumpy because you don't game. Getting old is just an excuse.

      I recently bought a Nintendo 2DS and was bummed to find out my eyesight isn't as good as it used to be. Did I get grumpy? Did I go compile some code and say, "Get off my gaming lawn!"?? Hell no. I did, to keep with the lawn theme, play some Plants vs Zombies on my computer. I bought some reading glasses (heck ya, I'm getting old now!!!) and can play my 2DS just great.

      I will never stop gaming, it keeps me young.

      And I'm not knocking AMD chips, that make nice ones, but when it comes to gaming, you are pushing a lot of data thru the buses. And Intel chips are generally faster at that. This is why Intel has held the gaming CPU crown forever.

      Does it matter as much today? Not really. To rich hardcore gamers? yes, it matters.

      And compiling software is a game in itself. =)

  • (Score: 1) by richtopia on Friday July 11 2014, @09:54PM

    by richtopia (3160) on Friday July 11 2014, @09:54PM (#67861) Homepage Journal

    What drives CPU selection for me now is a few things:

    Heat - Aiming for a silent pc
    Features - things like virtualization
    Socket - as mainstream and as modern as possible (trying to provide upgrade path if needed)

    And most importantly:
    What is on sale/ what mobo is on sale.

    For example, my current i5 was selected two thanksgivings ago because it is mainstream, on sale, and a mobo with crossfire was on sale (I like crossfire as an upgrade path, a decent video card today is a cheap ebay purchase and ~60% improvement later)

    Now that is for a desktop with gaming being the standard difficult task. Mobile is a different story, but is largely driven by the graphics selection.

    • (Score: 1) by richtopia on Friday July 11 2014, @10:06PM

      by richtopia (3160) on Friday July 11 2014, @10:06PM (#67867) Homepage Journal

      I lied about crossfire, I have ~50% improvement.

      Here is a futuremark comparison of my crossfire enabled/disabled if you are so interested:
      http://www.3dmark.com/compare/3dm11/5332480/3dm11/5332221 [3dmark.com]

    • (Score: 2) by tibman on Friday July 11 2014, @10:50PM

      by tibman (134) Subscriber Badge on Friday July 11 2014, @10:50PM (#67885)

      A 120mm fan on a radiator is all you need for the cpu. I will never air-cool again, it is far too loud : )

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 1) by richtopia on Friday July 11 2014, @10:57PM

        by richtopia (3160) on Friday July 11 2014, @10:57PM (#67886) Homepage Journal

        Liquid cooling is all fine and dandy for the CPU; kits available are reasonably priced and completely enclosed making them easy to install.

        However, in my experience (we are talking gaming PC here) it is the GPU that really needs help cooling, and the aftermarket GPU heatsinks just don't light the world on fire - or maybe they will which is the issue.

        I actually did mount an extra 120mm fan on the side of my case blowing directly on my video cards. I don't think I can find a better solution aside from upgrading to a new card. I just haven't found a game that really demands faster than my 2x ATI Radeon 5770s.

        • (Score: 2) by tibman on Saturday July 12 2014, @04:08PM

          by tibman (134) Subscriber Badge on Saturday July 12 2014, @04:08PM (#68139)

          I completely agree. But the one thing i have noticed is the gpu fan is aggressively throttled down when idling. It can easily turn into a turbine when playing games for an hour in a warm room though : /

          --
          SN won't survive on lurkers alone. Write comments.
      • (Score: 2) by meisterister on Friday July 11 2014, @11:54PM

        by meisterister (949) on Friday July 11 2014, @11:54PM (#67901) Journal

        The only reason I go with a noisy massive air cooler is that I honestly expect to be using either the same computer or the same heatsink five years from now. Otherwise, water is flipping awesome from what I've heard.

        --
        (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
      • (Score: 0) by Anonymous Coward on Saturday July 12 2014, @12:52PM

        by Anonymous Coward on Saturday July 12 2014, @12:52PM (#68081)

        Water cooling still requires a fan and still makes as much noise as air cooling with a similar dimensional profile. You're also exposed to the risk that a leak will happen, which is almost guaranteed to damage your computer hardware. Water cooling is for ignorant computer ricers and Tom's Hardware types who like to waste money on putting the computer equivalent of big spoilers and no-purpose hood scoops in their cases.

        • (Score: 2) by tibman on Saturday July 12 2014, @04:06PM

          by tibman (134) Subscriber Badge on Saturday July 12 2014, @04:06PM (#68138)

          You're wrong and i'll tell you why. You typically use a larger (120mm) fan on the radiator and it will spin at a much much lower speed. I have been water-cooling since 2001 and never had a leak. A water cooling kit these days is cheaper than a good air-cooler too. It's only computer ricing if you have lights and a window. I think i have addressed all of your garbage comment : )

          --
          SN won't survive on lurkers alone. Write comments.
    • (Score: 0) by Anonymous Coward on Saturday July 12 2014, @12:04AM

      by Anonymous Coward on Saturday July 12 2014, @12:04AM (#67905)

      > Heat - Aiming for a silent pc

      I can't hear my fan run over the whine of my video card. What's up with that? GeForce GTX 770 FWIF.

    • (Score: 2) by Hairyfeet on Saturday July 12 2014, @12:34AM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Saturday July 12 2014, @12:34AM (#67916) Journal

      I have a PII X6 that is going on 5 years old now and it plays games great and with a coolermaster N520 you hear the HDDs before you ever hear the CPU. I used to be the guy that built a new PC every other year (with CPU upgrade in the middle) but I really see no point as the games just don't seem to be CPU bound much anymore.

      But as I said in TFA you can take a CPU from 5 years ago like a Phenom II X4 or C2Q and paired with a decent mainstream GPU like the HD7790 and play pretty much every game out there above 30 FPS so I really think the chips have gotten so insanely overpowered that most folks simply won't have to replace their chips every other year to play games like we used to, with so many cores its really got more power than most games will need any time soon.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
  • (Score: 1) by Ethanol-fueled on Friday July 11 2014, @10:48PM

    by Ethanol-fueled (2792) on Friday July 11 2014, @10:48PM (#67884) Homepage

    The summary made me feel all warm and fuzzy inside, because games are traditionally thought of as the only reason to be up to date with the latest and fastest PC hardware (which I always thought was bullshit because a 1GHZ Athlon chip was enough to handle more than 8 tracks of heavily-processed audio recorded in full-duplex mode). In many cases, newer consoles for example, hardware upgrades (buying the console) are mandatory because franchises like Metal Gear Solid are exclusive to certain consoles.

    There's been a hipster revolution with classic gaming -- for example, it's not good enough to have the original NES console, no, it now must also must be played on a CRT television for ultimate geek cred. What if we will see a hipster revolution with Unreal-era gaming as played on original hardware, with "vintage" SoundBlaster sound cards, using joysticks for flight sims, all viewed with a high-end Sony/Silicon Graphics [cfusion.com] CRT monitor bought separately?

    Okay, scratch that last point -- Those things may be considered "hipster" to young adults, but they're still business as usual to a lot of you who probably still play Doom on your cream-colored 486DX66 boxes with the "turbo" switches. Bahaha.

  • (Score: 0) by Anonymous Coward on Friday July 11 2014, @11:30PM

    by Anonymous Coward on Friday July 11 2014, @11:30PM (#67893)

    As a quick note: My Phenom II X4 810 can't keep up with my R9 270X when playing Space Engineers, which throttles the framerate significantly...

    • (Score: 0) by Anonymous Coward on Saturday July 12 2014, @12:54PM

      by Anonymous Coward on Saturday July 12 2014, @12:54PM (#68083)

      Most boards that support a Phenom II don't support PCI Express 3.0, you know.

  • (Score: 3, Insightful) by meisterister on Friday July 11 2014, @11:32PM

    by meisterister (949) on Friday July 11 2014, @11:32PM (#67894) Journal

    For the most part, there's no real reason to get an Uber-Haxxorz CPU for gaming now. The best way of selecting a CPU really is by feature set. For example, I picked my 8350 because I'm writing/using mostly multithreaded integer software. I likely would've bought an i7 if I cared that much about floating point performance. The fact that my CPU is pretty good for games is just an extra bonus.

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
  • (Score: 2) by dbot on Saturday July 12 2014, @11:31AM

    by dbot (1811) on Saturday July 12 2014, @11:31AM (#68056) Journal

    IANAG (Minus Civ III IV V, and the new linux space one hopefully!) - so this might not apply to me, but when I've been spec'ing out systems the main thing I'm looking for is Premium Features.

    That is, AMD doesn't charge extra for: virtualization, ecc, others.

    With Intel, there is deliberate crippling (i.e. that they must go out of their way to disable) to create market segmentation that is totally artificial. The fact that they've been toying with 'software upgrades/unlocking', just goes to show - the pricing model they have set up already includes enough profit to sell better parts at lower pricing at a good enough profit. Defective by design. Then there's the whole anti-trust side of things. Their whole operation just stinks, and doesn't match with my values - 'cause I try, I really do.

    But yeah, my frame rate is pretty good - I mean, that's what we're talking about right?

  • (Score: 2, Interesting) by Hyperturtle on Saturday July 12 2014, @06:33PM

    by Hyperturtle (2824) on Saturday July 12 2014, @06:33PM (#68193)

    I think anyone that plays Dwarf Fortress can agree that there are sometimes good reasons to open the case, overclock the CPU, set up elaborate fans, and design a new computer around getting 5 more fps. A good recent reason is the release of the new version, which now requires even more resources.

    It is still very processor dependent, although strides have been made to split off various tasks (splitting off video makes little difference since the graphics are not taxing, but it helps).

    I've found that having a ramdisk to store the game is very helpful in loading and saving, in addition to excessively customizing my desktop over the years. I don't need another reason to rationalize an upgrade, but DF provides a compelling example of the benefits of CPU upgrades...