Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday May 30 2017, @02:35PM   Printer-friendly
from the moah-fasteh dept.

Recently, Intel was rumored to be releasing 10 and 12 core "Core i9" CPUs to compete with AMD's 10-16 core "Threadripper" CPUs. Now, Intel has confirmed these as well as 14, 16, and 18 core Skylake-X CPUs. Every CPU with 6 or more cores appears to support quad-channel DDR4:

Intel CoreCores/ThreadsPrice$/core
i9-7980XE18/36$1,999$111
i9-7960X16/32$1,699$106
i9-7940X14/28$1,399$100
i9-7920X12/24$1,199$100
i9-7900X10/20$999$100
i7-7820X8/16$599$75
i7-7800X6/12$389$65
i7-7740X4/8$339$85
i7-7640X4/4$242$61 (less threads)

Last year at Computex, the flagship Broadwell-E enthusiast chip was launched: the 10-core i7-6950X at $1,723. Today at Computex, the 10-core i9-7900X costs $999, and the 16-core i9-7960X costs $1,699. Clearly, AMD's Ryzen CPUs have forced Intel to become competitive.

Although the pricing of AMD's 10-16 core Threadripper CPUs is not known yet, the 8-core Ryzen R7 launched at $500 (available now for about $460). The Intel i7-7820X has 8 cores for $599, and will likely have better single-threaded performance than the AMD equivalent. So while Intel's CPUs are still more expensive than AMD's, they may have similar price/performance.

For what it's worth, Intel also announced quad-core Kaby Lake-X processors.

Welcome to the post-quad-core era. Will you be getting any of these chips?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by zocalo on Tuesday May 30 2017, @03:27PM (10 children)

    by zocalo (302) on Tuesday May 30 2017, @03:27PM (#517686)
    I wouldn't class video processing a "really special application", yet it's a fairly popular usage case that can - with the right software - absolutely benefit mainsteam PC users that have a larger number of CPU cores and a good GPU (or two), especially given the rise in popularity of 4K footage from action cams and drones. Along with GPU, storage and memory upgrades, I've gone from dual-core to quad-core to octa-core, and I've still not managed to hit a point where I can't max out every single core of the CPU during a high-res rendering process - even with GPU based rendering assistance. Nothing stymies the creative process quicker than having to sit and wait so, while I'm not going to be upgrading immediately, I'm definitely looking forwards to seeing what kind of capacity for higher preview resolutions and how close to real-time final output rendering I can achieve with one of these chips when I do.
    --
    UNIX? They're not even circumcised! Savages!
    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by ikanreed on Tuesday May 30 2017, @04:18PM (6 children)

    by ikanreed (3164) Subscriber Badge on Tuesday May 30 2017, @04:18PM (#517722) Journal

    And of course, there's video games. You know, one of the primary reasons people get fancier, more powerful, and substantially more expensive computers. If you can't think of how graphics pipeline, engine pipeline, resource loading pipeline, and AI state updates might all be on different threads/cores, you're not very creative.

    • (Score: 2) by EvilSS on Tuesday May 30 2017, @05:10PM

      by EvilSS (1456) Subscriber Badge on Tuesday May 30 2017, @05:10PM (#517764)
      These won't do much for games over their lower-core siblings though. Most games won't use much over 4 cores. However they would be useful to people who play PC games and stream to youtube or twitch. Live encoding on top of gaming can bottleneck most desktop CPUs. Although the AMD CPUs might be at a better price point for that use.
    • (Score: 2) by zocalo on Tuesday May 30 2017, @05:23PM (4 children)

      by zocalo (302) on Tuesday May 30 2017, @05:23PM (#517774)
      Are there any video games that can actually make use of this many CPU cores though (as opposed to GPU pipelines, in which case the sky is probably still the limit)? Sure, it's becoming the norm to to split a complex game engine up into multiple parallel executing threads to take more advantage of modern multi-core CPUs, but there's only so many things that you can do that with before managing all the interdependant parallelism becomes more problematic that it's worth. Unlike with video processing, it's not like you're chopping up a set workload into chunks, setting the CPU to work, and don't really care when each task finishes before reassembling the processed data; almost everything is going to need to maintain at least some level of synch.

      Even assuming an MMO, where you're going to have a few more options that could be readily farmed off to a dedicated thread/core on top of the four examples of execution threads you gave (team comms, for instance), and maybe making some of the main threads (graphics and engine, perhaps?) multi-threaded in their own right, you might get up towards 8-10 threads, but 18? Even allowing for a few threads left over for background OS tasks, I'm not sure we're going to be seeing many games that can take full advantage of the 18 core monster - let alone its 36 threads - for quite some time, if at all.
      --
      UNIX? They're not even circumcised! Savages!
      • (Score: 2) by ikanreed on Tuesday May 30 2017, @05:24PM

        by ikanreed (3164) Subscriber Badge on Tuesday May 30 2017, @05:24PM (#517775) Journal

        Basically only AAA games that explicitly have PC as a primary target, consoles tend to cap out at 4 cores.

        Which is to say, not a lot.

      • (Score: 2) by tibman on Tuesday May 30 2017, @06:15PM

        by tibman (134) Subscriber Badge on Tuesday May 30 2017, @06:15PM (#517804)

        Like you pointed out, most people have other programs running at the same time. It's often overlooked in PC gaming benchmarks. VOIP being the big one. I often have a browser open too. So while most games only make good use of 3-4 threads that doesn't make an 8+ thread CPU useless for gaming. Zero random lurches or stutters from other processes is nice. Six core is pretty much the next step for gamers. Eight core is trying to future proof but probably not needed (yet).

        --
        SN won't survive on lurkers alone. Write comments.
      • (Score: 0) by Anonymous Coward on Tuesday May 30 2017, @07:02PM (1 child)

        by Anonymous Coward on Tuesday May 30 2017, @07:02PM (#517822)

        Why MMO? Most of the critical game logic, aside from display and control, is run on the server.

        From my (admittedly genre-limited) experience, strategy games can make the most use of additional CPU resources. A modern strategy game AI can use as many cores as you can throw at it; here, the CPU (or sometimes the memory) is the bottleneck. Not to mention thousands or hundreds of thousands of individual units, each one requiring some low-level control routines.

        In some strategy games I've played (Star Ruler 2...), late game on larger maps can become almost completely unplayable because the CPU just can't keep up. On the other hand, I've never had graphical stutter, even on combats with thousands of units shooting lasers and exploding all over the place (like on this screenshot [gog.com]).

        TL;DR: AlphaGo is a computer controlled AI for a strategy game. Think about how many cores can it use.

        • (Score: 2) by takyon on Tuesday May 30 2017, @07:19PM

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday May 30 2017, @07:19PM (#517837) Journal

          Agreed on the MMO. Single player first-person games can have "complex" AI to the point where you can't have dozens or hundreds of NPCs on a single map without causing slowdowns (compare that amount to your strategy games... like Cossacks: Back to War), and tend to break maps up with loading zones (for example, cities in Oblivion or Skyrim). Having more cores and RAM allows more AI and stuff to be loaded, which is beneficial for single-map stealth games with no loading zones that have all of the AI loaded and patrolling at the beginning.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by kaszz on Wednesday May 31 2017, @03:35AM (2 children)

    by kaszz (4211) on Wednesday May 31 2017, @03:35AM (#518076) Journal

    I think you are doing the video processing wrong. When they did some of the later Star Wars movies. It was partly edited on a simple laptop using a low quality version. But when the edit was completed. The edit file were sent to a server cluster that rendered the real thing. No creative wait there that mattered.

    • (Score: 2) by zocalo on Wednesday May 31 2017, @11:07AM (1 child)

      by zocalo (302) on Wednesday May 31 2017, @11:07AM (#518206)
      In a professional studio production environment, absolutely (I've done this myself), but the point of the discussion - and my personal example - was for enthusiast/semi-pro users with a single high-end PC doing that CPU intensive rendering of their 4K action cam/drone footage, and for that you are drawing on the raw full-res video. I'm absolutely editing at low-res (or not so low, these days) but such users are not likely to have access to a render farm, but even if they did then it would just be an array of servers equipped with the CPUs like those above, so would still make a valid "mainstream" usage case. There are still a few editing situations where you'll want to see the full-res output though - selective corrections on part of the frame for example - so you can make sure you've not got any visual glitches at the boundaries of the edit, and that's where having the extra CPU capacity during editing can really be useful.

      I suppose if you were a really keen self employed videographer then it might be viable to offload the final rendering to the cloud, but there are a number of problems with that. Firstly, I'm not actually sure how much of a timesaving that's going to give you as you're introducing another potential bottleneck - your Internet link; my fibre internet connection is no slouch so it might work out, but my raw video footage is still typically tens of GB - and sometimes exceeds 100GB if I'm drawing on a lot of B-roll clips - so if you don't have a quick connection that may be an issue. The likely show-stopper though is software support; other than CGI and 3DFX *very* few video production tools actually support render farms in the first place, let alone cloud-based ones (including Adobe Creative Cloud's Premiere Pro, somewhat ironically), so you're either going to be farming out the work manually or using higher end tools like the AVID setup used by Disney for their Star Wars movies. Even if your software supports it, you're unlikely to find a commercial service, so you'll be rolling your own VMs in the cloud, and likely paying out a small fortune for licenses as well.
      --
      UNIX? They're not even circumcised! Savages!
      • (Score: 2) by kaszz on Wednesday May 31 2017, @05:17PM

        by kaszz (4211) on Wednesday May 31 2017, @05:17PM (#518385) Journal

        The files are not sent via the internet. They are delivered physically to the rendering farm, only the edit data is perhaps sent via the internet. As for software, I'll expect professional software to at least be able to deliver the edit file. But there's open source solutions both for the edit console and the server farm I think. The last part should at least not be too hard to get rolling.