Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday June 02 2016, @12:19PM   Printer-friendly
from the chipping-away-at-the-market dept.

A lot of CPU news is coming out of Computex 2016.

Intel has launched its new Broadwell-E "Extreme Edition" CPUs for "enthusiasts". The top-of-the-line model, the i7-6950X, now includes 10 cores instead of 8, but the price has increased massively to around $1,723. Compare this to a ~$999 launch price for the 8-core i7-5960X or 6-core i7-4960X flagships from previous generations.

Intel has also launched some new Skylake-based Xeons with "Iris Pro" graphics.

AMD revealed more details about the Radeon RX 480, a 14nm "Polaris" GPU that will be priced at $199 and released on June 29th. AMD intends to compete for the budget/mainstream gamer segment falling far short of the $379 launch price of a GTX 1070, while delivering around 70-75% of the performance. It also claims that the RX 480 will perform well enough to allow more gamers to use premium virtual reality headsets like the Oculus Rift or HTC Vive.

While 14nm AMD "Zen" desktop chips should be coming later this year, laptop/2-in-1/tablet users will have to settle for the 7th generation Bristol Ridge and Stoney Ridge APUs. They are still 28nm "Excavator" based chips with "modules" instead of cores.


Original Submission

Related Stories

Nvidia Unveils GTX 1080 and 1070 "Pascal" GPUs 20 comments

Nvidia revealed key details about its upcoming "Pascal" consumer GPUs at a May 6th event. These GPUs are built using a 16nm FinFET process from TSMC rather than the 28nm processes that were used for several previous generations of both Nvidia and AMD GPUs.

The GeForce GTX 1080 will outperform the GTX 980, GTX 980 Ti, and Titan X cards. Nvidia claims that GTX 1080 can reach 9 teraflops of single precision performance, while the GTX 1070 will reach 6.5 teraflops. A single GTX 1080 will be faster than two GTX 980s in SLI.

Both the GTX 1080 and 1070 will feature 8 GB of VRAM. Unfortunately, neither card contains High Bandwidth Memory 2.0 like the Tesla P100 does. Instead, the GTX 1080 has GDDR5X memory while the 1070 is sticking with GDDR5.

The GTX 1080 starts at $599 and is available on May 27th. The GTX 1070 starts at $379 on June 10th. Your move, AMD.


Original Submission

Intel Announces 4 to 18-Core Skylake-X CPUs 31 comments

Recently, Intel was rumored to be releasing 10 and 12 core "Core i9" CPUs to compete with AMD's 10-16 core "Threadripper" CPUs. Now, Intel has confirmed these as well as 14, 16, and 18 core Skylake-X CPUs. Every CPU with 6 or more cores appears to support quad-channel DDR4:

Intel CoreCores/ThreadsPrice$/core
i9-7980XE18/36$1,999$111
i9-7960X16/32$1,699$106
i9-7940X14/28$1,399$100
i9-7920X12/24$1,199$100
i9-7900X10/20$999$100
i7-7820X8/16$599$75
i7-7800X6/12$389$65
i7-7740X4/8$339$85
i7-7640X4/4$242$61 (less threads)

Last year at Computex, the flagship Broadwell-E enthusiast chip was launched: the 10-core i7-6950X at $1,723. Today at Computex, the 10-core i9-7900X costs $999, and the 16-core i9-7960X costs $1,699. Clearly, AMD's Ryzen CPUs have forced Intel to become competitive.

Although the pricing of AMD's 10-16 core Threadripper CPUs is not known yet, the 8-core Ryzen R7 launched at $500 (available now for about $460). The Intel i7-7820X has 8 cores for $599, and will likely have better single-threaded performance than the AMD equivalent. So while Intel's CPUs are still more expensive than AMD's, they may have similar price/performance.

For what it's worth, Intel also announced quad-core Kaby Lake-X processors.

Welcome to the post-quad-core era. Will you be getting any of these chips?


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by LoRdTAW on Thursday June 02 2016, @01:28PM

    by LoRdTAW (3755) on Thursday June 02 2016, @01:28PM (#354038) Journal

    The RX 480 has piqued my interest. I haven't upgraded my rig since I built it around 2011: i7 2600 with a Radeon 6850. Wasn't a powerful GPU when it came out but was midrange and did the job for the past 5 years thanks to many games being hamstrung by consolitis and my shrinking interest in gaming. The only upgrades were an SSD for my games and I doubled the RAM to 16GB. Now Doom has proven that it's time to upgrade my GPU but my price point is $200. The RX 480 is exactly what I am looking for and just in time as I have been researching and shopping around the past two days.

    As for the AMD Zen, if they can release an 8+ core CPU that has similar per core performance of the Intel stuff then I might consider it down the road. As of now, my i7 has yet to appear slow in the CPU sense and my AMD A10 Linux box isn't showing any signs of slowness either. Face it, CPU demand for mainstream has plateaued. There are no CPU killing games and most everything else I do doesn't stress my CPU save for compressing files using pbzip2. That could change in the near future but I'll cross that bridge when I get to it.

    As for AMD, I hope they can survive in the long run. They have pretty much lost the entire server market to Intel which makes their ARM opteron seem like a foolish endeavour. The desktop isn't dead yet but AMD is still a minority player after they were stomped year after year by Intel's Core (to be fair, an AMD APU is more than enough for most people). But their saviour might be VR as it's ramping up to be the hot new item. They should stay focused on making good desktop GPU's, CPU's, APU's and continue their melding of the CPU/GPU ala HSA.

    I also like the idea of their G series SoC and would like to see more 10-15W SoC's with GPU/APU with the same plethora of connectivity and dual channel DDR3/4. The only ARM silicon I'd like to see from AMD would be a mobile SoC with Radeon GPU or some kind of hybrid ARM-APU that is applications oriented like the Freescale/NXP i.MX series and their G SoC. We can finally have a multipurpose ARM chip with a GPU that doesn't need to be reverse engineered or binary driver crap. But save that for when you pull yourself out of the financial rut.

    • (Score: 2) by tibman on Thursday June 02 2016, @02:45PM

      by tibman (134) Subscriber Badge on Thursday June 02 2016, @02:45PM (#354076)

      I'm interested in the RX 480 just for the reduction in power use (heat!). My R9 sounds like a leaf blower. Totally with you on the CPU front. Currently have an A10 and it's good enough. Recently saw that they are releasing A12s but only for laptops?

      Anyways, if you're looking for a game that is CPU hungry, check out Vermintide: http://store.steampowered.com/app/235540/ [steampowered.com]

      --
      SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by cubancigar11 on Thursday June 02 2016, @04:00PM

      by cubancigar11 (330) on Thursday June 02 2016, @04:00PM (#354104) Homepage Journal

      RX480 is pretty much AMD's equivalent of GTX970.
      1070 is there for 2k/1080p gaming. 1080 is for 4k. RX480 will afaik save more power, but 970 will drop in price so you can think about it if you aren't actively against nvidia.

      I personally own 970 so I am skipping this whole upgrade cycle. In 2-3 years I am planning to get a 4k monitor and then I will think about upgrading :)

      • (Score: 2) by purple_cobra on Thursday June 02 2016, @07:07PM

        by purple_cobra (1435) on Thursday June 02 2016, @07:07PM (#354177)

        Ditto on the skipping this generation and for the same reason. The gaming machine has a Xeon in it so I won't be in any great rush to upgrade that either unless there is some compelling reason to do so.
        One thing I would like to see from AMD is the desktop equivalent APUs to the mobile ones they announced here. A motherboard made by a reputable company with an onboard DisplayPort socket for that would be perfect; that's the last hurdle to clear before I upgrade my better half's PC. She's currently using my old Radeon card solely because the DisplayPort output on that will handle the 2560x1440 monitor she uses. I've seen a few boards meeting that requirement advertised in the US but the availability here in the UK seems to be very limited, possibly non-existent.

        • (Score: -1, Offtopic) by Anonymous Coward on Thursday June 02 2016, @07:26PM

          by Anonymous Coward on Thursday June 02 2016, @07:26PM (#354183)

          Well my computer is a male and after reading about your computer he would like to go out with her.

        • (Score: 2) by cubancigar11 on Friday June 03 2016, @07:37AM

          by cubancigar11 (330) on Friday June 03 2016, @07:37AM (#354403) Homepage Journal

          AMD appears to be aiming for APUs only. Both PS4 and XBONE have them already, so I am hoping in 2-3 years they will have something good for PC market too. But AMD is also, as it appears, not aiming for desktop gaming market so lets watch and see :)

      • (Score: 2) by LoRdTAW on Thursday June 02 2016, @11:03PM

        by LoRdTAW (3755) on Thursday June 02 2016, @11:03PM (#354267) Journal

        The 970 is still up there in price, $300-350. If The RX480 does the job of the 970 for the same price but consumes less power, it's still a win on power costs alone. I'll wait for some benchmarks. And Before the Radeon 6850 I had an Nvidia 9800 GT in my older AMD so nothing against Nvidia.

    • (Score: 0) by Anonymous Coward on Thursday June 02 2016, @08:15PM

      by Anonymous Coward on Thursday June 02 2016, @08:15PM (#354201)

      > There are no CPU killing games

      You're not playing KSP.

    • (Score: 2) by takyon on Friday June 03 2016, @06:44AM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Friday June 03 2016, @06:44AM (#354389) Journal

      I'm more interested in AMD's mobile APUs than anything, and it is a segment where they have done well even while lagging at 28nm, because their integrated graphics are good. Intel could catch up in APU graphics but their most powerful iGPUs have been in really expensive CPUs, and I'm not sure AMD will let them take the lead. I have no idea how impressive Zen's integrated graphics will be, but it is likely to be a massive improvement because of the double die shrink. I'll just take graphics and power efficiency improvement for granted and hope that single-threaded performance meets expectations.

      Zen is needed to recover from the Bulldozer disaster. It is AMD's chance to put a real 8-core chip into desktops and recover part of that market. No more failed module design, which is why they can make... or claim such an impressive 40% IPC gain when Intel has been doing 3-12% per generation. 8 cores, 16 threads. Intel isn't exactly trying to compete on core count... although I guess the 6-core i7-6800K is somewhat cheaper than previous >4-core -E chips at $434.

      So AMD's 8-core desktop chip will compete against quad-core i7 chips. No matter what kind of IPC gain AMD makes, we know it will still fall short of Intel's single-threaded performance. But they will sell an 8-core chip for around $150 (compare with launch prices for 4 module FX chips [wikipedia.org]), and it will hopefully have better multithreaded performance than Intel's quad cores. That combination is what could make AMD great again.

      There are no CPU killing games

      The good news is that the PS4 and XBO have AMD 8-core chips. Sure, they are somewhat puny AMD Jaguar chips, and only 6-7 cores are available to the game devs, but it will hasten the parallelization trend, particularly beyond the ubiquitous quad-cores. The "PS4K" will have a similar 8-core chip, but with a clock rate boost. This is being released "mid-cycle", so consolitis is diminished to a degree.

      Zen will be followed by Zen+, which should improve IPC/etc. further. Maybe around that time (2018?), Intel will decide to make a "mainstream" 6-core chip for once.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Sunday June 05 2016, @12:34AM

      by Anonymous Coward on Sunday June 05 2016, @12:34AM (#355333)

      "They have pretty much lost the entire server market to Intel which makes their ARM opteron seem like a foolish endeavour." (endeavor *)

      It's all relative. The server of yesterday is obsolete today and my phone is more powerful than it. However by having server options AMD can force Intel to keep upgrading its server offerings to compete, at least to some extent, which takes away money and resources from producing other stuff that AMD can better compete with them on. So AMD can invest a little bit at a time and make Intel invest more to keep ahead of the game and insure they stay ahead of the game. If Intel doesn't invest at all or if they reduce their investment they are threatened to eventually be surpassed by AMD as technology moves so fast that a regular/constant small investment would move quick relative to not investing at all since past investments, even large ones, quickly result in obsolete technology if they aren't meet with constant investing and advancement going forward.

      • (Score: 0) by Anonymous Coward on Sunday June 05 2016, @12:36AM

        by Anonymous Coward on Sunday June 05 2016, @12:36AM (#355336)

        errr ... ensure not insure *

  • (Score: 3, Interesting) by opinionated_science on Thursday June 02 2016, @03:57PM

    by opinionated_science (4031) on Thursday June 02 2016, @03:57PM (#354102)

    once we get a device with 12 tflops, there'll be a mini-revolution with molecular dynamics....

    The APU would be great to achieve this, but I understand the bus between the x86 CPU and onboard GPU is still PCI!!! on the same die!!!

    For those reading, the latency of PCI-E is a massive overhead for parallelism, until we got GPU's where the bandwidth became the issue.

    My $0.02...

    • (Score: 2) by LoRdTAW on Thursday June 02 2016, @10:39PM

      by LoRdTAW (3755) on Thursday June 02 2016, @10:39PM (#354259) Journal

      Correct me if I am wrong, but the idea behind AMD's HSA is to use shared memory between the CPU and GPU. The idea being that since the GPU and CPU share the same memory controller, the data to be processed has its address passed to the GPU and the GPU directly accesses that piece of memory without a pci transfer.

      • (Score: 2) by opinionated_science on Friday June 03 2016, @03:49PM

        by opinionated_science (4031) on Friday June 03 2016, @03:49PM (#354600)

        that's what I was referring too - but originally (and I dug around the APU I had to hand A10 something) the GPU was on the otherside of a PCI bus, inside the CPU package!!

        If I understand correctly Hypertransport is more efficient (allows interleaved transfer) but is only used for CPU-CPU and CPU-MEM.

        For GPU's to work, they use PCI as they can be directly addressed from the CPU via PCI, as well as be a DMA master.

        If new architectures share GPU/CPU memory with lower latency (1 us), I would be very interested.

        As of now, plans are to load as much on the GPU and let it rip - hence my number 12 Tflops. I calculated it a few years ago, though someone actually built a machine that solves the problem out of custom ASICs...

          (Google Anton, D.E. Shaw)

  • (Score: 2) by jasassin on Thursday June 02 2016, @08:29PM

    by jasassin (3566) <jasassin@gmail.com> on Thursday June 02 2016, @08:29PM (#354208) Homepage Journal

    but the price has increased massively to around $1,723

    Im gonna buy like ten of dem... just in case a couple go bad.

    --
    jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
  • (Score: 2, Interesting) by mattTheOne on Friday June 03 2016, @05:13AM

    by mattTheOne (1788) on Friday June 03 2016, @05:13AM (#354357)

    The middle range is very competitive on the GPU front. This new chip will be competing with the GTX1060 and GTX980, 970, etc.

    Also the prior AMD chips as well, like the 390X.

    Channel and OEMs will be dropping the price like mad to clear out the old stock.

    Ironically, this product might spur sales, but of Nvidia or older AMD chips instead...win win for the consumer tho