Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Wednesday May 29 2019, @06:21PM   Printer-friendly
from the better-late-than-never dept.

Intel's 10th Gen, 10nm Ice Lake CPUs: everything you need to know

Intel has a lot to prove. 2018 marked the chipmaker's 50th anniversary, but it was also a year that shook the company to its core. It was the year that Intel lost its CEO, struggled with Spectre and Meltdown, and reportedly lost Apple's confidence as far as chips for future Macs are concerned. Above all, it was the year the world finally realized Intel processors had hit a wall, after yet another failure to shrink its circuits down to the "10 nanometer" process node.

But now, after years of delays, the company is about to bring its first real batch[*] of 10nm CPUs to the world. Today, the company is officially taking the wraps off its 10th Gen Intel Core processors, codename "Ice Lake," and revealing some of what they might be able to do for your next PC when they ship in June.

[*] 18% IPC improvement *loud coughing* compared "against the Skylake cores the company released nearly four years ago!"

Also at AnandTech and Tom's Hardware.


Original Submission

Related Stories

Intel CEO Blames "10nm" Delays on Aggressive Density Target, Promises "7nm" for 2021 10 comments

Intel says it was too aggressive pursuing 10nm, will have 7nm chips in 2021

[Intel's CEO Bob] Swan made a public appearance at Fortune's Brainstorm Tech conference in Aspen, Colorado, on Tuesday and explained to the audience in attendance that Intel essentially set the bar too high for itself in pursuing 10nm. More specifically, he pointed to Intel's overly "aggressive goal" of going after a 2.7x transistor density improvement over 14nm.

[...] Needless to say, the 10nm delays have caused Intel to fall well behind that transistor density doubling. Many have proclaimed Moore's Law as dead, but as far as Swan is concerned, Moore's Law is not dead. It apparently just needed to undergo an unexpected surgery.

"The challenges of being late on this latest [10nm] node of Moore's Law was somewhat a function of what we've been able to do in the past, which in essence was define the odds on scaling the infrastructure," Swan explains. Bumping up to a 2.7x scaling factor proved to be "very complicated," more so than Intel anticipated. He also says that Intel erred when it "prioritized performance at a time when predictability was really important."

"The short story is we learned from it, we'll get our 10nm node out this year. Our 7nm node will be out in two years and it will be a 2.0X scaling so back to the historical Moore's Law curve," Swan added.

Also at Fortune and Tom's Hardware.

Related:


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Interesting) by ikanreed on Wednesday May 29 2019, @06:40PM (8 children)

    by ikanreed (3164) Subscriber Badge on Wednesday May 29 2019, @06:40PM (#849020) Journal

    Where no matter how pure the substrate, how accurately the semiconductors are cut and laid, quantum interactions start introducing defects into circuits?

    I thought I heard that years ago when we hit 12 nm.

    • (Score: 0) by Anonymous Coward on Wednesday May 29 2019, @06:51PM (1 child)

      by Anonymous Coward on Wednesday May 29 2019, @06:51PM (#849028)

      You did. That's one of the reasons it has been taking so long. Had to work out the ramification of that on top of making things that small and that accurate.

      • (Score: 2) by ikanreed on Wednesday May 29 2019, @06:57PM

        by ikanreed (3164) Subscriber Badge on Wednesday May 29 2019, @06:57PM (#849032) Journal

        What can you even do about random bit flips happening, though?

        It's not like every gate having an ECC at 10 nm is a smaller or more performative chip than a flat 12 nm one. My abstract algebra is a little rusty, but I don't recall a way to make it up.

    • (Score: 2) by datapharmer on Wednesday May 29 2019, @07:10PM (1 child)

      by datapharmer (2702) on Wednesday May 29 2019, @07:10PM (#849038)

      Yep, such a problem in fact that Samsung is having to do it at half the size just to dodge the qubits. https://www.zdnet.com/article/samsung-develops-euv-5-nanometre-chip-process/ [zdnet.com]

      • (Score: 2) by JoeMerchant on Wednesday May 29 2019, @07:19PM

        by JoeMerchant (3937) on Wednesday May 29 2019, @07:19PM (#849042)

        How many 5nm ARM cores does it take to wrestle down a 10nm bloat *cough* CISC Core?

        --
        🌻🌻 [google.com]
    • (Score: 3, Interesting) by takyon on Wednesday May 29 2019, @08:15PM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday May 29 2019, @08:15PM (#849061) Journal

      Intel's "10nm" process is comparable to TSMC's "7nm". TSMC has better yields, and AMD's chiplet approach can handle defects better.

      Intel will move to a chiplet approach and has demoed mixing process nodes. AMD uses "14nm" I/O die alongside "7nm" core chiplets. Intel demoed a chip that used "14nm" and "10nm" cores.

      TSMC is only using extreme ultraviolet lithography (EUV) for some steps in its "7nm" process. "7nm+" and other processes will use more EUV, leading to lower defect rates.

      TSMC has "5nm" coming. [tomshardware.com] TSMC, Samsung, and others are expect to move to a gate-all-around (GAA) design by "3nm", which should have improved tolerance to quantum tunneling/leakage.

      https://semiengineering.com/transistor-options-beyond-3nm/ [semiengineering.com]
      https://spectrum.ieee.org/nanoclast/semiconductors/devices/new-metalair-transistor-replaces-semiconductors [ieee.org]

      As you can see, there are various options for continued scaling improvements.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by ikanreed on Wednesday May 29 2019, @08:45PM (2 children)

        by ikanreed (3164) Subscriber Badge on Wednesday May 29 2019, @08:45PM (#849068) Journal

        So I'm hearing "You were an idiot for taking internet commenters on a tech site 5 years ago as even remotely knowledgeable"

        • (Score: 0) by Anonymous Coward on Wednesday May 29 2019, @08:55PM

          by Anonymous Coward on Wednesday May 29 2019, @08:55PM (#849072)
          Remember, that recommendation applies to itself...
        • (Score: 2) by takyon on Wednesday May 29 2019, @09:05PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday May 29 2019, @09:05PM (#849076) Journal

          I don't think "quantum interactions" really refers to the manufacturing defects. It's the manufacturing process which uses deep ultraviolet or soon extreme ultraviolet to create feature sizes that are smaller than the wavelength of the light used. So they have to use multiple patterning to handle that. But it will never be perfect.

          You do have quantum tunneling that causes electrical leakage. But there is at least one transistor design [wikipedia.org] that can take advantage of that effect.

          We are running into fundamental limits, but there are still a lot of ideas left in the bag to improve scaling and performance. Nanotube transistors, for example. The industry would prefer not to use new stuff if they can continue to tweak the old stuff.

          I think there are credible ideas in the bagg which could lead to orders of magnitude of performance improvements.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by JoeMerchant on Wednesday May 29 2019, @07:17PM (8 children)

    by JoeMerchant (3937) on Wednesday May 29 2019, @07:17PM (#849040)

    Did anyone else perceive the Skylake curse? It seemed that every time I turned around, there was another problem with Skylake that wasn't a thing with the 5th gen chipset. Network drivers that needed updating, embedded graphics glitches, overheating (of the i5 NUCs), and our vendors seemed to delay release of and issue more patches for Skylake than previous, or later, generations.

    They eventually settled down, but if it were my show - I'd have said that Skylake got pushed out the door prematurely.

    --
    🌻🌻 [google.com]
    • (Score: 0) by Anonymous Coward on Wednesday May 29 2019, @07:39PM

      by Anonymous Coward on Wednesday May 29 2019, @07:39PM (#849049)

      It was because Intel screwed the pooch bad enough they were having to work incompatible errata fixes into everything that was going to break older chips, either unintentionally, or intentionally.

    • (Score: 2) by bzipitidoo on Wednesday May 29 2019, @09:41PM (6 children)

      by bzipitidoo (4388) on Wednesday May 29 2019, @09:41PM (#849084) Journal

      They've been pushing the thermal limits too far.

      I got a NUC, and yeah, it overheats. Most of the time it can handle CPU torture like with a distributed computing project such as GIMPS. It overheats only a couple of times per week while running that.

      But, engage those built in 3D graphics, and it'll overheat much more often and sooner-- maybe in 10 minutes, maybe in an hour.

      I also got one of those ASUS Vivosticks. Try to play a video on it, and it'll overheat in 15 minutes plus or minus 5. Hardware powered mpeg-4 decoding would be a lot nicer if it didn't overheat the CPU. So far, the NUC has had no problems with video.

      • (Score: 3, Informative) by JoeMerchant on Wednesday May 29 2019, @11:08PM (4 children)

        by JoeMerchant (3937) on Wednesday May 29 2019, @11:08PM (#849111)

        I'm trying to use a NUC (Skylake, I think) to drive a 4K screen, and it struggles big time. The NUC is "installed" (more like laying) on the back of the screen where some of the screen electronics feed it warm air - if I fiddle it around into a cooler airstream it will handle fullscreen video a little longer, but even when set back to 1080p, it still starts glitching after a while of running Netflix or similar. Now, it will run Blender or OpenSCAD at 4K resolution all day long just fine, but when the frame rates come up it is toast - even when completely removed from the warm air input.

        --
        🌻🌻 [google.com]
        • (Score: 2) by takyon on Wednesday May 29 2019, @11:30PM (3 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday May 29 2019, @11:30PM (#849112) Journal

          Smartphones, Amazon Fire TV 4K, etc. can do 4K video output. It's surprising that a NUC can't handle it.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by JoeMerchant on Thursday May 30 2019, @01:58AM (2 children)

            by JoeMerchant (3937) on Thursday May 30 2019, @01:58AM (#849151)

            It's surprising that a NUC can't handle it.

            Specs say it does it, it will do it for a minute or two, sometimes 10 or 20, then it starts tearing - spurious color lines appear, sometimes rectangles of wild color shift like white -> magenta, etc. I've got another, newer NUC here, I should try an SSD swap between them and see if the newer one will handle it better. Long twisted story: the newer one was bought to drive the living room TV because the 4th gen NUC in there is starting to have pretty loud fan noise - I cleaned it, which quieted it for like a month then the fan noise returned and it didn't sound healthy, so I bought the new NUC and prepped it to take over, but there's just one app I made for the old one that's running 14.04 which won't recompile under 18.04 - ancient library called Wt makes widget apps that are accessible via http, and the old NUC still soldiers on, and I like that app (I can "press space bar" on the NUC from a webpage served on my local network - pause, resume, KODI, Netflix and others...) So... waiting for the old NUC to die, but it won't, and the 4K display in the other room is driven by a Windows 10 Skylake NUC that tears video, but we don't really watch much video in there... it's all "not quite broke" so not at a high priority for fixing, and if the newer NUC does drive the big screen without tearing, do I really care? Pretty sure I don't want to take the tearing video NUC and use it for my primary video watching screen.

            --
            🌻🌻 [google.com]
            • (Score: 2) by FatPhil on Thursday May 30 2019, @07:37AM (1 child)

              by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Thursday May 30 2019, @07:37AM (#849214) Homepage
              Glue a coin centrally on top of the noisy fan. It'll increase the rotational inertia, and make it less prone to wobble, which is most of the loud fan noise I've encountered. Of course, it makes it harder to start too, and you don't want that failure mode.

              I don't necessarily advise this variant, but it worked for me in a sparcstation for years, I just used blu-tak to attach the coin, which permitted me to fine-tune its central location, and once it was absolutely central, there was no centrifugal force trying to remove it.
              --
              Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 0) by Anonymous Coward on Thursday May 30 2019, @03:08AM

        by Anonymous Coward on Thursday May 30 2019, @03:08AM (#849176)

        Work got me an ultrabook (newer than skylake though; 2018 Dell XPS-13 with an i7).

        The disk is encrypted with LUKS, and the kernel starts spewing spam about thermal throttling immediately while it is booting up. It doesn't start thermal throttling if booted from unencypted media, so it looks like handling the encrypted disk is what puts it over the edge on startup.

        Intel processors produce way too much heat for the form factors they are trying to package them in.

  • (Score: 0) by Anonymous Coward on Wednesday May 29 2019, @07:52PM (1 child)

    by Anonymous Coward on Wednesday May 29 2019, @07:52PM (#849056)

    how many more bugs does it introduce?

(1)