Intel's 10th Gen, 10nm Ice Lake CPUs: everything you need to know
Intel has a lot to prove. 2018 marked the chipmaker's 50th anniversary, but it was also a year that shook the company to its core. It was the year that Intel lost its CEO, struggled with Spectre and Meltdown, and reportedly lost Apple's confidence as far as chips for future Macs are concerned. Above all, it was the year the world finally realized Intel processors had hit a wall, after yet another failure to shrink its circuits down to the "10 nanometer" process node.But now, after years of delays, the company is about to bring its first real batch[*] of 10nm CPUs to the world. Today, the company is officially taking the wraps off its 10th Gen Intel Core processors, codename "Ice Lake," and revealing some of what they might be able to do for your next PC when they ship in June.
Intel has a lot to prove. 2018 marked the chipmaker's 50th anniversary, but it was also a year that shook the company to its core. It was the year that Intel lost its CEO, struggled with Spectre and Meltdown, and reportedly lost Apple's confidence as far as chips for future Macs are concerned. Above all, it was the year the world finally realized Intel processors had hit a wall, after yet another failure to shrink its circuits down to the "10 nanometer" process node.
But now, after years of delays, the company is about to bring its first real batch[*] of 10nm CPUs to the world. Today, the company is officially taking the wraps off its 10th Gen Intel Core processors, codename "Ice Lake," and revealing some of what they might be able to do for your next PC when they ship in June.
[*] 18% IPC improvement *loud coughing*
compared "against the Skylake cores the company released nearly four years ago!"
Also at AnandTech and Tom's Hardware.
Where no matter how pure the substrate, how accurately the semiconductors are cut and laid, quantum interactions start introducing defects into circuits?
I thought I heard that years ago when we hit 12 nm.
You did. That's one of the reasons it has been taking so long. Had to work out the ramification of that on top of making things that small and that accurate.
What can you even do about random bit flips happening, though?
It's not like every gate having an ECC at 10 nm is a smaller or more performative chip than a flat 12 nm one. My abstract algebra is a little rusty, but I don't recall a way to make it up.
Yep, such a problem in fact that Samsung is having to do it at half the size just to dodge the qubits. https://www.zdnet.com/article/samsung-develops-euv-5-nanometre-chip-process/ [zdnet.com]
How many 5nm ARM cores does it take to wrestle down a 10nm bloat *cough* CISC Core?
Intel's "10nm" process is comparable to TSMC's "7nm". TSMC has better yields, and AMD's chiplet approach can handle defects better.
Intel will move to a chiplet approach and has demoed mixing process nodes. AMD uses "14nm" I/O die alongside "7nm" core chiplets. Intel demoed a chip that used "14nm" and "10nm" cores.
TSMC is only using extreme ultraviolet lithography (EUV) for some steps in its "7nm" process. "7nm+" and other processes will use more EUV, leading to lower defect rates.
TSMC has "5nm" coming. [tomshardware.com] TSMC, Samsung, and others are expect to move to a gate-all-around (GAA) design by "3nm", which should have improved tolerance to quantum tunneling/leakage.
https://semiengineering.com/transistor-options-beyond-3nm/ [semiengineering.com]https://spectrum.ieee.org/nanoclast/semiconductors/devices/new-metalair-transistor-replaces-semiconductors [ieee.org]
As you can see, there are various options for continued scaling improvements.
So I'm hearing "You were an idiot for taking internet commenters on a tech site 5 years ago as even remotely knowledgeable"
I don't think "quantum interactions" really refers to the manufacturing defects. It's the manufacturing process which uses deep ultraviolet or soon extreme ultraviolet to create feature sizes that are smaller than the wavelength of the light used. So they have to use multiple patterning to handle that. But it will never be perfect.
You do have quantum tunneling that causes electrical leakage. But there is at least one transistor design [wikipedia.org] that can take advantage of that effect.
We are running into fundamental limits, but there are still a lot of ideas left in the bag to improve scaling and performance. Nanotube transistors, for example. The industry would prefer not to use new stuff if they can continue to tweak the old stuff.
I think there are credible ideas in the bagg which could lead to orders of magnitude of performance improvements.
Did anyone else perceive the Skylake curse? It seemed that every time I turned around, there was another problem with Skylake that wasn't a thing with the 5th gen chipset. Network drivers that needed updating, embedded graphics glitches, overheating (of the i5 NUCs), and our vendors seemed to delay release of and issue more patches for Skylake than previous, or later, generations.
They eventually settled down, but if it were my show - I'd have said that Skylake got pushed out the door prematurely.
It was because Intel screwed the pooch bad enough they were having to work incompatible errata fixes into everything that was going to break older chips, either unintentionally, or intentionally.
They've been pushing the thermal limits too far.
I got a NUC, and yeah, it overheats. Most of the time it can handle CPU torture like with a distributed computing project such as GIMPS. It overheats only a couple of times per week while running that.
But, engage those built in 3D graphics, and it'll overheat much more often and sooner-- maybe in 10 minutes, maybe in an hour.
I also got one of those ASUS Vivosticks. Try to play a video on it, and it'll overheat in 15 minutes plus or minus 5. Hardware powered mpeg-4 decoding would be a lot nicer if it didn't overheat the CPU. So far, the NUC has had no problems with video.
I'm trying to use a NUC (Skylake, I think) to drive a 4K screen, and it struggles big time. The NUC is "installed" (more like laying) on the back of the screen where some of the screen electronics feed it warm air - if I fiddle it around into a cooler airstream it will handle fullscreen video a little longer, but even when set back to 1080p, it still starts glitching after a while of running Netflix or similar. Now, it will run Blender or OpenSCAD at 4K resolution all day long just fine, but when the frame rates come up it is toast - even when completely removed from the warm air input.
Smartphones, Amazon Fire TV 4K, etc. can do 4K video output. It's surprising that a NUC can't handle it.
It's surprising that a NUC can't handle it.
Specs say it does it, it will do it for a minute or two, sometimes 10 or 20, then it starts tearing - spurious color lines appear, sometimes rectangles of wild color shift like white -> magenta, etc. I've got another, newer NUC here, I should try an SSD swap between them and see if the newer one will handle it better. Long twisted story: the newer one was bought to drive the living room TV because the 4th gen NUC in there is starting to have pretty loud fan noise - I cleaned it, which quieted it for like a month then the fan noise returned and it didn't sound healthy, so I bought the new NUC and prepped it to take over, but there's just one app I made for the old one that's running 14.04 which won't recompile under 18.04 - ancient library called Wt makes widget apps that are accessible via http, and the old NUC still soldiers on, and I like that app (I can "press space bar" on the NUC from a webpage served on my local network - pause, resume, KODI, Netflix and others...) So... waiting for the old NUC to die, but it won't, and the 4K display in the other room is driven by a Windows 10 Skylake NUC that tears video, but we don't really watch much video in there... it's all "not quite broke" so not at a high priority for fixing, and if the newer NUC does drive the big screen without tearing, do I really care? Pretty sure I don't want to take the tearing video NUC and use it for my primary video watching screen.
Cool thought - might try that someday.
Work got me an ultrabook (newer than skylake though; 2018 Dell XPS-13 with an i7).
The disk is encrypted with LUKS, and the kernel starts spewing spam about thermal throttling immediately while it is booting up. It doesn't start thermal throttling if booted from unencypted media, so it looks like handling the encrypted disk is what puts it over the edge on startup.
Intel processors produce way too much heat for the form factors they are trying to package them in.
how many more bugs does it introduce?
You don't have clearance to know that information.