Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday October 23 2018, @09:10AM   Printer-friendly
from the small-progress dept.

Report: Intel is cancelling its 10nm process. Intel: No, we're not

Earlier today, it was reported that Intel is cancelling its troublesome 10nm manufacturing process. In an unusual response, the company has tweeted an official denial of the claims.

[...] The company's most recent estimate is that 10nm will go into volume production in the second half of 2019. The report from SemiAccurate cites internal sources saying that this isn't going to happen: while there may be a few 10nm chips, for the most part Intel is going to skip to its 7nm process.

Typically, Intel doesn't respond to rumors, but this one appears to be an exception. The company is tweeting that it's making "good progress" on 10nm and that yields are improving consistent with the guidance the company provided on its last earnings report. Intel's next earnings report is on Thursday, and we're likely to hear more about 10nm's progress then.

Also at Tom's Hardware and The Verge.

Related: Intel's "Tick-Tock" Strategy Stalls, 10nm Chips Delayed (it has been over 3 years since this article was posted)
Moore's Law: Not Dead? Intel Says its 10nm Chips Will Beat Samsung's
Intel's First 8th Generation Processors Are Just Updated 7th Generation Chips
Intel Releases Open Letter in Attempt to Address Shortage of "14nm" Processors and "10nm" Delays


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by bzipitidoo on Tuesday October 23 2018, @03:51PM (2 children)

    by bzipitidoo (4388) on Tuesday October 23 2018, @03:51PM (#752506) Journal

    So... next year, time to ditch the still perfectly functional 14nm stuff, and send it to the 3rd world? I'm way behind. I'm still using a 45nm AMD Phenom II based system, though it's not my main computer any more. I didn't even bother buying a 32nm or 22nm system. Waited until 14nm arrived and dropped a bit in price.

    14nm brought so much improvement over 45nm that I felt pretty much compelled to upgrade. It was getting to the point that the mere savings in electricity usage might pay for the new computer in 2 years, particularly if the new machine was one of those entry level stick computers for $100, or heck a laptop with screen for $150 to $200. I went with the low power usage computers rather than the raw power and speed and awesome gaming graphics. The Phenom system uses 80W when idle, the new 14nm system uses 33W max, and less than 10W when idling, and is still a little faster despite the tuning for low power. And then there was the game that required SSE4, which the Phenom is too old to have.

    I anticipate in a few years having to upgrade to get AV1 decoding in hardware, more hardware support for Vulkan, and Spectre (and Meltdown) fully fixed, by which time we may be on 5nm.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by bryan on Tuesday October 23 2018, @05:05PM (1 child)

    by bryan (29) <bryan@pipedot.org> on Tuesday October 23 2018, @05:05PM (#752533) Homepage Journal

    I anticipate in a few years having to upgrade to get AV1 decoding in hardware

    Only for mobile devices that have batteries. You would need resolutions over 4K to stress software decoding on 14nm desktop-class hardware. Both hardware video decoding and 4K+ resolution upgrades would probably be best handled with a graphics card upgrade anyway.

    • (Score: 3, Interesting) by takyon on Tuesday October 23 2018, @06:05PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday October 23 2018, @06:05PM (#752550) Journal

      I want to do the same thing as bzipitidoo. I'll skip from a 2011 32nm laptop down to an AMD chip with AV1 hardware decoding and hopefully 6-8 cores. That could come around 2021, maybe on a TSMC "5nm" process. At the same time, waiting until then would skip the deep ultraviolet (DUV) stopgap version of "7nm", so I would be using a chip made with more extreme ultraviolet (EUV) steps, resulting in better performance and power consumption.

      Even if I get a desktop I would still be keen on AV1 hardware decoding (and maybe encoding), particularly if I was using integrated graphics. Given the low power consumption of bzipitidoo's system, that might be what is happening. You could also envision a scenario where you get a relatively low-power CPU with integrated graphics first, and then get a gaming GPU (maybe a lower power one like GTX X060 or AMD equivalent) after waiting a couple of years, since the CPU is usually not the main bottleneck.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]