Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday February 03 2017, @04:49PM   Printer-friendly
from the witty-connection-between-chips,-fries,-and-[Big]-Macs dept.

Apple, which makes its own ARM SoCs for its mobile products such as iPhones and iPads, is planning to include ARM chips in Mac laptops alongside Intel CPUs. The ARM chips will handle various tasks during power conservation modes:

Apple Inc. is designing a new chip for future Mac laptops that would take on more of the functionality currently handled by Intel Corp. processors, according to people familiar with the matter. The chip, which went into development last year, is similar to one already used in the latest MacBook Pro to power the keyboard's Touch Bar feature, the people said. The updated part, internally codenamed T310, would handle some of the computer's low-power mode functionality, they said. The people asked not to be identified talking about private product development. It's built using ARM Holdings Plc. technology and will work alongside an Intel processor.

Although Apple only accounted for 7.5 percent of worldwide computer shipments in the fourth quarter, according to data from IDC, the Mac line has long set the standard for design and component improvements. Its feature additions often start new technology trends that other manufacturers rush to follow. Apple and Intel declined to comment. [...] Apple engineers are planning to offload the Mac's low-power mode, a feature marketed as "Power Nap," to the next-generation ARM-based chip. This function allows Mac laptops to retrieve e-mails, install software updates, and synchronize calendar appointments with the display shut and not in use. The feature currently uses little battery life while run on the Intel chip, but the move to ARM would conserve even more power, according to one of the people.

Do you think we will see Dell, Acer, ASUS, et al. produce mainstream dual-processor laptops? How about big.LITTLE clusters in Chromebooks?

Also at Ars Technica, TechCrunch, and Computerworld.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday February 03 2017, @05:43PM

    by Anonymous Coward on Friday February 03 2017, @05:43PM (#462495)

    A single computer (mobile, desktop, server) doesnt use alot of power, but if there are millions, then it starts to matter.
    it is curious how during the evolution of computers, energy usage was of no big concern.
    if energy requirements would have been a concern, then many of the first computers would not have been developed.
    of course, over time, computers were made to use less energy or use it more efficiently.
    these computers, however, where still "development" computers (users were near the software-hardware boundary).
    now it seems, that because of the "mobile before everything" dogma, everything has to take a backseat to energy consumption.
    since computers are a evolution and a architecture cannot be restarted, as from scratch, every new generation incorporates
    the previous in some form.
    i fear that processing short cuts might be taken, laid down, just because of one component: Battery.
    power-saving can either be for the source of the energy (battery) -or- because the processing unit requires it for further computing-density (heat).

    if in the future some energy storage break-thru might happen (hydrogen + micro turbine? tony stark lost a repulsor? ), then this evolutionary layer of "power-savings" will become obsolete. no computer will run for free (unless it's a matrix we all live in).

    so is power saving for more stamina of "sheapel social media" window outlet -aka- dumb-terminal-for-the-cloud or more computing power in a denser space?

  • (Score: 3, Insightful) by takyon on Friday February 03 2017, @06:01PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday February 03 2017, @06:01PM (#462509) Journal

    If a battery breakthrough lets your laptop/smartphone carry around 10x more Watt-hours of energy (or 2-4x with a smaller battery), your device is that much closer to being a bomb.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 3, Informative) by Nerdfest on Friday February 03 2017, @06:07PM

    by Nerdfest (80) on Friday February 03 2017, @06:07PM (#462513)

    Part of it is the cost of electricity as well. Some of the desktop machines a few years ago absolutely sucked down power, even at idle. An office full of those ends up incurring quite a power cost.

  • (Score: 3, Interesting) by tangomargarine on Friday February 03 2017, @07:26PM

    by tangomargarine (667) on Friday February 03 2017, @07:26PM (#462543)

    since computers are a evolution and a architecture cannot be restarted, as from scratch, every new generation incorporates
    the previous in some form.

    There's no technical reason you can't do this. Hell, with sufficient determination I bet you could wire wrap [wikipedia.org] an 8086 (they have a picture of an actual wire-wrapped z80 in that article; that's 8-bit and 8086 is 16, though).

    Making things from scratch in the computing world isn't as impossible as it sounds--it may be a huge, enormous, gigantic pain in the ass, but it's still possible. One might remember that guy [wikipedia.org] who was insisting that it was impossible for Linus Torvalds to have written his own OS without plagiarizing Minix, and the book he published on the topic was universally derided, even by some of the people he (mis)quoted.

    Of course, trying to market a brand new processor architecture for the general consumer market these days would most likely crash and burn for marketing reasons. But that's a separate argument.

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 2) by sjames on Friday February 03 2017, @11:50PM

    by sjames (2882) on Friday February 03 2017, @11:50PM (#462669) Journal

    The energy savings aren't quite as ephemerally valuable as you might think. On the other side, we still have the power ate the expense of energy consumption chips. It turns out that a big limiting factor on them is how much heat we can pull out of them and how fast so they don't burn themselves up. The fastest damn it all just go fast machines use refrigerant systems to keep the CPU cool. They would be faster but they can't be kept cool enough internally no matter how cold we make the surface.

    Meanwhile, Intel seems to hate open. They always have, it's just that they didn't know how to close things. ARM is currently the wedge that keeps them from closing everything up. It's not as fast or powerful, but it's fast and powerful enough to be a credible threat if they dare make their chips totally hands off.