Apple Inc. is planning to use its own chips in Mac computers beginning as early as 2020, replacing processors from Intel Corp., according to people familiar with the plans.
The initiative, code named Kalamata, is still in the early developmental stages, but comes as part of a larger strategy to make all of Apple's devices -- including Macs, iPhones, and iPads -- work more similarly and seamlessly together, said the people, who asked not to be identified discussing private information. The project, which executives have approved, will likely result in a multi-step transition.
The shift would be a blow to Intel, whose partnership helped revive Apple's Mac success and linked the chipmaker to one of the leading brands in electronics. Apple provides Intel with about 5 percent of its annual revenue, according to Bloomberg supply chain analysis.
Intel shares dropped as much as 9.2 percent, the biggest intraday drop in more than two years, on the news. They were down 6.4 percent at $48.75 at 3:30 p.m. in New York.
No interest in Apple hardware but it would be interesting to see how they implement a hybrid Desktop/Tablet OS DE. I'm sure Ubuntu and Gnome will follow.
(Score: 3, Informative) by fyngyrz on Tuesday April 03 2018, @06:38PM
It's stupid because it also prevents correctly engineered applications from getting the system resources they require to operate properly.
Designers should not automatically opt-in a "feature" that breaks software. Instead, it makes sense to provide an opted-out default to unknown applications, and then let the user opt-in if they actively decide to, based on criteria they choose, rather than just breaking things willy-nilly without notice or signal.
Because of the backwards default, this "feature" breaks things without any notice to the user: perfectly good applications quietly stop working properly because they have been starved of resources they require, leaving your average user completely in the dark about WTF is going on. Whereas if it was properly implemented, it would:
The presumption that a user knows to look in the "Get Info" dialog when an application is resource-starved, or even that they will understand that what is wrong is that it is resource-starved, is broken on first-principles.