Apple announces Mac architecture transition from Intel to its own ARM chips, offers emulation story
Apple has just announced its plans to switch from Intel CPUs in Macs to silicon of its own design, based on the ARM architecture. This means that Apple is now designing its own chips for iOS devices and its Mac desktop and laptops. Apple said it will ship its first ARM Mac before the end of the year, and complete the Intel -> ARM transition within two years.
Apple will bring industry leading performance and performance-by-watt with its custom silicon. Apple's chips will combine custom CPU, GPU, SSD controller and many other components. The Apple silicon will include the Neural Engine for machine learning applications.
[...] "Most apps will just work".
The Next Phase: Apple Lays Out Plans To Transition Macs from x86 to Apple SoCs
[From] an architecture standpoint, the timing of the transition is a bit of an odd one. As noted by our own Arm guru, Andrei Frumusanu, Arm is on the precipice of announcing the Arm v9 ISA, which will bring several notable additions to the ISA such as Scalable Vector Extension 2 (SVE2). So either Arm is about to announce v9, and Apple's A14 SoCs will be among the first to implement the new ISA, otherwise Apple will be setting the baseline for macOS-on-Arm as v8.2 and its NEON extensions fairly late into the ISA's lifecycle. This will be something worth keeping an eye on.
[...] [In] order to bridge the gap between Apple's current software ecosystem and where they want to be in a couple of years, Apple will once again be investing in a significant software compatibility layer in order to run current x86 applications on future Arm Macs. To be sure, Apple wants developers to recompile their applications to be native – and they are investing even more into the Xcode infrastructure to do just that – but some degree of x86 compatibility is still a necessity for now.
The cornerstone of this is the return of Rosetta, the PowerPC-to-x86 binary translation layer that Apple first used for the transition to x86 almost 15 years ago. Rosetta 2, as it's called, is designed to do the same thing for x86-to-Arm, translating x86 macOS binaries so that they can run on Arm Macs. Rosetta 2's principle mode of operation will be to translate binaries at install time.
See also: Apple Announces iOS 14 and iPadOS 14: An Overview
Apple's First ARM-Based (Mac) Product Is a Mac mini Featuring an A12Z Bionic, but Sadly, Regular Customers Can't Buy It
Previously: Apple Will Reportedly Sell a New Mac Laptop With its Own Chips Next Year
Related Stories
Apple's New ARM-Based Macs Won't Support Windows Through Boot Camp:
Apple will start switching its Macs to its own ARM-based processors later this year, but you won't be able to run Windows in Boot Camp mode on them. Microsoft only licenses Windows 10 on ARM to PC makers to preinstall on new hardware, and the company hasn't made copies of the operating system available for anyone to license or freely install.
"Microsoft only licenses Windows 10 on ARM to OEMs," says a Microsoft spokesperson in a statement to The Verge. We asked Microsoft if it plans to change this policy to allow Windows 10 on ARM-based Macs, and the company says "we have nothing further to share at this time."
[...] Apple later confirmed it's not planning to support Boot Camp on ARM-based Macs in a Daring Fireball podcast. "We're not direct booting an alternate operating system," says Craig Federighi, Apple's senior vice president of software engineering. "Purely virtualization is the route. These hypervisors can be very efficient, so the need to direct boot shouldn't really be the concern."
Previously: Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops
Apple Announces The Apple Silicon M1: Ditching x86 - What to Expect, Based on A14
The new processor is called the Apple M1, the company's first SoC designed with Macs in mind. With four large performance cores, four efficiency cores, and an 8-GPU core GPU, it features 16 billion transistors on a 5nm process node. Apple's is starting a new SoC naming scheme for this new family of processors, but at least on paper it looks a lot like an A14X.
[...] Apple made mention that the M1 is a true SoC, including the functionality of what previously was several discrete chips inside of Mac laptops, such as I/O controllers and Apple's SSD and security controllers.
[....] Whilst in the past 5 years Intel has managed to increase their best single-thread performance by about 28%, Apple has managed to improve their designs by 198%, or 2.98x (let's call it 3x) the performance of the Apple A9 of late 2015.
[...] Apple has claimed that they will completely transition their whole consumer line-up to Apple Silicon within two years, which is an indicator that we'll be seeing a high-TDP many-core design to power a future Mac Pro. If the company is able to continue on their current performance trajectory, it will look extremely impressive.
New report reveals Apple's roadmap for when each Mac will move to Apple Silicon
Citing sources close to Apple, a new report in Bloomberg outlines Apple's roadmap for moving the entire Mac lineup to the company's own custom-designed silicon, including both planned release windows for specific products and estimations as to how many performance CPU cores those products will have.
[...] New chips for the high-end MacBook Pro and iMac computers could have as many as 16 performance cores (the M1 has four). And the planned Mac Pro replacement could have as many as 32. The report is careful to clarify that Apple could, for one reason or another, choose to only release Macs with 8 or 12 cores at first but that the company is working on chip variants with the higher core count, in any case.
The report reveals two other tidbits. First, a direct relative to the M1 will power new iPad Pro models due to be introduced next year, and second, the faster M1 successors for the MacBook Pro and desktop computers will also feature more GPU cores for graphics processing—specifically, 16 or 32 cores. Further, Apple is working on "pricier graphics upgrades with 64 and 128 dedicated cores aimed at its highest-end machines" for 2022 or late 2021.
New Mac models could have additional efficiency cores alongside 8/12/16/32 performance cores. Bloomberg claimed the existence of a 12-core (8 performance "Firestorm" cores, 4 efficiency "Icestorm" cores) back in April which has not materialized yet.
The Apple M1 SoC has 8 GPU cores.
Previously: Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops
Apple Has Built its Own Mac Graphics Processors
Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core
Your New Apple Computer Isn't Yours
Linus Torvalds Doubts Linux will Get Ported to Apple M1 Hardware
CNet:
Apple will start selling Macs that use in-house processors in 2021, based on ones in upcoming iPhones and iPad Pros, Bloomberg reported Thursday. The company is apparently working on three of its own chips, suggesting a transition away from traditional supplier Intel.
The initial batch of custom chips won't be on the same level as the Intel ones used in high-end Apple computers, so they're likely to debut in a new type of laptop, the report noted. These processors could have eight high-performance cores and at least four energy-efficient cores, respectively codenamed Firestorm and Icestorm.
Just another brick in the wall[ed garden]?
(Score: 3, Interesting) by Debvgger on Tuesday June 23 2020, @08:23AM (15 children)
I never liked PCs. Apple did wrong when going from the PowerPC to x64. Glad to see them stopping producing more PCs.
Thanks Apple!
And Amiga rules! Make it big endian compatible so we can run AROS on it! :-)
(Score: 5, Insightful) by canopic jug on Tuesday June 23 2020, @08:43AM (11 children)
I see it as another step away from allowing general purpose computing. They won't achieve their goals all at once but, again, this is just another step. Because of the many previous steps heading that direction, I would say that is the main goal -- eventually -- if too few put up too small a fight.
This move has been in the works for many years. I haven't seen the details on the new Apple PCs with the ARM processors, but if it follows the recent trends (as in from 2007 onward) then it will be about finding a way to lock in the OS and applications even further. This is not about off-the-shelf ARM processors but about custom-designed processors [9to5mac.com]. They already have the T2 chip controlling the audio, disk drives, and several other systems [macworld.com]. The end goal is the data of course.
I applaud the long overdue move away from x86 but not the move into locked-in, custom hardware. M$ has long been aiming at becoming a roach motel for your data but it leaks too much still. Apple, with control over the hardware like this, has a better chance at putting an end to general-purpose computing.
Money is not free speech. Elections should not be auctions.
(Score: 2) by takyon on Tuesday June 23 2020, @09:54AM (4 children)
I don't think putting more functions into a SoC necessarily puts an end to general purpose computing. There are other ARM SoCs that have added GPU, AI acceleration, audio handling, wireless, etc. but are not particularly locked down and run Linux.
And what's the solution? RISC-V?
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Insightful) by HiThere on Tuesday June 23 2020, @02:35PM (2 children)
While that is technically correct, it ignores the company doing it. Apple has always striven to control the end user with incompatible formats, dating all the way back to the old floppy drives of the Apple ][. This doesn't necessarily mean that a new processor will be designed to control the end-user, but there's a long history saying "that's the way to bet".
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 3, Interesting) by looorg on Tuesday June 23 2020, @02:42PM
True. But I would strongly suspect that there will be some heavily encrypted chip on the motherboard to show that this is a "proper licensed Mac" and things should run smooth and the hardware has not been messed with cause if you do it's brick time. Sort of like all the game consoles have done, and it often to ages for hacks to come out to circumvent that and then usually as a started thru various software bugs.
(Score: 5, Insightful) by fyngyrz on Tuesday June 23 2020, @03:50PM
That's right. From TFS, bracketed remark mine:
Any Apple Mac customer who does not take seriously the adage "those who not study history are doomed to repeat it" is certain to be bitten by Apple's repeated, end-user-hostile, "here-now gone-later" treatment. The discarding of the first Rosetta was an exercise in pulling people's feet out from under them, turning huge amounts of invested time and money into vapor.
While I really do like most of OS X itself, Apple has repeatedly stepped all over its users during the decades I've been a customer of theirs. Their idea of a "transition" is to stab the customer who gave them (arguably too much) money in the back.
The upside here is that there are (and will continue to be) many high quality Intel-based Macs available via EBay and so forth, so I'll be able to keep using all the software I've purchased and learned to use into the indefinite — and distant — future.
As for the free software applications I write (image processing, SDR clients), they have always been developed on OS X, and ported to Windows in an OS X VM. I'm going to reverse that now; develop native under Windows on the brand new Intel-based, non-Apple hardware I just ordered today and backport on the Mac until the day Apple throws their Intel users under the bus, as history indicates they will almost certainly do.
Apple has made it clear that the future of Mac hardware is a trip ever deeper into an ever narrower niche. I'm not going to go any further in that direction with them, nor will I develop software that supports such an undertaking. Whatever Microsoft's other faults are (and they are many and significant), they have at least worked hard to maintain compatibility and not shaft their customers by breaking their investments of time and money.
IMO, we don't need less compatibility and more software target fragmentation. The more uniform the playing field is for developers, and the wider the access to software solutions is for end users, the more beneficial to society computers will become. Apple's clearly on a quest to isolate its userbase. I hope a dragon eats them.
TL;DR: Fuck you, Apple.
--
You know that little voice in your
head that keeps you from saying things?
I should probably get one of those.
(Score: -1, Flamebait) by Anonymous Coward on Tuesday June 23 2020, @04:22PM
Arm is a disgusting PITA with Linux. Fuck Arm and the vile whores at Apple.
(Score: 2) by FunkyLich on Tuesday June 23 2020, @11:44AM (3 children)
Wouldn't this move have a side effect of making ARM processors (not only Apple's but the others too) more popular and a choice to fuel alternative development of say, Linux, on the ARM? Developers could be more motivated to make native ARM platform software, starting maybe by porting Apple ARM software to some other ARM platform. I'd imagine that could easily happen. Especially knowing that recently some big companies like Lenovo, Dell, IBM, have started to support Linux directly and it is not too far fetched to think that they might as well make ARM based generic computers and laptops. That might easily attracted more and more developers away from the x86 and towards ARM.
(Score: 1, Interesting) by Anonymous Coward on Wednesday June 24 2020, @02:40AM (2 children)
It's not likely Apple will allow booting other operating systems on their native hardware. So a Mac owner could run Linux in a VM - they demoed that feature - but it's not going to support user freedom in general or Linux on ARM in particular.
Our best hope, and it's slim, is that some backdoor in their boot process that is impossible to patch remotely is discovered in a few years, and the free software community is able to reverse engineer enough to use Apple ARM products for other operating systems. But I wouldn't hold my breath. The Playstation 4 came out six and a half years ago and running Linux on it only became somewhat easy two years ago, and you still have to be careful what version of the Playstation firmware you have or else you can't install it. As far as I understand it, it's still impossible to run Linux on the Xbox One. Apple has a more money to throw at Digital Rights Management than Sony and even Microsoft.
(Score: 2) by TheRaven on Wednesday June 24 2020, @04:26PM (1 child)
Historically, Apple has been happy for people to run other operating systems on Macs. They have reasonably good margins on the hardware and make money from people who buy their hardware to run something else (plus that don't have any support costs from these people, if they take a Linux or Windows Mac to an Apple Store, they are told to boot macOS before they get any help).
This is quite different to consoles. Consoles are sold either at cost or with very low margins, manufacturers make money from customers buying games on them. Anything that reduces the number of games that people buy or makes it likely that games may not work, this reduces their income.
sudo mod me up
(Score: 0) by Anonymous Coward on Wednesday June 24 2020, @09:39PM
I hope you're right, that would make me feel a lot better about this. I'm worried the array of options for people wanting to install Linux or *BSD is starting to shrink - it's obviously far from dying, but this could be a big step in the wrong direction.
(Score: 1, Insightful) by Anonymous Coward on Tuesday June 23 2020, @04:04PM (1 child)
Yeah, because before the creepy Intel / x86 monopoly there weren't general purpose computers.
Monopolies are bad. That stands for CPU architectures as well.
Won't be missed.
(Score: 0) by Anonymous Coward on Wednesday June 24 2020, @01:01PM
The Intel x86 monopoly is terrible, but crucially the overwhelming majority of x86 devices ship with unlocked boot loaders. Device driver support in Linux or *BSD might be inadequate for any particular piece of hardware, but there are on the order of a billion combined laptops and desktops around the world that can boot Linux or BSD instead of whatever it originally shipped with. Outside of the ARM SoCs like the Raspberry Pi and an ever diminishing portion of smart phones, most ARM devices don't allow the same flexibility.
(Score: 2) by TheRaven on Tuesday June 23 2020, @11:44AM
Big endian support is basically gone in AArch64. There are a few instructions that let you do efficient byte swapping, but that's it.
sudo mod me up
(Score: 2) by SpockLogic on Tuesday June 23 2020, @12:14PM
Good bye PC ?
Worse than that, Good Bye Hackintosh. :-(
Overreacting is one thing, sticking your head up your ass hoping the problem goes away is another - edIII
(Score: 2) by pdfernhout on Wednesday June 24 2020, @12:42AM
https://en.wikipedia.org/wiki/Apple_Newton [wikipedia.org]
"Most Newton devices were based on the ARM 610 RISC processor and all featured handwriting-based input."
https://en.wikipedia.org/wiki/StrongARM [wikipedia.org]
"According to Allen Baum, the StrongARM traces its history to attempts to make a low-power version of the DEC Alpha, which DEC's engineers quickly concluded was not possible. They then became interested in designs dedicated to low-power applications which led them to the ARM family. One of the only major users of the ARM for performance-related products at that time was Apple, whose Newton device was based on the ARM platform. DEC approached Apple wondering if they might be interested in a high-performance ARM, to which the Apple engineers replied "Phhht, yeah. You can’t do it, but, yeah, if you could we'd use it.""
One of the most silly court rulings/settlemnts ever: "DEC agreed to sell StrongARM to Intel as part of a lawsuit settlement in 1997. Intel used the StrongARM to replace their ailing line of RISC processors, the i860 and i960. "
https://www.cnet.com/news/intel-digital-settle-suit/ [cnet.com]
"Intel (INTC) and Digital Equipment (DEC) today announced a settlement in a patent litigation lawsuit filed by Digital in May, an agreement likely to have wide impact on the microprocessor industry. "
DEC literally had the future in the palm of their hand (the StrongARM, as demonstrated in the Apple MP2000 which finally was on the cusp of greatness -- I still have one), won a patent lawsuit against Intel, and then gave Intel the StrongARM as part of the settlement. Just a crazy lack of imagination by DEC executives (or worse).
And if only the Newton had emphasized using an on-screen keyboard instead of handwriting recognition in the first iterations...
The biggest challenge of the 21st century: the irony of technologies of abundance used by scarcity-minded people.
(Score: 2, Troll) by r_a_trip on Tuesday June 23 2020, @09:24AM (17 children)
God save us all. Not because Apple switches to their own silicon. Not because this might be the prelude to even more lock-in. From a technical standpoint it's even interesting. No, the trouble will be all the Mac fans. Now that the Mac becomes that exotic niche thing again with hardware almost no one has, we will see all the fans making outrageous claims about the "magical" properties of the new silicon. The new Macs are soooo much better than x86 that people using x86 for computing are like Cromagnon. It will be the reality distortion of the Power Mac all over again.
(Score: 2) by takyon on Tuesday June 23 2020, @09:40AM (2 children)
Apple was already good at making ARM chips that outperformed its Samsung/Qualcomm/MediaTek/etc. competition, despite having fewer cores and less RAM. And they sold many millions of iPhones and iPads while doing so. So the new ARM SoCs for Macs won't exactly be something no one has, since they will be scaled up versions of the mobile SoCs. Apple IPC is supposedly exceeding Intel's (see confusing AnandTech chart) while performance of the iPad Pro is able to rival Intel ultrabooks.
It remains to be seen whether Apple can thrash the likes of Xeon, Threadripper, and Epyc, but I wouldn't count them out.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 1) by r_a_trip on Tuesday June 23 2020, @10:10AM (1 child)
You are talking technical specs on hardware. I'm talking behaviour of users. Two different things.
(Score: 2) by takyon on Tuesday June 23 2020, @10:32AM
If the outrageous claims are close to being true, they aren't outrageous. It can be clearly seen with Apple vs. Android, where Apple has put out more powerful hardware and arguably a better software experience year after year after year. That's attributed to such factors as vertical integration, not having to chase after useless specs (like 16 GB RAM), etc.
They could put out something that thoroughly destroys current Xeon Macs if they feel like it. I'm not sure if they will be able to compete against a hypothetical 96-core AMD CPU though. There is a clear path to "destroying" all x86 competition (going full 3DSoC), but that's more of a matter of timing than magic.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 4, Insightful) by ilsa on Tuesday June 23 2020, @01:38PM (12 children)
I'm not so sure of that. The RDF for the most part vanished with Jobs. There are always gonna be sycophants that create some bizarre emotional dependency on their choice of tech brand, but I think people are going to be a lot more down to earth about this. Just look at last year's WWDC when the audience openly mocked Apple's obnoxious $1000 monitor stand.
At least among the more technical crowd I run with, Apple's endless missteps over the last, 5ish years have significantly soured people's feelings towards Apple's products. No maintainability, no usable ports, and the single stupidest keyboard to have graced the world in decades.
IMO Apple should be falling down really hard right now. However, thanks to Microsoft and the god forsaken cesspool that is Windows 10, Microsoft has handed Apple a license to print money. I've asked people why they keep using Apple and the answer has universally been, "Because it's not Windows." ie: Not because Apple is better, because it is less worse than the alternative.
(And no, Linux doesn't count. It still falls down in too many ways to be a decent alternative desktop os)
(Score: 2) by looorg on Tuesday June 23 2020, @02:34PM (1 child)
That is the thing in general tho, you don't have to have the best product -- it doesn't even have to be good. It just has to suck less then the competition. I guess that is their niche, or one of them: Apple -- sucking less!
I wonder how many of those 1k monitor stands they sold. The apple aficionados (or whatever they are called) might mock it in public or on their social media to be cool with their peps but then they buy the stand anyway cause it just looks so awesome and fits so great with all their other apple hardware. As noted if you keep paying thousands of extra bucks for hardware just cause of the silver plastic apple logo on the front then you probably have another 1k over to spend on a monitor stand.
(Score: 2) by toddestan on Tuesday June 23 2020, @10:34PM
Hey, it works for mobile too. Though I'm of the opinion that Android sucks less than iOS, both are pretty shitty so I can see why someone might prefer iOS.
Admittedly, the biggest draw for Android is that if you want a dumbed down, user-hostile phone, you don't have to spend hundreds of dollars to get one.
(Score: 4, Insightful) by HiThere on Tuesday June 23 2020, @02:42PM (5 children)
*How* does Linux fall down as a desktop system? I've used it for years without significant problems. At one time the applications available were lacking, but these days it's got everything I want to use. To me it appears that what it lacks is a marketing department pushing it, and that's fine with me.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 3, Informative) by DECbot on Tuesday June 23 2020, @08:22PM
It falls down because you either have to know too much about Linux to use it or treat it like an appliance. There's not enough 'it just works' when doing things like Outlook, Excel, Powerpoint, Finalcut Pro, and so forth. And when you finally do fix those things, you'll alienate the geeks and nerds like me who use Linux to bolster their cred. I want a tinkerer's desktop, not some locked-down, corporate polished productivity turd that treats me like a product, consumtard, or hostile user.
cats~$ sudo chown -R us /home/base
(Score: 5, Interesting) by ilsa on Wednesday June 24 2020, @04:42PM (3 children)
This horse has been flogged so much that even the skeleton is disintegrating.
Linux covers exactly two use cases.
1. An experience that is so curated that the user doesn't even know they're using Linux (eg: Android, ChromeOS, etc)
2. You really really love dicking around with your computer, and/or are experienced with this kind of thing like sysadmins, etc.
While general usage at any given moment may well be fine, you will run into situations that require you to open a CLI or otherwise make an obscure conf file change in order to accomplish what you need. That has been reduced somewhat over the years, but it's still an issue. Hell, Gnome still won't let you add arbitrary applications to the application menu unless you open text editor and construct your own .desktop file from scratch. 'scuze me what now? Mac and Windows will literally let you do this with a single drag and drop operation.
Some things are so bloody idiotic that it leaves people incredulous. A great example is if you're on a laptop with discrete graphics, for example, you need to choose between battery saving integrated graphics or battery destroying discrete graphics, and you need to reboot/relogin (which for a consumer is basically the same thing) to change the setting. It's 2020 and you need to reboot to change your video chip. Something Windows and MacOS solved, what, 2 decades ago? And no, I don't care what the reasons are, no matter how justifiable you think they are.
The number of examples I could list are nearly endless. And nevermind the application ecosystem that is filled with software that, while they technically mark the necessary feature checkboxes, vary from unpolished, to difficult to use, to not fit for purpose for anything more than the simplest of use cases.
This is less of a problem for those of us that choose to swim in the deep end, but sometimes I just want to get my effing job done and *don't* want to go down a 4 hour irrelevant rabbit hole to change some obscure setting that is blocking my ability to do the task that I'm actually being paid to perform.
And this doesn't count blatant screw ups like when Ubuntu 19 shipped with a known broken crypto library that effectively disabled Active Directory integration, and then proceeded to never correct. The fix required you to install replacement libraries from some random 3rd party repo, which is completely unacceptable.
Meanwhile, the linux community seems to care FAR more about bullshit like theme choices and systemd, rather than truly important issues, like the fore-mentioned total lack of polish.
(Score: -1, Troll) by Anonymous Coward on Wednesday June 24 2020, @06:15PM
Waah waah waah. If you want someone to hold your hand, go pay for a consultant to integrate this free software complete strangers gave you out of the goodness of their hearts. You didn't pay for Ubuntu, right?
(Score: 0) by Anonymous Coward on Wednesday June 24 2020, @09:48PM
With respect to "like the fore-mentioned total lack of polish." - I still do stumble over polish issues in Windows and my friends with Macs stumble over them in Macs. It's not as often, and not as severe, but to me it's a 100 times as infuriating that companies with quite literally more than 1,000 times as much resources as the open source community still screws things up. No, a loosely connected group of volunteers that mostly gets zero money for their work on desktop apps can't match Microsoft, Google, or Apple. But they covered 80% of the gap with 0.01% of the resources - which makes me scream in frustration every time my work laptop hangs, or displays an error, or closes the File Explorer suddenly.
( I'm the Linux nerd that responded with https://soylentnews.org/comments.pl?noupdate=1&sid=38132&page=1&cid=1011956#commentwrap [soylentnews.org] )
(Score: 2) by HiThere on Wednesday June 24 2020, @11:54PM
Sorry, but no. Once upon a time decades ago I liked fiddling with the OS and manipulating things. Now I just want it to work as I want, be reliable, and have the applications I need. Linux suits me fine. As it happens I use Mate, because that's the interface I got used to, and the others kept changing to be flashier. Gnome 2 was fine. So was KDE3. Actually, I think KDE3 was the best, but I don't fight city hall, so I use mate. And I use pretty much the default install, because I *don't* like fiddling with the OS, and it's pretty much what I want without messing around. But I'm not too particular, Gnome2 was fine with me, and so was KDE3. KDE4 became acceptable, but it did a bunch more stuff than I wanted. And I've stuck with mate and avoided fiddling with my OS (except apt-get update or install a new application) while MSWind has gone through 3 or 4 cycles of update/replace/revise. MSWind is the one that has required lots of fiddling with the OS and adjusting to new GUIs. And I haven't even looked at a Mac since OS10.4, so I can't say what it's been doing. As for MS, if someone asks for help I say "I don't do windows"...and that's the right answer, because I haven't touched it since slightly before 2000.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 3, Funny) by fyngyrz on Tuesday June 23 2020, @03:57PM
Well, they needed something to go along with the single stupidest mouse [flickr.com] to have graced the world in decades.
--
Wow. Apparently it's "rude" to ask the parents
of a kid on a leash if it was a rescue.
(Score: 0, Troll) by Anonymous Coward on Tuesday June 23 2020, @04:25PM (1 child)
"And no, Linux doesn't count. It still falls down in too many ways to be a decent alternative desktop os"
for you dumb fucking monkeys maybe.
(Score: 2) by ilsa on Wednesday June 24 2020, @05:13PM
Thank you for demonstrating yet another reason why Linux will always be a failure as a desktop.
Nothing sys, "This is the platform for me!" like having random assholes throwing insults at you because you arn't willing or able to invest hours of your life to figure out how to use an OS that was never user friendly to begin with.
(Score: 2, Insightful) by Anonymous Coward on Wednesday June 24 2020, @01:27PM
Linux falls down for the average person for two reasons:
1. The whole fucking world is hooked on Microsoft file formats. LibreOffice, Google Docs, and other alternatives are fine office suites in their own right but everyone occasionally needs to send a document to a potential employer, a school, a hospital, or a government office. About 10% of the time, all of those Microsoft Office alternatives get the file conversion to the Microsoft format wrong, and it will cause you no end of pain when the recipient can't open the file, or can't read it, or decides you're a complete idiot because your paperwork has weird margin offsets and line breaks in the wrong places. And on top of that, a significant minority of the white collar workers of the world are Excel power users. They are completely unwilling to spend hundreds of hours learning equivalent skills for LibreOffice Calc or Google Sheets - especially since the LibreOffice Calc script they write in Python can't be exported to Excel for their friend that still uses Excel. Microsoft has the world locked down here.
2. The general rule of thumb for almost any Windows video game for any version of Windows on the entire internet is that if you have Windows 10, a sufficiently powerful CPU and GPU, and enough RAM, you can run it. The WINE project to run Windows software on Linux is amazing, especially with the work Valve Corporation is contributing as part of their WINE for Steam project called Proton. The percentage of Windows video games that run flawlessly on Linux using WINE or Steam Proton is increasing rapidly, but it will never reach 100% because it's all but impossible to make WINE support all of the crazy Digital Rights Management tools that are integrated in some games. If you talk your friend into switching to Linux and then she finds out that Soul Caliber 6 will open on WINE but is blocked from multiplayer, she will go back to Windows and never try Linux again.
Now, as a Linux enthusiast, I understand these two limitations and I am willing to live with them. I have happily used exclusively Linux for my home computers for years. But whenever I consider helping a friend or family member migrate to Linux I have a long discussion with them about this first - and often, we agree it does not make sense for them to switch.
(Score: 2) by PartTimeZombie on Thursday June 25 2020, @02:45AM
You might be right.
I spent the 1990's working in pre-press, which was exclusively Mac (ignoring for a moment the million dollar Crosfield image editing workstations we had) because Macs were "better."
Which in 1992 may have been true, but when the boss needed 40 new machines for the design studio in 2000, he bought 40 Windows 2000 machines, because he saved several 10s of thousands of dollars, and you know what? Those Win2k machines were just as good as the Macs we had been using which came as a shock to all of us.
Windows ran Photoshop, Quark Xpress, and Illustrator just as well as Mac had by that point
(Score: 2) by looorg on Tuesday June 23 2020, @09:41AM (5 children)
As much as I like them to once again switch, I'm less certain about how great it will actually be. Apple have this tendency to go for the walled-garden approach. To preserve their "specialness", just like they previously didn't want or allow say apple-clone machines to be built by third parties and then run their software. This is why they are just a couple of percentage of the desktop market.
Then of cause it would be interesting to see some specs here of just how good it will be processor-wise. It will probably be fairly good and competitive by itself but if it's then locked down and not usable by anything beyond the magical apple-software then it sort of become super niche again and just not very interesting. Examples. Will their ARM cpu allow you to run Windows 10? Since there is an ARM-version of that. Not that I would want to but it's nice to have options. If they don't allow that they probably don't allow you to install whatever ARM version of linux there is either -- their might be one either way and it's not like I'm asking them for permission to install whatever I want but if they lock it down you might have to break shit just to get it in/on there and then it becomes a matter of perhaps the hardware just isn't that great after all.
The other thing is if this will actually be a proper desktop machine or if this is just the next step in some tablet-phone-desktop-hybrid crap. Always connected to the magical internet cloud and more or less just a viewport of the Apple-cloud experience.
(Score: 4, Insightful) by hendrikboom on Tuesday June 23 2020, @09:52AM (1 child)
I bought a PC a few months before the first Ma was released. I wanted to hold out for what promised to be a superior machine, but I needed a computer when i needed it, not when it would be convenient to need it.
Was I ever glad when the Mac did come out. I had a machine I could program, and the Mac purchasers didn't.
It took two years before an ordinary Mac user could program their Mac, and then it was by an interpreter, not a compiler.
For a regular developer to program a Mac, you needed the much more expensive Apple Lisa.
-- hendrik
(Score: 2) by everdred on Tuesday June 23 2020, @07:26PM
I had a similar experience when shopping for my first smartphone shortly before the release of the original iPhone. No third-party app development was the big issue for me (but the keyboard, battery and AT&T situation didn't help either).
I bought a Palm Treo.
(Score: 2, Interesting) by petecox on Tuesday June 23 2020, @10:24AM (1 child)
If Apple develop a UEFI payload for their bootloader then, sure, no reason not to support Windows on ARM Macs as they do on their current machines.
I guess though this will kill off the Hackintosh, i.e. running macos (subject to the laws in one's country) on generic x86 hardware if the smarts inside the A13 are part of the trusted boot sequence.
(Score: 0) by Anonymous Coward on Tuesday June 23 2020, @10:36AM
QEMU already emulates various models of ARM, presumably they'll soon support these as well. GPU will be harder. But just because it's ARM, doesn't mean you have to use an on-chip GPU. Apple has a lot of video production customers that aren't necessarily going to be happy about a 90% nerf to their GPU performance, so probably they'll continue to use real PCIe GPUs in at least some models (we know Apple would prefer if nobody played games on their Serious Computers). With a lot of hard work and a little bit of luck, you'll be able to use a PC with QEMU emulating the CPU and a passed-through PCIe GPU to still run MacOS on PC hardware. You will probably have to use Linux to accomplish it.
(Score: 3, Interesting) by takyon on Wednesday June 24 2020, @11:05PM
Not initially, due to Microsoft's licensing issues (a consequence of Windows 10 on ARM not being a priority anyway?):
Apple’s new ARM-based Macs won’t support Windows through Boot Camp: It’s up to Microsoft to change that [theverge.com]
It seems like Microsoft could take care of those issues fairly quickly. They will probably try to push the full Windows ARM experience on Raspberry Pi 4 as well since 2-8 GB options exist and it is proven to at least run on 1 GB [windowslatest.com].
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by hendrikboom on Tuesday June 23 2020, @09:48AM (5 children)
I winder if their code translator will also convert POWER Mac code to the new ARM code.
(Score: 3, Insightful) by takyon on Tuesday June 23 2020, @09:57AM (3 children)
No. It will create an instant disatrophe.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Funny) by kazzie on Tuesday June 23 2020, @02:11PM (2 children)
Because of a mis-placed apostrophe?
(Score: 2) by takyon on Tuesday June 23 2020, @02:16PM (1 child)
I was trying to make a new word that has already been made. I'll try again: *disastrophe.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by PartTimeZombie on Thursday June 25 2020, @02:48AM
Disastrophe? Isn't that a Frank Zappa album?
(Score: 0) by Anonymous Coward on Tuesday June 23 2020, @04:10PM
That would be pretty cool to be honest, and emulating a PowerPC using an ARM is much faster than trying with an x64.
(Score: 3, Interesting) by shrewdsheep on Tuesday June 23 2020, @11:16AM (5 children)
I have always been confused by the statement that "Apple does its own silicon" and hope to be enlightened here. My understanding has always been that ARM does not distribute its Verilog code, rather you get a mask design. You can arrange a SoC adding components and asking for your preferred mix of cores but you would not design the cores. Is Apple therefore just one more SoC designer like many others or is their contribution deeper than that?
(Score: 5, Informative) by takyon on Tuesday June 23 2020, @12:44PM (1 child)
They design their own ARM SoCs, including redesigned ARM cores, not just the stock designs that ARM licenses out:
https://en.wikipedia.org/wiki/Apple_A13 [wikipedia.org]
They are basically more powerful than all competition in the smartphone space. Obviously, chips like the Fujitsu A64FX [wikipedia.org] that we will be reading about today [soylentnews.org] have more raw multithreaded performance, but Apple will move into that area if they want to replace high core count Xeons.
If it sounds like it would be easy to make custom ARM designs, Samsung has made an embarrassing exit [soylentnews.org] after making several generations of lackluster Exynos SoCs that typically sucked when compared to Qualcomm Snapdragon and Apple. And they are the #2 smartphone manufacturer, just after Huawei, and worth about a third of a trillion dollars (electronics division).
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Tuesday June 23 2020, @06:08PM
They also work hand and and with ARM on updates to the spec. Apple had a lot of influence during the transition to 64 bit.
The summary discusses the upcoming v9ISA, and I am willing to bet Apple will be implementing it on day one, most likely providing a lot of engineering resources on the actual implementation.
(Score: 4, Informative) by takyon on Tuesday June 23 2020, @01:49PM
Sites like AnandTech do deep dives into the architectural changes:
https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/2 [anandtech.com]
Apple isn't going to reveal all the details, so there is some guesswork involved. But there is some discussion of specific design changes that impact performance.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Informative) by KilroySmith on Tuesday June 23 2020, @03:21PM (1 child)
ARM has many different IP licensing schemes. In general, they provide a precompiled block that doesn't give you any flexibility to modify but which your tools will still be able to use for timing and place-and-route. Certainly you can get full-source licenses allowing you to modify the design, and companies like Apple, Qualcomm, Samsung will have licenses that extend that to basic architecture - they have access to ARM source, but also have the rights to build full-custom designs that may only be loosely based on that source but implement the ARM-specified chip architecture. For example, it's not clear to me how much of an Apple A13 chip is based on ARM's source, and how much of it is based on Apple's unique sources, but Apple's history of building faster ARM devices than everyone else might indicate that they have been liberal with their modifications to ARM IP. ARM creates the instruction sets, and the Verilog building blocks (here's an L1 cache module or an instruction decode module), and Apple picks and chooses what they'll use directly, what they'll modify, and what they'll rewrite for themselves.
(Score: 0) by Anonymous Coward on Wednesday June 24 2020, @05:21PM
What do ARM provide?
In the typical case ARM *do* provide Verilog to their partners. Typically one is not allowed to change the design although if you have a strong enough case (e.g. some kind of mismatch between the design of components that breaks a reasonable use-case) a waiver is possible. The CPUs in any resulting SoC have to pass a verification test suite before they can be shipped but these rules are enforced through legal measures in the IP license, not technical obfuscation.
What is Apple's contribution?
Some partners have an Architecture license. In the most extreme case this is (a) figuratively a copy of the ARM specification with the "For reference only, you cannot use this to clone an ARM processor" crossed out, (b) a copy of the verification suite and (c) a license to use the relevant ARM patents. The micro-architectural design can start with a blank piece of paper. Other companies might modify an existing design.
All SoC vendors (with or without an Architecture license) have to take all the various bits of IP, from ARM and other companies, and stitch them together, and they have to think about the scenarios they would like to work effectively and plan the size, number and connectivity of those components appropriately. e.g. do you want to allow a 3D game to splat a video file on to some surfaces? Okay, in that case you need an efficient path from the video decoder into the GPU and you need enough bandwidth on the bus to do both simultaneously, or prove to your satisfaction that you can do them in alternate time slices so the bus is never trying to carry the traffic of both the video decoder and the GPU at once. What about a live feed from a camera? Does that mean we need a wider bus? Could we just increase the bus frequency a bit? How does the area of the silicon vary for faster vs wider? Could we settle for either video or live camera input but not both?
Simulations will be run to try to confirm these choices before the design is finished and sent to the fab. Under-spec. it and your SoC won't do all the cool things you had planned for, or not for the planned battery life. Over-spec. it and your SoC will be bigger (more expensive) and potentially less efficient than the competition and customers will go elsewhere.
BTW @KilroySmith, I'm curious about your source for "in general they provide a precompiled block" - that needs correcting.
(Score: 5, Insightful) by SomeGuy on Tuesday June 23 2020, @11:49AM (12 children)
I think this is going to finally kill Apple Mac.
Apple does not understand the power of legacy application. There is a real need to write things once and let them go forever. This is why Windows 10-32 bit still exists. This is why Microsoft Internet Explorer is still unremovably mixed in to Windows 10. It's shit, but people need that shit.
They keep making the same mistake over and over. There are plenty of Mac OS 9 classic applications that will never run on MacOS X. There are plenty of PPC Mac OS X applications that never got moved to Intel. Because they adopted a bit too early they wound up having to move from 32-bit to 64-bit and then dumped all 32-bit, leaving 32-bit applications in the trash. From a software perspective, x86 64-bit is a fairly stable place to be, and that stability would benefit them, even if Intel drags their feet with the hardware sometimes.
The move from PPC to x86 was a disaster, but it was made up for by the fact that their users could now run Windows programs in parallel with MacOS or even just install Windows, without some slow emulator.
"Return of Rosetta"? Remember how long that lasted for PPC applications and how well that worked? I'd expect that to be clunky and they WILL pull the plug after a year or so.
I'd also fully expect ARM hardware to be locked down. If not that could at least be... interesting to play with. Otherwise, it is just a pathetic toy iPhone in the form of a laptop or desktop.
Sigh. I suppose most of their loyal consumertard sheep will still be happy, yet again, to throw away perfectly good hardware and discard useful software all for the latest fashion accessory. They can put that x86 mac in the closet next to their dusty old PPC mac. Meanwhile I surprise myself stumbling over new applications that occasionally still run under Windows XP, 98, or even 95.
(Score: 5, Insightful) by Subsentient on Tuesday June 23 2020, @11:56AM (9 children)
Apple has survived two other transitions so far, M68K to PPC, and PPC to Intel. Now that Apple is worth 1.5Tn, there's no reason to think the Mac won't survive this.
"It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
(Score: 3, Insightful) by fyngyrz on Tuesday June 23 2020, @04:10PM (1 child)
Three — the recent 32 to 64 bit transition has also been extremely disruptive as well.
--
Science. It's like religion. Except real.
(Score: 3, Touché) by jb on Wednesday June 24 2020, @06:45AM
Surely that should be at least four? Or aren't you counting the transition from the MOS6502 in the (pre-Macintosh) 8 bit Apples to the m68k in the Lisa and the early Macs?
Or to be pedantic, since you counted 32 to 64 bit as a transition, perhaps it should be five -- counting the transition to the 16bit 65C816 in the Apple //gs from the 6502s in the 8bit Apples? Or is that cheating because the //gs was released a few years later than the Lisa?
(Score: 2) by toddestan on Tuesday June 23 2020, @10:50PM (6 children)
Back when they made the M68K to PPC, and when they made the PPC to Intel transition, Apple was a computer company. It was even in the name, Apple Computers, Inc. So they had a pretty big vested interest in the transition being successful.
Today they are a phone company and an app store. It's obvious for most of the last decade that they don't care about the Mac like they used to, which is why their line of Macs languishes and a new Mac often uses hardware that is already years out of date. Which is to be expected since they make their money off of their mobile platform. If the Mac disappeared tomorrow, Apple, Inc. would hardly notice.
This could very well be the end of the Mac. If users shun the new Macs, or if developers decide it's not worth going through another transition for an increasingly shrinking user base and just dump the platform entirely, I could see Apple quietly killing off their computer line in a few years. That is, assuming they don't finally complete the merge with the mobile product line, and a Mac is little more than iPad with an attached keyboard.
(Score: 2) by looorg on Wednesday June 24 2020, @12:02AM (5 children)
I would think you last sentence there is actually the ultimate goal for them, and quite honestly for a lot of "computer" companies. Perhaps it's not all to bad? People that are into, or actually need, computers will still get computers and everyone else will just get one of them laptop-mobile-tablet-hybrids. Come to think of it isn't that sort of what is already happening?
(Score: 2) by toddestan on Wednesday June 24 2020, @12:39AM (3 children)
I've also suspected that's their ultimate goal - to turn the Mac into the same kind of walled garden that the iPhone/iPad is. Getting a sweet 30% cut of every software sale on the Mac must be mighty tempting.
Anyone who wants or needs a computer will still buy a computer - it just won't be one with a fruit on it. Not that Apple has made a serious computer for some time now - the Mac Pro has been a joke for years now.
(Score: 2) by Pino P on Sunday June 28 2020, @06:04PM (2 children)
Unless you develop software for a living or as a hobby. No fruit, no access to the market of Mac, iPhone, and iPad users. Even if you plan to deploy your application as a web application, you still needed a Mac in order to debug the client side in Safari last I checked.
(Score: 2) by toddestan on Monday June 29 2020, @01:49AM (1 child)
The Mac software market has been shrinking for some time now, and some of the big developers on Mac such as Adobe have switched from developing on the Mac and porting to Windows later to developing on Windows and releasing a Mac port later. It's a pretty good bet a lot of them are going to give some thought as to whether it's worth going through another transition.
As you allude to, the big market for Macs is to develop software for iOS, and will continue that way so long as Apple doesn't open up their platform (and iOS remains relevant). This could take a pretty big hit if Apple starts allowing for development to take place on iPads, as you could always just use an iPad + keyboard instead of Mac. Assuming that iPad + keyboard isn't just a Macbook anyway.
While I'm sure anyone developing websites cares about Safari on iOS, I kind of doubt it for Safari on the Mac. Sure, some will care, but I would guess a lot would test on iOS Safari and assume Mac Safari will work too for the small fraction of Mac users who actually use Safari. And if it doesn't, the answer will just be "install Chrome" anyway.
(Score: 2) by Pino P on Sunday July 05 2020, @02:10AM
The debugger and inspector tools for Safari on iOS are displayed inside Safari on macOS.
Chrome for iOS uses the same WKWebView control as Safari for iOS and thus has the same engine quirks.
(Score: 2) by Pino P on Sunday June 28 2020, @06:02PM
I fear that as this trend continues, the total global production of general-purpose home computers will shrink to below replacement. This would mean that "[p]eople that are into, or actually need, computers" might not be able to find one.
(Score: 1, Insightful) by Anonymous Coward on Tuesday June 23 2020, @02:27PM
I'm not sure I agree. I mean, sure, people who do real business work with their computers need that, and people who care about value in their purchases want that. But Apple's market is :
1} People who have to develop for ios and therefore have to put up with whatever bullshit Apple wants to shove down their throat,
2) People doing video and graphic design who have been trained to pay for everything as a service anyway and are used to throwing everything away every couple of years, and
3) People who want to pay $4000 for a computer that has $500 worth of utility and $3500 worth of logo
I'm not sure backward compatibility is necessary for any of these groups.
(Score: 2) by TheRaven on Wednesday June 24 2020, @04:32PM
macOS Catalina has dropped support for 32-bit apps. They didn't ship 64-bit x86 processors until 2007. They have, at most, 13 years of backwards compatibility. They've also deprecated and then removed a load of APIs, so a lot of software from back then simply doesn't work. Practically, they have closer to 10 years of backwards binary compatibility. It doesn't seem to have hurt them.
sudo mod me up
(Score: 3, Insightful) by Subsentient on Tuesday June 23 2020, @11:54AM (3 children)
Apple was already dead to me ever since it became impossible to boot Linux on the new Mac Minis. Not like I'd have bought one before that though, but I'd probably have accepted it as a gift. Now, it's a doorstop to me.
Their new aarch64 Mac hardware will have locked bootloaders, just like their iPhones and iPads, and will be totally useless for anything other than macOS.
I'm really growing to hate Apple. They're so predictably evil.
"It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
(Score: 0) by Anonymous Coward on Wednesday June 24 2020, @01:34PM
Right. If they announced unlocked boot loaders and published documentation for creating device drivers for their new hardware in Linux (or anything else), I would be excited for the change. As is, no thank you.
The only good news I see is that existing x86_64 Macs should depreciate rapidly in value. Maybe in a year or two I'll pick one up and put Linux on it.
(Score: 2) by TheRaven on Wednesday June 24 2020, @04:36PM (1 child)
When did they do that? A quick search gave me instructions for installing Linux on a latest generation Mac Mini. It was broken for a while about a decade ago because Apple shipped a UEFI implementation without legacy BIOS emulation, but Linux has been able to boot on pure-UEFI systems for a very long time.
sudo mod me up
(Score: 2) by toddestan on Monday June 29 2020, @02:08AM
It's the T2 security chip. If that's enabled, your Mac can only boot OSes with the right keys, which means MacOS and Windows. Linux (and anything else) gets the middle finger from Apple. Apple does let you disable the T2 chip, but that also disables the internal storage, so you now have to install to an external drive. At which point you're now running Linux on your Mac Mini.
I'll leave it to the pedants as to whether this counts as installing Linux on a Mac Mini since it's not actually installed on the built-in internal storage.
(Score: -1, Troll) by Anonymous Coward on Tuesday June 23 2020, @05:20PM
The transition to ARM would be an excellent opportunity to setup [sting] the nosy cunts with NAWBO...
(Score: -1, Offtopic) by Anonymous Coward on Tuesday June 23 2020, @09:47PM
everything "apple", for me, is like news from a far away place or even another planet.
also, ref. foundation tv series, if the portrait world has a "clean" and "tidy" and "adhd mediaction fueled" look then you know it's d00med.
only a portrait world where there are extremes between high-tech and wealth and in the same shot, (mutant?) rats, hawker stalls and garbage exists with leaky pipes and, yup, still exhaust steam, is the world "real" and even low-lifes survive ...
in the sleak, and shaved and ADHD worlds one always wonders were people poop, get drunk and eat ... and who the heck builds all this "sleak"?
not to mention that a "apple world" really doesn't need much of "psycho science" to predict much ...