Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday April 24 2020, @07:58AM   Printer-friendly
from the replacing-the-Apple's-core dept.

CNet:

Apple will start selling Macs that use in-house processors in 2021, based on ones in upcoming iPhones and iPad Pros, Bloomberg reported Thursday. The company is apparently working on three of its own chips, suggesting a transition away from traditional supplier Intel.

The initial batch of custom chips won't be on the same level as the Intel ones used in high-end Apple computers, so they're likely to debut in a new type of laptop, the report noted. These processors could have eight high-performance cores and at least four energy-efficient cores, respectively codenamed Firestorm and Icestorm.

Just another brick in the wall[ed garden]?


Original Submission

Related Stories

Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops 71 comments

Apple announces Mac architecture transition from Intel to its own ARM chips, offers emulation story

Apple has just announced its plans to switch from Intel CPUs in Macs to silicon of its own design, based on the ARM architecture. This means that Apple is now designing its own chips for iOS devices and its Mac desktop and laptops. Apple said it will ship its first ARM Mac before the end of the year, and complete the Intel -> ARM transition within two years.

Apple will bring industry leading performance and performance-by-watt with its custom silicon. Apple's chips will combine custom CPU, GPU, SSD controller and many other components. The Apple silicon will include the Neural Engine for machine learning applications.

[...] "Most apps will just work".

The Next Phase: Apple Lays Out Plans To Transition Macs from x86 to Apple SoCs

[From] an architecture standpoint, the timing of the transition is a bit of an odd one. As noted by our own Arm guru, Andrei Frumusanu, Arm is on the precipice of announcing the Arm v9 ISA, which will bring several notable additions to the ISA such as Scalable Vector Extension 2 (SVE2). So either Arm is about to announce v9, and Apple's A14 SoCs will be among the first to implement the new ISA, otherwise Apple will be setting the baseline for macOS-on-Arm as v8.2 and its NEON extensions fairly late into the ISA's lifecycle. This will be something worth keeping an eye on.

[...] [In] order to bridge the gap between Apple's current software ecosystem and where they want to be in a couple of years, Apple will once again be investing in a significant software compatibility layer in order to run current x86 applications on future Arm Macs. To be sure, Apple wants developers to recompile their applications to be native – and they are investing even more into the Xcode infrastructure to do just that – but some degree of x86 compatibility is still a necessity for now.

The cornerstone of this is the return of Rosetta, the PowerPC-to-x86 binary translation layer that Apple first used for the transition to x86 almost 15 years ago. Rosetta 2, as it's called, is designed to do the same thing for x86-to-Arm, translating x86 macOS binaries so that they can run on Arm Macs. Rosetta 2's principle mode of operation will be to translate binaries at install time.

See also: Apple Announces iOS 14 and iPadOS 14: An Overview
Apple's First ARM-Based (Mac) Product Is a Mac mini Featuring an A12Z Bionic, but Sadly, Regular Customers Can't Buy It

Previously: Apple Will Reportedly Sell a New Mac Laptop With its Own Chips Next Year


Original Submission

Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core 26 comments

Apple Announces The Apple Silicon M1: Ditching x86 - What to Expect, Based on A14

The new processor is called the Apple M1, the company's first SoC designed with Macs in mind. With four large performance cores, four efficiency cores, and an 8-GPU core GPU, it features 16 billion transistors on a 5nm process node. Apple's is starting a new SoC naming scheme for this new family of processors, but at least on paper it looks a lot like an A14X.

[...] Apple made mention that the M1 is a true SoC, including the functionality of what previously was several discrete chips inside of Mac laptops, such as I/O controllers and Apple's SSD and security controllers.

[....] Whilst in the past 5 years Intel has managed to increase their best single-thread performance by about 28%, Apple has managed to improve their designs by 198%, or 2.98x (let's call it 3x) the performance of the Apple A9 of late 2015.

[...] Apple has claimed that they will completely transition their whole consumer line-up to Apple Silicon within two years, which is an indicator that we'll be seeing a high-TDP many-core design to power a future Mac Pro. If the company is able to continue on their current performance trajectory, it will look extremely impressive.

ARM-Based Mac Pro Could Have 32+ Cores 29 comments

New report reveals Apple's roadmap for when each Mac will move to Apple Silicon

Citing sources close to Apple, a new report in Bloomberg outlines Apple's roadmap for moving the entire Mac lineup to the company's own custom-designed silicon, including both planned release windows for specific products and estimations as to how many performance CPU cores those products will have.

[...] New chips for the high-end MacBook Pro and iMac computers could have as many as 16 performance cores (the M1 has four). And the planned Mac Pro replacement could have as many as 32. The report is careful to clarify that Apple could, for one reason or another, choose to only release Macs with 8 or 12 cores at first but that the company is working on chip variants with the higher core count, in any case.

The report reveals two other tidbits. First, a direct relative to the M1 will power new iPad Pro models due to be introduced next year, and second, the faster M1 successors for the MacBook Pro and desktop computers will also feature more GPU cores for graphics processing—specifically, 16 or 32 cores. Further, Apple is working on "pricier graphics upgrades with 64 and 128 dedicated cores aimed at its highest-end machines" for 2022 or late 2021.

New Mac models could have additional efficiency cores alongside 8/12/16/32 performance cores. Bloomberg claimed the existence of a 12-core (8 performance "Firestorm" cores, 4 efficiency "Icestorm" cores) back in April which has not materialized yet.

The Apple M1 SoC has 8 GPU cores.

Previously: Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops
Apple Has Built its Own Mac Graphics Processors
Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core
Your New Apple Computer Isn't Yours
Linus Torvalds Doubts Linux will Get Ported to Apple M1 Hardware


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Insightful) by Anonymous Coward on Friday April 24 2020, @08:51AM (17 children)

    by Anonymous Coward on Friday April 24 2020, @08:51AM (#986424)

    This is the fourth processor ecosystem transition they have done. The 68000 to PowerPC, the PowerPC to x86, the x86 to x86-64, and now x68-64 to ARM. Each one worked with relative ease, as old software would still run on the new machines and new software ran on the old ones. Sure different mechanisms were required for each direction to work and there were some hiccups, but most end users probably wouldn't have really noticed them due to the firm deadlines known in advance. Beautiful illustration of planning and execution.

    Although, I do wonder if they remembered the old lessons and have the right engineers at the helm to pull it off this time. Like launching a manned rocket to space, it isn't exactly easy despite looking so when done by people who know what they are doing and success can easily breed institutional overconfidence, complacency, and carelessness.

    • (Score: 3, Interesting) by Mojibake Tengu on Friday April 24 2020, @09:42AM (14 children)

      by Mojibake Tengu (8598) on Friday April 24 2020, @09:42AM (#986429) Journal

      While I modded you up, I still consider transition of Macs to ARM a significant regression.

      That makes me to consider future Apple devices as just unimpressive multimedia toys, not universal computers.
      I may continue to buy some, as I usually pick a dedicated device for a specific task only, free of maintenance burden, but the awesomeness of their ecosystem is long gone away with Jobs.

      --
      Respect Authorities. Know your social status. Woke responsibly.
      • (Score: 2) by takyon on Friday April 24 2020, @11:07AM (12 children)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday April 24 2020, @11:07AM (#986430) Journal

        That makes me to consider future Apple devices as just unimpressive multimedia toys, not universal computers.

        That is a bizarre take. There are billions of ARM devices, there are ARM server chips, etc. Apple already makes some of the best-performing ARM SoCs, and they can certainly make ARM chips that perform better than some of the low-end Intel x86 chips that were in the cheaper Macs. The 12-core described in the article will be powerful, and will use more Watts per core (at least for the Firestorm portion) than Apple's previous designs.

        If it's about the software, there have been articles signalling this move for years now. Apple has a lot of money to throw around, and developers will target these chips and be able to hit iPad Pro, etc. at the same time.

        Transition will be gradual, start with less-powerful computers

        There are some weak quad-core Intel CPUs in the lineup, and even a dual-core in the Macbook Air.

        Apple is exploring Mac processors with more than 12 cores for further in the future, the people said.

        Yes, they could even challenge Xeon/Threadripper-based systems eventually.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 5, Interesting) by Mojibake Tengu on Friday April 24 2020, @12:30PM (11 children)

          by Mojibake Tengu (8598) on Friday April 24 2020, @12:30PM (#986444) Journal

          I don't think so. The weakest point of all the ARM franchise is inferior memory bandwidth. Always was. It is not so obvious in toys but it will be clearly distinctive in work devices.

          Apple already slipped into the cult of low power at all cost and that's what cripples drives their designs. Every next of them is worse than the previous one. This dogma of low powered endpoints complements the dogma of central clouds. And the goal of this dual dogma is: make user devices as powerless as possible, and dependent on services.

          Sheer count of processors is no metric to compare platforms.
          I am looking forward for AMD64 phones, not for ARM notebooks.

          --
          Respect Authorities. Know your social status. Woke responsibly.
          • (Score: 1, Interesting) by Anonymous Coward on Friday April 24 2020, @01:04PM

            by Anonymous Coward on Friday April 24 2020, @01:04PM (#986455)

            Do you have a source for that? What about the ARM servers we're starting to see in AWS?

            In certain benchmarks, the iPad Pro is pretty close to the Macbooks. Isn't that all the evidence needed?

          • (Score: 2, Flamebait) by epitaxial on Friday April 24 2020, @03:11PM (2 children)

            by epitaxial (3165) on Friday April 24 2020, @03:11PM (#986498)

            You're so full of shit your eyes are brown. The cheapest iPhone now has a faster processor than the most expensive Android.

            • (Score: 2) by DannyB on Friday April 24 2020, @04:46PM

              by DannyB (5839) Subscriber Badge on Friday April 24 2020, @04:46PM (#986547) Journal

              Disclaimer: somewhat of an android fanboy here

              While I do not dispute that Apple may have faster processors than the most expensive android . . .

              According to the last state of this animated chart [youtube.com], android has about 8.5 x the market share of Apple. (btw, watching how mobile operating systems changed from the 90's until 2019 is very interesting and amusing.)

              I was a fan of WebOS at one point (I think about 2009) but then it became clear to me that Android would absolutely win. If you watch that animated chart, as soon as Android appears, it rapidly sweeps away everything else.

              Why?

              Here is an insight.

              In my younger days I was on the flip side. I was a true card-carrying loyal Apple fanboy and developer. And quite smug. After all Mac was clearly superior in every way to PCs with DOS and Windows 3.1. Yet the PC ecosystem was vastly bigger.

              Apple didn't license Mac OS to other hardware manufacturers until it was too late. When they did, they realized that those other hardware makers could way underprice Apple, which Steve Jobs had refused to believe they could. After all, who could possibly make a Mac more inexpensively than Apple?

              Consider this. If you were going to start making computers in the 1980's, what OS would you go with? What choices were there? Oh, yeah. There was DOS / Windows. Apple wasn't going to license Mac OS to you.

              In 2007 with the iPhone, I saw the exact same thing. Superior product. Artificially high prices. No licensing to third parties. Like the Mac, only one design. The one true way. Period.

              Meanwhile in the Android world (like the PC world before it) devices came in every size, shape, style color, feature set and price point. It was so obvious to me that a blind man could see it. Android would win.

              Now, all that said. How long will it be before flagship android devices get more powerful processors. It is a myth (that I once partly believed) that Apple and their engineers somehow had magical powers that nobody else could replicate.

              --
              The lower I set my standards the more accomplishments I have.
            • (Score: 2) by DannyB on Friday April 24 2020, @04:49PM

              by DannyB (5839) Subscriber Badge on Friday April 24 2020, @04:49PM (#986554) Journal

              Also, you get a +1 Informative

              --
              The lower I set my standards the more accomplishments I have.
          • (Score: 2) by DannyB on Friday April 24 2020, @04:49PM (3 children)

            by DannyB (5839) Subscriber Badge on Friday April 24 2020, @04:49PM (#986552) Journal

            Is there some fundamental reason that the ARM architecture cannot be fabricated with faster switches, higher voltages, generate more heat, have more parallelism, or certainly more cores per chip, have higher memory bandwidth, etc?

            I hear about ARM designs in powerful servers.

            --
            The lower I set my standards the more accomplishments I have.
            • (Score: 2) by Mojibake Tengu on Friday April 24 2020, @07:16PM (2 children)

              by Mojibake Tengu (8598) on Friday April 24 2020, @07:16PM (#986649) Journal

              Yes, it is. A fundamental reason. On ARM, no AMD HyperTransport, namely its superset Infinity Fabric architecture. A true breakthrough, a new techlevel. This one cannot be imitated correctly even by Intel with all their money pile, not mentioning poor ARM.

              --
              Respect Authorities. Know your social status. Woke responsibly.
              • (Score: 2) by DannyB on Friday April 24 2020, @07:51PM

                by DannyB (5839) Subscriber Badge on Friday April 24 2020, @07:51PM (#986656) Journal

                That is interesting information about AMD. Thank you.

                So what about comparing ARM / RISC-V with Intel. What would prevent building high performance, power hungry, room heating processors?

                --
                The lower I set my standards the more accomplishments I have.
              • (Score: 2) by takyon on Friday April 24 2020, @09:36PM

                by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday April 24 2020, @09:36PM (#986696) Journal

                A high-speed interconnect doesn't cure everything, it will be imitated, and Apple is not replacing AMD CPUs, they are replacing Intel CPUs. Apple also has a much bigger money pile than ARM (SoftBank).

                --
                [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2, Interesting) by petecox on Saturday April 25 2020, @12:20AM (1 child)

            by petecox (3228) on Saturday April 25 2020, @12:20AM (#986772)

            I am looking forward for AMD64 phones

            tried and failed, Intel wanted to cram an Atom into a phone - turns out there was no market. If there's ever a challenger to ARM on mobile, think RISC-V (and we're still perhaps half a decade away).

            • (Score: 0) by Anonymous Coward on Saturday April 25 2020, @11:37AM

              by Anonymous Coward on Saturday April 25 2020, @11:37AM (#986885)

              AMD could use RISC-V as well. It would be interesting.

          • (Score: 3, Informative) by Alphatool on Saturday April 25 2020, @11:38AM

            by Alphatool (1145) on Saturday April 25 2020, @11:38AM (#986886)

            Memory bandwidth isn't an inherent problem with the ARM architecture. While there are chips out there with limited memory bandwidth, there are also ARM chips with great memory bandwidth, like the Fujitsu A64FX [anandtech.com]. If Apple want they are more than capable of making an A series chip with huge memory bandwidth.

      • (Score: 3, Insightful) by epitaxial on Friday April 24 2020, @12:06PM

        by epitaxial (3165) on Friday April 24 2020, @12:06PM (#986437)

        What makes you think ARM is less useful than x64?

    • (Score: 4, Interesting) by JoeMerchant on Friday April 24 2020, @01:43PM (1 child)

      by JoeMerchant (3937) on Friday April 24 2020, @01:43PM (#986462)

      Each one worked with relative ease

      That depends on the slant of your fanboi fedora. I inherited my dad's cast-off PowerPC MacMini in 2005, and it quickly turned into a paperweight while the 2006 Intel based MacMinis are still relatively useful today. I suppose if your perspective is: "I always upgrade to the latest hardware within 12 months after Apple releases it" then, sure, that transition was painless.

      I do applaud the move to low power ARM based PCs. I put it in the "it's about time" category - web browsers, word processors, spreadsheets and powerpoints don't really need more than ARM compute power and the electrical power savings are very significant and appropriate in the mobile form factors, including laptops.

      Intel chips continue to have more compute power, but the percentage of Mac (and PC) users who need that power more than a cool running, long lasting, small battery in their mobile device is probably lower than the percentage of people who are going to die of COVID-19. If you're among the heavy compute load minority, sure: buy your $5000 lap-burners if that's what you're into. Personally, I offload my heavy compute tasks to an i7 desktop box and use lightweight laptops and NUCs for the majority of my actual human-computer interface work.

      --
      🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Friday April 24 2020, @07:53PM

        by Anonymous Coward on Friday April 24 2020, @07:53PM (#986658)

        That depends on the slant of your fanboi fedora.

        Well relative does mean relative, after all. Besides, the transition worked much better than some cold swap. They released Intel computers in 2005 but still released updates for PPC OSX into 2009, and their official software after that. You could take a single installer or package and have it work on both systems, more or less transparently. A five year transition period like that is pretty good for a change so fundamental to the operation of the ecosystem.

  • (Score: 4, Interesting) by pkrasimirov on Friday April 24 2020, @11:19AM (2 children)

    by pkrasimirov (3358) Subscriber Badge on Friday April 24 2020, @11:19AM (#986432)

    As a consumer I approve -- the more the active CPU platforms the wider the options I have. And serves their agenda too: now everything hardware/software will be incompatible with the "legacy products" and people are expected to pay (again). My feeling is the engineering was unhappy with Spectre and sales gladly chimed in for the reason above.

    Eventually the companies will understand the only way to ensure peroper quality in huge systems is transparency, i.e. open source. Meanwhile let them build their wall.

    • (Score: 2) by JoeMerchant on Friday April 24 2020, @01:47PM

      by JoeMerchant (3937) on Friday April 24 2020, @01:47PM (#986466)

      Eventually the companies will understand the only way to ensure peroper quality in huge systems is transparency, i.e. open source.

      What companies are those? I work for a big corp with peroper quality and they're melting like the Greenland glacier was 100 years ago with respect to adopting open source. Maybe in another 100 years you'll start to notice some changes.

      --
      🌻🌻 [google.com]
    • (Score: 2) by meustrus on Friday April 24 2020, @05:35PM

      by meustrus (4961) on Friday April 24 2020, @05:35PM (#986587)

      Practically speaking, I can't see how this move makes Macs more closed than they were before.

      Hardware upgrades are non-existent outside of the most high-end product, the Mac Pro. Macs have always had "blessed hardware", and the current Mac Pro is no different. It's even more closed off than ever before. You aren't going to be upgrading any Apple product with non-Apple-approved parts with current tech any more than you are with an ARM-based CPU.

      Apps for them are developed in XCode. XCode will surely support cross-compiling to all targets in "fat bundles" like during previous transitions (PowerPC->x86 & x86->x64).

      I think the only practical downside here is that Intel-based Macs will see support for new software dry up in a few years. At that point, your only upgrade path for the old hardware will be installing Linux. But that is an option.

      Oh, and some software will never be updated for the new architecture. But that happens all the time. It's not limited to architecture changes, although it always happens when those happen. It's not even limited to Apple products, although it is more frequent in them.

      And sure, free software is a better model. But "open source" in the enterprise is not really the gold standard here. If you really want quality, then what you need is to be part of a de-centralized network of pro-bono free software and hardware developers. What you need is a democratic and meritocratic process to select the best systems designers to lead the network. What you need is iterative improvements to the quality of the abstractions everyone else relies upon so that the size of the network can scale without requiring each new developer to understand the same percentage of its growing complexity.

      None of that is going to happen in the corporate world, because the corporate world will always be trying to keep its users out of its design process. If the user is capable of modifying the software, the user is capable of designing a competitor to the software. No one will pay for a product once they know how to build their own.

      --
      If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
  • (Score: 3, Informative) by takyon on Friday April 24 2020, @12:20PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday April 24 2020, @12:20PM (#986440) Journal

    Some SBCs use big.LITTLE (Raspberry Pi is a notable holdout). Many mobile devices and some laptops (or iPad Pro) use it.

    Even Intel seems to be getting in on the act with the 5-core Lakefield as well as rumored 8+8 core CPUs:

    Rumor | Alder Lake-S with 16 cores could see Intel bringing big.LITTLE to the desktop, 80-150W TDP and PCIe Gen4 support in tow [notebookcheck.net]
    Intel Alder Lake-S 16-Core Could Bring Hybrid Architecture (big.Little) to Desktop [tomshardware.com]

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 3, Insightful) by DannyB on Friday April 24 2020, @05:17PM (3 children)

    by DannyB (5839) Subscriber Badge on Friday April 24 2020, @05:17PM (#986570) Journal

    I remember about 2005 thinking there was coming a day when Microsoft's best days would be behind it. In 2007 Ballmer laughed at the iPhone.

    12 years earlier Microsoft had almost missed the internet revolution. Blinded by the "desktop monopoly" googles so that they almost didn't see the potential of web applications. Thus the first browser wars. And IIS vs Apache. And FrontPage. ActiveX. Later XAML/Silverlight as a Flash replacement.

    Microsoft didn't see the Netbooks in about 2007-2008. Just in time, they killed off Netbooks by resurrecting XP for a time.

    Microsoft then entirely missed the mobile device revolution. Smartphones and tablets.

    Microsoft totally did not get Chromebooks and what value they bring. Other people missed that because wireless connectivity was not yet so common. A cheap, secure platform that does what 80% of users need.

    Meanwhile, Linux has come to dominate everything that is NOT a desktop PC or laptop. Linux is in everything around us. I could start enumerating things like smart TVs, billions of android phones, routers, and much more.

    Now Intel seems to be stumbling. Maybe it's their highly integrated design and fabrication? I don't pretend to know.

    I've never liked Intel's architecture. Layers upon layers of cruft added over nearly five decades. (if you count the 4004/8008 as predecessors) Reading early BYTE magazines, I discern that the first 8080 didn't even have relative branch instructions. (I would be happy to be corrected.) Thus making relocatable code impossible or at best difficult.

    Then the 8086/8088 segment registers. For decades DOS and Windows systems were hobbled with the limitations of 64K segments. How many compilers had weird restrictions on 64 K arrays, 64 K code segments, etc. And the justification for this unholy abomination? So that the 8086 could be source code compatible with the 8080.

    Now our PC processors have management engines? Really? That's entirely crazy insane! Processors once merely executed code starting at power up.

    Intel's Atom tried to complete effectively with ARM taking over the exploding mobile device market. (and I don't mean Samsung Galaxy Note 7's)

    As with Linux, and I think it is interrelated, ARM, and other RISC processors are in lots of everyday things around us. Even things you don't think of as having a computer. Like a TV, or even a monitor, printer, digital camera, etc.

    How long before someone fabricates powerful ARM or RISC-V chips for high performance servers, and desktops?

    An amusing thing with Linux and Chrome OS laptops, if the underlying processor changed, the entire ecosystem of Linux comes to the new processor. All of the Debian packages. Etc. Just look at Raspberry PI 4.

    As (or if) Intel declines, the biggest loser is Microsoft. Microsoft has tried to introduce Windows ARM devices. The failure is that people expect the Windows brand name to mean that all their legacy junk will run. That they won't have to re-purchase every single software package that they own, and that the software vendors won't price-gouge for the change. And how many legacy Windows programs are deeply tied to Intel?

    I assert that the entire value proposition of Windows is in large part tied to Intel and legacy software. Maybe they will sink together just as WinTel was a term that described their ascendancy.

    --
    The lower I set my standards the more accomplishments I have.
    • (Score: 0) by Anonymous Coward on Friday April 24 2020, @08:07PM

      by Anonymous Coward on Friday April 24 2020, @08:07PM (#986665)

      In all seriousness, they should have called their ARM OS "Doors." The commercials are easy, people wouldn't have the inherent sense of backward compatibility, they'd have been more used to the idea of a walled garden, and their enterprise sector could have vacuumed up schools like Chromebooks are now.

    • (Score: 1) by petecox on Saturday April 25 2020, @12:33AM (1 child)

      by petecox (3228) on Saturday April 25 2020, @12:33AM (#986775)

      Microsoft has tried to introduce Windows ARM devices. The failure is that people expect the Windows brand name to mean that all their legacy junk will run. That they won't have to re-purchase every single software package that they own, and that the software vendors won't price-gouge for the change. And how many legacy Windows programs are deeply tied to Intel?

      Intel Mac software won't run natively on ARM either. But MS, with Qualcomm, built in emulation. As apple did from 68k -> PPC -> x86, when Adobe wanted everybody to pay for a new architecture for Photoshop.

      I haven't tried a Surface Pro X but I suspect Microsoft learned from their failed WinRT experiments and now have a 2 year headstart on MacARM64.

      Or are you suggesting that the checkbox in Visual Studio to compile for ARM64 is somehow different from the checkbox in XCode to compile for ARM64?

      • (Score: 3, Insightful) by DannyB on Saturday April 25 2020, @05:24PM

        by DannyB (5839) Subscriber Badge on Saturday April 25 2020, @05:24PM (#987018) Journal

        I totally understand what you say, but it is irrelevant. That only applies to NEW software.

        And new also means newly compiled old software -- which the end user will have to re-purchase in many cases. Do you really think Adobe is not going to charge you again for an ARM compiled photoshop?

        IMO vast amounts of legacy Windows software have x86 dependencies that don't make it trivial to port to ARM or other architectures.

        --
        The lower I set my standards the more accomplishments I have.
(1)