Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday November 30 2020, @12:01AM   Printer-friendly
from the what-would-Linus-do? dept.

Linus Torvalds doubts Linux will get ported to Apple M1 hardware:

In a recent post on the Real World Technologies forum—one of the few public internet venues Linux founder Linus Torvalds is known to regularly visit—a user named Paul asked Torvalds, "What do you think of the new Apple laptop?"

If you've been living under a rock for the last few weeks, Apple released new versions of the Macbook Air, Macbook Pro, and Mac Mini featuring a brand-new processor—the Apple M1.

The M1 processor is a successor to the A12 and A14 Bionic CPUs used in iPhones and iPads, and pairs the battery and thermal efficiency of ultramobile designs with the high performance needed to compete strongly in the laptop and desktop world.

"I'd absolutely love to have one, if it just ran Linux," Torvalds replied. "I've been waiting for an ARM laptop that can run Linux for a long time. The new [Macbook] Air would be almost perfect, except for the OS."

[...] In an interview with ZDNet, Torvalds expounded on the problem:

The main problem with the M1 for me is the GPU and other devices around it, because that's likely what would hold me off using it because it wouldn't have any Linux support unless Apple opens up... [that] seems unlikely, but hey, you can always hope.

[...] It's also worth noting that while the M1 is unabashedly great, it's not the final word in desktop or laptop System on Chip designs. Torvalds mentions that, given a choice, he'd prefer more and higher-power cores—which is certainly possible and seems a likely request to be granted soon.

Previously: Apple's New ARM-Based Macs Won't Support Windows Through Boot Camp
Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core
Your New Apple Computer Isn't Yours


Original Submission

Related Stories

Apple’s New ARM-Based Macs Won’t Support Windows Through Boot Camp 42 comments

Apple's New ARM-Based Macs Won't Support Windows Through Boot Camp:

Apple will start switching its Macs to its own ARM-based processors later this year, but you won't be able to run Windows in Boot Camp mode on them. Microsoft only licenses Windows 10 on ARM to PC makers to preinstall on new hardware, and the company hasn't made copies of the operating system available for anyone to license or freely install.

"Microsoft only licenses Windows 10 on ARM to OEMs," says a Microsoft spokesperson in a statement to The Verge. We asked Microsoft if it plans to change this policy to allow Windows 10 on ARM-based Macs, and the company says "we have nothing further to share at this time."

[...] Apple later confirmed it's not planning to support Boot Camp on ARM-based Macs in a Daring Fireball podcast. "We're not direct booting an alternate operating system," says Craig Federighi, Apple's senior vice president of software engineering. "Purely virtualization is the route. These hypervisors can be very efficient, so the need to direct boot shouldn't really be the concern."

Previously: Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops


Original Submission

Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core 26 comments

Apple Announces The Apple Silicon M1: Ditching x86 - What to Expect, Based on A14

The new processor is called the Apple M1, the company's first SoC designed with Macs in mind. With four large performance cores, four efficiency cores, and an 8-GPU core GPU, it features 16 billion transistors on a 5nm process node. Apple's is starting a new SoC naming scheme for this new family of processors, but at least on paper it looks a lot like an A14X.

[...] Apple made mention that the M1 is a true SoC, including the functionality of what previously was several discrete chips inside of Mac laptops, such as I/O controllers and Apple's SSD and security controllers.

[....] Whilst in the past 5 years Intel has managed to increase their best single-thread performance by about 28%, Apple has managed to improve their designs by 198%, or 2.98x (let's call it 3x) the performance of the Apple A9 of late 2015.

[...] Apple has claimed that they will completely transition their whole consumer line-up to Apple Silicon within two years, which is an indicator that we'll be seeing a high-TDP many-core design to power a future Mac Pro. If the company is able to continue on their current performance trajectory, it will look extremely impressive.

Your New Apple Computer Isn't Yours 133 comments

Your Computer Isn't Yours:

On modern versions of macOS, you simply can't power on your computer, launch a text editor or eBook reader, and write or read, without a log of your activity being transmitted and stored.

It turns out that in the current version of the macOS, the OS sends to Apple a hash (unique identifier) of each and every program you run, when you run it. Lots of people didn't realize this, because it's silent and invisible and it fails instantly and gracefully when you're offline, but today the server got really slow and it didn't hit the fail-fast code path, and everyone's apps failed to open if they were connected to the internet.

Because it does this using the internet, the server sees your IP, of course, and knows what time the request came in. An IP address allows for coarse, city-level and ISP-level geolocation, and allows for a table that has the following headings: Date, Time, Computer, ISP, City, State, Application Hash

Apple (or anyone else) can, of course, calculate these hashes for common programs: everything in the App Store, the Creative Cloud, Tor Browser, cracking or reverse engineering tools, whatever.

This means that Apple knows when you're at home. When you're at work. What apps you open there, and how often. They know when you open Premiere over at a friend's house on their Wi-Fi, and they know when you open Tor Browser in a hotel on a trip to another city.

ARM-Based Mac Pro Could Have 32+ Cores 29 comments

New report reveals Apple's roadmap for when each Mac will move to Apple Silicon

Citing sources close to Apple, a new report in Bloomberg outlines Apple's roadmap for moving the entire Mac lineup to the company's own custom-designed silicon, including both planned release windows for specific products and estimations as to how many performance CPU cores those products will have.

[...] New chips for the high-end MacBook Pro and iMac computers could have as many as 16 performance cores (the M1 has four). And the planned Mac Pro replacement could have as many as 32. The report is careful to clarify that Apple could, for one reason or another, choose to only release Macs with 8 or 12 cores at first but that the company is working on chip variants with the higher core count, in any case.

The report reveals two other tidbits. First, a direct relative to the M1 will power new iPad Pro models due to be introduced next year, and second, the faster M1 successors for the MacBook Pro and desktop computers will also feature more GPU cores for graphics processing—specifically, 16 or 32 cores. Further, Apple is working on "pricier graphics upgrades with 64 and 128 dedicated cores aimed at its highest-end machines" for 2022 or late 2021.

New Mac models could have additional efficiency cores alongside 8/12/16/32 performance cores. Bloomberg claimed the existence of a 12-core (8 performance "Firestorm" cores, 4 efficiency "Icestorm" cores) back in April which has not materialized yet.

The Apple M1 SoC has 8 GPU cores.

Previously: Apple Announces 2-Year Transition to ARM SoCs in Mac Desktops and Laptops
Apple Has Built its Own Mac Graphics Processors
Apple Claims that its M1 SoC for ARM-Based Macs Uses the World's Fastest CPU Core
Your New Apple Computer Isn't Yours
Linus Torvalds Doubts Linux will Get Ported to Apple M1 Hardware


Original Submission

Linus Torvalds On The Importance Of ECC RAM, Calls Out Intel's "Bad Policies" Over ECC 99 comments

Linus Torvalds On The Importance Of ECC RAM, Calls Out Intel's "Bad Policies" Over ECC

There's nothing quite like some fun holiday-weekend reading as a fiery mailing list post by Linus Torvalds. The Linux creator is out with one of his classical messages, which this time is arguing over the importance of ECC memory and his opinion on how Intel's "bad policies" and market segmentation have made ECC memory less widespread.

Linus argues that error-correcting code (ECC) memory "absolutely matters" but that "Intel has been instrumental in killing the whole ECC industry with it's horribly bad market segmentation... Intel has been detrimental to the whole industry and to users because of their bad and misguided policies wrt ECC. Seriously...The arguments against ECC were always complete and utter garbage... Now even the memory manufacturers are starting [to] do ECC internally because they finally owned up to the fact that they absolutely have to. And the memory manufacturers claim it's because of economics and lower power. And they are lying bastards - let me once again point to row-hammer about how those problems have existed for several generations already, but these f*ckers happily sold broken hardware to consumers and claimed it was an "attack", when it always was "we're cutting corners"."

Ian Cutress from AnandTech points out in a reply that AMD's Ryzen ECC support is not as solid as believed.

Related: Linus Torvalds: 'I'm Not a Programmer Anymore'
Linus Torvalds Rejects "Beyond Stupid" Intel Security Patch From Amazon Web Services
Linus Torvalds: Don't Hide Rust in Linux Kernel; Death to AVX-512
Linus Torvalds Doubts Linux will Get Ported to Apple M1 Hardware


Original Submission

Booting Linux and Sideloading Apps on M1 Macs 31 comments

Initial Patches Posted for Bringing up Linux Kernel on Apple Silicon M1 Hardware

Initial Patches Posted For Bringing Up The Linux Kernel On Apple Silicon M1 Hardware

It was over the weekend that Corellium began posting their work of Linux booting on the Apple M1. It's now to the extent they can get Ubuntu's Raspberry Pi ARMv8 desktop image booting on Apple M1 hardware to a GUI albeit without any hardware acceleration. The Apple M1 graphics support will remain the big elephant in the room given the big challenges involved in bringing up an entirely new OpenGL/Vulkan driver stack and needing to carry out all of that reverse engineering first under macOS.

Apple M1 Open-Source GPU Bring-Up Sees An Early Triangle

The open-source/Linux Apple M1 work continues to be quite busy this week... The latest is Alyssa Rosenzweig who has been working on reverse-engineering the M1 graphics processor has been able to write some early and primitive code for rendering a triangle.

Alyssa Rosenzweig of Panfrost fame has been working to reverse engineer the Apple M1 graphics as part of the Asahi Linux effort with developer Marcan.

This week the milestone was reached of drawing a triangle using the open-source code. It's an important first milestone but important to keep in mind that this isn't an initial driver triangle but rather hand-written vertex and fragment shaders with machine code for the M1 GPU. Those hand-written shaders are submitted to the hardware via the existing macOS IOKit kernel driver. If not clear enough, this was done on macOS and not the early Linux state as well.

Previously: Your New Apple Computer Isn't Yours
Linus Torvalds Doubts Linux will Get Ported to Apple M1 Hardware
ARM-Based Mac Pro Could Have 32+ Cores

Apple Pulls the Plug on User-Found Method to Sideload iOS Apps on M1 Mac

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by takyon on Monday November 30 2020, @12:35AM (2 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday November 30 2020, @12:35AM (#1082168) Journal

    Now that Apple has proven ARM's value in the performance as well as the budget space, we broadly expect competing systems using high-end Snapdragon and similar processors to enter the market within the next few years. Such systems wouldn't need to beat—or even match—the M1's standout performance; they'd simply need to compete strongly with more traditional x86_64 systems on performance and price while dominating them in power consumption and thermal efficiency.

    It's also worth noting that while the M1 is unabashedly great, it's not the final word in desktop or laptop System on Chip designs. Torvalds mentions that, given a choice, he'd prefer more and higher-power cores—which is certainly possible and seems a likely request to be granted soon.

    96-128 core Neoverse CPUs [anandtech.com] are one thing. Maybe we'll also see regular ARM chips with more clusters. There's nothing stopping anybody from putting out a 16-core with a mix of Cortex-X1 [wikipedia.org]/X2, A78/A79/A80 [cnx-software.com], A55/A??. It might even be able to fit in a tablet or smartphone.

    In the meantime, the RK3588 looks pretty good [cnx-software.com] and supports double the RAM of the M1.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 4, Interesting) by Anonymous Coward on Monday November 30 2020, @02:46AM (1 child)

      by Anonymous Coward on Monday November 30 2020, @02:46AM (#1082206)

      I don't agree.

      Apple can do this because they control the platform top to bottom. The PC world is compromised by having too many vested interests, all of which are only interested in things that give them more control, and all of which therefore are hated by users who are completely happy with x86 and don't want things that are just there to screw them. There might be enough of a market for a Linux-only laptop running ARM, but it needs to be well designed, not just a proof of concept.

      ARM isn't, after all, some inherently superior alternative. It's just different. It has a heritage in low power devices so it's good at being low power. It's just barely catching up in performance, just like Intel/AMD are just barely catching up in power consumption.

      This is kind of like when Apple switched to PowerPC and everyone said x86 was dead. It's RISC! That's the future! Intel will never keep up! And then it didn't happen.

      ARM will probably contine to make inroads in servers, where power consumption is more important than software compatibility and usually more important than performance and Microsoft can't get in everyone's way. Among laptops, what will happen is that Intel or AMD will come out with an ultra low power x86 that sips power and it'll be fine. And Apple will change architectures again in ten years like they always do. (1975-6502 1985-68K, 1995-PPC, 2005-x86, 2020-ARM). If anything, Apple is five years late this time.

      • (Score: 2) by takyon on Monday November 30 2020, @03:16AM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday November 30 2020, @03:16AM (#1082221) Journal

        What part of my comment don't you agree with?

        I don't think x86 will die soon, and x86 and ARM will coexist for a long time to come. ARM is already very prevalent in HTPCs, SBCs (many capable of acting as desktops), smartphones, and tablets. It is in laptops [theverge.com], moreso in Chromebooks, but AMD x86 is coming back strong in both. Servers might be one of the worst markets for ARM because of resistance to change, but at least hyperscalers like Amazon can add their own ARM CPUs and rent them out, with low commitment from the renter.

        I just think it's interesting that smartphones, tablets, SBCs, etc. all pretty much end at 8 cores (MediaTek did produce some 10-12 core Helio SoCs for smartphones, but have pulled back), even though DynamIQ (the new big.LITTLE) allows up to 32 clusters with 8 cores each. Bloomberg reported in April that Apple was working on a 12-core SoC, and I believe they are probably planning to add even more cores to replace Xeon-based Mac Pros. So my guess is that other companies will start to boost core counts for things intended to be plugged into the wall, even if they still lag behind Apple A/M chips in single-threaded peformance and other areas.

        If Apple does decide to change architectures again, I think it would be a response to monolithic 3D chip development that increases performance by orders of magnitude. That would be a good time to throw everything out, go in a completely new direction, and emulate as needed. But they have the perpetual license and now a huge ecosystem of products using ARM.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 0) by Anonymous Coward on Monday November 30 2020, @12:57AM (11 children)

    by Anonymous Coward on Monday November 30 2020, @12:57AM (#1082171)

    Well maybe Os X will get ported to pinbooks instead.

    • (Score: 2) by takyon on Monday November 30 2020, @01:14AM (10 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday November 30 2020, @01:14AM (#1082176) Journal
      • (Score: 2) by PartTimeZombie on Monday November 30 2020, @01:26AM (5 children)

        by PartTimeZombie (4827) on Monday November 30 2020, @01:26AM (#1082179)

        I never thought I would ever see Mac OS 8 again. It seems weird anyone would have nostalgia for it, because it was awful.

        • (Score: 1, Funny) by Anonymous Coward on Monday November 30 2020, @01:35AM (4 children)

          by Anonymous Coward on Monday November 30 2020, @01:35AM (#1082183)

          The Amiga will rise again!

          ... because all the other OSes will be so compromised and full of adware and malware, that the few knowledgeable users will retreat to the last known good OS.

          Well I can dream.

          • (Score: 2) by looorg on Monday November 30 2020, @02:23AM (3 children)

            by looorg (578) on Monday November 30 2020, @02:23AM (#1082191)

            It's worth contemplating. Perhaps not Amiga OS, even tho I love it I'm not sure I would want to go back to it as a main and core everyday OS.

            But I'm thinking for my parents if I should get them something old like Mac OS 6 or 7. That said I have them somewhat fooled already -- they still think they are running IE (after all they click on the big blue E icon) but they have been running Pale Moon for years now. If I could only easily inject the Google logo on DDG I would switch them over to that to. I tried once but after the calls about how there was something wrong with Google and the Internet I had to switch them back.

            But in general if all that is need is to do some light surfing, check your email and perhaps every now and then type some documents. There is no need for new and shiny. Problem with a lot of the old systems tho is a lack of modern connectivity. Finding a network card can be a bit of a challenge, and a tcp/ip stack since it didn't come with one from the start. I think Miami is still OK for the Amiga, AmiTCP might also probably be ok, but I'm not sure if they have been updated to the latest standards and/or support IPv6 etc. I guess the browsers might be an issue since they might show a lot of blanks and a lot of messages about upgrading the browser cause it doesn't support their new design etc.

            • (Score: 2) by PartTimeZombie on Monday November 30 2020, @02:47AM (2 children)

              by PartTimeZombie (4827) on Monday November 30 2020, @02:47AM (#1082208)

              From what I remember of the networking in Mac OS 8, it was using Appletalk (because it's Apple, so it's better, right?) which of course it wasn't it would shit the bed pretty much every day, usually during some file copy operation, requiring a restart (again).

              My memories of that time are exclusively of waiting for my Mac to restart. But of course we were smugly sure that what we were using was so much better than a PC.

              There was a TCP/IP stack for Apple before OSX but I think that came a bit later.

              • (Score: 2) by EEMac on Monday November 30 2020, @05:14PM

                by EEMac (6423) on Monday November 30 2020, @05:14PM (#1082435)

                > My memories of that time are exclusively of waiting for my Mac to restart. But of course we were smugly sure that what we were using was so much better than a PC.

                You're not wrong about the restarts! But remember Mac OS 8 competed against Windows 95, which was also famous for crashing. Windows people suffered along with us.

                It was roughly 2000-2001 (Windows 2000, OS X, a few really Linux/BSD people) that microcomputers got really stable.

              • (Score: 2) by Rich on Tuesday December 01 2020, @09:22AM

                by Rich (945) on Tuesday December 01 2020, @09:22AM (#1082772) Journal

                Mac OS 8 still did support AppleTalk on pre-iMac machinery, but by then you'd mostly use it to access printers. While slow, and upsetting the cooperative multitasking, it had the nice property of full network auto-discovery. "Proper" networking was done through the OpenTransport stack by then, which was pretty much what Solaris used. From 8.6 on, there was the Nanokernel, which could do some neat multiprocessing. Unfortunately, the overall infrastructure was held back by too much crap that had piled up, and some really bad mistakes made, when the PPC was introduced (particularly the way of low-memory-accesses), so you'd still have applications take down the whole system.

                Yet, on the surface, 8.6 was the finest OS ever made. Only people with pathological Windows conditioning disease or command line autism can say otherwise. Jobs then messed up the perfection in 9 with the Sherlock 2 search.

      • (Score: 2) by Runaway1956 on Monday November 30 2020, @03:01AM (3 children)

        by Runaway1956 (2926) Subscriber Badge on Monday November 30 2020, @03:01AM (#1082215) Journal

        Yay! I win the Oregon Trail, with something over 5000 points! I didn't figure on playing that game again, LOL!

        • (Score: -1, Flamebait) by Anonymous Coward on Monday November 30 2020, @05:30AM (2 children)

          by Anonymous Coward on Monday November 30 2020, @05:30AM (#1082261)

          No, you are dying of dysentery, and Trump Loss Syndrome, and early onset Old Timer's disease. So just set yerself out on the porch, porch honkey, with a Mint Julep, and wait for your inevitable end. Seems the Grandkids, or them what survives, are not going to miss Asshole Grandpa that much.

          • (Score: 2) by Runaway1956 on Monday November 30 2020, @05:57AM (1 child)

            by Runaway1956 (2926) Subscriber Badge on Monday November 30 2020, @05:57AM (#1082281) Journal

            You sound like you would enjoy that mint julep more than I would. I've never had one, never seen one, don't even know what goes into it besides "mint". WTF is a julep? You pick them off a julep tree? No, I don't even want to know, so don't bother. Knowing how to make haggis would probably be more useful, and I don't really plan on eating sheep guts.

            • (Score: 2) by rleigh on Monday November 30 2020, @07:40AM

              by rleigh (4887) on Monday November 30 2020, @07:40AM (#1082306) Homepage

              Haggis is delicious, you don't know what you're missing!

  • (Score: 3) by Runaway1956 on Monday November 30 2020, @01:05AM (9 children)

    by Runaway1956 (2926) Subscriber Badge on Monday November 30 2020, @01:05AM (#1082174) Journal

    With his doubts, I doubt that Torvalds will be doing the porting. Doesn't mean that someone won't do it. Linus objections are good reasons not to bother, but some people like crazy stupid challenges. That's how I got my copy of Darwin's drawings, after all, accepting crazy stupid challenges!

  • (Score: 1, Interesting) by Anonymous Coward on Monday November 30 2020, @03:38AM (9 children)

    by Anonymous Coward on Monday November 30 2020, @03:38AM (#1082228)

    "I'd absolutely love to have one, if it just ran Linux," Torvalds replied. "I've been waiting for an ARM laptop that can run Linux for a long time.

    What is my Lenovo tablet running? My android phone? My Raspberry Pi 4? The problem is not the ARM, the problem is Microsoft (who cannot seem to get Windows to run on ARM), and Apple, who are trying to lock down the machines to prevent what they all know would be the better solution. This is not a technical problem, it is a captured market problem.

    • (Score: 1, Touché) by Anonymous Coward on Monday November 30 2020, @05:08AM (1 child)

      by Anonymous Coward on Monday November 30 2020, @05:08AM (#1082255)

      Friend, you sound like a communist. Why do you hate Freedom?

      • (Score: 1, Funny) by Anonymous Coward on Monday November 30 2020, @05:35AM

        by Anonymous Coward on Monday November 30 2020, @05:35AM (#1082264)

        Because, Comrade, freedom isn't free! It is only given to us by a bunch of people stupid enough to join the military and fight to preserve the rights of the 1%! Or by the people, united, who will never be defeated. Which side are you on, which side are you on? We are taking this whole system down. Apples gonna fall, Windows gonna break, and ARM laptops running free and open software is only the beginning!

    • (Score: 2) by kazzie on Monday November 30 2020, @12:28PM (1 child)

      by kazzie (5309) Subscriber Badge on Monday November 30 2020, @12:28PM (#1082341)

      To paraphrase: "Because laptops are built to run Windows, and Microsoft haven't mastered ARM, nobody makes decent ARM laptops for Linux porters to target." Have I got that right?

      • (Score: 2) by DECbot on Monday November 30 2020, @09:34PM

        by DECbot (832) on Monday November 30 2020, @09:34PM (#1082563) Journal

        Yes, though, technically you can target the Arm Chromebooks, and some have successfully, but who wants that eGarbage as their daily driver? I'd rephrase it to say, if the laptop isn't designed for a developer, then the general Linux community of porters won't target it.

        --
        cats~$ sudo chown -R us /home/base
    • (Score: 3, Informative) by theluggage on Monday November 30 2020, @03:14PM (4 children)

      by theluggage (1797) on Monday November 30 2020, @03:14PM (#1082381)

      It's not so much about ARM, as about why 'regular' Linux and other *nix systems have never taken off as personal computer/desktop operating systems. Shifting to ARM (or RISC-V or Whatever) would be much less of a big deal if everybody was already using x86 *nix.

      Microsoft is certainly part of the problem, but as you point out, Linux (the kernel) has succeeded in the form of Android and Chromebooks. Linux has also carved out a significant slice of the server and scientific computing market against stiff competition from MS. Raspberry Pi etc. are a huge hit with tinkerers and even Apple successfully switched to Unix 20 years ago... Meanwhile, Microsoft has gone through 2-3 big crises - Vista, Windows 8, initial hate for Windows 10 - which should have been ideal opportunities for *nix to muscle in.

      So maybe, just maybe, there's something about *nix that people just don't like in a "desktop" operating system?

      Like... er... the desktop experience? Because one thing that the *nix success - Android, Chromebook, Mac, server - have in common is that they've either rolled their own desktop GUI system and applications, or just don't need a GUI much. Meanwhile, the scientific computing/CS community can use *nix's various X11-descended GUI systems as their designers intended: running 8 copies of vim in translucent windows floating over a shot from the Hubble space telescope...

      It's not that there haven't been great strides in making *nix more user-friendly than it has been in the past (hell, I haven't had to re-compile the kernel for a year or two...) and there is certainly a mass of really powerful *nix application software - and a Linux desktop is perfectly usable - but if an OS is going to displace an incumbent like Windows or MacOS, "perfectly usable" won't cut it - it needs to be better.

      I've used Linux a lot as a server for doing web development stuff, home servers, messing about with Raspberry Pi, making MythTV/TVHeadend PVRs and it's great for that - but doing that via SSH, file sharing and - maybe now - remote development tools in VS Code etc. is fine by me - there's nothing that makes me want to switch to Linux as a main desktop, and when I am confronted with a Gnome/KDE/Mate/whatever (that's part of the problem) desktop I usually just head straight to the terminal. As for applications I know I could switch - all the tools are there - but when you get onto graphics, IDE, Wordprocessing etc. the proprietary tools are just smoother, slicker and more pleasant to use... even when you eschew MS, Adobe etc. in favour of newer (cheaper) Mac/Windows alternatives like Affinity... and, yes, I've used Inkscape and appreciate its power but it's like kicking a dead whale along a beach c.f. Affinity... yes, I actually sat down and used Open/LibreOffice to write a thesis, and got quite familiar with it, but it was, at best, swings and roundabouts c.f. MS Word... the take-home lesson from that was wishing I's just taken the time to re-learn LaTeX, and hang all the GUI stuff.

      ...and that's really my point: Linux is great as a minimal OS for server stuff, embedded, number crunching etc. but after a few decades of trying and failing to catch up with Windows and Mac in the point-n-drool stakes, is there any point to Linux as a desktop OS other than "because freedom"? Most of the key open source applications have Windows and/or MacOS native builds and- for server/embedded development or any other time you want a minimal sandbox - there's always virtualisation.

      • (Score: 0) by Anonymous Coward on Monday November 30 2020, @08:07PM

        by Anonymous Coward on Monday November 30 2020, @08:07PM (#1082516)

        Foss apps need to quit worrying about cross platform and just focus on gnu+linux. cross platform is only desirable, IMO when enabling an app principally built for an EnslaveOS to work on a FreedomOS. small dev teams need to quit wasting all their dev effort pimping their projects out for willing users of enemy platforms.

      • (Score: 2) by DECbot on Monday November 30 2020, @09:46PM

        by DECbot (832) on Monday November 30 2020, @09:46PM (#1082570) Journal

        One thing that keeps Arm from really succeeding for general computing is there isn't a standard for determining what hardware is baked into the chip like there is with X86. That's why when you go to a distribution--especially phone distributions, you have your x86 image, arm64 image, and armv7, aarm64, and raspberry pi, and rockpro64, and, and, and. Each chip must have a custom crafted image because of the differences of the silicon, how boot is handled by each device, and so forth. There's not a simple universal booter and minimal kernel that gets the system up to start probing models and such that we take for granted on x86/arm64 space.
         
        Check out this podcast episode [linuxunplugged.com] if you want to hear more. I won't pretend to remember it all, but a standard on how to boot and recognized Arm architecture would really benefit general OSes like Linux, BSD, and even Microsoft. This would unlock Arm from the system builders' tight control over their hardware. Probably a good reason why we don't have it now and I doubt we will see it any time soon given how much NVIDA keeps their own drivers mostly private.

        --
        cats~$ sudo chown -R us /home/base
      • (Score: 0) by Anonymous Coward on Monday November 30 2020, @10:40PM

        by Anonymous Coward on Monday November 30 2020, @10:40PM (#1082591)

        Focusing on the desktop is stupid, because the desktop paradigm is stupid. I don't use a DE, just a window manager, because the DE is just training wheels for the typewriter crowd.

      • (Score: 2) by Pino P on Monday November 30 2020, @11:18PM

        by Pino P (4721) on Monday November 30 2020, @11:18PM (#1082596) Journal

        after a few decades of trying and failing to catch up with Windows and Mac in the point-n-drool stakes, is there any point to Linux as a desktop OS other than "because freedom"?

        In my experience, Xfce desktop on GNU/Linux has better I/O performance, faster process startup time, and lower RAM use than Windows 10 on the same computer. These add up to less disk thrashing during boot and login, less disk thrashing during use, and more work getting done.

        Most of the key open source applications have Windows and/or MacOS native builds and- for server/embedded development or any other time you want a minimal sandbox - there's always virtualisation.

        Virtualization also implies more RAM use to fit the host and client in RAM at once. Once you add the highest-capacity RAM module your laptop can use, you're out of luck. Plus the client OS needs to wait for the host OS's pokey I/O.

  • (Score: 2) by theluggage on Monday November 30 2020, @03:56PM

    by theluggage (1797) on Monday November 30 2020, @03:56PM (#1082399)

    Just to point out that although Linux won't (and may never) boot "bare metal" on M1 Macs, ARM Linux was shown running via virtualisation back in June when Apple announced "Apple Silicon". Parallels Desktop (popular MacOS hypervisor) for M1 is in beta and launching Real Soon Now.
    There have also been proof-of-concept demos of ARM64 Linux running in a minimal VM using MacOS's built-in hypervisor framework - so hyperkit, and hence Docker for Mac and probably Canonical Multipass shouldn't be too far behind.

    So, for a lot of people who want Linux VMs/containers for web/server-side development, that was always going to be the preferred solution anyway. Then, the good folk in the homebrew and macports community are beavering away to get all the usual free/open source suspects building on M1 MacOS (remember: MacOS = Unix with BSD userland).

    Meanwhile, it's worth remembering that the whole point of Apple Silicon is vertical integration - and a lot of the impressive performance of the M1 could be down to the CPU, GPU, SSD controller and other on-chip gubbins being tailored to run MacOS, and vice versa, including Apple's "Metal" graphics framework and probably their own APFS filesystem. There's even talk of the M1 having custom features to accelerate x86 code translated by Rosetta2... Obviously, it is to Apple's advantage, going forward, if they don't have to maintain a "stable" hardware interface for the CPU, GPU etc. and can tweak both hardware and OS drivers in tandem without breaking third-party OSs. So, possibly, if you want a "new standard platform" for linux-based personal computers, M1 is not the horse to hitch your wagon to.

    There's more to a "platform" than just the processor ISA. It may be that - if we are witnessing the days of the "Wintel" IBM PC-descended platform coming to a middle - we won't see another such "generic" hardware platform now it's within the grasp of any moderate "Tech Giant" to roll their own system-on-a-chip. Apple have done it. Amazon have done it (for AWS rather than retail). Microsoft could do it. NVIDIA probably plans to do it...

    Remember that the IBM PC was only ever "open" by mainframe industry standards (i.e. third parties were graciously allowed to write their own software and make expansion cards) and was supposed to have a proprietary IBM lock-in in the form of the copyrighted BIOS firmware. The PC clone boom only started when someone found a legal end-run around the copyright by "clean room" reverse-engineering the BIOS. If Oracle and co. have their wicked way over copyrighting APIs, that won't be legally possible in the future, so we might see a lot of closed, vertically-integrated systems. The dancing on Intel's grave may be short lived...

  • (Score: 2) by DannyB on Monday November 30 2020, @05:11PM (1 child)

    by DannyB (5839) Subscriber Badge on Monday November 30 2020, @05:11PM (#1082433) Journal

    A good first step would be to get Linux to run on ARM processors.

    Then we could see Linux on things like mobile phones, tablets, and raspberry pi's.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 3, Interesting) by Pino P on Monday November 30 2020, @11:25PM

      by Pino P (4721) on Monday November 30 2020, @11:25PM (#1082598) Journal

      Good luck using GCC or Clang when /home is mounted noexec.

      Mobile phones and tablets have fairly locked-down operating systems focused on reliability and security over flexibility. In order to prevent viruses, for example, mobile operating systems tend to block writing, building, and running an application for a device on a device. Instead of docking a device to an external keyboard and display, a developer is supposed to use a separate desktop or laptop computer to write and cross-build an application and push it to the device to run. See, for example, how Android 10's noexec home is making Termux use less practical [github.com].

  • (Score: 0) by Anonymous Coward on Monday November 30 2020, @08:10PM

    by Anonymous Coward on Monday November 30 2020, @08:10PM (#1082519)

    a mapple laptop "near perfect"? yeah, right. maybe for a whore. fuck apple and arm.

(1)