Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by Fnord666 on Monday February 25 2019, @10:11AM   Printer-friendly
from the the-word-according-to-linus dept.

https://www.realworldtech.com/forum/?threadid=183440&curpostid=183486

Guys, do you really not understand why x86 took over the server market?

It wasn't just all price. It was literally this "develop at home" issue. Thousands of small companies ended up having random small internal workloads where it was easy to just get a random whitebox PC and run some silly small thing on it yourself. Then as the workload expanded, it became a "real server". And then once that thing expanded, suddenly it made a whole lot of sense to let somebody else manage the hardware and hosting, and the cloud took over.

Do you really not understand? This isn't rocket science. This isn't some made up story. This is literally what happened, and what killed all the RISC vendors, and made x86 be the undisputed king of the hill of servers, to the point where everybody else is just a rounding error. Something that sounded entirely fictional a couple of decades ago.

Without a development platform, ARM in the server space is never going to make it. Trying to sell a 64-bit "hyperscaling" model is idiotic, when you don't have customers and you don't have workloads because you never sold the small cheap box that got the whole market started in the first place.

Submitted via IRC for Bytram

Linus Torvalds pulls pin, tosses in grenade: x86 won, forget about Arm in server CPUs, says Linux kernel supremo

Channeling the late Steve Jobs, Linux kernel king Linus Torvalds this week dismissed cross-platform efforts to support his contention that Arm-compatible processors will never dominate the server market.

Responding to interest in Arm's announcement of its data center-oriented Neoverse N1 and E1 CPU cores on Wednesday, and a jibe about his affinity for native x86 development, Torvalds abandoned his commitment to civil discourse and did his best to dampen enthusiasm for a world of heterogeneous hardware harmony.

"Some people think that 'the cloud' means that the instruction set doesn't matter," Torvalds said in a forum post. "Develop at home, deploy in the cloud. That's bullshit. If you develop on x86, then you're going to want to deploy on x86, because you'll be able to run what you test 'at home' (and by 'at home' I don't mean literally in your home, but in your work environment)."

For Torvalds, this supposedly unavoidable preference for hardware architecture homogeneity means technical types will gladly pay more for x86 cloud hosting, if only for the assurance that software tested in a local environment performs the same way in the data center.

Jobs during his time as Apple's CEO took a similar stance toward native application development, going so far as to ban Adobe's Flash technology on devices running iOS in 2010. For Jobs, cross-platform code represented a competitive threat, bugs, and settling for lowest-common denominator apps.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by bobthecimmerian on Monday February 25 2019, @12:01PM (11 children)

    by bobthecimmerian (6834) on Monday February 25 2019, @12:01PM (#806280)

    The Raspberry Pi and devices like it are fine pieces of hardware, but I'd hate to have it as my primary development environment.

    But I think Torvalds is downplaying an important aspect of historical context: when x86 took over the data center, I suspect (but admittedly can't prove) that a lot more data center software workloads were in C++ and C. That made the headaches of cross-compilation significant. Today if you're writing your server side code in PHP, Python, Nodejs, C#, Java, and all of the .NET and JVM derivatives then your developers don't have to worry about cross-compilation. The people working in C++, C, Swift, Go, D, and Rust still care but there are enough of the other group that I think ARM should be able to establish a foothold.

    Starting Score:    1  point
    Moderation   +4  
       Insightful=2, Interesting=2, Total=4
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Interesting) by DannyB on Monday February 25 2019, @03:22PM (8 children)

    by DannyB (5839) Subscriber Badge on Monday February 25 2019, @03:22PM (#806322) Journal

    Thank you.

    As a Java developer, I was going to bring up that changing OSes, and even architecture would have very little effect. I could develop on Windows x86 and deploy on Linux on ARM.

    There are some things in Java that use native libraries. Even these are packaged for the various OS / architecture combinations. I know they exist. The only one I can think of at the moment is for Serial I/O. Not something the vast majority of Java applications even do.

    Back in 2014, my boss's, boss's boss came into my office, handed me a raspberry Pi (model 1 with 512 MB) and said "here, do something cool with this and let me know what you do". I very quickly got my commercial web application running on it, while still accessing an MS SQL Server in my office. (For various definitions of "run" -- it was slow) Because of memory constraints, I replaced Tomcat with Jetty. But it all worked. I don't think I spent more than a couple hours on it -- after a couple of initial hours learning a bit about it, how to create a bootable SD, configure it, SSH, SCP and VNC into it to do remote deployment and configuration.

    But that was a real example of moving a good sized commercial Java web application from Windows to Linux and x86/64 to ARM -- without even recompiling my binaries! (eg, java "JAR" file)

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 4, Interesting) by HiThere on Monday February 25 2019, @05:36PM (4 children)

      by HiThere (866) Subscriber Badge on Monday February 25 2019, @05:36PM (#806426) Journal

      Yes, but for *me* and *my application* Java is a non-starter. I wish this wasn't true, as I'd prefer choices besides Python and Ruby. All the Java serialization approaches concentrate on portability at the expense of compact data. Whoops!

      Actually, C++ and D, and even go, would be reasonable choices, but Python's a lot faster to develop in. Java would be nearly as fast if it didn't discriminate so against 64 bit integers and utf8 (or even if it just handled utf32 well). Vala looks good, but it's too tied to the Gnome system. Objective-C has all their documentation specific to the Mac. So it's going to be Python, with a goal of eventually translating the code into C++ or D. (FWIW, the main problem is lists "arbitrarily" embedded into lists as the data structure. LISP would be the reasonable approach, but it seems essentially a dead language, and one can point to lots of reasons why. Clojure inherites problems from Java, and also has data all being immutable, instead of only externally visible data being immutable. Erlang would require all the mutable data to be stored in a hash table or database. Etc. Then there are the problems with available documentation systems. Fortunately DOxygen will work with Python, as Sphinx is terrible, and when they converted epydoc into PyDoc they made it so garish as to be unreadable. (It also generates truly horrible html. For one example just look at the white-space handling.)

      I should also consider Fortran, but I haven't found any good text on modern Fortran, specifically including focus on concurrency. I've got a couple of bad ones. And I did consider Ada...but though it's got a few good features that don't exist in any other language I've seen, it's got so many bad features that I dropped it. (Look into writing a binary file with a header record different from the rest of the contents. It actually *is* doable, but what a pain! And to call the way of documenting your code terrible is to over-praise it dramatically.)

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 2) by DannyB on Monday February 25 2019, @06:00PM

        by DannyB (5839) Subscriber Badge on Monday February 25 2019, @06:00PM (#806454) Journal

        Wow. As you point out, there are LOTS of choices. All with pros and cons. If there were one perfect choice everyone would use it. I picked Java long ago because it seemed, and has been a very good fit building big code base web applications. Particularly refactoring tools in modern IDEs.

        I still like Lisp even though, as you say, it is a dead language. Yet there is so much you learn by using it. I guess like Latin. :-) And there are a lot of fun things you can do in Lisp and related languages. Just for pure fun.

        I don't know what you're trying to do, so I don't know specifically why Java is unsuitable, but it sounds like you've really understood your particular problems and workable choices.

        --
        To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 2) by bobthecimmerian on Wednesday February 27 2019, @11:52AM (2 children)

        by bobthecimmerian (6834) on Wednesday February 27 2019, @11:52AM (#807541)

        Since I'm a Clojure fan, I'll point out that Clojure uses immutability by default but has ways you can work with mutable data. But I suspect it still doesn't fit - if Java won't work, it would be weird if an alternative built on top of Java would work. Though if I recall correctly, Clojure syntax makes handling of 64 bit uints or equivalent easier than Java. With Java the BigInteger syntax gets clunky in a hurry. Clojurescript could also be an option if you go that route, I suppose.

        If you're doing something involving number-crunching, reddit programming seems to have a lot of fans of the Julia and Nim languages. I haven't used either, though.

            Good luck no matter how you go.

        • (Score: 2) by HiThere on Wednesday February 27 2019, @05:38PM (1 child)

          by HiThere (866) Subscriber Badge on Wednesday February 27 2019, @05:38PM (#807706) Journal

          Julia won't work, but I do keep looking at Nim. I'm just not sure how seriously to take it.

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
          • (Score: 2) by HiThere on Wednesday February 27 2019, @05:57PM

            by HiThere (866) Subscriber Badge on Wednesday February 27 2019, @05:57PM (#807714) Journal

            So. I just looked again. Many of the documentation links are stale. (You need to recode them from the git page to the org page. Easy enough translation, but they move the site without updating the links.)

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 4, Insightful) by jmorris on Monday February 25 2019, @05:44PM (2 children)

      by jmorris (4844) on Monday February 25 2019, @05:44PM (#806437)

      Yeah, but even today Java is write once, debug everywhere. Which is why anyone sane packages the exact JVM used for development and testing with the application. Which means you deploy on the same hardware arch you develop and test on unless you are a fool. Of course the IT world is largely fools now so....

      The bottleneck to ARM deployment is the video problem. Small ARM boards that are more than powerful enough to do development work on exist, but they are all either entirely headless or usable only as media players because there are no stable accelerated video drivers for Linux. Android you can get EGL drivers for, but not GL and even now Vulkan is still rare.

      AMD is the one hot and bothered to push ARM into the server space so they need to realize the problem and that they are uniquely suited to fixing it. Dump Mali, bolt on a medium spec Radeon core to a midgrade ARM and release a developer targeted board. Aim for a Micro-ATX form factor, put a pair of DDR4 slots for RAM, a few PCIe slots for the odd peripherals the real world requires that aren't on USB interfaces and of course some USB3.1 ports for the stuff that is. Add a GigE and call it good. Better still, once you have PCIe just put a PCIe x16 slot in and list some specific Radeon boards (a low, med and high end at minimum) that ARM support has been upstreamed for. Then price to compete with an AMD x86 of similar performance. Yea, that would mean taking a loss for a year to seed the market, AMD might have trouble doing that part.

      Then embark on a PR campaign to make having one the new hotness, like they did with the PR campaign for the RaspPi.

      • (Score: 3, Interesting) by DannyB on Monday February 25 2019, @05:54PM

        by DannyB (5839) Subscriber Badge on Monday February 25 2019, @05:54PM (#806447) Journal

        even today Java is write once, debug everywhere.

        Maybe for Swing, creating desktop UIs. It doesn't seem to be the case for web applications where the Java code is mostly a "function" that takes inputs and serves up web pages. Where you see cross-platform differences that lead to bugs is when you are using things that touch platform specifics, like a desktop GUI in Swing. Or maybe even JavaFX. I have had to make minor platform adjustments for Swing, and I have not used JavaFX so I can't speak to that. Most of my Java experience is now with web server applications, and I don't see debug everywhere.

        In practice, what we deploy on is pretty close to what we develop, test and demo on. (Windows, x86, the biggest variation is which version of Java or Tomcat, but the app seems highly insensitive to that variation.)

        I wish AMD luck with pushing ARM into the server space. Intel needs some serious competition. My dream would be for ARM and Linux to be cost effective enough to force managers to take notice.

        --
        To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 2) by bobthecimmerian on Wednesday February 27 2019, @11:56AM

        by bobthecimmerian (6834) on Wednesday February 27 2019, @11:56AM (#807542)

        I think the 'write once, debug everywhere is grossly overstated'. I work at a Java web shop and we have developers working on Macs, on Windows, and on Ubuntu Linux and our deployment platform is CentOS Linux and we never have a cross-platform problem. And twelve or so years ago it was "develop on Windows, deploy on Solaris" and everything was fine then too. Now if we were writing GUI applications, it might be different.

  • (Score: 2) by MichaelDavidCrawford on Monday February 25 2019, @06:47PM (1 child)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Monday February 25 2019, @06:47PM (#806499) Homepage Journal

    It's about the same as a P6 from Y2K.

    I was happy with my own P6 back then; I used that box for four solid years.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Funny) by DannyB on Tuesday February 26 2019, @03:14PM

      by DannyB (5839) Subscriber Badge on Tuesday February 26 2019, @03:14PM (#806951) Journal

      Wow. You just made me suddenly realize. A present day Rasperry Pi 3, with 1 GB of ram is:
      * faster
      * more memory
      * much more storage (with a 32 GB or more SD card)

      than my first decent Linux box in 1999.
      * Athlon something or other
      * 256 MB memory
      * 30 GB drive

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.