Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday September 17 2017, @02:54AM   Printer-friendly
from the Happy-Birthday-to-You! dept.

September 15th was the 30th anniversary of the anniversary of X11

The X11 window system turns 30 years old today! X11 which still lives on through today via the X.Org Server on Linux, BSD, Solaris, and other operating systems is now three decades old.

It was on this day in 1987 that Ralph Swick of MIT announced the X Window System Version 11 Release 1. As explained in the announcement compared to earlier versions of X, X11 offered "This release represents a major redesign and enhancement of X and signals it's graduation from the research community into the product engineering and development community. The X Window System version 11 is intended to be able to support virtually all known instances of raster display hardware and reasonable future hardware, including hardware supporting deep frame buffers, multiple colormaps and various levels of hardware graphics assist."

https://www.phoronix.com/scan.php?page=news_item&px=X11-Turns-30

[As a point of reference, Intel introduced the 80386 in 1985 and the 80386SX variant in 1988. --Ed.]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by Rich on Sunday September 17 2017, @01:17PM (5 children)

    by Rich (945) on Sunday September 17 2017, @01:17PM (#569379) Journal

    As Hanlon's Razor "Never attribute to malice that which is adequately explained by stupidity." goes, I occasionally wonder if malice can be found in this area to keep Linux away from mainstream success. The X architecture and its detail solutions are so squarely off, that it is impossible to improve it out of its misery.

    Bit of history (as I understand it): They saw Lisa and Mac and thought "me too". Out came X with Xlib, which looks a lot like the old Mac window manager (cf. "GetNextEvent" vs. "XNextEvent" and so on) with remote QuickDraw. Networked, and tidied up a bit for that, and working around the evil regions patent. Not too shabby, actually. Then the problems started: Seeing that, the workstation vendors became greedy, every one of them thinking "we're the next to make it big" and a "no policy" policy was introduced, to create a "healthy market", or whatever the asshats might have called it.

    This led to an architecture cut at about one third of where the classic Mac's Control manager was, aided by the fact that the bolt-on semi-official "Athena Widgets" were unpleasing to use and looked like shit. It would still maintain the box outlines for controls (or widgets, how they called them), but leave the rendering to the client side. Also, it seemed to be a good idea to them (which it was not) to move the handling of the windows to a vendor specific window manager. The first issue, not helped by licensing issues, brought the mess of Motif, Qt and GTK - which eventually switched to ignoring the control outlines. With the second part, we got the mess around the ICCCM and EWMH window manager managing. If you're ever trying to get out of there, you have to do it at the toolkit layer and replace everything below.

    Now at the time that happened, it was not clear cut that the mainstream would move to "everything client side". Apple's original ideas for Carbon and BeOS, as we saw it, did have the server side "policy", where an application server separate from the application renders the interface. In the case of BeOS not to its disadvantage, as it is still seen as "teh snappiest" GUI ever. Had Apple bought Be, Carbon would have become a client protocol for the Be Application Server. But after the NeXT deal was done, the engineers had the task of unifying two more divergent systems. Their solution was to render everything client side - which is what the Linux systems today do as well to then force it through XRender (which perverts the original low-bandwidth networking design). Wayland is merely a cleanup of this outcome. No malice involved, I guess, merely accepting the least common denominator, when there was no will to force a coherent and efficient design through the whole stack.

    The true evil lies on the driver side. It started with not cleanly separating out the hardware access, but instead building it into the main X server, leading to a situation where the server is the driver, and every attempt to deal with, or adapt the driver requires dealing with the whole server. Because this situation was unmaintainable, the vendors started out bolting on additional acceleration drivers to their driver parts (XAA, KAA, EXA). So at this point already, if you wanted to do anything with supported graphics hardware, you had to use X11, because it was impossible to otherwise get a working driver out of the mess.

    Enter OpenGL. Originally mostly a direct representation of whatever graphics hardware Silicon Graphics had running. IIRC originally with awful limits like power-of-two-only bitmap sizes and being non pixel-precise, which made it mostly unsuitable for desktop acceleration. When 3D needed to be integrated, a few bright minds had the idea of putting OpenGL on the hardware, which was where it belonged, and putting X on top of OpenGL (probably while maintaining OpenGL a bit into the direction of being usable for 2D). This would have had the neat side effect of separating the drivers from X, effectively leveling the play field for anyone trying to do just the task of windowing and compositing.

    And then the evil happened: A new scheme, AIGLX was devised and deployed, that put OpenGL under control of the X server. Which means, any and all usable graphics system from now on had to be part of a steaming X shitpile. This architectural move was so bad that I'm not sure that it still can be attributed to stupidity. Eventually, the stench became putrid enough that at least those functions needed for daily survival had to be factored out and put into the kernel (as KMS) under the control of more sane people. The remaining 3D functionality filtered down to Mesa. Which is another interesting thing, because although it sits further down the stack, it seems to insist being built against X. There seems to be a distinct lack of compassion to tidy up the architecture - which is strange, because it seems to be little effort, say, compared to implementing the LLVM softpipe.

    To sum it up: The history the driver architecture of X11 is the main evil that has been holding back Linux.

    That rant has been long enough, so I don't extend it to the rotten network protocol (that needs its own "-X" switch in ssh) and all the other stupid things that make it really hard to get a modern media workstation out of it.

    Starting Score:    1  point
    Moderation   +4  
       Flamebait=1, Interesting=3, Informative=2, Disagree=1, Total=7
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 1) by Ethanol-fueled on Sunday September 17 2017, @05:40PM

    by Ethanol-fueled (2792) on Sunday September 17 2017, @05:40PM (#569442) Homepage

    Shuttleworth's faggotry is what kept Linux from "mainstream success."

    The largest potential userbase are Windows converts. But nooooo, Shuttleworth wanted to be fucking Steve Jobs and now Linux has no chance of mainstreaming.

  • (Score: 2, Insightful) by Anonymous Coward on Sunday September 17 2017, @10:04PM (1 child)

    by Anonymous Coward on Sunday September 17 2017, @10:04PM (#569500)

    Lots of changes, that for you seem to be crap, but in practice they seem to improve performance and adapt to current hardware. Check benchmarks, what other OS do with other systems, you get in Linux too if the driver uses the hardware to the fullest. Check creativity, people implementing all kind of window managers or plain apps.

    Maybe X11 is flexible enough to adapt while keeping backwards compatibility, and the reason it lasted so many years. No app cares if processed via EXA or Glamor, do they? Yet you can get video without frame drops, video sync, 3D, multple monitors, hot plug and many other things inside modern implementations of X11.

    Ideal, clean systems are just theory. Practice is dirty while getting things done.

    • (Score: 2) by Rich on Monday September 18 2017, @01:31AM

      by Rich (945) on Monday September 18 2017, @01:31AM (#569569) Journal

      Now that was awfully apologetic. The software just beneath X11, the Linux kernel, shows how a large scale project is properly run. It may have its rough edges, but in terms of cleanliness, compared to the graphics stack above, it is about two magnitudes ahead. And it took like what, two decades, until the multi-monitor screen arrangement preferences looked like that from System 6, from 1988. As far as "flexibility" goes, I have suggested before to replace X11 with TCP/IP, which is even more flexible, and carries much less legacy burden.

      The X stack does have to have a certain quality. If it was any worse, it would be booted out and replaced by something more sane, which would probably look a lot like what Google does on Android (e.g. standalone EGL on top of KMS, and a rootless compositor on top of that, and VNC for remote access, because once the pixels are client side, screw architectural network transparency). I lost one (fanless, and fortunately cheap) graphics card to the lack of power management in its X driver. Damaging hardware is pretty far up on the scale of crappy software.

      I'm just armchair-bickering here, of course, but if I was in charge, Linus style (or better Theo style, given the smell of the mess) I would boot all hardware dependency and escalated rights requirements out of X, make it one of many clients of Mesa, and prohibit any lower level software, under threat of court-martialing, to have dependencies on X. (Sort of where Wayland is supposed to go, but i remember reading about them introducing yet more horrible dependencies). Mesa in turn would get a unit test suite (based on piglit?!) that firstly checks for pixel precision of test results, and then for speed, so that every 2D/3D accelerating module (e.g. new graphics card) can immediately get a thumbs up/down - while running widely visible "user education" to only spend money on "thumbs up" solutions.

      To be noted though, these days, there's the whole matter of GPU processing (non-uniform processing architecture?! NUPA?), but that has to do more with the kernel and less with X. I haven't really looked into how this would be properly dealt with.

  • (Score: 2) by FatPhil on Monday September 18 2017, @08:01AM (1 child)

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Monday September 18 2017, @08:01AM (#569665) Homepage
    > I don't extend it to the rotten network protocol (that needs its own "-X" switch in ssh

    That's as retarded an argument as saying that taxis are rubbish because you have to ask the driver to open the boot in order to put your luggage there. Setting up an X tunnel is an overhead that most poeple don't need most of the time - having it optional is the polite thing to do. You'll be complaining about the ``-t'' switch in netcat next.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by Rich on Monday September 18 2017, @08:59AM

      by Rich (945) on Monday September 18 2017, @08:59AM (#569675) Journal

      I won't complain about netcat adding what's needed to get commands through, but I will complain about the underlying protocol design, that, on a system where "everything is a file and can be piped together", makes issuing remote system commands an interactive exercise in terminal capability negotiation. I understand that all of this legacy comes from having to support semi-dumb text-only terminals (as do all the stty options, with stuff like "raw", "cooked" and "sane"), but from a system design view, nothing of this overhead belongs where it is now. It's just that they had the need to kludge it somewhere 40 years ago, and it stuck.