Chris Siebenmann over on his personal web page at the University of Toronto writes about X networking. He points out two main shortcomings preventing realization of the original vision of network transparancy. One is network speed and latency. The other is a too narrow scope for X's communication facilities.
X's network transparency was not designed as 'it will run xterm well'; originally it was to be something that should let you run almost everything remotely, providing a full environment. Even apart from the practical issues covered in Daniel Stone's slide presentation [warning for PDF], it's clear that it's been years since X could deliver a real first class environment over the network. You cannot operate with X over the network in the same way that you do locally. Trying to do so is painful and involves many things that either don't work at all or perform so badly that you don't want to use them.
Remote display protocols remain useful, but it's time to admit another way will have to be found. What's the latest word on Wayland or Mir?
Source : X's network transparency has wound up mostly being a failure
(Score: 4, Insightful) by Arik on Saturday February 10 2018, @04:29PM (1 child)
So, his main complaint is a colossal example of missing the point, he's blaming X for the reality of where the hardware is and how those things work. If we had thin clients with simple framebuffers attached to huge fast network pipes and super-server machines in a datacenter somewhere (as many imagined we would, back when X was being developed) with all the specialized circuitry, RAM, I/O etc. you could dream of, then it would make perfect sense for them to decode videos and then send them to the thin-client as bitmaps. But that's not how the hardware has developed so it doesn't make sense. It has nothing at all to do with X transparency either way, it's all about where the various circuits sit on the network, and how fast and responsive that network can be.
The typical suggestion to 'solve' this problem where it is not (in X) typically amounts to adding yet another layer of encode-decode to the process, needlessly complicating it. Again, when you have lots of resources locally and a slow unresponsive network you decode locally, and the simple, straightforward way to do this is just to run the danged video player locally.
This is not the only complaint he has, though. Let's continue.
"The second is that the communication facilities that X provided were too narrow and limited. This forced people to go outside of them in order to do all sorts of things, starting with audio and moving on to things like DBus and other ways of coordinating environments, handling sophisticated configuration systems, modern fonts, and so on."
"Sophisticated configuration systems" is a link, let's follow it and see exactly what he's talking about.
"Remote applications and Gnome settings: an irritation" ( https://utcc.utoronto.ca/~cks/space/blog/linux/RemoteAppsGconf )
You can go read it yourself if you doubt me, but he's unironically blaming X for the fact that Gnome insisted on inventing their own deliberately obtuse and incompatible way of handling settings, which doesn't play well with others (or even alone.) If I was writing fiction I would hesitate to include this, it would be too absurd, but like I said, check the link if you doubt me.
"Modern fonts" is also a link. Oh goody. ( https://utcc.utoronto.ca/~cks/space/blog/unix/ModernXFontDrawback )
Again, it's difficult to see how this can fairly be labeled a failure of X transparency so much as a failure to reign in font-madness in general. It's absolutely ludicrous to see how many fonts a default install of even a relatively sane OS contains. Many of them not even legible. A screen font on a general purpose PC has one job and one job only! To be clear and unambiguous, to facilitate error-free reading of text on that screen.
A multitude of fonts are not helpful here. The only time I could see an argument for even installing most of this junk is if the machine is specifically being used for a print shop. Which very, very few are.
But no, we need 20 bazillion different fonts on every machine because lord forbid a web browser be forced to ignore an *optional* font instruction when rendering a web-page.
I'll stop now before I get too rambly myself.
If laughter is the best medicine, who are the best doctors?
(Score: 1) by Burz on Saturday February 10 2018, @04:53PM
What?! You want to make the user coordinate multiple tools when the application / use case calls for video in the app? You want to prevent the app author from creating an integrated presentation? Sending a pre-compressed stream to the client isn't an option? HA. That's why people can't conference effectively on an X11 based machine.
As for "font madness", it only seems that way because you use an OS lacking a well-defined set of core fonts.