Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Wednesday March 14 2018, @10:57AM   Printer-friendly
from the even-more-necessary-today dept.

Maciej Ceglowski, proprietor of the Pinboard bookmarking site, spoke back on October 29, 2015, at the Web Directions conference in Sydney, Australia about the problem of increasingly bloated web pages. His talk describes the nature of the bloat problem, fake attempts at pretending to fix it, the bloat that advertisements contribute, mishandling of images, unreasonable crufty javascript frameworks, time wasting layouts, sluggish backends, and why it is important to address these issues. The reasons to do so go well beyond just aesthetics and efficiency.

Here's the hortatory part of the talk:

Let’s preserve the web as the hypertext medium it is, the only thing of its kind in the world, and not turn it into another medium for consumption, like we have so many examples of already.

Let’s commit to the idea that as computers get faster, and as networks get faster, the web should also get faster.

Let’s not allow the panicked dinosaurs of online publishing to trample us as they stampede away from the meteor. Instead, let's hide in our holes and watch nature take its beautiful course.

Most importantly, let’s break the back of the online surveillance establishment that threatens not just our livelihood, but our liberty. Not only here in Australia, but in America, Europe, the UK—in every free country where the idea of permanent, total surveillance sounded like bad science fiction even ten years ago.

He closes with an appeal to address these concerns in order to improve general accessibility of the WWW, which correlates with its general awesomeness.

From The Website Obesity Crisis (transcript)
The Website Obesity Crisis (video)

[Ed note: Though some of the admin functions for SoylentNews use Javascript, the user-facing side is entirely Javascript-free; everything is done with straight HTML and CSS. --martyb]

[TMB note: I wish. We never could figure out a way to do collapsible comment trees how we wanted to entirely without Javascript and it's also required for subscriptions paid through Stripe.]


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by c0lo on Wednesday March 14 2018, @11:26AM (1 child)

    by c0lo (156) Subscriber Badge on Wednesday March 14 2018, @11:26AM (#652286) Journal

    some of the admin functions for SoylentNews use Javascript

    One wonders: if the scripts don't deliver ads to the administrators and don't track them, what's the point?

    (grin)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 1, Informative) by Anonymous Coward on Wednesday March 14 2018, @11:32AM (7 children)

    by Anonymous Coward on Wednesday March 14 2018, @11:32AM (#652288)

    [Ed note: Though some of the admin functions for SoylentNews use Javascript, the user-facing side is entirely Javascript-free; everything is done with straight HTML and CSS. --martyb]

    While it is true that this site is completely usable without JavaScript, and has very little user-facing JavaScript, it is not true that the user-facing side is entirely JavaScript-free.

    Here's an excerpt from the page source of this very page:

    <script src="//soylentnews.org/expandAll.js" type="text/javascript"></script>

    • (Score: 2) by The Mighty Buzzard on Wednesday March 14 2018, @11:33AM

      by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Wednesday March 14 2018, @11:33AM (#652289) Homepage Journal

      Yup, updated to reflect that.

      --
      My rights don't end where your fear begins.
    • (Score: 5, Touché) by theluggage on Wednesday March 14 2018, @11:54AM (4 children)

      by theluggage (1797) on Wednesday March 14 2018, @11:54AM (#652297)

      While it is true that this site is completely usable without JavaScript,

      ...which is all that really matters. Javascript is great for enhancing interactivity and dynamic layout - provided everything "fails safe" to a usable static page. Then, anybody who can't use Javascript - or just has a bug up their arse about it - can just turn it off.

      Of course, if the first few iterations of CSS hadn't done such a good job of giving the impression of being designed by someone who had neither visited a website, used a DTP package or even stylesheets in a wordprocessor, then we wouldn't need so much Javascript. At least now we can stop worrying about supporting ancient versions of Internet Explorer. and start to use things like flex layouts and the sensible box model...

      • (Score: 4, Insightful) by anubi on Wednesday March 14 2018, @12:26PM (2 children)

        by anubi (2828) on Wednesday March 14 2018, @12:26PM (#652317) Journal

        or just has a bug up their arse about it

        No more than I have a bug up my arse about picking up random things and putting it in my mouth.

        Some sites, like this one, are pretty clean. I'm not really afraid of what you guys put into the script. You do not have a record for distributing malware in ad scripts.

        But other sites, well, I find putting scripts from their site into my machine being similar to putting in my mouth something I found on the street. With some sites being more like putting in my mouth things given to me in a public rest room.

        Incidentally, I would wish script blockers would not be referred to as "ad" blockers. It never was my intent to block innocent well-mannered ads ( and actually, it doesn't !), but it was my full intent in using it as a hygenic measure to keep malware out of my machine, just as washing hands after using the toilet, or use of a handkerchief as consideration for my neighbors to minimize biological viruses.

        Many business sites are like a guy with a cold, deliberately flinging snot, and he wonders why *I* approach him only when wearing a face mask!

        Someone told that businessman that part of being in business is being able to set terms and conditions, one of which is tolerating his snot. If he's gonna fling snot, I am going to approach him properly dressed. Which means script blockers if it comes to that.

        Just For Laughs gag... what happens to me a lot on my computer ... but this time using kids instead of JavaScript to do the deed. [youtube.com]

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
        • (Score: 2, Interesting) by crb3 on Wednesday March 14 2018, @01:51PM

          by crb3 (5919) on Wednesday March 14 2018, @01:51PM (#652365)

          I keep a console window open and maximized when browsing, say, Reddit. If I suspect that the site will dump a ton of garbage into the page, I swipe the URL, switch to that CLI pane and execute "links <mousepasted-url>". I read a Fortune page about 15 minutes ago; try *that* with an ad-blocker.

        • (Score: 2) by maxwell demon on Thursday March 15 2018, @07:50AM

          by maxwell demon (1608) on Thursday March 15 2018, @07:50AM (#652821) Journal

          Incidentally, I would wish script blockers would not be referred to as "ad" blockers.

          They aren't. Script blockers like NoScript block scripts. If some of those scripts happen to belong to ads, those scripts are also blocked. And if the ad happens to rely on those scripts to be displayed (as is common these days), then the ad will be displayed.

          On the other hand, ad blockers block ads. If the ads happen to contain scripts, then naturally those scripts are also blocked. Scripts that do not belong to ads are not blocked by ad blockers.

          --
          The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 3, Insightful) by maxwell demon on Thursday March 15 2018, @07:42AM

        by maxwell demon (1608) on Thursday March 15 2018, @07:42AM (#652820) Journal

        ...which is all that really matters.

        No, it is not all that matters. But fortunately this site also does right the other part that matters: The JavaScript is hosted locally to 100%.

        --
        The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by cmdrklarg on Wednesday March 14 2018, @05:53PM

      by cmdrklarg (5048) Subscriber Badge on Wednesday March 14 2018, @05:53PM (#652522)

      Yup, I do like it when I can get sites to work by having NoScript whitelist ONE item (zero is better, but one or two is OK). Not like some of these damned sites that have 20+ blocked scripts.

      --
      The world is full of kings and queens who blind your eyes and steal your dreams.
  • (Score: 2) by c0lo on Wednesday March 14 2018, @11:43AM

    by c0lo (156) Subscriber Badge on Wednesday March 14 2018, @11:43AM (#652293) Journal

    the user-facing side is can be made entirely Javascript-free

    Otherwise https://soylentnews.org/expandAll.js is included.

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @12:14PM (1 child)

    by Anonymous Coward on Wednesday March 14 2018, @12:14PM (#652304)

    Does this "news" item set a new personal best for the "oldest news" yet on SN?

    Suggestion to editors--save non-time critical articles like this for times when the queue is low (the queue is jammed today).

    • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @01:23PM

      by Anonymous Coward on Wednesday March 14 2018, @01:23PM (#652350)

      It would have appeared sooner, but the "Make it so, Soylent Number One" functionality requires JavaScript.

  • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @12:18PM

    by Anonymous Coward on Wednesday March 14 2018, @12:18PM (#652306)

    As hinted by the ed/TMB comments

    about:config -> javascript.enabled -> false

  • (Score: 2, Redundant) by MichaelDavidCrawford on Wednesday March 14 2018, @12:20PM (4 children)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday March 14 2018, @12:20PM (#652309) Homepage Journal

    If you look at the resources that a document is composed of, you'll often see stuff like this:

    http://code.google.com/jquery/jquery.js?browser=Safari&referring-page=https://example.com/&date=2018-03-14 [google.com]

    Yes I do understand that all those query parameters are unnecessary because that information is in the HTTP header.

    This is the kind of destruction that analytics has wrought upon the web. It's not just the extra cruft:

    Putting any query parameters at all on a GET to jquery will defeat the caching.

    Having toyed with jQuery some I can readily understand its popular appeal.

    But when I first heard of it was from puzzling over why so very many web pages were so very slow to load. It led me to believe that jQuery must be some kind of Tool Of The Devil.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 4, Interesting) by stretch611 on Wednesday March 14 2018, @02:41PM (3 children)

      by stretch611 (6199) on Wednesday March 14 2018, @02:41PM (#652389)

      IMO, jQuery is a Tool of the Devil.

      I remember debugging javascript that was using jQuery.

      More often than not it seems that people who use jQuery have a preference for writing crappy code in the fewest characters necessary instead of writing it well designed and readable, I remember the hell it was one particular time when I was trying to debug errant javascript behavior only to realize that the idiot that wrote the original web page used multiple ids with the same exact name. This is something that goes against standards and causes errors, but instead of failing, jQuery trapped the error and let it pass. While some people may think its a good thing and be happy to avoid such an error, people who actually believe in coding standards realize what a nightmare this can cause. (Coding standards is why we look at the IE ERA as hell.)

      Even though I feel that people who use jQuery had a preference for writing crap code, it was nothing compared to the few times where I was forced to debug in the level of jQuery code itself. I was greeted by a new level of hell each time I did this. Spaghetti jumping from one place to another and returns everywhere. And I swear they do not allow people to use more than 2 f-ing letters for any variable name. I learned Commodore BASIC with two letter names... It was hell then, its worse now when there is no need to limit yourself. except to write incomprehensible garbage.

      And to all those that believe that writing the fewest characters is a sign of "good" coding... May you be subject to the eternal hell of trying to read your own code after not seeing it for six months.

      --
      Now with 5 covid vaccine shots/boosters altering my DNA :P
      • (Score: 2, Informative) by Anonymous Coward on Wednesday March 14 2018, @05:14PM (1 child)

        by Anonymous Coward on Wednesday March 14 2018, @05:14PM (#652504)

        I haven't looked at much (any?) jquery source, but I know their scripts are available in two versions: bloated readable code and 'minified' versions. Hopefully you were using the readable one to debug (that's what its for) but if it was something you were looking at on somebody else's production page it should have been using the minified version.

        * minified javascript is machine reduced to decrease the size - all identifiers are replaced with shorter ones, whitespace stripped, etc.

        • (Score: 3, Informative) by stretch611 on Wednesday March 14 2018, @09:12PM

          by stretch611 (6199) on Wednesday March 14 2018, @09:12PM (#652597)

          No, it was not minified. The "other person" was a prior employee of the company I was working for at the time. I had full access to the server and code.

          --
          Now with 5 covid vaccine shots/boosters altering my DNA :P
      • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @09:23PM

        by Anonymous Coward on Wednesday March 14 2018, @09:23PM (#652606)
        In practice you can have multiple elements with the same id (though it is a bad thing), you only get the first one with getElementById(), though (but you can do attribute matching if you are really that desperate). I remember there was some talk about removing this restriction on the grounds that people are dumb it is a pain to control on user side. Cloned a bunch of notes? Time to scan the damn ids. Although I don't think it lead to anywhere as of now.
  • (Score: 2) by All Your Lawn Are Belong To Us on Wednesday March 14 2018, @12:24PM (6 children)

    by All Your Lawn Are Belong To Us (6553) on Wednesday March 14 2018, @12:24PM (#652314) Journal

    Let’s preserve the web as the hypertext medium it is, the only thing of its kind in the world, and not turn it into another medium for consumption, like we have so many examples of already.

    Let’s commit to the idea that as computers get faster, and as networks get faster, the web should also get faster.

    Let’s not allow the panicked dinosaurs of online publishing to trample us as they stampede away from the meteor. Instead, let's hide in our holes and watch nature take its beautiful course.

    Money is now involved in the process of media consumption online. It is now impossible to remove the impact of media consumption on the internet, and I would posit that it is a significant enough part of the U.S. economy that attempting to ignore or remove it would cause more damage than it is worth. It's probably, in its way, literally a national security function now (ever since someone decided that economic strength is a national security matter). The bottom line there is: If it is cheaper and more profitable to cruft up a webpage and you still get the requisite number of eyeballs for financial benefit, and taking the time for fine crafted design utilizes more money than it profits the process, the industry will not change. Users, by and large, will not care if their pages take 3-5 seconds more to load and consume 10-20% more processing power - not when there is an optimax of allowable crufting-to-profit ratio that is found empirically. Companies will not care that they are requiring unnecessary libraries and bad code to cross the pipes so long as Joe Sixpack gets his sports news promptly enough and Jane Slimfast gets her cat pictures.

    To truly allow what is being spoken of here requires a new web - one whose selling point is more genuine information or product at better speeds. A new standard, hopefully not polluted by bloat or js.

    --
    This sig for rent.
    • (Score: 2) by MichaelDavidCrawford on Wednesday March 14 2018, @12:43PM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday March 14 2018, @12:43PM (#652323) Homepage Journal

      The economy is rightfully considered a national security matter.

      Better economies have an easier time building weapons than do worse economies. And should a war break out the stronger economy will succeed at putting its troops and guns whereever the battlefield may be.

      A while I read a truly sorrowful post by an American soldier who served with some Egyptian troops. He said he always had plenty to eat:

      "An Army runs on its stomach. That's why they always had plenty of turkey sandwiches for us."

      But his Egyptian friends had so little to eat, and of such poor quality that they were slowly starving.

      That's not quite what you want when the order to ATTACK AT DAWN comes down.

      --
      Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Informative) by MichaelDavidCrawford on Wednesday March 14 2018, @12:46PM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday March 14 2018, @12:46PM (#652325) Homepage Journal

      Jakob Nielsen says you're wrong:

      He studies real users by employing such devices as custom proxies that report where and when the test volunteer is on the web.

      His studies conclusively demonstrated that your page has just two seconds to load and render before the user will click the Back button then go find some other page.

      All that whizzy analytics that so many webmasters are creaming their jeans for must not provide this information, or such pages as Hilary For President wouldn't have more than a thousand resources contained in just her homepage.

      --
      Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by MichaelDavidCrawford on Wednesday March 14 2018, @12:48PM (2 children)

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday March 14 2018, @12:48PM (#652327) Homepage Journal

      Tor just has to slow lots of things down.

      Perhaps if Facebook had a .onion address its Javascript would finally work in Pale Moon.

      --
      Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Funny) by bob_super on Wednesday March 14 2018, @09:34PM

      by bob_super (1357) on Wednesday March 14 2018, @09:34PM (#652611)

      > Joe Sixpack gets his sports news promptly enough and Jane Slimfast gets her cat pictures

      Without touching on the sexism in that statement, I do need to point out that it's now Joe Barrel and Jane Sparetire. Their son, Xavier Xander Lipids does care about how quickly he can multitask between porn pages and teen social media, not that he always sees a difference.

  • (Score: 2) by MichaelDavidCrawford on Wednesday March 14 2018, @12:39PM (2 children)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday March 14 2018, @12:39PM (#652321) Homepage Journal

    Kids These Days.

    My iPhone 4 was significantly faster than my iPhone 7 is now. The ARM Cortex A8 in my 4 only had one core. The new and improved-by-Apple kinda sorta ARMy CPU in my 7 has a vast number of cores.

    All the better to address through two-dimensional arrays by row then column with. :-/

    When I worked for Apple I was assigned to a committee whose mission it was to rescue a new box that was way way too slow.

    It wasn't long at all before I produced an internal-consumption-only document that had the same diagram as this World Wide Wonderful resource does:

    hay where the hell did it go?

    ... it was on my local box but not online. Just now I discovered a whole bunch of pages just like that.

    HOWTO Write Code That Doesn't Suck [warplife.com]

    To save yourself a lot of time and trouble, just scroll about halfway down until you find lots of asterisks. Then read just a few of the paragraphs above and below them.

    Perhaps the reason I hadn't posted it yet is that some of it is the work of a shattered mind: I was a patient in a mental hospital when I wrote it.

    And yes some mental hospitals let you keep both your cell phone and your laptop. They won't let you have your power cords but the nurses are quite helpful about recharging your devices when they run low.

    At Stanford Medical Center they even have a large, locked utility room with a vast array of power sockets. At a place like the Stanford Medical Center H-2 Psychiatric Inpatient Unit, many if not most of the wingnuts within are heavily into technology.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @03:23PM (1 child)

      by Anonymous Coward on Wednesday March 14 2018, @03:23PM (#652417)

      In "HOWTO Write Code That Doesn't Suck [warplife.com]"
      I saw this,

      You may be far, far more familiar with Smalltalk than you think you are: Objective-C is just C with Smalltalk bolted on, you see.

      Can you elaborate?

  • (Score: 3, Interesting) by cocaine overdose on Wednesday March 14 2018, @01:51PM (7 children)

    His notions are not new or revolutionary. Many in the underground technology circles despise what's become of modern technology. Websites are no different from OSes. Just as Salon's page loads 45 MB over 5,000 requests, so too does the most latest and minimal version of SystemD/Linux start more off with more than 50 processes after boot. The same is true for Windows and the Linux kernel in general. Everything has become bloated because of "progress." You can't be a Linux expert anymore, considering there's more more lines of code than there are Britannica Encyclopaedia lines (and it would take about 11 years to read the whole encyclopaedia). Hell, Firefox alone has more lines than Britannica Encyclopaedia. There's no way to know what you're working with anymore and some circles are extremely wary of that. They've ditched Linux for other UNIX-likes. Some go to Redox, some go to OpenBSD, and a very few KISS-fearing men have gone on to champion Plan9 (Unix successor).

    Going back to the webpages, I recently tried to reverse engineer a design of a popular website. What I found is the same as OSes! Many lines, no documentation (expected), poor or no formatting, redundant functionality, and so many goddamn script loads so the startup can have moving buttons (Look, pa, this one slides on its own!). The entire page is 1MB, and it's just the landing page, primarily composed of numerous CSS stylesheets. (If you want to scream, look at something like VK.com, 15MB of extremely complicated JS and request routing -- but the most bloated file is the fonts.css!) There has been non-stop talk of how modern design is killing the web and we need to stop using all of the tools available (or atleast slow down and think if we really need to max out 8 cores for fancy animations). To this, I say is a futile pursuit. Teason the web is so popular because of all the ways JS/CSS/etc. have attracted all the (over-bearing majority) of casual users. Despite their relatively non-existant standards for good webpages and extensive mobile-use (mobile first was terrible to wrap my head around), they are the prime reason we have had so much innovation in the WWW sphere. Were not for them contributing the snowballing popularity of the internet, and capitalism seeing opportunity to make money, you likely would not have the internet speeds you have today. Or all the functionality of your favorite websites (which are usually someone else's code) like ajax and music that keeps state over different pages. The problem is most certainly a capitalist problem. And as long as there's still money to be made, no amount of idealism will be able to quench greed. The FSF has been reluctant to accept its failure in this.

    But there's also the other side of the coin, that webdevs aren't making the web more bloated. They're just using the tools they have to achieve the goals required of them. One of these is to catch user's eyes and make companies money. That means the website has to be novel. So many times have I heard relatively computer-iliterate, but nonetheless web-related consumers, that a website "looks like it's from the 90s" and that they refuse to use it. The websites in question usually do not have anything "wrong" with them, but they're boring and not as feature-rich as some of the more bloated examples they use. The webdevs have to yield to these consumer expectations, lest they make no money and get fired. And that is in effect why you cannot "stop progress" or "go back." If you go back, you die. The only time this is acceptable is for a personal blog or a website that doesn't face consumers (open source software and non-profits face consumers, no matter their thoughts on it). Otherwise, you're just fighting a losing battle.

    Now, back to this polish fellow and his talk. He seems like he's bandwagonning onto the exclamations of the underground, without saying anything new. Perhaps the case could be made that he's disseminating their opinions or taking hold on the cusp of how much consumers can tolerate slow websites -- to maximize how much reach his writings have and get more readers. I think the latter is more likely. Considering his extensive coverage (IMHO, whining for views) on the atrocity of technology, and the internet. Another point could be made that's he's a very dumb entrepreneur and a bad webdev trying to lower standards to make it easier for him to make more money. For evidence of this, look no further than: his blog and his "startup" Pinboard. Take note of how complex and intricately designed they seem on the outside.

    As to the points in the OP, and the transcript of his talk, I will briefly address them. Rather, try to prove why they're wrong, because I've already invested so much effort into the idea that he is being reactionary for more readers.

    [1/2] See below for rest
    • (Score: 3, Interesting) by cocaine overdose on Wednesday March 14 2018, @01:54PM (2 children)

      [2/3]
      >This talk isn't about any of those. It's about mostly-text sites that, for unfathomable reasons, are growing bigger with every passing year.

      To this, all I have is that websites will indeed grow larger the more years they're around. By virtue of adding more content. However, I don't think that's the point he's making so we'll wait.

      > While I'll be using examples to keep the talk from getting too abstract, I’m not here to shame anyone, except some companies (Medium) that should know better and are intentionally breaking the web.

      Medium is not a mostly-text site. It's heavily laden with JS, CSS, and images. But I believe he's trying to say those things should not be needed. "Intentionally breaking the web," is a hyperbolic hook and I don't believe he will give examples or address what he means by "breaking the web." Or why he thinks mostly-text sites exist anymore, save remnants of old websites, blogs by internet veterans, and academics' personal pages.

      > What do I mean by a website obesity crisis? Here’s an article on GigaOm from 2012 titled "The Growing Epidemic of Page Bloat". It warns that the average web page is over a megabyte in size. The article itself is 1.8 megabytes long.

      Reader take note, that GigaOm is a news website. It includes many pictures, user accounts and comments, and responsive design. Reader also take note that the transcript page is also, itself, 1.03 MBs in size (because of the numerous photographs he's decided to use PNG with).

      >  Here's an almost identical article from the same website two years later, called “The Overweight Web". This article warns that average page size is approaching 2 megabytes. That article is 3 megabytes long. If present trends continue, there is the real chance that articles warning about page bloat could exceed 5 megabytes in size by 2020.

      We're already past this point.

      > The problem with picking any particular size as a threshold is that it encourages us to define deviancy down. Today’s egregiously bloated site becomes tomorrow’s typical page, and next year’s elegantly slim design. I would like to anchor the discussion in something more timeless.

      I would agree, but "timeless" is terrible for standards. The technology is constantly evolving, and yesterday's "logical" standards are today's "outdated" standards.

      > To repeat a suggestion I made on Twitter, I contend that text-based websites should not exceed in size the major works of Russian literature. This is a generous yardstick. I could have picked French literature, full of slim little books, but I intentionally went with Russian novels and their reputation for ponderousness.

      My propensity to call him a "faggot hipster" is growing larger and larger every passing line. The "major works of Russian literature." I believe he's only read some Dostoevsky and believes himself knowledgeable on Russian literature. Dostoevsky, who, normally writes works around 500-900 pages. I will not deny that many in this group are 400-600 pages, but many are also 100-200 pages (Heart of a Dog, A Hero of Our Time, We, etc.). Even ignoring that his standards are based on false pretense, the yardstick is not generous. Text-based websites? They're all text-based, first, you'll need to refine that, because that's almost every website. But, assuming he means "websites with the majority of content being text," that's still most websites. Assuming he means websites like his blog, that use for the most part ONLY text for display, it's still a misguided thought. Websites aren't only text. They have formatting and positioning to make it readable. The Russian literature has this to, but it barely affects the overall page count. That formatting is going to take up a large chunk of data, that is necessary to have a readable interface.

      > their reputation for ponderousness.

      Russian novels are also known for being large and bloated.

      >In Goncharov's Oblomov, for example, the title character spends the first hundred pages just getting out of bed. If you open that tweet in a browser, you'll see the page is 900 KB big. That's almost 100 KB more than the full text of The Master and Margarita, Bulgakov’s funny and enigmatic novel about the Devil visiting Moscow with his retinue (complete with a giant cat!) during the Great Purge of 1937, intercut with an odd vision of the life of Pontius Pilate, Jesus Christ, and the devoted but unreliable apostle Matthew. For a single tweet.

      I so very much want to call him a moron. Twitter is not just text, it has further functionality (like replies/searching/images/extensive formatting/liking/following) that has to take up data to be there. The Master and Margarita is just text. And you're basing your standards off comparing apples to artichokes. One is a healthy, but difficult to eat, vegetable. While the other is a sweet, juicy, and sugary snack. Any book you put on the web will be most certainly larger than in plaintext. HTML takes up space, a lot of space in comparison. A snarky comment: Maybe this is why he hasn't achieved anything else except for selling one OK product to Slacker News. He's unaware of his terrible shortcomings.

      >Or consider this 400-word-long Medium article on bloat, which includes the sentence: "Teams that don’t understand who they’re building for, and why, are prone to make bloated products." The Medium team has somehow made this nugget of thought require 1.2 megabytes. That's longer than Crime and Punishment, Dostoyevsky’s psychological thriller about an impoverished student who fills his head with thoughts of Napoleon and talks himself into murdering an elderly money lender. Racked by guilt, so rattled by his crime that he even forgets to grab the money, Raskolnikov finds himself pursued in a cat-and-mouse game by a clever prosecutor and finds redemption in the unlikely love of a saintly prostitute. Dostoevski wrote this all by hand, by candlelight, with a goddamned feather.

      See my previous comments. He's spending more time on explaining the books he's read than furthering his point, i.e contributing to bloat.

      > I could go on in this vein. And I will, because it's fun!

      He goes on for another 7 paragraphs drawing comparisons to literature and modern websites. While doing all of the things I've complained about in my previous comments. I feel burned out having to point out every goddamn flaw that should've been addressed by him pre-speech. I won't be able to do the entire thing. So:
      • (Score: 2) by cocaine overdose on Wednesday March 14 2018, @01:57PM (1 child)

        [3/4]
        > These Apple sites exemplify what I call Chickenshit Minimalism. It's the prevailing design aesthetic of today's web. I wrote an essay about this on Medium. Since this is a fifty minute talk, please indulge me while I read it to you in its entirety: "Chickenshit Minimalism: the illusion of simplicity backed by megabytes of cruft."

        This author exemplifies what I call "Chickenshit's Dunning Kruger." It's the prevailing disease of today's "writers" and "entrepreneurs." Succinctly, it can be explained as: "Chickenshit's Dunning Kruger: When your ego is so big and your intelligence so low, that you think one success makes you infallible."

        >Finally, I want to talk about our giant backends. How can we expect our web interfaces to be slim when we're setting such a bad example on the server side?

        Stop. More and more people use websites. Synchronous programming no longer works and just about every website needs load-balancing. You unintelligible troglodyte.

        >Most website work is pretty routine. You hook up a database to a template, and make sure no one trips over the power cord. But automation at scale? That's pretty sweet, and it's difficult!

        Stop. You cannot compare the simple MyPhpAdmin you setup years ago to what most companies need to do now.

        > That's what it feels like to be a programmer, lost in the cloud. Complexity is like a bug light for smart people. We can't resist it, even though we know it's bad for us. This stuff is just so cool to work on.

        See: "Chickentshit's Dunning Krugar." You're not a programmer, you're a webdev.

        >Adam Drake wrote an engaging blog post about analyzing 2 million chess games. Rather than using a Hadoop cluster, he just piped together some Unix utilities on a laptop, and got a 235-fold performance improvement over the 'Big Data' approach. The point is not that people using Hadoop clusters are foolish, or that everything can be done on a laptop. It's that many people's intuition about what constitutes a large system does not reflect the reality of 2015 hardware. You can do an awful lot on a laptop, or pizza box web server, if you skip the fifty layers of overhead.

        Right tool for the job, etc. etc. Unix utils work great for structured string processing, Hadoop works great for an alayzing HUGE (terabytes, where the chess games were only 1.7GBs) amounts of unstructured data and being able to horizontally scale it by using other machine's hardware. Bash cannot do this. And Bash utils are just C.

        >Let me give you a concrete example. I recently heard from a competitor, let’s call them ACME Bookmarking Co., who are looking to leave the bookmarking game and sell their website. [...] Rather than trying to make your overbuilt projects look simple, ask yourself if they can't just be simple.

        The author goes on to bash a competitor, that may or may not exist.

        >Here's the hortatory part of the talk: Let’s preserve the web as the hypertext medium it is, the only thing of its kind in the world, and not turn it into another medium for consumption, like we have so many examples of already.

        Too late. He hasn't said why this should be a noble goal, except pointing to bloat. It seems awkwardly placed, and the intent behind might be to rouse the crowd with some vapid "rebel against society" hype.

        > Let’s commit to the idea that as computers get faster, and as networks get faster, the web should also get faster.

        It is faster. But just like CPU power, bandwidth, and RAM, non-utilized it's wasted. Take note, that there is a difference between smart utilization and taking over an entire core.

        > Let’s not allow the panicked dinosaurs of online publishing to trample us as they stampede away from the meteor. Instead, let's hide in our holes and watch nature take its beautiful course.

        Who knows what this means.

        > Most importantly, let’s break the back of the online surveillance establishment that threatens not just our livelihood, but our liberty. Not only here in Australia, but in America, Europe, the UK—in every free country where the idea of permanent, total surveillance sounded like bad science fiction even ten years ago.

        Thank you. I almost thought keeping the web as only HTML was the worst aside. Why is this statement here at all?

        > The way to keep giant companies from sterilizing the Internet is to make their sites irrelevant. If all the cool stuff happens elsewhere, people will follow. We did this with AOL and Prodigy, and we can do it again.

        The cool stuff is novel, flashy, hip websites. The "cool stuff" is not granddad's water color paints encyclopedia (for the majority).

        > For this to happen, it's vital that the web stay participatory. That means not just making sites small enough so the whole world can visit them, but small enough so that people can learn to build their own, by example.

        For educational and humanitarian purposes, sure. For companies that need to make money? No. People can already learn to build their own websites that rival billion-dollar companies. The only thing stopping them is their persistence to work through tough challenges.

        I'm starting to believe this talk was for no other reason, but to express the author's mediocrity and inability to make a good website. If everyone agrees with his talk, surely he must be "doing fine." Maybe that's what those two awkward statements about "privacy" and "keep the internet great" were about: ear-candy for more agreement.

        > I don't care about bloat because it's inefficient. I care about it because it makes the web inaccessible.

        Is this what the talk was about? I really don't know.

        > Keeping the Web simple keeps it awesome. Thank you very much! HEAVY, ROILING, TROUBLED SEAS OF APPLAUSE

        The absolute, chickenshit, madman.
        • (Score: 0) by Anonymous Coward on Thursday March 15 2018, @04:26AM

          by Anonymous Coward on Thursday March 15 2018, @04:26AM (#652776)

          tl;dr lol

    • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @03:42PM (1 child)

      by Anonymous Coward on Wednesday March 14 2018, @03:42PM (#652437)

      you want to get a blog or something and put this there?

      might save you some time considering your 1/2 is 3/4 and counting. if you had your own platform you could make it all formatted for your audience in a way that doesn't require them to determine what you're talking about in a way other than is already required.

    • (Score: 1) by TheSage on Thursday March 15 2018, @06:48AM (1 child)

      by TheSage (133) on Thursday March 15 2018, @06:48AM (#652807) Journal

      It's Dunning Kruger btw.

  • (Score: 2) by darkfeline on Wednesday March 14 2018, @08:05PM (4 children)

    by darkfeline (1030) on Wednesday March 14 2018, @08:05PM (#652570) Homepage

    >We never could figure out a way to do collapsible comment trees how we wanted to entirely without Javascript

    Well, it's quite obvious why. SoylentNews is a web application, not a WWW page or site. The semantics of HTML and HTTP are quite clear with respect to methods like GET/POST and URL semantics/resource naming, and SN does not follow them, being a forum/news web app.

    If SN were a real WWW site, every comment would be found at URLs like soylentnews.org/posts/1234/comments/1234; you would have to send a GET request for each individual comment, or perhaps send a filter query to the directory: soylentnews.org/posts/1234/comments/?rating=2&excludeUser=Arik for a list of IDs which one can then automatically wget using the previous per-resource URL format. New comments and posts would be submitted via raw POST to soylentnews.org/posts/ and soylentnews.org/posts/1234/comments/ And so on and so forth.

    Let this be an opportunity to recognize that it is possible to host non-WWW applications on WWW technologies, and that this is not necessarily a bad thing.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 0) by Anonymous Coward on Wednesday March 14 2018, @09:25PM

      by Anonymous Coward on Wednesday March 14 2018, @09:25PM (#652608)

      That's not how you define a web page.

    • (Score: 2) by maxwell demon on Thursday March 15 2018, @07:58AM (1 child)

      by maxwell demon (1608) on Thursday March 15 2018, @07:58AM (#652823) Journal

      If SN were a real WWW site, every comment would be found at URLs like soylentnews.org/posts/1234/comments/1234

      Does https://soylentnews.org/comments.pl?noupdate=1&sid=24545&page=1&cid=652570#commentwrap [soylentnews.org] count?

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 3, Interesting) by darkfeline on Friday March 16 2018, @03:12AM

        by darkfeline (1030) on Friday March 16 2018, @03:12AM (#653328) Homepage

        Not really. Those are clearly parameters being passed to a CGI script and aren't compliant with RFC 3986 semantics of "identifying a resource".

        If it was https://soylentnews.org/comments.pl?sid=24545&cid=652570 [soylentnews.org] and only returned the "identified resource" (i.e., a single comment), then yes (although I personally do not like the URI format), but it returns a bunch of unrelated resources, like the post and child comments along with the "identified resource", (and not to mention all of the website frills, like the sidebar menus).

        --
        Join the SDF Public Access UNIX System today!
    • (Score: 2) by Wootery on Thursday March 15 2018, @11:31AM

      by Wootery (2341) on Thursday March 15 2018, @11:31AM (#652899)

      SoylentNews is a web application, not a WWW page or site

      Not really. An 'application' is a type of program, and SoylentNews doesn't even require a Turing complete language (not counting CSS being technically Turing complete). Remember the old Web 2.0 buzzword for the read/write web? That's what SN is.

      If SN were a real WWW site, every comment would be found at URLs

      As maxwell demon already pointed out, every comment does have its own URL.

      you would have to send a GET request for each individual comment

      Nope. You've just invented a silly architecture and decreed that this is the one true webby way to do things. Being a 'proper' website doesn't mean adding countless pointless roundtrips.

  • (Score: 0) by Anonymous Coward on Thursday March 15 2018, @11:10AM

    by Anonymous Coward on Thursday March 15 2018, @11:10AM (#652889)

    We never could figure out a way to do collapsible comment trees how we wanted to entirely without Javascript

    Hidden checkboxes, with CSS showing the sub-comments when the checkbox is checked. Then make the "show replies" a label with "for" set to the corresponding checkbox.

    I use a similar concept on my personal website to build a menu tree, though I'm using radio buttons instead to make the previous menu collapse when a different one is selected. Zero javascript involved.

    (You may need to use CSS animation to make the expanding/collapsing appear slow, otherwise people might think that nothing happened).

(1)