Stories
Slash Boxes
Comments

SoylentNews is people

posted by azrael on Tuesday September 02 2014, @01:11PM   Printer-friendly
from the harder-better-faster-stronger dept.

The Web is going to get faster in the very near future. And sadly, this is rare enough to be news. The speed bump won't be because our devices are getting faster, but they are. It won't be because some giant company created something great, though they probably have. The Web will be getting faster very soon because a small group of developers saw a problem and decided to solve it for all of us.

Web developers recognized this problem very early on in the growth of what was called the "mobile" Web back then. So more recently, a few of them banded together to do something developers have never done before—create a new HTML element.

"As of August 2014, the size of the average page in the top 1,000 sites on the Web is 1.7MB. Images account for almost 1MB of that 1.7MB.

If you've got a nice fast fiber connection, that image payload isn't such a big deal. But if you're on a mobile network, that huge image payload is not just slowing you down, it's using up your limited bandwidth. Depending on your mobile data plan, it may well be costing you money.

What makes that image payload doubly annoying when you're using a mobile device is that you're getting images intended for giant monitors loaded on a screen slightly bigger than your palm. It's a waste of bandwidth delivering pixels most simply don't need."

The article covers in depth how a group of volunteer web developers battled with the HTML standards body to get the new element accepted, and also describes how this element will speed up our browsing once it's commonly implemented.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Informative) by pkrasimirov on Tuesday September 02 2014, @01:23PM

    by pkrasimirov (3358) Subscriber Badge on Tuesday September 02 2014, @01:23PM (#88480)
  • (Score: 4, Insightful) by TheLink on Tuesday September 02 2014, @01:38PM

    by TheLink (332) on Tuesday September 02 2014, @01:38PM (#88485) Journal

    Good for them! It's not that easy to get from idea to implementation[1].

    I had an idea some years ago to improve web security:
    http://lists.w3.org/Archives/Public/www-html/2002May/0021.html [w3.org]
    https://www.mail-archive.com/mozilla-security@mozilla.org/msg01448.html [mail-archive.com]
    If it was implemented a few of those worms in the past (myspace, yahoo etc) might not have worked.

    But supposedly they are doing something about it now: https://en.wikipedia.org/wiki/Content_Security_Policy [wikipedia.org]

    [1] Ideas are easy, getting stuff implemented is the hard part - that's why I think nowadays patents in practice are hindering progress more than helping.

  • (Score: 1) by Horse With Stripes on Tuesday September 02 2014, @01:39PM

    by Horse With Stripes (577) on Tuesday September 02 2014, @01:39PM (#88486)
    How many different implementations will this result in? No matter what any browser vendor says they always want their version to become the standard. This results in hemorrhoid after hemorrhoid for coders who need to deal with the user experience.
    • (Score: 3, Informative) by DECbot on Tuesday September 02 2014, @02:42PM

      by DECbot (832) on Tuesday September 02 2014, @02:42PM (#88511) Journal

      Considering the same developer wrote the implementation for Blink and WebKit, I imagine it will be fairly universal. Of course IE is the usual exception here.

      --
      cats~$ sudo chown -R us /home/base
      • (Score: 1) by Horse With Stripes on Tuesday September 02 2014, @03:52PM

        by Horse With Stripes (577) on Tuesday September 02 2014, @03:52PM (#88537)

        IE's market share is significant enough that it can't be ignored. We also shouldn't discount the variety of mobile browsers available to Android users.

        • (Score: 2, Informative) by francois.barbier on Tuesday September 02 2014, @06:06PM

          by francois.barbier (651) on Tuesday September 02 2014, @06:06PM (#88585)

          Yet, the other browsers' market share is significant enough that IE can't ignore them either.
          IE can't do its proprietary stuff anymore. Last IE doesn't even implement IE conditional comments.
          Yay for healthy competition!

        • (Score: 4, Informative) by frojack on Tuesday September 02 2014, @06:56PM

          by frojack (1554) on Tuesday September 02 2014, @06:56PM (#88593) Journal

          As long as the proposed solution is backward compatible there shouldn't be a problem.
          Its not clear that it is.

          Basically sending the maximum resolution the client was prepared to deal with upstream to the server in the tail of the initial request
          and letting the server decide which images would be sent makes more sense than sending all available image sizes and letting the client choose which one to get would have done less damage to the web as a whole.

          To recap, the markup pattern proposed by this community group is as follows:

          <picture alt=""> <source src="mobile.jpg" /> <source src="large.jpg" media="min-width: 600px" /> <source src="large_1.5x-res.jpg" media="min-width: 600px, min-device-pixel-ratio: 1.5" /> <img src="mobile.jpg" /> </picture>

          Which is fine, but relies on web developers crawling through their source code and fixing ever image they use. That will take forever to happen, and most will NEVER do it until the web-dev tools do it automatically. So the status quo will last essentially forever.

          By telling the server, either in the initial request or the browser id (user agent), the images could be scaled in real time to a size that fit within the browsers capabilities, and the work could be all done server side (and cached) rather than 1) forcing a NEW browser release on everyone, and 2) waiting for every web developer to rewrite their site.

          I suppose the above syntax could be built from the current simple syntax in the server on the fly to server pre-cached image sizes, but that too seems fraught with peril.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 0) by Anonymous Coward on Tuesday September 02 2014, @08:26PM

            by Anonymous Coward on Tuesday September 02 2014, @08:26PM (#88631)

            Bingo. It won't catch on with developers, and it's the wrong solution to an outdated problem.

            1. Ten years ago or more, when mobile devices had tiny resolutions, this approach would make sense. Now, the only way to make a nested list of images would be to specify one for each level of bandwidth (meaning necessary bandwidth itemprops or new tags instead of min-resolution media tags).

            2. It's not backward-compatible to desktops but to mobile devices. This pattern will failover to a mobile image on old versions of IE, which are significant in marketshare.

            3. There is already a solution to image bandwidth: Apache's (and other servers') Content Negotiation with the qs parameter. That can be implemented without mucking up the html, although it does require a type-map file. That separates the image quality from the content, which is desirable, although it will probably freak out some graphic designers for whom text files are verboten mysteries of the coding staff. That much could be handled by development environment plugins in the background, though: a plugin in Coda or whatever hip graphic designers use now could, upon publishing/uploading files, generate alternate images and a type-map file automagically, all without mucking up the html.

  • (Score: 4, Interesting) by kaganar on Tuesday September 02 2014, @01:46PM

    by kaganar (605) on Tuesday September 02 2014, @01:46PM (#88491)

    Most people I know have phones with higher resolutions than their monitors, so I don't anticipate any observable mobile device bandwidth savings when browsing in a landscape orientation. A portrait orientation may be different. Which makes one wonder -- do I have to reload images to get the proper resolution when I switch from portrait to landscape? Or do I just load the larger resolution to begin with? If so, I would expect no bandwidth savings for most mobile devices.

    The more novel use of this tag seems like showing more tightly cropped images for smaller devices where you have seriously limited screen real estate, regardless of resolution.

    The actual news here is that we'll (hopefully) no longer need to use Javascript and some ugly scripts to semi-intelligently choose pictures based on view specifications. I suspect that most people won't notice the difference in speed or design.

  • (Score: 0) by sjwt on Tuesday September 02 2014, @01:50PM

    by sjwt (2826) on Tuesday September 02 2014, @01:50PM (#88493)

    The only thing thats going to come close to 'fixing html' is throwing it out the window..

    start from scratch and develop something thats a true standard, and is workable.

    • (Score: 5, Insightful) by tibman on Tuesday September 02 2014, @02:14PM

      by tibman (134) Subscriber Badge on Tuesday September 02 2014, @02:14PM (#88500)

      You would also be throwing out the only thing that every browser/os is actually agreeing on : /

      --
      SN won't survive on lurkers alone. Write comments.
    • (Score: 2, Interesting) by len_harms on Tuesday September 02 2014, @02:31PM

      by len_harms (1904) on Tuesday September 02 2014, @02:31PM (#88505) Journal

      One thing I wish we could do with HTML and http/https markup is to design it for better cachability.

      I want shorter html in the form of short cuts in http tags. Something like http://soylentnews.org/huge/sub/dir/list/of/stuff/someimage.jpg [soylentnews.org] could become SHORTCUT/someimage.jpg. Saving tons of reference tags all over the place.

      I want the ability to tell the browser that site A is equal to site B and you may sometimes get different html from me. They are addressing some of that with pipelining but not enough. So you could do something like http://loadbalancer1.soylentnews.org/ [soylentnews.org] = http://loadbalancer2.soylentnews.org/some/sub/dir [soylentnews.org] then tag them both with 'SHORT' and then SHORT/someimage.jpg is easily cacheable. Instead of if I happen to hit both 1 and 2 I end up with 2 items cached that are identical causing 2 fetches and using 2x the space on the client side.

      I dont think HTTP/HTML is necessarily broken. I think it needs refactoring and re-thought on how people use it. Some of the http2 stuff that is going on is a good start. We however have to be careful not to end up with something that is complex and hard for people to use because it is 'easy' to code up.

      • (Score: 2) by joshuajon on Tuesday September 02 2014, @04:11PM

        by joshuajon (807) on Tuesday September 02 2014, @04:11PM (#88546)

        The solution to this already exists and is widely implemented in the form of reverse proxy caching. There should only be one URL that clients should know about. Your internal reverse proxy handles distribution of the requests to the appropriate load balancer. Check out https://www.varnish-cache.org/ [varnish-cache.org] for probably the most common software of this type.

        • (Score: 1) by len_harms on Tuesday September 02 2014, @04:38PM

          by len_harms (1904) on Tuesday September 02 2014, @04:38PM (#88556) Journal

          Sort of. That is a decent solution for that particular sort of thing (which is load balancing). It however requires another server in the mix. When much of this information could be done at the client level. I am talking more about giving more information to the client so they can cache better.

          Another issue is that many sites run into the '2 connections' per http site. So they start creating 'fake' sites that are really the same thing. The client can get around that by changing settings in their browser. But the majority of browsers limit you to 2 so you code to that. The browser will see it as 2 different objects. Even though it could be the same object.

          Another problem I see is many sites serving up what is static data as dynamic data. Thus fighting the cache. So you visit a site you more than likely will re-do the object request anyway. But that is a windmill for another day :)

          I think it is all symptoms of the same issue. Which is poor descriptions of what is a retrievable object. So we do things like reverse proxies and dns timeout trickery. The poor description is because the spec does not take into account load balancing. That is left as a network exercise. So we do tricks to get around it.

          • (Score: 3, Interesting) by joshuajon on Tuesday September 02 2014, @05:16PM

            by joshuajon (807) on Tuesday September 02 2014, @05:16PM (#88566)

            Varnish will also alleviate another problem you identified. Specifically, it can cache dynamic pages and serve them as static content. You can discount varnish as trickery, or a workaround, but it's becoming increasingly common and extremely useful for handling incoming web traffic both for caching as well as facilitating load balancing. You could argue that this functionality could be built into the web server, and perhaps you're right, but many prefer smaller simpler tools with more narrowly defined scopes that are as uncoupled as possible.

            But I think it's not realistic to expect clients to solve the caching problem alone. I think you're right that servers can give better information to allow more efficient caching. It's possible for site operators to explicitly define cache-control headers which will maximize caching when appropriate, and minimize it for truly dynamic resources. In fact in the case of a site using a CDN these headers become even more important because it's necessary to have some control over how both the client, and the intermediary CDN performs it's caching.

    • (Score: 2) by kaszz on Wednesday September 03 2014, @02:27AM

      by kaszz (4211) on Wednesday September 03 2014, @02:27AM (#88733) Journal

      Aha.. you mean like Standard Generalized Markup Language [wikipedia.org] standardized in 1986 as ISO 8879 ..? :p

      LaTeX could also be a candidate.

      Usually these things works fine until some corporations want a piece of the action......

  • (Score: 4, Informative) by tangomargarine on Tuesday September 02 2014, @02:35PM

    by tangomargarine (667) on Tuesday September 02 2014, @02:35PM (#88508)

    It drove me up the wall that I got through the entire summary and halfway through the SECOND PAGE of the article without having any idea what the fuck the proposal technically DOES. Cutting out all the talking-down examples and vague language wish-wash:

    ...make Picture a wrapper for img...

    When the browser encounters a Picture element, it first evaluates any rules that the Web developer might specify....Then, after evaluating the various rules, the browser picks the best image based on its own criteria....Once the browser knows which image is the best choice, it actually loads and displays that image in a good old img element....

    This solves two big problems. With the browser prefetching problem, prefetching still works and there's no performance penalty. And for the problem of what to do when the browser doesn't understand picture, now it falls back to whatever is in the img tag.

    Until I found this, I suspected the entire summary could have been replaced with the single word "Compression."

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 2) by tangomargarine on Tuesday September 02 2014, @02:37PM

      by tangomargarine (667) on Tuesday September 02 2014, @02:37PM (#88510)

      They mentioned a few things that are "obviously" bad ideas that I'm not aware of the reason for, either. Why is a mobile version of a website a bad thing? Or are we just talking maintainability reasons?

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by tibman on Tuesday September 02 2014, @02:49PM

        by tibman (134) Subscriber Badge on Tuesday September 02 2014, @02:49PM (#88516)

        It seemed a bit more opinion based. Some people say one site that is responsive is easier to maintain than two separate sites (one for mobile, one for desktop). The biggest reason why one responsive site seems to be the winner is that it isn't just choosing between big and small formats. It scales to several different display sizes and tries to fill up as much display as possible. I know i certainly dislike desktop sites that are a thin vertical strip of content.

        --
        SN won't survive on lurkers alone. Write comments.
        • (Score: 1) by mckwant on Tuesday September 02 2014, @04:15PM

          by mckwant (4541) on Tuesday September 02 2014, @04:15PM (#88548)

          If I'm reading this correctly, it's basically.

          If (Android || iPhone):
                smallImage
          else
              bigImage
          end if

          ...and I'm not sure changing/altering the image gets you the whole way there. Your layout, spacing, etc., isn't image based, and won't (AFAIK, please correct me) adapt to the same inputs as the image. You'll have to reinterpret the inputs for your text and the rest of your page, too.

          So you're effectively doing two layouts anyway, you're just putting the weight on the dynamic engine for the images. And your code is awash in ifs. Have fun debugging.

          I'm not arguing, and the bandwidth benefits are obvious over mobile networking, but this probably won't solve much in managing mobile v. non-mobile web content. Until somebody comes up with the jQuery (read: standard handling of a tricky space) of this thing, anyway.

          I think.

          • (Score: 3, Informative) by tibman on Tuesday September 02 2014, @05:12PM

            by tibman (134) Subscriber Badge on Tuesday September 02 2014, @05:12PM (#88564)

            A responsive layout doesn't use divs/spans in a static way. They stack vertically or layout horizontally to consume the entire display. The image size does play a role in how much content you can fit on the screen. Here is an example: http://adaptive-images.com/ [adaptive-images.com]
            You should see a large image to the left, a graphic legend about image sizes to the right, and a collection of images in a row at the bottom. Now resize your browser to make it smaller. You'll notice the image resizes to fit the window (no scroll bars). When you get small enough you'll see that the legend on the right moves to be below the picture to give the image more space. The legend also changes from a vertical layout to a horizontal one. This effect is done by using CSS, not javascript (you can even noscript the page).

            This isn't about creating a mobile vs non-mobile site. It's about creating a site that uses the end-user's screen to its fullest potential. If someone visits your site with a 4k display they don't want to see a tiny box of content. Some mobile displays are even higher resolution than desktop monitors.

            You are right about "debugging". CSS has always been a pain to work with. Especially if the design has to be pixel perfect : / But i hope that example link shows what the intended goal of a responsive site is.

            --
            SN won't survive on lurkers alone. Write comments.
      • (Score: 0) by Anonymous Coward on Tuesday September 02 2014, @04:58PM

        by Anonymous Coward on Tuesday September 02 2014, @04:58PM (#88560)

        Why is a mobile version of a website a bad thing?

        Because they have different links, and when you post a link on a web site (such as SoylentNews), it will point either to the desktop version (annoying all mobile users) or the mobile version (annoying all desktop users).

        And yes, I've already been annoyed a lot by links to the mobile Wikipedia site.

      • (Score: 5, Insightful) by emg on Tuesday September 02 2014, @05:20PM

        by emg (3464) on Tuesday September 02 2014, @05:20PM (#88569)

        Why is a mobile version of a website a bad thing?

        Once upon a time, we had a dream. What if we developed a hyperlinked format where content was separated from presentation, so you'd create a document, and the program that opened that document would figure out how to best display it on whatever device you wanted.

        And, Lo, there was HTML.

        But then the Graphic Designers invaded the Web. And, lo, suddenly pages began to say 'best viewed in Internet Exploder at 300x976 pixels', or displayed gibberish if you didn't have Comic Sans font installed.

        And thus was the dream destroyed. Instead of the device deciding how to display the content, the Graphic Designers designed different content for every device. Which will keep them in work for centuries to come, but results in users getting really freaking annoyed when the site decides that their dual-screen 24" monitor Linux desktop is actually an Android phone.

        • (Score: 1) by gtomorrow on Tuesday September 02 2014, @08:22PM

          by gtomorrow (2230) on Tuesday September 02 2014, @08:22PM (#88627)

          Once upon a time, we had a dream. What if we developed a hyperlinked format where content was separated from presentation, so you'd create a document, and the program that opened that document would figure out how to best display it on whatever device you wanted.

          And, Lo, there was HTML.

          Ehmm, no. Content separated from presentation is CSS, not HTML. The program (at the time of "the dream" of HTML becoming reality) was WorldWideWeb, then NCSA Mosaic or some variation (Spyglass comes to mind), and then Netscape. And "whatever device you wanted" was only one...your PC, most likely with a 14-17" CRT monitor.

          1995: then came Microsoft's Internet Explorer. And the "browser tag" wars started...by programmers (or more likely, management), not because of your so-called invasion of the graphic designers. I never remember a time when a page (not in a non-roman alphabet) didn't fall back to a serif or sans serif font even when Comic Sans was specified but not installed. Never.

          According to "the dream" everything should look like this. [w3.org] I guess the 1994 version of HotWired [veen.com] was already too "artsy-fartsy" for you.

          CSS, despite being a WC3 recommendation since 1996, didn't even begin to enjoy mainstream usage until the mid-2000s. And that's where the presentation, not the content, changes to fit the device (in this case, mobile devices, a problem that's existed maybe only 10-ish years, in my humble estimation). The device never decided how to display content. It was never intended to decide. The HTML headers "decide" for the device.

          With all due respect, I think "we had a dream" is you're just dreaming.

          • (Score: 2) by forsythe on Wednesday September 03 2014, @12:39AM

            by forsythe (831) on Wednesday September 03 2014, @12:39AM (#88711)

            I must respectfully disagree. Old standards of HTML separated content from presentation quite nicely - they left it up to the HTML agent. If you examine those older specifications, the wording of the elements tiptoes around stating ``This is how you shall display it.'' Instead, there is a guideline that refers to a ``typical'' implementation, with the explanation

            Typical processing is described for many elements. This is not a mandatory part of the specification but is given as guidance for designers and to help explain the uses for which the elements were intended.

            (Emphasis mine.) Thus you get element descriptions such as

            The STRONG element indicates strong emphasis, typically rendered in bold.

            As a result, older versions of HTML are quite presentation-agnostic (including <img>). You can turn them into audio, for example. That's not a dream. That's something that happened, and which started getting really annoying to do about the time <frame> came on the scene.

          • (Score: 0) by Anonymous Coward on Wednesday September 03 2014, @08:04AM

            by Anonymous Coward on Wednesday September 03 2014, @08:04AM (#88805)

            Ehmm, no. Content separated from presentation is CSS, not HTML.

            No.

            Original HTML idea: The document provides the content, and does not tell at all how it is to be presented. That's left to the browser (the very first browsers were text based, and might not even have had a way to show bold or italics depending on the terminal you ran them on).

            Perverted HTML (mostly thanks to Netscape and Microsoft): "Enrich" HTML with tags telling how the content shall be presented.

            Compromise solution: Separate content from presentation again, but still allow presentation to be specified by the provider instead of the browser, using CSS.

        • (Score: 0) by Anonymous Coward on Wednesday September 03 2014, @12:41AM

          by Anonymous Coward on Wednesday September 03 2014, @12:41AM (#88712)

          hello.
          im sry I torrified my android phone to surf thru your tor exit.
          I suspect that we both visited the same site but my android got "registered" to your tor exit ip first and then assume that you too (comming from same ip) was also a android.
          I have fixed this now. my firefox mobile is now pretending to be firefox desktop x64 for win7.

    • (Score: 2) by tibman on Tuesday September 02 2014, @02:51PM

      by tibman (134) Subscriber Badge on Tuesday September 02 2014, @02:51PM (#88519)

      Agreed! But checkout pkrasimirov's link. It's a technical article about the new/proposed tag. It sounds like it is already available in FF and Chrome. IE compatibility is noted as under-construction.

      --
      SN won't survive on lurkers alone. Write comments.
    • (Score: 0) by Anonymous Coward on Tuesday September 02 2014, @04:13PM

      by Anonymous Coward on Tuesday September 02 2014, @04:13PM (#88547)

      Compression

      Always a great idea, now easier than ever [trimage.org]. (A free software frontend to free software tools losslessly compressing jpg and png)

      However meanwhile in the real world, we have to remember to be grateful if we don't run into a bmp files, png photos or jpg diagrams...

      • (Score: 2) by tangomargarine on Wednesday September 03 2014, @02:31PM

        by tangomargarine (667) on Wednesday September 03 2014, @02:31PM (#88912)

        free software tools losslessly compressing jpg and png)

        I was under the impression PNG was already losslessly compressed (why it's so much more efficient than BMP). You mean they recompress it with a more efficient algorithm? Wikipedia says it uses DEFLATE, which apparently isn't hard to beat (cf. .zip vs .7z); presumably the question is just doing it so that the browser doesn't require any extra software to decode it on their end.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 3, Interesting) by calzone on Tuesday September 02 2014, @03:53PM

    by calzone (2181) on Tuesday September 02 2014, @03:53PM (#88538) Journal

    * Making all video and flash opt-in all the time.
    * Create a free, OSS web service called "v2t" (video to text) that takes a video url and spits back still image of a video, it's transcript, plus a link to the source video. Encourage all blog owners to use v2t instead of directly linking to video
    * Pass a law making it illegal to serve more than x bytes of ads to consumers on any pay-per-byte connections. Seriously. TV is fine because it's always on for a fixed price. Same for print. Imagine if you paid MORE to support being served MORE ads? That has to stop. I don't want to pay for the bandwidth ads eat up.

    --

    Time to leave Soylent News [soylentnews.org]

    • (Score: 2) by captain normal on Tuesday September 02 2014, @04:23PM

      by captain normal (2205) on Tuesday September 02 2014, @04:23PM (#88551)

      Which is why plug-ins such as noscript and adblock exist. When i'm using a mobile connection I have them on. If I want to see an image or video, then I can select which to load.

      --
      Everyone is entitled to his own opinion, but not to his own facts"- --Daniel Patrick Moynihan--
      • (Score: 1) by calzone on Tuesday September 02 2014, @04:28PM

        by calzone (2181) on Tuesday September 02 2014, @04:28PM (#88553) Journal

        Plugins don't always work flawlessly. Enforcing simple nettiquette (remember how we killed the blink tag?... that) goes a longer way to accommodating more users, or more types, on more platforms, for cheaper while using fewer CPU resources all around. It's win-win-win.

        --

        Time to leave Soylent News [soylentnews.org]

        • (Score: 1, Informative) by Anonymous Coward on Tuesday September 02 2014, @05:34PM

          by Anonymous Coward on Tuesday September 02 2014, @05:34PM (#88573)

          Your "free" v2t service is pie-in-the-sky. Automatic transcription is essentially impossible and doing it for free just ain't going to happen. Look at google - their transcription of voicemail is really hit-or-miss and that's with a relatively constrained input of a single speaker talking directly into a microphone with little to no background noise.

          • (Score: 1) by calzone on Tuesday September 02 2014, @09:51PM

            by calzone (2181) on Tuesday September 02 2014, @09:51PM (#88667) Journal

            Actually, voicemail is pretty crappy audio quality compared to videos, particularly commercial videos, which are the worst offenders on the internet as far as you get linked to something you're interested in only to find it's a video and not an article.

            Videos suck not just because they're loud, not just because they consume more bandwidth, not just because you can't excerpt quotes to share with friends... but because they are inherently linear and you can't effectively find the choice pieces of info you care about using random access seeking. They're a major waste of resources and time.

            Now if we had said software, it would probably make a lot of mistakes. A lot of them would be hilarious (and spawn plenty of memes). But the very practice, especially when it comes out wrong, would send a clear message to content providers that video is a crappy medium and that people are attempting to circumventing it and as a result getting a poor translation compared to what to their intent.

            This would incentivize them to avoid doing these stupid pieces in the first place and go back to written content, or to provide transcripts on their own (imagine a <transcript> tag!)

            --

            Time to leave Soylent News [soylentnews.org]

  • (Score: 1) by Username on Tuesday September 02 2014, @05:39PM

    by Username (4557) on Tuesday September 02 2014, @05:39PM (#88575)

    The real solution would be to just compress the files before putting them online. I know it was big in the 90s but now seems to be a lost art.

    Also Opera created something great, Opera mini, which compresses all images through their proxy service by default. IMHO The fastest version is 6.5. Blink is still champ.

  • (Score: 3, Interesting) by tempest on Tuesday September 02 2014, @05:40PM

    by tempest (3050) on Tuesday September 02 2014, @05:40PM (#88576)

    "As of August 2014, the size of the average page in the top 1,000 sites on the Web is 1.7MB. Images account for almost 1MB of that 1.7MB.

    And I'm sure that 1.7Mb worth of stuff is critically important to whatever stupendous information it holds, compared to 15 years ago when web pages had virtually no useful intelligent content. I mean there's no way they could get a message across in less than 20k. (yes the sarcasm tag is still non standard unfortunately).

    I propose two new tags. <uneccesary-bullshit> and <actual-useful-content>. Then my browser can just download I want to know and theme <actual-useful-content> with a format I'd like to look at. Then again that would admit that we're trying to fix bloat (which can never be fixed because patching the problem simply allows room for more bloat), which is unacceptable.

    • (Score: 0) by Anonymous Coward on Tuesday September 02 2014, @08:25PM

      by Anonymous Coward on Tuesday September 02 2014, @08:25PM (#88630)

      Your solution, sadly, is as unworkable as the 'evil bit' and for the same reason. It requires cooperation from your adversary.

    • (Score: 2) by kaszz on Wednesday September 03 2014, @02:18AM

      by kaszz (4211) on Wednesday September 03 2014, @02:18AM (#88726) Journal

      Removal of tags like head, link, style, div, span, etc would save a lot.

      Gigantic javascripts, that are usually unnecessary and usually breaks anyway. Ie makes the browser go 100% CPU..

      Here's a dump of where your CNN.com bytes goes, tag wise:
      Tag Bytes
      a 32686
      div 19101
      img 18992
      li 4962
      span 2898
      option 1892
      meta 1351
      input 1125
      iframe 838
      link 800
      script 494

      So html works, most common web designers seems to fail.

  • (Score: 2) by mcgrew on Tuesday September 02 2014, @07:18PM

    by mcgrew (701) <publish@mcgrewbooks.com> on Tuesday September 02 2014, @07:18PM (#88601) Homepage Journal

    let's build websites that were flexible. Marcotte envisioned sites that used relative widths to fit any screen and worked well no matter what device was accessing it.

    I and many others have tried to get stupid web developers to understand that long before there was mobile browsing. For instance, why is there a damned horizontal scroll at the bottom of my screen? Damn it, use relative positioning!!! Count percentage of screen space, not pixels.

    Some devices are widescreen, some standard aspect, some landscape and some portrait. You're not designing a printed page you can control the size and shape of, you're designing for a fluid medium.

    Ranting about moron developers aside, I would have rather read about how to use the <picture> element, but the Ars article only went on and on about politics and bureaucracy among those who come up with the standards. I did find the info here [w3.org] and here. [html5hub.com]

    BTW, my own site works as well on a phone as it does on my 42 inch monitor. Lots of sites (newspapers are the worst) don't even render properly on my laptop.

    I think the biggest problem is developers come from the advertising business, where you have to take an IQ test to be hired; any score over 80 disqualifies you.

    --
    mcgrewbooks.com mcgrew.info nooze.org
    • (Score: 2) by kaszz on Wednesday September 03 2014, @02:23AM

      by kaszz (4211) on Wednesday September 03 2014, @02:23AM (#88729) Journal

      Yep, probably a mindset problem and not a technical one. By designing for a target rather than here's my information and hints. Render it as you please.

      This comment is best viewed with 800x600 Windows98 using Inbreed Exploitme 6.

  • (Score: 2) by Geotti on Wednesday September 03 2014, @02:22AM

    by Geotti (1146) on Wednesday September 03 2014, @02:22AM (#88728) Journal

    Ok, so how is this better than the media attribute in CSS? https://en.wikipedia.org/wiki/Media_queries [wikipedia.org]

    Say I have a master stylesheet and overload the elements containing pictures for each resolution I want to provide. This results in cleaner HTML and I can have separate CSS files for each resolution.

    /me believes he must have missed some crucial fact about this new <picture> tag...

    • (Score: 1) by chris.alex.thomas on Wednesday September 03 2014, @08:28AM

      by chris.alex.thomas (2331) on Wednesday September 03 2014, @08:28AM (#88815)

      how is it better?

      ok, how do you specify an image inside css, as a background image right? but you know that you can't then size a container based on a background image, requiring you to lock the width and height in some way, so it's impractical

      this way allows all the normal rules of HTML, just depending on your browser and the current media query, will download a specific image from a list, so instead of downloading a 4k image on a mobile phone, you'll download the 640x480 version instead saving you lots of bandwidth