Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday March 12 2018, @11:11PM   Printer-friendly
from the embrace,-extend... dept.

Google promises publishers an alternative to AMP

Google's AMP project is not uncontroversial. Users often love it because it makes mobile sites load almost instantly. Publishers often hate it because they feel like they are giving Google too much control in return for better placement on its search pages. Now Google proposes to bring some of the lessons it learned from AMP to the web as a whole. Ideally, this means that users will profit from Google's efforts and see faster non-AMP sites across the web (and not just in their search engines).

Publishers, however, will once again have to adopt a whole new set of standards for their sites, but with this, Google is also giving them a new path to be included in the increasingly important Top Stories carousel on its mobile search results pages.

"Based on what we learned from AMP, we now feel ready to take the next step and work to support more instant-loading content not based on AMP technology in areas of Google Search designed for this, like the Top Stories carousel," AMP tech lead Malte Ubl writes today. "This content will need to follow a set of future web standards and meet a set of objective performance and user experience criteria to be eligible."

Also at Search Engine Land and The Verge.

Related: Kill Google AMP Before It Kills the Web
Google Acquires Relay Media to Convert Ordinary Web Pages to AMP Pages
Google Bringing Accelerated Mobile Pages to Email


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by Anonymous Coward on Tuesday March 13 2018, @12:02AM (9 children)

    by Anonymous Coward on Tuesday March 13 2018, @12:02AM (#651587)

    I've been living without JS as much as possible for years and in that time the number of sites that completely break without it has skyrocketed. Not even a week ago Youtube was reduced to a black screen with a white bar across the top. It used to at least show the first few rows of videos/titles which I could manually youtube-dl by copying the links. I used be amazed when JS-heavy sites loaded 10-15 scripts, today 80+ is the norm... Further, in my years-long effort to avoid JS I've convinced exactly zero people to do the same. Avoiding or boycotting tech rarely evokes the desired change. JS, new versions of Firefox/Chrome/Windows/Linux/whatever, mobile devices, government spying, all continue to march forward at an increasingly accelerated rate regardless of how much the masses disapprove of their choices or behavior; and so will AMP.

    Starting Score:    0  points
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 3, Touché) by bob_super on Tuesday March 13 2018, @01:01AM (3 children)

    by bob_super (1357) on Tuesday March 13 2018, @01:01AM (#651606)

    The masses care about the social network post, the game streaming, and the porn.
    The masses don't know about Javascript or any of the obscure piping that makes the bits go around.

    I browse the not-neutered web on Linux with Noscript, exchanging the inconvenience for the certainty that my use case is such a tiny fraction of a percent, it's hardly worth writing nasty code against.
    I have no illusions of "the masses" ever learning, let alone bothering, to fight for smaller pages and less scripts. You could run crypto-miners all day long without fear of uprising.

    • (Score: 0) by Anonymous Coward on Tuesday March 13 2018, @08:32AM (2 children)

      by Anonymous Coward on Tuesday March 13 2018, @08:32AM (#651735)

      The masses care about the social network post, the game streaming, and the porn.

      Porn sites are the exception, though. Many porn sites work without JS, some even work better without than with. Probably related to the whole idea that porn sites are especially risky, so many people will browse porn with JS disabled.

      Videos are an exception, though, I'm not sure whether this is for copyright reasons (though Video Download Helper works fine), or simply web devs not knowing that the video tag works fine without Javascript. Considering the number of posts I've seen on sites like StackExchange where simple CSS questions get answered with "use JQuery", I'm guessing it's the latter.

      • (Score: 3, Informative) by Pino P on Tuesday March 13 2018, @04:15PM

        by Pino P (4721) on Tuesday March 13 2018, @04:15PM (#651867) Journal

        Videos are an exception, though, I'm not sure whether this is for copyright reasons

        True, Encrypted Media Extensions for digital restrictions management of web video require JavaScript. But that's not the only problem. The other problems are seeking, variable throughput of the Internet connection, and live video.

        The naive method of seeking in a recorded video, relying on HTTP range requests, runs into two problems. The first is variable bit rate. Dropping the needle one-third of the way through a file won't get you one-third of the way into the runtime if the first third is encoded with a greater or lesser bitrate than the remainder. So a player has to use bisection search to figure out at which byte offset to start retrieving the video data, and this sort of back and forth can take a while over a high-latency satellite or cellular connection.

        The other problem with HTTP range requests is web servers that fail to support range requests. Two decades ago, download managers used range requests to attempt to retrieve several pieces of a file at once, exploiting throttles that operated per connection. But because these used up more connection resources on download servers, several file download services deliberately disabled range requests except for those servers reserved for paying subscribers. A browser might discover that a range request has failed, and the server has fallen back to resending the entire video from the start.

        Another cause of seeking is varying throughput of an Internet connection over short periods. To ensure a seamless experience for viewers, a service might want the player to choose among encodes at different bitrates. But when it switches bitrates, it has to seek to the corresponding position in the lower or higher bitrate encode.

        Viewers expect to drop the needle in a live video at what's happening right now, or perhaps a minute ago ago after the video has passed through the buffers of state- or advertiser-required censorship and codecs that use B-frames. The naive solution of encoding the video separately for each viewer doesn't scale. An improvement is to encode once and start each stream at the next keyframe. But the architecture of widely used repeater services (also called content delivery networks) works on a whole file, not a stream that each server would need to demux and remux.

        The common solution to these is breaking the videos into segments 3 to 10 seconds long, storing each at a separate URL, and linking them from a timed playlist. Then the CDNs can cache each segment, and the player can choose which to request based on the current playback position and recent throughput. But if the browser doesn't support common playlist formats, such as Apple HTTP Live Streaming or MPEG-DASH, a polyfill served by the website needs to handle this using the Media Source Extensions API, which requires JavaScript.

        Not being a fan of porn, I don't know how long a typical porn video is. But if it's under a minute, the compressed video is likely to fit entirely within RAM, allowing the browser to use trivial seeking methods that rely on the whole muxed stream being available.

      • (Score: 2) by bob_super on Tuesday March 13 2018, @04:29PM

        by bob_super (1357) on Tuesday March 13 2018, @04:29PM (#651870)

        > many people will browse porn with JS disabled.

        "AH, you mean incognito mode, right? I always clear my history and cookies too, just in case" - typical user

        Little reminder for the SN dweller: It is likely that over 90% of people browsing the web Do Not Know what that Javascript thingy is. Maybe half have noticed the name exists.
        You could easily sell the casual user a new flonium condensator to rehash their Qbit.
        Computers are an appliance, especially in phone form factor. Most people have no clue.

  • (Score: 2, Interesting) by cocaine overdose on Tuesday March 13 2018, @03:50AM (3 children)

    Rarely, is going back and banning (for the sake of this topic, recommending a boycott is banning) new technology one that reaches the intended result (a drop in usage of said technology). Especially one with little visible benefits, and a whole load of noticeable drawbacks (as you mentioned, everything breaks).

    One of things I've found successful, was reactionary and precise pushback that doesn't remove the quality of life one has been accustomed to. Think uBlock : Ads, not NoScript et al. In one of my own codebases, we were able to modify the SpiderMonkey JS engine to emulate Chrome's V8 (among other privacy patches that make your browser completely indistinguishable from a Windows 10 Chrome, even on Linux) for the very bare necessities, and then pass fake info for everything else. Less than 1% of the browser population uses/ does not have JavaScript enabled (don't look at industry data from fingerprinting companies, this one statistic is misinterpreted/lied about). Our assumption was that NoScript severely lessens the user's experience, while offering intangible benefits. So we worked the opposite. The insecure cruft was removed in favor of simple pseudo-data outputters. What's left is already fast and stops most egregious abuses of JS, but we still added little user-enhancements here and there to make it more "real" (like rate-limiting and scaling CPU usage, so coinminers don't lock out a core). Then slap a simple installer script onto it, and you have people giving a shit now. In the SaaS world, this is called "friction," i.e how fucking hard do you make it to achieve your intended goal?
    • (Score: 2) by Pino P on Tuesday March 13 2018, @05:09PM (2 children)

      by Pino P (4721) on Tuesday March 13 2018, @05:09PM (#651883) Journal

      One of things I've found successful, was reactionary and precise pushback that doesn't remove the quality of life one has been accustomed to.

      Would it also be considered "precise pushback" to block scripts that the end user isn't allowed to understand and improve [gnu.org]?

      • (Score: 2) by cocaine overdose on Tuesday March 13 2018, @07:16PM (1 child)

        Less precise than uBlock, more precise than NoScript. It would take considerable effort, as Stallman said, to identify non-free JS scripts and return you have another one of those "intangibles." It's one of my gripes with the FSF and GPL. Stallman tells us "this is bad, this is bad, but this one thing here is good. Get rid of the bad" without lending a hand. The FSF is like a little girl that declares herself queen, during a riveting game of house. "I want this and this and this, but not this." "Yes, your majesty, but how are supposed get all those?" "I don't care, I want it." Anyway, aside aside, even if you could do it reliably, you're now back to the NoScript dilemma. Or should I say uMatrix dilemma? uMatrix is one tool that can be used to further the blocking of non-free JS, but if you've used it before, I'm sure you've noticed how even the most arbitrary website functionality is dependent on JS, and if you block even one the whole things just doesn't load. The only thing I've found it to be good for is blocking analytics and "high risk" JS.

        You're not gonna get people to disavow JS using a logical argument. They're going to have to get consecutively more and more sick of it, until everyone drops it for the newest flavor of [insert hyped up tech here].
        • (Score: 2) by Pino P on Wednesday March 14 2018, @03:16PM

          by Pino P (4721) on Wednesday March 14 2018, @03:16PM (#652405) Journal

          It would take considerable effort, as Stallman said, to identify non-free JS scripts

          "Non-free" is hard. "Not machine-readably labeled as free" is easy. Block everything by default and allow only those scripts whose developers have specified their license [gnu.org]. I'd be interested to see which would be the first adtech company to answer the LibreJS challenge.

  • (Score: 2) by FatPhil on Tuesday March 13 2018, @12:43PM

    by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Tuesday March 13 2018, @12:43PM (#651790) Homepage
    I don't browse youtube, generally, so its JS can go eat itself. However, I happily watcvh youtube vids by simply downloading them and playing them. To download them I use a self-modified (he doesn't even acknowledge, let alone merge, my patches) version of Jamie Zawinski's youtubedown (from jwz.org, direct linking to the script is blocked). JWZ's prepared to play whackamole with youtube's constantly changing my-first-crypto(tm) obfuscation, which is nice.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves