Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday April 20 2015, @12:38AM   Printer-friendly
from the pick-a-peck-of-packets dept.

Chromium Blog has published an update on Quick UDP Internet Connections (QUIC). QUIC is a UDP-based transport layer network protocol which began testing in the Google Chrome browser in 2013. One of the goals of QUIC is to reduce latency compared to TCP by making fewer round trips between clients and servers. It also handles multiplexing and packet loss better.

QUIC clients store information about QUIC-enabled servers that have been connected to previously, allowing a secure connection to be established immediately (zero-round-trip). Google claims this can enable significant reductions in page load times:

The data shows that 75% percent of connections can take advantage of QUIC's zero-round-trip feature. Even on a well-optimized site like Google Search, where connections are often pre-established, we still see a 3% improvement in mean page load time with QUIC.

Another substantial gain for QUIC is improved congestion control and loss recovery. Packet sequence numbers are never reused when retransmitting a packet. This avoids ambiguity about which packets have been received and avoids dreaded retransmission timeouts. As a result, QUIC outshines TCP under poor network conditions, shaving a full second off the Google Search page load time for the slowest 1% of connections. These benefits are even more apparent for video services like YouTube. Users report 30% fewer rebuffers when watching videos over QUIC. This means less time spent staring at the spinner and more time watching videos.

Google plans to propose QUIC to the Internet Engineering Task Force as an Internet standard, just as it has done with SPDY, which is being superseded by the HTTP/2 standard.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday April 20 2015, @12:57AM

    by Anonymous Coward on Monday April 20 2015, @12:57AM (#172983)

    Only if you keep the 'connection' open. At which point it would I guess start at 0 again? Or were you speculating too?

  • (Score: 0, Interesting) by Anonymous Coward on Monday April 20 2015, @02:20AM

    by Anonymous Coward on Monday April 20 2015, @02:20AM (#172996)

    trying to read between the lines.

    Google would like better and faster connections. Means more traffic per server, since the some of existing overhead gone, hence saves money.

    But at the same time, having a man-in-the-middle will help their ad business, since client appears to be located after the decryption. Yes, we could cut down the cross site little web-bugs to track us, so less traffic would be coming to our clients or cookies being sent, so our personal pipes will be less full, so better speed.

    But it is like using Google Chrome - please log in, tell me who you are. There, they have full browser. This sounds like being in every other browser too.

    • (Score: 3, Interesting) by Non Sequor on Monday April 20 2015, @03:04AM

      by Non Sequor (1005) on Monday April 20 2015, @03:04AM (#173004) Journal

      If you look at the arc of all of Google's work in web standards and platforms using those standards, Google's end game seems to be making native apps have no advantage over web apps for a typical user. I figure that since they profit as a middle man on the web, they must see anything that results in more stuff being done with web browsers as growing their market.

      From that perspective, Android was originally backed by Google for the purpose of driving out crappy web browsers on most phones at the time it was introduced. Chrome is a testbed for standards work to make new APIs for their web app,development, and also originally to put pressure on other browsers to improve javascript performance. The work on reducing latency relates to findings from A-B testing that users are highly sensitive to page load times and are much more likely to leave if they encounter even marginally slower page loads.

      --
      Write your congressman. Tell him he sucks.
      • (Score: 2) by kaszz on Tuesday April 21 2015, @09:30PM

        by kaszz (4211) on Tuesday April 21 2015, @09:30PM (#173680) Journal

        I think the web has become bloated and overloaded on purposes. It's not a slim solution anymore. So let's keep the web-octopus out of the core protocol arena so we actually do useful things on the web. Corporations can't be trusted!