Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrcoolbp on Friday March 28 2014, @09:40AM   Printer-friendly
from the 1ms-2ms-3ms-floor dept.

Anonymous Coward writes:

Two years ago John Carmack tweeted, "I can send an IP packet to Europe faster than I can send a pixel to the screen. How f'd up is that?" And if this weren't John Carmack, I'd file it under the interwebs being silly.

Not convinced? You aren't alone, but Carmack appeared when called out to defend this claim.

We looked further and found this informative article from AnandTech about input lag.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by WizardFusion on Friday March 28 2014, @09:56AM

    by WizardFusion (498) on Friday March 28 2014, @09:56AM (#22437) Journal

    So, not only is this a dupe from a week ago (or there abouts), but it's also teo freaking years old.
    Come on, this should have been binspam'd from the get go.

    • (Score: 5, Funny) by lx on Friday March 28 2014, @10:07AM

      by lx (1915) on Friday March 28 2014, @10:07AM (#22439)

      Yeah but with the Oculus takeover this is now a Facebook story!

    • (Score: 5, Funny) by mattyk on Friday March 28 2014, @12:49PM

      by mattyk (2632) on Friday March 28 2014, @12:49PM (#22483) Homepage

      So, not only is this a dupe from a week ago (or there abouts), but it's also teo freaking years old.

      Proof that it takes ages to update the screen?

      --
      _MattyK_
    • (Score: 1) by darinbob on Friday March 28 2014, @09:23PM

      by darinbob (2593) on Friday March 28 2014, @09:23PM (#22715)

      The real problem is that Carmack thinks that this is a problem. It's a freeking GAME, have some perspective!

      But ya, USB is a stupid protocol, most people who've worked with it realize how messed up it is. But it was intended for slooow devices initially and thus a stupid protocol is not necessarily bad if it has redeeming features (like keeping the consortium happy and profitable). You just gotta love it though when USB interrupt pipelines are polled and everyone manages to keep a straight face about it.

  • (Score: 1, Insightful) by alioth on Friday March 28 2014, @10:39AM

    by alioth (3279) on Friday March 28 2014, @10:39AM (#22445)

    On my 1982 Sinclair Spectrum (with a Z80 CPU running at 3.5MHz) I can send a pixel to the screen much faster than that. (In fact I can get the entire screen updated in under 20ms, a transatlantic IP packet takes about 60ms).

    The fastest I can send a pixel to the screen on said computer would be something like this program, which will set the top left pixel on the screen:

          ld a, 0x80
          ld (0x4000), a

    This takes approximately 5 microseconds.

    • (Score: 5, Informative) by Anonymous Coward on Friday March 28 2014, @12:00PM

      by Anonymous Coward on Friday March 28 2014, @12:00PM (#22466)

      > The fastest I can send a pixel to the screen
      > on said computer would be something like this
      > program, which will set the top left pixel on
      > the screen:
      >
      > ld a, 0x80
      > ld (0x4000), a
      >
      > This takes approximately 5 microseconds.

      That doesn't send a pixel to the screen, it only
      updates memory that represents the screen. To
      actually see the new pixel, the electron gun in
      your CRT has to sweep through that pixel. That
      will take somewhere around 8 milliseconds, with a
      worst-case around 17 ms.

      Your fast update also excludes the whole "read
      user input and figure out what to do with it"
      part. TFA suggests ~10-100ms for that on a modern
      computer; I'd guess 2-5x that on your Sinclair,
      for a contemporary game.

      TFA argues that there's maybe 20 or so ms of lag
      associated with transmitting a frame to an LCD and
      actually displaying that frame, lag which is not
      present in direct-drive CRTs, but is really a
      small part of the user-input-screen-update cycle.

      • (Score: 0) by Anonymous Coward on Friday March 28 2014, @01:24PM

        by Anonymous Coward on Friday March 28 2014, @01:24PM (#22494)

        Your fast update also excludes the whole "read
        user input and figure out what to do with it"
        part. TFA suggests ~10-100ms for that on a modern
        computer; I'd guess 2-5x that on your Sinclair,
        for a contemporary game.

        I wouldn't be so sure about that. I don't know about the sinclair, but on the C64, the keyboard was laid out (electronically) in an 8x8 matrix, connected to two memory mapped 8 bit parralel ports. Although you do say "game", which is completely different from Carmacks test. For his test, which only needs to poll one key, you could set up one port ahead of time to the correct row (assuming interrupts disabled), after which polling the port is a single instruction plus a branch if the port read as zero. Updating the screen background color is a third instruction. Along with a jump back to the test, that's four instructions for the entire test routine (after setting up everything), and the jump is not time critical.

        Even with 1 MHz, three instructions is pretty fast.

    • (Score: 2) by Koen on Friday March 28 2014, @03:24PM

      by Koen (427) on Friday March 28 2014, @03:24PM (#22545)

      And thanks to Z80's LDIR (load, increment & repeat) instruction one could send whole screens (or parts of it) very fast.

      LD HL, 2000h ; Pointer to the source
      LD DE, 4000h ; Pointer to the destination
      LD BC, 49125 ; Number of bytes to move
      LDIR ; Moves BC bytes from (HL) to (DE)

      --
      /. refugees on Usenet: comp.misc [comp.misc]
  • (Score: 5, Informative) by Boxzy on Friday March 28 2014, @11:16AM

    by Boxzy (742) on Friday March 28 2014, @11:16AM (#22456) Journal

    Were always superior for input lag. They had actual variable resistors to control things like contrast, brightness, hue and volume.

    Now, every display has to have its own computational engine, capable of painting the menu system on the screen because its cheaper. Bad engineering means the stream of pixels has to be sent through the computed pipeline the same way the menu is. There's no real reason other than money that the signals can't still be pass-through other than engineering complexity. (still money)

    --
    Go green, Go Soylent.
    • (Score: 2) by nitehawk214 on Friday March 28 2014, @04:27PM

      by nitehawk214 (1304) on Friday March 28 2014, @04:27PM (#22575)

      You seem to be making two separate arguments.

      First, that direct controls are superior to the stupid OSD on screen controls. I fully agree with this. Even if the monitor makers could design a UI worth a damn, physcial controls are almost always superior. The only issue is that modern monitors have tons of settings that can be changed, so some sort of on-screen will almost always be necessary.

      Secondly, and independently, you argue that analog displays are superior to digital. Do you even recall consumer-grade analog crt screens? Even brand new they were shit, and after a few years of heavy use they were ready for the scrapheap.

      Arguing that your >$1000 professional super-display was superior is useless. Those high end analog displays are easily outmatched in size and quality by mid-range lcd displays of today. Spend >$1000 on a current screen and you get... well I don't know. I haven't had the need to spend more than 250 on a display in more than 10 years, and I haven't had to discard one in as long either.

      --
      "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
      • (Score: 2) by nitehawk214 on Friday March 28 2014, @04:29PM

        by nitehawk214 (1304) on Friday March 28 2014, @04:29PM (#22576)

        Addendum: Actually I do know something you get on modern super expensive displays... lots of monitor controls! Just like in the old days you can have separate buttons for brightness, contrast, etc. Are physical buttons really that expensive?

        --
        "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
      • (Score: 1) by Boxzy on Friday March 28 2014, @08:08PM

        by Boxzy (742) on Friday March 28 2014, @08:08PM (#22670) Journal

        The only place where I argue CRT was ever superior is input lag. In virtually every respect LCD's have exceeded a CRT. LCD's introduced a problem CRT's never suffered with except in a few minor edge cases. Personally I have never been too bothered by tiny amounts of lag until lip-sync becomes a problem.

        --
        Go green, Go Soylent.
        • (Score: 1) by cybro on Monday March 31 2014, @03:17AM

          by cybro (1144) on Monday March 31 2014, @03:17AM (#23413)

          You forgot black levels.

          • (Score: 2) by Boxzy on Monday March 31 2014, @08:04AM

            by Boxzy (742) on Monday March 31 2014, @08:04AM (#23478) Journal

            Sure, that would be one of those minor edge cases. Not everybody obsesses about how black is black. "I'll stop wearing black when they invent a darker colour!"

            --
            Go green, Go Soylent.
    • (Score: 4, Insightful) by Foobar Bazbot on Friday March 28 2014, @04:36PM

      by Foobar Bazbot (37) on Friday March 28 2014, @04:36PM (#22581) Journal

      This is bullshit. The presence of an internal framebuffer is strongly correlated to LCD vs. CRT, and not at all correlated to OSD vs. non-OSD.

      The "dumb pass-through screens" were CRTs, fed off a VGA or component source. They received a pixel at a time, and displayed a pixel at a time.

      Almost every current screen is an LCD of some sort, and LCDs don't display a pixel at a time, they display a whole row/column (or depending on the matrix design, some large fraction (usually 1/2) of a row/column) at a time. Neither a true pixel-at-once (VGA/component) nor DVI/HDMI's pixel-serialized-over-ten-bits connection is suitable for directly driving these -- there must be at least a line buffer (or, I suppose, an insanely wide parallel video interface that transfers a line at once). Therefore no LCD screen can really be a "dumb pass-through screen". Moreover, if they are to be useful at any other resolution than the panel's native resolution, as is commonly required, you need at least a ring buffer of multiple lines for good vertical scaling, and it's simplest with a full framebuffer.

      Note that many CRTs weren't dumb screens with a half-dozen pots to twiddle, but had menu systems so everything could be adjusted with the front panel buttons/wheels and the OSD. Yet they implemented OSD overlays without routing the video signal through a framebuffer, because adding a framebuffer, ADCing the video signal into it, blitting an OSD on, and DACing it back out to the CRT proper would have been worse and more expensive than the genlocked overlay these screens actually used. Doing the OSD menu in a framebuffer only became cheaper once the framebuffer was already there (and already causing input lag) for scaling reasons, and that was only needed with LCDs.

  • (Score: 5, Informative) by wonkey_monkey on Friday March 28 2014, @11:29AM

    by wonkey_monkey (279) on Friday March 28 2014, @11:29AM (#22459) Homepage

    Firstly, this is old, old news. The tweet was two years old, and the other linked article is 5 years old.

    Secondly, the headline should actually read:

    Transatlantic ping faster than sending a pixel to a particular screen

    --
    systemd is Roko's Basilisk