Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Saturday December 30 2017, @06:45PM   Printer-friendly
from the perhaps-providing-prompt-prompts-prompts-perceived-performance-primacy dept.

Have you ever had that nagging sensation that your computer was slower than it used to be? Or that your brand new laptop seemed much more sluggish than an old tower PC you once had? Dan Luu, a computer engineer who has previously worked at Google and Microsoft, had the same sensation, so he did what the rest of us would not: He decided to test a whole slew of computational devices ranging from desktops built in 1977 to computers and tablets built this year. And he learned that that nagging sensation was spot on—over the last 30 years, computers have actually gotten slower in one particular way.

Not computationally speaking, of course. Modern computers are capable of complex calculations that would be impossible for the earliest processors of the personal computing age. The Apple IIe, which ended up being the “fastest” desktop/laptop computer Luu tested, is capable of performing just 0.43 million instructions per second (MIPS) with its MOS 6502 processor. The Intel i7-7700k, found in the most powerful computer Luu tested, is capable of over 27,000 MIPS.

But Luu wasn’t testing how fast a computer processes complex data sets. Luu was interested in testing how the responsiveness of computers to human interaction had changed over the last three decades, and in that case, the Apple IIe is significantly faster than any modern computer.

https://gizmodo.com/the-one-way-your-laptop-is-actually-slower-than-a-30-ye-1821608743


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Sunday December 31 2017, @12:56PM (2 children)

    by Anonymous Coward on Sunday December 31 2017, @12:56PM (#616128)

    As he discovered, but many gamers already know, without driver tweaks there are roughly four frames of latency between an application sending a frame to the GPU, and it actually being sent to the monitor. Monitor display lag [displaylag.com], which runs anywhere from 1/2 frame average up to three or four, is on top of that. Because these sources of latency are per rendered frame rather than defined by wall clock time, they naturally decrease with higher framerate. CRTs have no display lag (although it does take time to actually scan out the image).

    Gaming monitors are specially designed to minimize display lag, but laptops for the most part don't worry about it much.

    It's possible to tweak drivers to reduce the rendering latency, which like any other pipeline is actually chosen to strike a balance between latency and throughput.

    "Squishy" (non-mechanical/membrane) keyboards have a distance between the key "bump" and actually activating the keypress, which contributes to latency as well.

    Overall unless you're playing a competitive game where one frame of latency can cause you to lose, latency on a quality PC is probably not worth worrying too much about. The human brain hides latency up to around 200ms, which is (perhaps not coincidentally) also roughly the human reaction time. Now, if you want to talk about ATMs or grocery store checkouts that take two seconds to respond and still drop half your keypresses...

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: -1, Troll) by Anonymous Coward on Sunday December 31 2017, @01:23PM (1 child)

    by Anonymous Coward on Sunday December 31 2017, @01:23PM (#616131)

    And it's pretty clear that he still doesn't really know what he's talking about.

    At 144 Hz, each frame takes 7 ms. A change to the screen will have 0 ms to 7 ms of extra latency as it waits for the next frame boundary before getting rendered (on average,we expect half of the maximum latency, or 3.5 ms). On top of that, even though my display at home advertises a 1 ms switching time, it actually appears to take 10 ms to fully change color once the display has started changing color. When we add up the latency from waiting for the next frame to the latency of an actual color change, we get an expected latency of 7/2 + 10 = 13.5ms

    Cannot distinguish between switching time and latency. Display lag is caused by, in addition to the unavoidable delay waiting for the frame to be transmitted over the display cable, electronics in the monitor spending time doing things like rescaling the image (even if it's already in the right resolution), dithering colors to make 18-bit displays look like 24-bit displays, filling a buffer needed to translate HDMI/DVI/DisplayPort into the monitor's internal signaling format, fiddling with the brightness and contrast, and whatever other image processing the designers and marketing team thought would look good. Switching time is caused by the time it takes the actual pixels in the panel to change color once the electronics have decided to tell it to switch.

    They are not the same. All monitor manufacturers advertise the switching time. Almost nobody, outside of high-end gaming monitors, advertises the electronics-related latency. Even then, they normally just say it's good, and don't bother with actual numbers.

    • (Score: 0) by Anonymous Coward on Monday January 01 2018, @09:41AM

      by Anonymous Coward on Monday January 01 2018, @09:41AM (#616378)
      Actually it seems more like you are the one who doesn't know what he is talking about or are intentionally ignoring it.

      Firstly he's talking about overall latency. Quote: "Luu was interested in testing how the responsiveness of computers to human interaction had changed over the last three decades, "

      So it doesn't really matter what the different latencies are called, what matters is in many cases they add up to quite a high value.

      Secondly he may actually know the difference, he's stating that just because displays advertise 1ms switching times doesn't mean the latency is actually 1ms. Lots of people see the 1ms and assume the total is 1ms.