Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday March 15 2020, @10:31AM   Printer-friendly
from the go-fast-by-going-slow dept.

Ice Lake GPU underperforming? Put it in powersave mode. Wait, what?:

Back in December, Linux users were starting to notice that Ice Lake-equipped laptops were getting better framerates in powersave mode than in performance mode. This Tuesday, Intel developer Francisco Jerez released a patchset to address the conundrum. Jerez begins by noting the fact that if your system bottleneck is I/O, boosting CPU performance won't help—the CPU can't process more data if the I/O subsystem isn't providing it fast enough.

"In IO-bound scenarios (by definition) the throughput of the system doesn't improve with increasing CPU frequency beyond the threshold value at which the IO device becomes the bottleneck."

Jerez goes on to note that pointlessly boosting the CPU into turbo frequencies when there's no additional data for it to process doesn't just hurt power efficiency. In the case of laptop designs, there's typically no room for desktop- or server-style "overengineering"—you've got limited space as well as limited power. This means, among other things, that there's only so much cooling to go around.

"With the current governors [...] the CPU frequency tends to oscillate with the load, often with an amplitude far into the turbo range, leading to severely reduced energy efficiency."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by khallow on Sunday March 15 2020, @02:55PM (7 children)

    by khallow (3766) Subscriber Badge on Sunday March 15 2020, @02:55PM (#971568) Journal
    Sorry, there's a need for gaming laptops. Such as not having a permanent desk at which to set up the computer. For example, most coworkers in my workplace live in dorms. Most people use laptops for their computing needs because they're easy to relocate say because the roommate is using the only desk or sleeping.
  • (Score: 2) by toddestan on Monday March 16 2020, @03:34AM (1 child)

    by toddestan (4982) on Monday March 16 2020, @03:34AM (#971779)

    Where do you work that most of your coworkers live in a dorm?

    Even when I lived in the dorms at the university in the very early 2000's back when laptops were too expensive for most college students so we all had PC towers and clunky CRT monitors, we got by just fine. Typically one of use would use the crappy, permanently attached to the wall, and poorly located desk they provided, and the other would get a crappy folding table or maybe a cheap desk from Walmart that's made from compressed sawdust. Hell, I remember towards the end when people started upgrading to LCDs - not us, we were too poor - but we were more than happy to take that spare CRT off your hands. And now our dorm room had two power hungry Athlon-based towers, each with dual CRT's. It got warm, especially since the dorms weren't air conditioned.

    I've never been impressed with the gaming laptops. They are expensive, run super hot, and if you use them for their intended purpose you might get three years out them before they burn themselves out. Though on the other hand, they are one of the few laptops still built today that have some upgradeability and a keyboard that isn't total rubbish.

    • (Score: 0) by Anonymous Coward on Wednesday March 18 2020, @03:11AM

      by Anonymous Coward on Wednesday March 18 2020, @03:11AM (#972618)

      Where do you work that most of your coworkers live in a dorm?

      My guess would be the field of law enforcement, in the LEA decoy division.

  • (Score: 2) by Freeman on Monday March 16 2020, @05:56PM (4 children)

    by Freeman (732) on Monday March 16 2020, @05:56PM (#971951) Journal

    Heat kills electronics. Laptops have tiny exhaust ports, poor circulation, and even worse heat sinks than in essentially any desktop computer. That leads to excess build up of heat and dying parts on the laptop and / or degradation in performance of said components. Either get the lightest, lowest power usage laptop for the job or get a desktop. Also, you're likely able to buy 2 decent gaming computers for the price of one decent gaming laptop.

    --
    Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
    • (Score: 2) by takyon on Monday March 16 2020, @10:42PM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday March 16 2020, @10:42PM (#972038) Journal

      The march of MOORE SLAW improvements and the push towards greater graphical resolutions means that newer laptops are probably going to be just fine for gaming.

      https://www.notebookcheck.net/GeForce-RTX-2080-Ti-Desktop-vs-GeForce-MX350_9526_9980.247598.0.html [notebookcheck.net]

      A high-end discrete GPU might only be 5-10 times faster than a budget discrete laptop GPU. That sounds like a big difference. But if the former can do 4K @ 120 FPS, you would expect the latter to do 1080p @ 60 FPS, which is acceptable for most people.

      Increases in laptop and desktop APU performance will kill lower-end discrete GPUs.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Freeman on Wednesday March 18 2020, @03:16PM (2 children)

        by Freeman (732) on Wednesday March 18 2020, @03:16PM (#972771) Journal

        MOORE SLAW doesn't compete with the Laws of Thermodynamics. Moore's Law is irrelevant to the discussion, until someone consistently comes out with laptops that make good use of said laws and doesn't want their laptops to be bricks after a couple of years.

        As far as I can tell heat management is that thing they pay lip service to after they stuff in all the goodies they can. Sure, at some point, heat management might not be an issue. That point isn't now, unless you can make do with a tablet, like an iPad or Samsung. I've gotta say the Samsung tablet I've been (ab)using for the last year or two has held up well. The main difference is ARM vs AMD/Intel. ARM CPU/GPU combos use a lot less power and function a lot better than AMD/Intel in enclosed environments.

        Unless you're predicting that sometime in the near future ARM will be killing off the AMD/Intel hold on the desktop/laptop market? The only thing I see getting in the way is Software and Microsoft is still killing Apple when it comes to that. It's definitely an uphill battle.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
        • (Score: 3, Interesting) by takyon on Thursday March 26 2020, @07:11PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Thursday March 26 2020, @07:11PM (#976035) Journal

          You are overestimating the problem of heat, which is funny since it's one of the biggest problems facing integrated circuits.

          I have an A6-3400M laptop from 2011. That chip has a relatively high 35W TDP. The thing still runs well and it is on basically 24/7, with a fan that is certainly collecting dust, probably not helped by a hole in the chassis near the left hinge. The hinge, keyboard, and screen are the components that are failing, not the APU.

          You can get better performance from today's 15-25W TDP x86 APUs (listed as 15W, often configured to use up to 25W). The Ryzen 7 4800U has 8 cores, can match some Ryzen desktop chips in benchmarks, and has roughly the GPU performance of an original PS4, which used about 10x the power [wikipedia.org].

          Since my laptop was made, the Ultrabook initiative has forced most laptops to become thinner with worse overall thermal situations. But there are improvements [hexus.net] that can balance that out. The build quality of laptops seems to be better these days.

          What I am predicting is that we will see somewhere between a further 1,000x to 10,000,000x improvement in classical CPU performance. The performance and efficiency gains could outpace the ability of OS and software developers to "waste" the performance. x86 will continue to hang around for the foreseeable future. Fanless x86 systems already exist and their performance will continue to go up. Intel's Lakefield [anandtech.com] shows that there is low-hanging fruit that can be picked to bridge the gap between x86 and ARM efficiency, such as the adoption of a "big.LITTLE" multi-core approach.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by takyon on Saturday March 28 2020, @02:10PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday March 28 2020, @02:10PM (#976631) Journal

          https://ownsnap.com/ryzen-5000-series-cezanne-apu-will-have-8-core-16-thread-cpu-navi-24-cus-igpu-built-on-amd-7nm-zen-3-microarchitecture-according-to-latest-leaks/ [ownsnap.com]

          Here's a fun thing. It's based on a YouTube channel [youtube.com] I watch a lot of now.

          The speculation is 8 cores (Ryzen 7 3700X equivalent) and up to Xbox One X GPU performance (6 teraflops) in a ~25 Watt package for laptops. And Microsoft could be using a cut down version of the APU from Xbox Series X (horrible naming, I know) to put in a Surface product. It could even have some of the "raytracing" functionality.

          Then you will see improvements from this kind of baseline from further node shrinks. TSMC "5nm" node should be about 85% more dense than first-gen TSMC "7nm", with higher performance and efficiency. TSMC will do a "3nm" node, and maybe a couple moore.

          Take the END OF SLAW baseline ("0.5nm" to "3nm") and transition to 3D monolithic chips with RAM nanometers or a micron away from cores, and you could easily improve performance per Watt by 100x. I think that applies to GPU performance as well. We'll just see laptops, tablets, or phones that are better than today's high-end desktops (i.e. 64-core Threadripper systems). Fanless TDP? No problem. If a Raspberry Pi 4 or cheap laptop is good enough for 50% of today's home users, that stuff should be good enough for 99.9% of users.

          But reeling it back to Cezanne... a 25 Watt APU in a laptop is doable. It doesn't get bricked after 2 years. Maybe you don't want to overclock it or use it in a sauna, but this kind of system can live a long time. Thermal solutions are better, and build quality is up.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]