Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Thursday October 13 2022, @04:04AM   Printer-friendly
from the for-some-definitions-of-"mini" dept.

You can build a smaller PC yourself, though the NUC will take out the guesswork:

Intel's NUC Extreme series of mini PCs have always tried to straddle the line between keeping the NUC's traditional tininess and providing the power of a full-size desktop. The NUC 11 and NUC 12 Extreme models both included enough room inside for a dual-slot GPU up to around 300 mm in length, but the company is apparently going even further with the NUC 13 Extreme, codenamed "Raptor Canyon." Intel showed off a new version of the box at TwitchCon (via VideoCardz) that is large enough to fit a triple-slot GPU alongside new 13th-generation Intel Core CPUs.

[...] The problem with compact-but-powerful ITX gaming builds—and the opening for the NUC 13 Extreme box—is that these cases are often tricky to build in and require careful measuring, planning, and cable management to ensure that all the components fit and that they're adequately cooled (I say this from sometimes-painful experience). Tiny cases and small-form-factor SFX power supplies also command their own price premium over full-size components. The benefit of building with standard parts is that you'll have more options for upgrading a few years down the road. But the simplicity of the NUC might be worth it for someone who wants something small and fast without all the hassle.

It's sort of funny that we're hearing about this case on the same day as GeForce RTX 4090 reviews are going live—other cards in the 4000-series will surely be small enough to fit in a "mere" triple-slot case, and Nvidia's partners may even figure out how to do it with a 4090. But the trend has been toward ever-larger GPUs, and owners of this new NUC (or many other GPU-compatible ITX cases) may find triple-slot compatibility more limiting in the future than it has been in the past.

Triple-slot GPU in a small box sounds like it has potential to be a loud space heater.


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Disagree) by Opportunist on Thursday October 13 2022, @07:14AM (10 children)

    by Opportunist (5545) on Thursday October 13 2022, @07:14AM (#1276392)

    It's even practically impossible to get an affordable GPU that fits a normal PC, now let's try for some specialized application, what could possibly be the problem?

    • (Score: 2) by RamiK on Thursday October 13 2022, @12:05PM (9 children)

      by RamiK (1813) on Thursday October 13 2022, @12:05PM (#1276411)

      It so happens to be that Intel is in the GPU business now and can make sure there's affordable GPUs for that case.

      --
      compiling...
      • (Score: 2) by Opportunist on Friday October 14 2022, @06:38AM (8 children)

        by Opportunist (5545) on Friday October 14 2022, @06:38AM (#1276526)

        If they actually make affordable GPUs worth a damn, people will buy those GPUs to put them in their normal PCs and still not give a fuck about this.

        • (Score: 2) by RamiK on Friday October 14 2022, @12:21PM (7 children)

          by RamiK (1813) on Friday October 14 2022, @12:21PM (#1276558)

          If they actually make affordable GPUs worth a damn...

          The A750 and A770 seem to be doing rather well in all the reviews I've seen: https://www.engadget.com/intel-arc-a750-a770-review-mid-range-gpu-rivals-nvidia-amd-130032507.html [engadget.com] https://www.tomshardware.com/reviews/intel-arc-a770-limited-edition-review [tomshardware.com]

          ...and still not give a fuck about this.

          I haven't done or read any market research, but personally I agree with you that I just don't care about the size of my PC case. Still, people do buy NUCs and Macs just because they're smaller so maybe something mid-range size-wise is a thing... IDK.

          --
          compiling...
          • (Score: 2) by takyon on Saturday October 15 2022, @10:20AM (6 children)

            by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday October 15 2022, @10:20AM (#1276703) Journal

            The A750 and A770 seem to be doing rather well in all the reviews I've seen

            They are junk compared to e.g. a 6650 XT. Aside from the horrible present, there's a very uncertain future for Intel discrete drivers as well.

            They may be useful for encoding/work. They have AV1 encode.

            But the 6600 through 6700 XT prove there are reasonable "mid-range" options, although they are still not as cheap as some would like.

            At the low-end, the 6500 XT and 6400 make too many compromises to be recommended, it's better to head upwards to the 6600.

            --
            [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
            • (Score: 2) by RamiK on Saturday October 15 2022, @06:06PM (5 children)

              by RamiK (1813) on Saturday October 15 2022, @06:06PM (#1276754)

              They are junk compared to e.g. a 6650 XT

              Don't they win various real world benchmarks at cost-performance margins? I mean, I've only googled some benchmarks and the language is fairly positive...

              there's a very uncertain future for Intel discrete drivers as well.

              I don't think there's any reason to suspect they're lying when they say they're working out bugs and will be doing dx9 and the likes through the software. I mean, there's nothing targeting the old APIs that will suffer from the software overhead so it might be a bit wasteful on paper but in practice you won't even feel it... No?

              But the 6600 through 6700 XT prove there are reasonable "mid-range" options, although they are still not as cheap as some would like.

              At the low-end, the 6500 XT and 6400 make too many compromises to be recommended, it's better to head upwards to the 6600.

              I don't game and don't plan to but if I were, I'd still use 1080p so the lack of fairly priced cards under $330 seems pretty ridiculous to me. I understand this came from the mining era where they couldn't just segment on RAM alone and had to cut into the cores and caches... But as things stand, I think people are just better off getting something really cheap to hold them off before new generations cards that are no longer segmented towards $300-1000 but towards $200-600 come out. And for those I think Intel has a good chance of fitting in nicely if they get their act together and prices right.

              --
              compiling...
              • (Score: 2) by takyon on Saturday October 15 2022, @06:57PM (4 children)

                by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday October 15 2022, @06:57PM (#1276758) Journal

                Don't they win various real world benchmarks at cost-performance margins? I mean, I've only googled some benchmarks and the language is fairly positive...

                Here is the top-of-the-line Arc A770 with 16 GB of VRAM losing to the RX 6600 XT in 1080p. The price of the A770 is around that of a 6700 XT which beats it in everything:

                https://www.techpowerup.com/review/intel-arc-a770/31.html [techpowerup.com]

                After re-reading your comment I guess you don't care about gaming. But it should be facing strong competition from the 6700 XT and whatever Nvidia has at similar prices.

                I don't think there's any reason to suspect they're lying when they say they're working out bugs

                About the driver prospects. I'm referring to the internal cancellation of Arc discrete graphics. The rumor (that Intel will deny) is that they are selling the Alchemist GPUs they made (about 4 million), will wind down Battlemage to target low-end discrete with a single die (for laptops, low-end, OEMs), and Celestial/Druid may never be made. You'll remember that Intel has cancelled Optane and other products recently, and Bloomberg is reporting that Intel is planning to layoff thousands of employees later this month.

                Intel will continue to make iGPUs and graphics drivers, but those years of working with integrated graphics didn't seem to translate so well to discrete graphics. If they stop making discrete graphics (for consumers, enterprise/AI products will continue), then you could expect Arc owners to be neglected at some point.

                I don't relish Intel failing in graphics, and I hope they can do something interesting for the market like challenge AMD's "G" APUs. All they have to do is combine their biggest and best iGPU with a mid-range CPU, and you have something like a Ryzen 7 5700G. Except Intel can spam it on the market because they have their own fabs. They are moving to "tiles" (chiplets) with Meteor Lake so they should have an ability to mix and match different CPUs and iGPUs easily.

                https://www.youtube.com/watch?v=cZr_LWAlDkg [youtube.com]
                https://www.youtube.com/watch?v=2XiWGuEFCbE [youtube.com]
                https://www.jonpeddie.com/editorials/will-axg-survive-gelsingers-axe/ [jonpeddie.com]

                I don't game and don't plan to but if I were, I'd still use 1080p so the lack of fairly priced cards under $330 seems pretty ridiculous to me.

                6700 XT [slickdeals.net] is hanging around $360. That is considered a 1440p card.

                6650 XT [slickdeals.net] is around $285. 6600 XT prices look weird to me, probably because the 6650 XT is replacing it. 6600 non-XT [slickdeals.net] is around $230. The 6500 XT and 6400 are below $200 but don't represent a good value IMO, with gimped PCIe, decode/encode support, and only 4 GB VRAM. Intel's cheap Arc A380 near those price points might be a good buy if you are not doing gaming.

                --
                [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
                • (Score: 2) by RamiK on Saturday October 15 2022, @08:24PM (3 children)

                  by RamiK (1813) on Saturday October 15 2022, @08:24PM (#1276767)

                  I'm referring to the internal cancellation of Arc discrete graphics. The rumor (that Intel will deny) is that they are selling the Alchemist GPUs they made (about 4 million), will wind down Battlemage to target low-end discrete with a single die (for laptops, low-end, OEMs), and Celestial/Druid may never be made.

                  I think those rumors are misinterpreting things: With mining gone, there's basically no consumer high-end anymore so everyone in gaming are shifting efforts down a notch*. Nvidia pretty much said they're not going to make faster cards. AMD only really made price-point promises. And as for Intel, it probably means targeting a 1080p "low-end" with less RAM but aiming to push out as much FPS as a mid-range card does in higher resolution so they'll have enough market presence to make sure game devs target their hardware. That is, it's a repeat of the Atom's "cancellation" where they'll only really re-segment and shift products around.

                  they should have an ability to mix and match different CPUs and iGPUs easily

                  The power budget for iGPUs just doesn't work: Adding more draw to the die when they're struggling to remove heat from the frequency race as is... It's just a dead end. They'll continue focusing on video decoders and the likes but I don't see a way to add real value in that area.

                  Intel's cheap Arc A380 near those price points might be a good buy if you are not doing gaming.

                  iGPUs is all I need. Let the farm slot the space heaters.

                  Anyhow, now that the RAM embargo is signed, we'll see even more shifts so, should Intel make that move, it would be for the better.

                  * I believe the Stadia cancellation comes from this as well.

                  --
                  compiling...
                  • (Score: 2) by takyon on Saturday October 15 2022, @11:39PM (2 children)

                    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday October 15 2022, @11:39PM (#1276784) Journal

                    The rumors are supposedly based on sources within Intel having been briefed on the matter. Intel screwed up their launch really badly. It was delayed by around a year, causing it to miss a wide window of opportunity in which GPUs were sometimes selling 2-4x their MSRPs. And when it did come out, bad drivers and bad reviews. They may simply not want to trade blows with AMD and Nvidia in the consumer market, where wild supply/demand swings will probably continue to happen.

                    I don't know what you mean by "Nvidia pretty much said they're not going to make faster cards". The RTX 4090 is about a 70% performance improvement.

                    The Xbox Series X and PS5 show what gigantic APUs can do with a relatively large power budget. The Ryzen 9 7950X efficiency results show that you can get most of your performance when limited to a 65 Watt TDP. Raptor Lake can supposedly increase efficiency massively at lower TDPs, you can see such a claim here:

                    https://cdn.arstechnica.net/wp-content/uploads/2022/09/13th-Gen-Intel-Core-Desktop-Pre-Brief-Presentation_Appendix-Embargoed_20220926-32-2.jpeg [arstechnica.net]

                    That's the 13900K at 65W matching the 12900K at 241W. It's a pre-release marketing claim so we'll have to see how it holds up.

                    However, Alder Lake is often more efficient than Zen 3 during gaming:

                    https://www.igorslab.de/en/intel-core-i9-12900kf-core-i7-12700k-and-core-i5-12600k-review-gaming-in-really-fast-and-really-frugal-part-1/5/ [igorslab.de]

                    Why is that? Because games don't stress CPUs that much. There might be a large load on a couple of cores, but 12+ cores available with many of them idling. It's the multi-threaded workloads and overclocking stress tests that cause these CPUs to burn 300 Watts. They will happily do so given sufficient cooling; that's AMD's new policy with the AM5 170W TDP.

                    Point being, there is room for a bigger iGPU.

                    Anyway, with AMD's 680M graphics in Rembrandt delivering reasonably good 1080p performance, and the majority of the gamers on the planet still using 1080p resolution, we may see a situation where improvements to iGPUs outpace what people "need". Intel currently makes mobile iGPUs that are 3x larger than their desktop iGPUs, so it would just be a matter of putting that in desktop and raising power limits. The Alder Lake-P [wikipedia.org] Intel Core i7-1280P is packing 14 cores and 96 graphics EUs at a 28W base TDP (64W turbo).

                    Intel Arrow Lake mobile iGPUs may include up to 320 graphics execution units by 2024 [videocardz.com]. So they are more than tripling the size in the near future, although they could change the design to make each EU smaller and slower.

                    --
                    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
                    • (Score: 2) by takyon on Saturday October 15 2022, @11:52PM (1 child)

                      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday October 15 2022, @11:52PM (#1276788) Journal

                      Oh yeah. With Meteor Lake, Arrow Lake, etc., the tile design allows Intel to outsource part of the production. For example, a GPU tile can be made on the TSMC N3 node. So they have a workaround for their own fab issues. I think the video decode/encode is going to be separated out from graphics as well (source: I forgot).

                      --
                      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
                      • (Score: 2) by RamiK on Sunday October 16 2022, @01:45AM

                        by RamiK (1813) on Sunday October 16 2022, @01:45AM (#1276793)

                        Intel screwed up their launch really badly.

                        You know, I can't remember a single Intel launch that didn't miss the market or deadline by the critical margin? Like, you say a year but Bitcoin was mined for over a decade so for them to enter the market the same year it ended... Well, it's very Intel-like.

                        I don't know what you mean by...

                        The bit about Moore's law is dead... That is, they basically meant the demand and supply for the high-end broke down and they're not expecting to make & sell a lot of the high-end models.

                        The Xbox Series X and PS5 show what gigantic APUs can do with a relatively large power budget...

                        Note most of those improvements are for multi-threaded loads. Where the new chips get their single-threaded boost is from raw transistor density and frequencies. Though admittedly, you're right that they've basically reached the point where everything is so GPU bound that the CPU probably won't even break a sweat power wise.

                        Because games don't stress CPUs that much...Point being, there is room for a bigger iGPU.

                        That doesn't meant you don't still need a lot of die space for the CPU to throttle heat between multiple cores. That is, though it may seem like contemporary games only scratch some 33% off the CPU so you could just cut it in half an stick more GPU in there, in practice, it's all being used and can't be spared. It's also how they're reaching those crazy single-threaded 5GHz figures btw.

                        we may see a situation where improvements to iGPUs outpace what people "need".

                        Same as above.

                        the tile design allows Intel to outsource part of the production.

                        1. Back in the day Intel licensed a PowerVR GPU for the Z3580. Here, due to the different interconnect, they'll still need to license and fab (internally or via TSMC). So, not much difference in practice.
                        2. Though the additional flexibility looks promising, outsourcing parts of your chips to other fabs will depend on schedules Intel doesn't control so I suspect only a small fraction of Intel's actual real world CPUs will be manufactured with third-party tiles/chiplets and that they'll mostly fab them themselves.
                        3. AMD was gluing chips together for quite a few years. And while it helped, it wasn't the miracle Intel's current marketing is making it out to be.

                        Regardless, looking at all the delayed game releases and the lowered PC sales figures, I suspect we'll end up seeing more delays in the up and coming releases so talking about the new gen AMD and Intel products seems a bit too early.

                        --
                        compiling...
  • (Score: 3, Interesting) by hendrikboom on Thursday October 13 2022, @09:01AM

    by hendrikboom (1125) on Thursday October 13 2022, @09:01AM (#1276400) Homepage Journal

    I'd be happy with one that could contain two disk drives and use ECC memory.

  • (Score: 2, Insightful) by MonkeypoxBugChaser on Thursday October 13 2022, @01:36PM (7 children)

    by MonkeypoxBugChaser (17904) on Thursday October 13 2022, @01:36PM (#1276420) Homepage Journal

    The people that need this can just buy an m/b and a case. You can buy a multi gpu mining rig already pre-built. The point of the nuc was to be like a mac mini and fit behind monitors or other confined spaces.

    • (Score: 2) by JoeMerchant on Thursday October 13 2022, @02:56PM (6 children)

      by JoeMerchant (3937) on Thursday October 13 2022, @02:56PM (#1276436)

      "Next Unit of Computing" and, yes, it was supposed to be compact enough to VESA mount behind a monitor, and most of them were. I actually "lost" a NUC on my desk (that I hadn't seen for 2+ years because: pandemic and post-pandemic shift to WFH), because it wasn't on my desk where I was looking for it, I had stashed it behind the monitor which was leaned up against the wall - took a minute to remember...

      They have kinda bifurcated the NUC market with NUC-extreme for gamerz: skull graphics on the cases, etc.

      My constant tension with the NUC form factor is: compute power vs heat. What I really want is a compact NUC form factor, with working passive cooling (no fans), and enough compute oomph to use as my desktop. This has usually put me into the i5 varieties, i7 being a bit on the hot side, and I end up accepting a little fan in there because: oh, it's quiet enough, but... 3-4 years later it's packed with dust and starts spinning loudly, and that's generally a sign that I should start looking for a replacement before this one dies - find that the passively cooled options are 3x the cost, bite the bullet and buy another one with a fan. I think I've run that cycle 3x now with our home entertainment center NUC, and it's just started making excessive fan when running "normal" loads noise a few months ago. Maybe this is the replacement cycle where I finally pony up for a passively cooled version.

      I have two NUCs on my working desk, simply because they're small enough to do that without trouble. I needed to setup a M$ domain controller, and rather than screw around with the extra layer of "stuff" in a VM I decided that a sub $300 i3 NUC was a good investment in exchange for time saved, also since it takes negligible space. Even if it were "free" but the size of a mini-tower, I might have opted for the VM headaches instead.

      --
      Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
      • (Score: 2) by Immerman on Thursday October 13 2022, @04:12PM (5 children)

        by Immerman (3985) on Thursday October 13 2022, @04:12PM (#1276453)

        Yeah. While I can see the appeal of the "extreme" line, I really with they'd stick to the low profile form factor.

        E.g. You could have an edge-mounted PCIe slot so that your high-power video card lays in the same plane as the motherboard, keeping the two largely thermally isolated from each other. All wrapped up in a small, thick "pizza box" form factor that would still fit great either under your monitor or mounted to the back.

        • (Score: 2) by JoeMerchant on Thursday October 13 2022, @04:31PM (4 children)

          by JoeMerchant (3937) on Thursday October 13 2022, @04:31PM (#1276457)

          At launch, Intel sort of presented the NUC as a "reference design" that they expected others to re-make into their own versions, and some have, but I think Intel was so successful selling their own NUCs that they just figured "what the hell" and continue to dominate that market segment today. The extreme line seems like a "jumped the shark" kind of decision to me... losing sight of what made NUC great, just using the brand recognition to launch another line that may not be nearly as great as the original.

          --
          Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
          • (Score: 2) by Immerman on Thursday October 13 2022, @05:08PM (3 children)

            by Immerman (3985) on Thursday October 13 2022, @05:08PM (#1276459)

            Yeah, reference designs tend to be boring, lackluster, and generally uninteresing... but that's kind of the exact niche NUCs fill so well (or can't avoid). Combined with the relatively small size of the niche, I'm not surprised competitors haven't really thrown themselves into the fray. Especially when it's put them in direct competition with their upstream provider, who can maintain the same profit margins at lower prices.

            • (Score: 2) by JoeMerchant on Thursday October 13 2022, @05:24PM (2 children)

              by JoeMerchant (3937) on Thursday October 13 2022, @05:24PM (#1276461)

              Well, first, I think Intel maintains higher profit margins at the same price points, and in most of the NUC market (except that passive cooled type I want) the competition is mostly undercutting Intel prices. So, yeah, not super appealing for manufacturers.

              But, I totally fail to see where NUC should be any kind of niche from the consumer perspective. I mean, o.k. sure, you can't put a graphics or other big expansion card in it (until now), but, realistically speaking, what percentage of the PC market really ever puts anything in their box after they buy it? You are free to upgrade (or often required to purchase separately) the RAM and SSD in a NUC... that should cover like 90% of desktop users' "expansion" needs. And: who really needs or wants a bigger box?

              Back in 4th or 5th gen Core processors, I might have agreed that bigger desktop boxes (which can dissipate more heat) were attractive because the NUCs were a little underpowered, but today? You've got to be a 1% rare number crunching geek to need more than i5 Rocket Lake (11th gen) performance, and that's right on par, price wise, whether you get it in an Intel NUC or a Dell pizza box.

              --
              Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
              • (Score: 2) by Immerman on Thursday October 13 2022, @06:29PM (1 child)

                by Immerman (3985) on Thursday October 13 2022, @06:29PM (#1276464)

                I continue to have my suspicions about cooling quality though - especially with modern CPU speeds auto-throttling to prevent thermal damage, even a full-size PC can sometimes have a hard time actually running at full speed for any length of time.

                • (Score: 2) by JoeMerchant on Thursday October 13 2022, @06:58PM

                  by JoeMerchant (3937) on Thursday October 13 2022, @06:58PM (#1276467)

                  The ones I'm tempted by are "industrial application" which I suppose is part of their justification for the prices... they're also a bit bigger than the traditional Intel NUC form factor:

                  Stuff like: https://www.onlogic.com/ml100g-53/ [onlogic.com]

                  We bought something similar for work about 5 years back and it has been a trooper ever since - it is part of our build server mess, so it gets full CPU loads off and on several times daily.

                  --
                  Україна досі не є частиною Росії Слава Україні🌻 https://news.stanford.edu/2023/02/17/will-russia-ukraine-war-end
(1)