Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday July 13 2020, @03:09AM   Printer-friendly
from the Apple-of-my-eye dept.

Apple has built its own Mac graphics processors:

Like iPhones and iPads, Apple Silicon Macs will use an Apple-designed GPU – something that makes complete sense when you consider this is how current iOS devices work. But it could be a reason for pause by some high-end users during the transition period from Intel-based hardware.

[...] You see, while Intel Macs contain GPU’s from Intel, Nvidia and AMD, Apple Silicon Macs will use what the company seems fond of calling “Apple family” GPUs. These use a rendering system called Tile Based Deferred Rendering (TBDR), which iOS devices already use.

It works differently from the Immediate Mode rendering system supported in Intel Macs: While the latter immediately render imaging data to device memory, the former makes more use of the GPU by sorting out each element first before submitting it to device memory.

You can find out more here.

The effect is that TBDR rendering delivers lower latency, higher performance, lower power requirements and can achieve higher degrees of bandwidth. The A11 chip and Metal 2 really consolidated this technique.

It’s important to note that the GPU in a Mac with Apple silicon is a member of both GPU families, and supports both Mac family and Apple family feature sets. In other words, using Apple Silicon and Rosetta, you should still be able to use software designed for Intel-based Macs.

[...] How will Apple exploit this? Will it ditch fans in order to make thinner Macs? Will it exploit the opportunity to explore a new design language for its PCs? At what point will an iPhone become all the Mac you ever need, given your choice of user interface and access to a larger screen?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by takyon on Monday July 13 2020, @03:52AM (12 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @03:52AM (#1020139) Journal

    If they just want to replace the integrated graphics in Intel laptops, then Apple A14 and later chips will probably do just fine. But what will they use to replace the discrete AMD GPUs in higher-end products (e.g. a Mac Pro with up to 28-core Intel Xeon and AMD Radeon Pro)?

    They could keep using discrete AMD GPUs in some products after the 2 year transition off of x86, make their own discrete GPUs, or make giant SoCs with lots of ARM CPU cores (64+) and scaled-up GPU performance (enough to call it a graphics-oriented workstation). Not unlike the next-gen consoles which pack 8 Zen 2 cores and a fairly powerful GPU in a roughly 360 mm2 die. At the prices Apple sells those things for, they could provide a massive interposer-based design [tomshardware.com] combining CPU, GPU, AI acceleration, and stacks of High Bandwidth Memory.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by Mojibake Tengu on Monday July 13 2020, @06:24AM (3 children)

      by Mojibake Tengu (8598) on Monday July 13 2020, @06:24AM (#1020165) Journal

      Apple is heading to technological solipsism completely. That's how far.

      And I say that as an owner of 8 working Apple devices in total (MacBook Pro model 2015, IPhone6s, two iPads, Watch, Pencil, Magic Keyboard, AirPort router, a complete ecosystem good enough for platform software development). Anything designed by Apple after Jobs's death makes me scared. Their software is rolling downhill too. No future.

      --
      Respect Authorities. Know your social status. Woke responsibly.
      • (Score: 2) by takyon on Monday July 13 2020, @07:47AM (2 children)

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @07:47AM (#1020179) Journal

        As far as the CPU is concerned, Apple has everything it needs in order to completely leave Intel in the dust. Or to be more specific, they can leave these Intel Xeon W CPUs [wikipedia.org] found in the 2019 Mac Pro models in the dust, ignoring anything Intel or AMD puts out over the next few years. But it won't matter because they can make ARM hardware that is faster than the 2019 x86 baseline (as long as the software runs on it).

        8-16 big cores clocking somewhere between 3 GHz and 4 GHz [anandtech.com]? No problem. Small cores? How about 64-128? 28-core Xeon crushed. Throw 32-64 GB of HBM on the interposer. 128-196 GB of HBM is plausible using eight 8-Hi or 12-Hi stacks. Throw another $10,000 at the system to continue.

        Other hardware decisions and software will be harder than striking down Intel during its moment of weakness, but that's not what I'm interested in.

        You have a tough decision to make too. Will you complete your development ecosystem with the ultimate piece of kit [soylentnews.org]?

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by Mojibake Tengu on Monday July 13 2020, @09:14AM (1 child)

          by Mojibake Tengu (8598) on Monday July 13 2020, @09:14AM (#1020193) Journal

          Most probably, no. I consider Apple betrayal of their user-centric ideals untolerable.

          I hold and use some other dev ecosystems as well. For a VR paradigm, Sony's PS4 Pro itself is unimpressive as a HW platform (it's bad technical engineering all over the box inside), but its PSVR, while clumsy, is quite hackable, the protocol is known by reversion and so the gadget is experimentally usable on free platforms too. I already focused on that one, for I consider a terminal running in a bulky cabled VR headset optically safe. It may not be as safe on any kind of transparent glasses because of possible optical side channel leak.
          Hacking Sony controllers is fun too, not bad for repurposing.

          As for phone and mobile comm stuff, my future is Huawei, no doubts. They have no competition.

          --
          Respect Authorities. Know your social status. Woke responsibly.
          • (Score: 2) by takyon on Monday July 13 2020, @09:35AM

            by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @09:35AM (#1020198) Journal

            Get yourself a MateStation [wccftech.com].

            One detail I missed from the article:

            The company has also committed to introducing new Intel-based Macs that do support these external systems for some time during the current transition.

            So I guess there could be 1-2 new generations of Intel x86 Xeon Macs.

            --
            [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by c0lo on Monday July 13 2020, @07:19AM (1 child)

      by c0lo (156) Subscriber Badge on Monday July 13 2020, @07:19AM (#1020172) Journal

      But what will they use to replace the discrete AMD GPUs in higher-end products

      All Apple products are high-end products and, by God, our customers are going to be pleased [theonion.com] with whatever we tell them it's ultra-high-end.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 1) by petecox on Monday July 13 2020, @08:43AM (1 child)

      by petecox (3228) on Monday July 13 2020, @08:43AM (#1020186)

      Eschewing traditional GPUs should have been evident when they stopped licensing Imagination's PowerVR for their iPhones.

      Expect iOS/macos only software titles that rely on Metal/TBDR instead of OpenGL/Vulkan/DirectX/OpenCL/CUDA.

    • (Score: 2) by richtopia on Monday July 13 2020, @02:48PM (1 child)

      by richtopia (3160) on Monday July 13 2020, @02:48PM (#1020327) Homepage Journal

      This was a concern for me too when I first heard of Apple migrating to their own chips.

      I think they will probably continue with AMD GPUs for a few years, but they will move towards an in-house GPU solution in the future. This is speculation on my part, but my supporting evidence:

      1. Even Intel is struggling to get a dGPU to market. Their challenges might be different than Apple's (Intel also has to figure out the manufacturing side), but the moral of the story is graphics is difficult
      2. Apple has already been designing their graphics for the A series chips in house, so they have some experience
      3. Apple also has experience with their FPGA Afterburner card. It is hard to find technical details on this piece of hardware, but I assume this was designed in-house by Apple. More specific use-case than a GPU but similar domain

      • (Score: 2) by takyon on Monday July 13 2020, @05:02PM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @05:02PM (#1020457) Journal

        That's the Afterburner card, right? What I remember is that it can handle 16K video.

        Graphics difficulty is exaggerated. They already make their own integrated graphics hardware. To some extent, they can just throw more GPU cores at the problem. Apple A12X/A12Z has 7-8 GPU "cores" vs. 4 for A12, A13 [wikipedia.org] has 4 again. Each A13 "core" is just 3.25 mm2 on TSMC N7P (total GPU area is 15.28 mm2, not 13 mm2). Instead of 4-8 cores, they could put more like 128 of them on a non-mobile SoC, especially on denser nodes like "5nm". HBM can provide more memory bandwidth and lower power draw. Other scaling issues can be dealt with.

        Reaching the 5-56 teraflops levels of GPU performance of various Mac(Book) Pro products with dGPUs could be difficult, but not impossible.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Monday July 13 2020, @10:06PM (1 child)

      by Anonymous Coward on Monday July 13 2020, @10:06PM (#1020763)

      I'm a bit curious about some of these moves because Apple simply doesn't have the marketshare to get developers to develop software that needs to go through an entire porting process to run on their hardware. Once they swapped Intel processors for anything else, swapping out the GPU chips for something other than AMD, Intel or NVidia is not that big of a deal.

      What doesn't much make sense to me is that they used to be completely unable to run the software that ran on PCs without going through the process of porting it over there and the amount of software suffered for it. If you wanted to work with people that were using PCs you had to hope that either the software being used was available on both sides or, failing that, that there was a file format that was compatible on both sides and didn't cause weird things to happen due to subtly different implementations.

      We'll see how this works out, but this is a really strange move to make.

      • (Score: 2) by takyon on Monday July 13 2020, @10:40PM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @10:40PM (#1020792) Journal

        There's already a lot of software that targets Windows x86 but not Mac x86. At least this move towards ARM will unify iPhone, iPad, Mac, etc.

        It also seems like emulation can happen one way or another. Windows 10 on ARM will probably end up booting on ARM Macs at some point, and it will be able to emulate x64 in 2021.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 3, Interesting) by ledow on Monday July 13 2020, @07:19AM (2 children)

    by ledow (5567) on Monday July 13 2020, @07:19AM (#1020171) Homepage

    Er... what?

    TBDR? Welcome to the Sega Dreamcast era. "Early in the development of desktop GPUs, several companies developed tiled architectures. Over time, these were largely supplanted by immediate-mode GPUs with fast custom external memory systems."

    "Immediate Mode" - that's terminology so old that it's obsolete in itself, no modern graphics card "renders immediately" to device memory, and most modern protocols don't even allow it. We've had display-lists and all kinds in the DECADES before that was true. "Sorting each element first before submitting it to device memory"... you mean like every other graphics card/protocol on the planet at the moment?

    This is either something completely different to how it's written, or the author knows nothing of graphics programming for the last 20 years.

    Shall I give you a literal example of TBDR support in a modern ARM chip? "Broadcom VideoCore IV series" - a Raspberry-Pi. Well-reknowned for their high-end 3D rendering capabilities, right?

    • (Score: 2) by takyon on Monday July 13 2020, @07:50AM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @07:50AM (#1020180) Journal

      Looks like they filled out the article with some random stuff given that there is not much new information here.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Monday July 13 2020, @09:44PM

      by Anonymous Coward on Monday July 13 2020, @09:44PM (#1020735)

      Now that it is out of patent protection and they think whatever other techniques they are using can be kept from Nvidia/AMD/Etc discovering in time to sue them over it. Or that they have more money than god and can simply buy the company or patent warchest needed if push comes to shove.

      That said, a lot of these articles recently have HUGELY FUCKING RETARDED technical declarations from people who obviously have no knowledge of tech history that is less than 25 years old. Hell I knew more about computer history that was old in the 1990s than these kids are acting like now. Although this does seem part for the course of what the tech industry and tech reporting is declining into (and that is saying a lot given how bad some of it was in the 90s and 2000s.)

  • (Score: 1, Insightful) by Anonymous Coward on Monday July 13 2020, @08:18AM (6 children)

    by Anonymous Coward on Monday July 13 2020, @08:18AM (#1020184)

    GPUs are hard, a lot harder than CPUs. 3dFX invented modern GPUs, and nVidia bought them. ATI was the largest maker of 2D and OEM graphics before inventing the Radeon and later being bought by AMD, and has spent 25 years trying to barely keep up with nVidia. Lots of other companies have tried to make competitive GPUs and it never works out. Matrox, Rendition, PowerVR, S3, SiS, Number Nine, Intel.

    Magic sauce technology (especially if it's the same as technology that has already been tried) rarely pans out. Fast memory, efficient render pipelines, and advanced drivers are what wins the day.

    Most Mac users barely care if the thing even switches on, but the ones doing serious graphics and video work need a real GPU. Apple hates games and doesn't really want them on the Mac at all, but the Mac can't survive long term exclusively as a fashion statement.

    Maybe Apple is OK with that. Maybe in a few years they're planning to come out with a dock where you plug in your iPhone and that's your Mac. Maybe Apple is counting on their strong position in smartphones to let them replace the PC with a phone entirely. It's a big, risky bet, especially since the dock is just naturally going to cost more than a whole Intel laptop (and the iPhone is slowly but surely losing its popularity). I'd bet more on the whole thing blowing up in their face, but it's their decision to make.

    • (Score: 1) by petecox on Monday July 13 2020, @08:52AM

      by petecox (3228) on Monday July 13 2020, @08:52AM (#1020189)

      Maybe in a few years they're planning to come out with a dock where you plug in your iPhone and that's your Mac. Maybe Apple is counting on their strong position in smartphones to let them replace the PC with a phone entirely.

      e.g. Convergence on Ubuntu Touch. Continuum on Windows 10 Mobile.

      p.s. Google could already own that market if they put Chrome OS on phones. But instead they've settled on Android apps running in a window on touchscreen tablets.

    • (Score: 2) by takyon on Monday July 13 2020, @09:32AM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @09:32AM (#1020197) Journal

      Intel never stopped making integrated graphics, and now they are poised to beat AMD's iGPUs temporarily with Tiger Lake, make a return to discrete gaming GPUs, and more importantly, sell big discrete GPUs for AI and supercomputers [anandtech.com].

      Apple can probably scale up its current mobile iGPUs significantly for laptop/desktop-oriented versions. With sufficient die area and cooling, even SoCs can compete with discrete GPUs (see next-gen consoles that will have around RTX 2070 Super / RTX 2080 Super performance in an APU/SoC).

      Apple's hatred for games is greatly exaggerated.

      Revolution Software’s Beyond a Steel Sky brings new adventure to Apple Arcade [venturebeat.com]
      Apple is getting serious about iPad gaming with better gamepad and keyboard support [theverge.com]
      Apple Rumored to Be Working on an ARM-Based Console Most Likely Featuring Its Own A-Series Silicon [wccftech.com]

      They've sold billions of dollars worth of casual-ish games on iOS, all of which should be able to run on ARM-based Macs. Whether or not they will do much beyond that remains to be seen. The Nintendo Switch has shown that x86 games will be ported to an ARM device, even one with serious hardware constraints, if there is demand for it.

      The strategy they are talking about right now is putting ARM in Macs, and eventually Mac Pros, not docking. I don't see why they couldn't make iPhone docking a thing immediately, unless the connector needs changes to handle 6K+ resolutions or they want to try a WiGig approach. Seeing as Ubuntu Edge failed (by setting expectations too high) and nobody talks about Samsung DeX docking, maybe there isn't much demand for it yet.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Mojibake Tengu on Monday July 13 2020, @04:17PM (1 child)

        by Mojibake Tengu (8598) on Monday July 13 2020, @04:17PM (#1020418) Journal

        Gaming on iPad is as painful as is possible. Consider this: iOS has no method of terminating an app by user. I don't understand why if not for the pure purpose of stalking.
        Instead, the system kills the badly behaving app or when running out of resources. I'll address this situation later.

        In iOS fake multitasking system, originally only one, and now just 2 (in words: two) in current iteration of the system, applications could run in front end, facing the user.
        Anything on background can be killed at any moment at system's discretion. Killing means all network resources disconnected. One of the collateral damage of this tactics is iOS devices are nearly unusable for a simple remote terminal, even if you buy some fancy terminal app, for ssh session cannot survive reconnection. Sure, there are hacks for this condition, but not standard protocol.
        I hate Apple for this unlogic. On the protocol side, that practically means user is shackled to the http(s) world.

        Badly behaving means, any app can be killed for good reasons, like consuming to much memory or CPU time, but also for dubious system reasons like suddenly higher temperature on battery. The gaming device (iPad 6th generation, max possible memory) itself is underengineered badly. When running games, it is very easy to get the device overheated, so the game seemingly crashes, usually at climax of the thrilling combat because killed by system for strange internal temperature condition caused by heavy 3D rendering. Many known iOS games have inferior graphics in compare to other ARM platforms just because of that.

        Considering the fact Apple is partner to GE, pushing iOS into industry control panels, and also with Pentagon, providing custom iOS devices for SOG, this funny behavior is a good recipe for industrial accidents or even possible combat losses. I can't even conceive how people could be such stupid. Only comparable level of stupidity is the out of memory killer (known as famous OOM Killer) in Linux kernel.

        This is why I cannot tolerate transition of Mac platform to ARM and upcoming convergence of MacOS to iOS. I simply do not believe the Apple marketing propaganda about it.

        --
        Respect Authorities. Know your social status. Woke responsibly.
        • (Score: 2) by takyon on Monday July 13 2020, @07:04PM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @07:04PM (#1020578) Journal

          The propaganda has only just begun:

          Windows PCs Will Have to Switch Over to ARM CPUs Eventually to Match Apple’s Future Offerings, Says Former Mac Chief [wccftech.com]

          “Specifically, what are Dell, HP, Asus, and others going to do if Apple offers materially better laptops and desktops and Microsoft continues to improve Windows on ARM Surface devices? In order to compete, PC manufacturers will have to follow suit, they’ll “go ARM” because, all defensive rhetoric aside, Apple and Microsoft will have made the x86 architecture feel like what it actually is: old.”

          [...] “This leaves Microsoft with a choice: Either forget Windows on ARM and cede modern PCs to Apple, or forge ahead, fix app compatibility problems and offer an ARM-based alternative to Apple’s new Macs. It’s a false dilemma, of course. Microsoft will forge ahead…with repercussions for the rest of the Windows PC industry.”

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Tuesday July 14 2020, @08:46PM

        by Anonymous Coward on Tuesday July 14 2020, @08:46PM (#1021465)

        Docking will be the killer when smartglasses start getting really mainstream - maybe another 5-10 years. Phone SoCs are already fast enough for 99% of people, but interfacing is the pain point. If you're not going chorded input (because nobody make chorded input devices for consumers except tapwithus, and despite being very cool, that is definitely not going to be mainstream). A Bluetooth (or usb-c hub) keyboard/mouse or AR "touchscreen" or gestures might be enough for most people.

    • (Score: 2) by takyon on Tuesday July 14 2020, @09:37PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday July 14 2020, @09:37PM (#1021489) Journal

      There's a new GPU player in town:

      Asia based Zhaoxin has plans for a dedicated graphics card series [guru3d.com]

      That one looks pretty lackluster compared to this concept from another Chinese company, also on a "28nm" node:

      Look out Nvidia and AMD… Chinese GPU maker has a GTX 1080-level card in development [pcgamesn.com]

      Chinese companies can take a cheap and readily available older node, make a large and power hungry GPU, put lots of High Bandwidth Memory on it to help performance and lower power consumption a bit, and then sell it at cutthroat margins so that it has decent price/performance.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(1)