Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday July 13 2020, @03:09AM   Printer-friendly
from the Apple-of-my-eye dept.

Apple has built its own Mac graphics processors:

Like iPhones and iPads, Apple Silicon Macs will use an Apple-designed GPU – something that makes complete sense when you consider this is how current iOS devices work. But it could be a reason for pause by some high-end users during the transition period from Intel-based hardware.

[...] You see, while Intel Macs contain GPU’s from Intel, Nvidia and AMD, Apple Silicon Macs will use what the company seems fond of calling “Apple family” GPUs. These use a rendering system called Tile Based Deferred Rendering (TBDR), which iOS devices already use.

It works differently from the Immediate Mode rendering system supported in Intel Macs: While the latter immediately render imaging data to device memory, the former makes more use of the GPU by sorting out each element first before submitting it to device memory.

You can find out more here.

The effect is that TBDR rendering delivers lower latency, higher performance, lower power requirements and can achieve higher degrees of bandwidth. The A11 chip and Metal 2 really consolidated this technique.

It’s important to note that the GPU in a Mac with Apple silicon is a member of both GPU families, and supports both Mac family and Apple family feature sets. In other words, using Apple Silicon and Rosetta, you should still be able to use software designed for Intel-based Macs.

[...] How will Apple exploit this? Will it ditch fans in order to make thinner Macs? Will it exploit the opportunity to explore a new design language for its PCs? At what point will an iPhone become all the Mac you ever need, given your choice of user interface and access to a larger screen?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Monday July 13 2020, @03:52AM (12 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @03:52AM (#1020139) Journal

    If they just want to replace the integrated graphics in Intel laptops, then Apple A14 and later chips will probably do just fine. But what will they use to replace the discrete AMD GPUs in higher-end products (e.g. a Mac Pro with up to 28-core Intel Xeon and AMD Radeon Pro)?

    They could keep using discrete AMD GPUs in some products after the 2 year transition off of x86, make their own discrete GPUs, or make giant SoCs with lots of ARM CPU cores (64+) and scaled-up GPU performance (enough to call it a graphics-oriented workstation). Not unlike the next-gen consoles which pack 8 Zen 2 cores and a fairly powerful GPU in a roughly 360 mm2 die. At the prices Apple sells those things for, they could provide a massive interposer-based design [tomshardware.com] combining CPU, GPU, AI acceleration, and stacks of High Bandwidth Memory.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Mojibake Tengu on Monday July 13 2020, @06:24AM (3 children)

    by Mojibake Tengu (8598) on Monday July 13 2020, @06:24AM (#1020165) Journal

    Apple is heading to technological solipsism completely. That's how far.

    And I say that as an owner of 8 working Apple devices in total (MacBook Pro model 2015, IPhone6s, two iPads, Watch, Pencil, Magic Keyboard, AirPort router, a complete ecosystem good enough for platform software development). Anything designed by Apple after Jobs's death makes me scared. Their software is rolling downhill too. No future.

    --
    Respect Authorities. Know your social status. Woke responsibly.
    • (Score: 2) by takyon on Monday July 13 2020, @07:47AM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @07:47AM (#1020179) Journal

      As far as the CPU is concerned, Apple has everything it needs in order to completely leave Intel in the dust. Or to be more specific, they can leave these Intel Xeon W CPUs [wikipedia.org] found in the 2019 Mac Pro models in the dust, ignoring anything Intel or AMD puts out over the next few years. But it won't matter because they can make ARM hardware that is faster than the 2019 x86 baseline (as long as the software runs on it).

      8-16 big cores clocking somewhere between 3 GHz and 4 GHz [anandtech.com]? No problem. Small cores? How about 64-128? 28-core Xeon crushed. Throw 32-64 GB of HBM on the interposer. 128-196 GB of HBM is plausible using eight 8-Hi or 12-Hi stacks. Throw another $10,000 at the system to continue.

      Other hardware decisions and software will be harder than striking down Intel during its moment of weakness, but that's not what I'm interested in.

      You have a tough decision to make too. Will you complete your development ecosystem with the ultimate piece of kit [soylentnews.org]?

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Mojibake Tengu on Monday July 13 2020, @09:14AM (1 child)

        by Mojibake Tengu (8598) on Monday July 13 2020, @09:14AM (#1020193) Journal

        Most probably, no. I consider Apple betrayal of their user-centric ideals untolerable.

        I hold and use some other dev ecosystems as well. For a VR paradigm, Sony's PS4 Pro itself is unimpressive as a HW platform (it's bad technical engineering all over the box inside), but its PSVR, while clumsy, is quite hackable, the protocol is known by reversion and so the gadget is experimentally usable on free platforms too. I already focused on that one, for I consider a terminal running in a bulky cabled VR headset optically safe. It may not be as safe on any kind of transparent glasses because of possible optical side channel leak.
        Hacking Sony controllers is fun too, not bad for repurposing.

        As for phone and mobile comm stuff, my future is Huawei, no doubts. They have no competition.

        --
        Respect Authorities. Know your social status. Woke responsibly.
        • (Score: 2) by takyon on Monday July 13 2020, @09:35AM

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @09:35AM (#1020198) Journal

          Get yourself a MateStation [wccftech.com].

          One detail I missed from the article:

          The company has also committed to introducing new Intel-based Macs that do support these external systems for some time during the current transition.

          So I guess there could be 1-2 new generations of Intel x86 Xeon Macs.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by c0lo on Monday July 13 2020, @07:19AM (1 child)

    by c0lo (156) Subscriber Badge on Monday July 13 2020, @07:19AM (#1020172) Journal

    But what will they use to replace the discrete AMD GPUs in higher-end products

    All Apple products are high-end products and, by God, our customers are going to be pleased [theonion.com] with whatever we tell them it's ultra-high-end.

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 1) by petecox on Monday July 13 2020, @08:43AM (1 child)

    by petecox (3228) on Monday July 13 2020, @08:43AM (#1020186)

    Eschewing traditional GPUs should have been evident when they stopped licensing Imagination's PowerVR for their iPhones.

    Expect iOS/macos only software titles that rely on Metal/TBDR instead of OpenGL/Vulkan/DirectX/OpenCL/CUDA.

  • (Score: 2) by richtopia on Monday July 13 2020, @02:48PM (1 child)

    by richtopia (3160) on Monday July 13 2020, @02:48PM (#1020327) Homepage Journal

    This was a concern for me too when I first heard of Apple migrating to their own chips.

    I think they will probably continue with AMD GPUs for a few years, but they will move towards an in-house GPU solution in the future. This is speculation on my part, but my supporting evidence:

    1. Even Intel is struggling to get a dGPU to market. Their challenges might be different than Apple's (Intel also has to figure out the manufacturing side), but the moral of the story is graphics is difficult
    2. Apple has already been designing their graphics for the A series chips in house, so they have some experience
    3. Apple also has experience with their FPGA Afterburner card. It is hard to find technical details on this piece of hardware, but I assume this was designed in-house by Apple. More specific use-case than a GPU but similar domain

    • (Score: 2) by takyon on Monday July 13 2020, @05:02PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @05:02PM (#1020457) Journal

      That's the Afterburner card, right? What I remember is that it can handle 16K video.

      Graphics difficulty is exaggerated. They already make their own integrated graphics hardware. To some extent, they can just throw more GPU cores at the problem. Apple A12X/A12Z has 7-8 GPU "cores" vs. 4 for A12, A13 [wikipedia.org] has 4 again. Each A13 "core" is just 3.25 mm2 on TSMC N7P (total GPU area is 15.28 mm2, not 13 mm2). Instead of 4-8 cores, they could put more like 128 of them on a non-mobile SoC, especially on denser nodes like "5nm". HBM can provide more memory bandwidth and lower power draw. Other scaling issues can be dealt with.

      Reaching the 5-56 teraflops levels of GPU performance of various Mac(Book) Pro products with dGPUs could be difficult, but not impossible.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 0) by Anonymous Coward on Monday July 13 2020, @10:06PM (1 child)

    by Anonymous Coward on Monday July 13 2020, @10:06PM (#1020763)

    I'm a bit curious about some of these moves because Apple simply doesn't have the marketshare to get developers to develop software that needs to go through an entire porting process to run on their hardware. Once they swapped Intel processors for anything else, swapping out the GPU chips for something other than AMD, Intel or NVidia is not that big of a deal.

    What doesn't much make sense to me is that they used to be completely unable to run the software that ran on PCs without going through the process of porting it over there and the amount of software suffered for it. If you wanted to work with people that were using PCs you had to hope that either the software being used was available on both sides or, failing that, that there was a file format that was compatible on both sides and didn't cause weird things to happen due to subtly different implementations.

    We'll see how this works out, but this is a really strange move to make.

    • (Score: 2) by takyon on Monday July 13 2020, @10:40PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @10:40PM (#1020792) Journal

      There's already a lot of software that targets Windows x86 but not Mac x86. At least this move towards ARM will unify iPhone, iPad, Mac, etc.

      It also seems like emulation can happen one way or another. Windows 10 on ARM will probably end up booting on ARM Macs at some point, and it will be able to emulate x64 in 2021.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]