Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday July 13 2020, @03:09AM   Printer-friendly
from the Apple-of-my-eye dept.

Apple has built its own Mac graphics processors:

Like iPhones and iPads, Apple Silicon Macs will use an Apple-designed GPU – something that makes complete sense when you consider this is how current iOS devices work. But it could be a reason for pause by some high-end users during the transition period from Intel-based hardware.

[...] You see, while Intel Macs contain GPU’s from Intel, Nvidia and AMD, Apple Silicon Macs will use what the company seems fond of calling “Apple family” GPUs. These use a rendering system called Tile Based Deferred Rendering (TBDR), which iOS devices already use.

It works differently from the Immediate Mode rendering system supported in Intel Macs: While the latter immediately render imaging data to device memory, the former makes more use of the GPU by sorting out each element first before submitting it to device memory.

You can find out more here.

The effect is that TBDR rendering delivers lower latency, higher performance, lower power requirements and can achieve higher degrees of bandwidth. The A11 chip and Metal 2 really consolidated this technique.

It’s important to note that the GPU in a Mac with Apple silicon is a member of both GPU families, and supports both Mac family and Apple family feature sets. In other words, using Apple Silicon and Rosetta, you should still be able to use software designed for Intel-based Macs.

[...] How will Apple exploit this? Will it ditch fans in order to make thinner Macs? Will it exploit the opportunity to explore a new design language for its PCs? At what point will an iPhone become all the Mac you ever need, given your choice of user interface and access to a larger screen?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by ledow on Monday July 13 2020, @07:19AM (2 children)

    by ledow (5567) on Monday July 13 2020, @07:19AM (#1020171) Homepage

    Er... what?

    TBDR? Welcome to the Sega Dreamcast era. "Early in the development of desktop GPUs, several companies developed tiled architectures. Over time, these were largely supplanted by immediate-mode GPUs with fast custom external memory systems."

    "Immediate Mode" - that's terminology so old that it's obsolete in itself, no modern graphics card "renders immediately" to device memory, and most modern protocols don't even allow it. We've had display-lists and all kinds in the DECADES before that was true. "Sorting each element first before submitting it to device memory"... you mean like every other graphics card/protocol on the planet at the moment?

    This is either something completely different to how it's written, or the author knows nothing of graphics programming for the last 20 years.

    Shall I give you a literal example of TBDR support in a modern ARM chip? "Broadcom VideoCore IV series" - a Raspberry-Pi. Well-reknowned for their high-end 3D rendering capabilities, right?

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by takyon on Monday July 13 2020, @07:50AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Monday July 13 2020, @07:50AM (#1020180) Journal

    Looks like they filled out the article with some random stuff given that there is not much new information here.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 0) by Anonymous Coward on Monday July 13 2020, @09:44PM

    by Anonymous Coward on Monday July 13 2020, @09:44PM (#1020735)

    Now that it is out of patent protection and they think whatever other techniques they are using can be kept from Nvidia/AMD/Etc discovering in time to sue them over it. Or that they have more money than god and can simply buy the company or patent warchest needed if push comes to shove.

    That said, a lot of these articles recently have HUGELY FUCKING RETARDED technical declarations from people who obviously have no knowledge of tech history that is less than 25 years old. Hell I knew more about computer history that was old in the 1990s than these kids are acting like now. Although this does seem part for the course of what the tech industry and tech reporting is declining into (and that is saying a lot given how bad some of it was in the 90s and 2000s.)