Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by LaminatorX on Monday November 24 2014, @09:12AM   Printer-friendly
from the semi-approximate dept.

SemiAccurate pitted AMD's Mantle 3D rendering API against Microsoft DirectX 11, by comparing the frame rate performance achieved by five video games that ship with support with both rendering engines, on various hardware configurations outfitted with AMD GPUs or APUs (integrated CPU/GPU). Thomas Ryan wrote up the results in a five-part series. The short answer is that while Mantle produced superior frame rates for practically every game and every hardware configuration, in many cases the difference was small, 10 percent or less. The performance advantage for Mantle is more telling on systems with a dedicated GPU as opposed to an APU, and was most consistently realized for "Civilization Beyond Earth", ironically a strategy game rather than a shooter. In those scenarios, one could indeed say that "Mantle knocked it out of the park."

AMD claims that Mantle provides game developers more opportunities to directly invoke functionality on the GPU, removing the CPU bottleneck. It is supported by the Graphics Core Next generation of AMD CPUs and APUs, although it is not currently supported by either the Playstation 4 or Xbox One. More details are provided in AMD's white paper.

The order of test result articles is from semiaccurate.com: note that it isn't arranged from least to most capable hardware, or vice versa:

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Lagg on Monday November 24 2014, @10:42AM

    by Lagg (105) on Monday November 24 2014, @10:42AM (#119373) Homepage Journal

    Right now I am inclined by default to distrust any kind of benchmarking of hardware and libs alike. I was already skeptical that Mantle could magically outperform OpenGL though I have no doubts it can outdo Direct3D just by virtue of not having abstraction with side effects that might not do what people want them to do at runtime. For this reason I was curious if anyone dug up concerns about bribing or financial incentive that the guys doing the benchmark might have?

    Don't get me wrong I'm not accusing them or anything. It's just that right now benchmarking is such a cess pit of media sensationalism and company dishonesty it's really hard to just assume outright that they're staying in line with proper experimental procedure.

    Even Phoronix/Michael's benchmarks I have issues with despite trusting him a hell of a lot more than any other site doing the same. Just because I know that he gets hardware from companies for free. Of course since he started a site for archiving and submitting benchmarking results his results can be tested (like any experiment should be) I don't have issues here so much anymore. Problem is, his benchmarks usually show the obvious or tiny differences as one would intuitively expect from hardware fulfilling its specialized task. Michael never misrepresents this, but here it's just a little hard to swallow the alleged performance gains that are one or two magnitudes greater than what it's being compared to.

    and yes I know that one of the big advantages of Mantle is that it makes it much easier to offload processing to the GPU (or at least that's what AMD claims. I haven't actually written anything with it myself) but by nature these are number crunchers. It stands to reason that offloading the number crunching to the number crunches would result in gains. But I don't think that indicates anything with Mantle itself. Just that AMD makes decent GPUs and Mantle works pretty well with those GPUs. Which is fine. But for all I know when it isn't running on AMD GPUs the runtime performance is half that of any random hardware accelerated opengl stuff.

    Also, what's up with the silly photos of the cards? Yes GPUs have chips. Yes water looks cool when you sprinkle it on stuff [semiaccurate.com]. Like I said. Really hard to take things seriously in this media sensationalism and hype crap.

    --
    http://lagg.me [lagg.me] 🗿
    • (Score: 2) by Hairyfeet on Monday November 24 2014, @03:52PM

      by Hairyfeet (75) <bassbeast1968NO@SPAMgmail.com> on Monday November 24 2014, @03:52PM (#119441) Journal

      THANK YOU!!! This is what I've been saying for years, ever since finding the so called "benchmarks" showing results that the same chips in the shop just simply did not deliver and then wadda ya know, Cinebench gets caught putting "If AMD tie boat anchor" in their benchmark software, SiSoft did the same with Sandra a year or two back, and when reviewers try running actual applications instead of benchmarks? They find AMD is better than the benches say it is [youtube.com] and that the results I have been seeing is correct, that AMD is within 5%-7% on single threaded depending on which chip you are comparing it to while on programs that can multithread those extra cores have their chips comparing favorably with Intel chips that cost over twice the price. This is after of course learning the GPU benchmarks are so hopelessly rigged they might as well not exist, between the GPU companies writing drivers that change setting based on whether its a benchmarking program to the programs themselves favoring one GPU over another by the way they design their so called "tests".

      So the only logical thing you can do is consider all benchmarks to be rigged and look only at tests using real world programs, be it games, productivity software, video encoding, whatever. thanks to YouTube plenty of guys out there running real programs and uploading the results, TechReview and Jayz2Cents to name just 2 but there are a ton of 'em. Just remember folks benchmark rigging has been going on since Quack.exe if not earlier so you need to treat benchmarks as what they are, lies designed to sell the hardware of whomever paid them the most $$$.

      --
      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
    • (Score: 2) by Wootery on Monday November 24 2014, @06:36PM

      by Wootery (2341) on Monday November 24 2014, @06:36PM (#119502)

      I was already skeptical that Mantle could magically outperform OpenGL though I have no doubts it can outdo Direct3D just by virtue of not having abstraction with side effects that might not do what people want them to do at runtime.

      Why do you assume OpenGL is harder to beat than Direct3D?

      You may already know this, but: as I understand it, the ideal case for Mantle is where the graphics-card is powerful, but the CPU is weak. Reduced CPU overhead seems to be the kicker, not improvements to the way things actually run on the GPU.

      Agree that the water-sprinkled graphics-card photos are just stupid.

  • (Score: 0) by Anonymous Coward on Monday November 24 2014, @12:06PM

    by Anonymous Coward on Monday November 24 2014, @12:06PM (#119379)

    Looks like the poll is a defunct section now. Poll archived but no new poll. I miss the polls of the early days. They were fun and often. Oh well.

    Good night.

  • (Score: 0) by Anonymous Coward on Monday November 24 2014, @02:52PM

    by Anonymous Coward on Monday November 24 2014, @02:52PM (#119419)

    If you think 10 percent is small, donate 10 percent of your income. Hell, donate 10 of your wealth!

    Yeah, didn't think you would.

    • (Score: 2) by kaganar on Monday November 24 2014, @04:40PM

      by kaganar (605) on Monday November 24 2014, @04:40PM (#119456)
      Would you buy a product for 90 cents but not 99 cents? What if you were buying ten of them? A hundred? A thousand? Scale changes perspective quite a lot. Economics studies have shown repeatedly that the decision making process changes drastically as the perceived stakes change. That's a failing of analogy between relatively similar scenarios. In comparing income changes to game performance changes, your statement is even less directly applicable.
      • (Score: 0) by Anonymous Coward on Monday November 24 2014, @08:39PM

        by Anonymous Coward on Monday November 24 2014, @08:39PM (#119550)

        Comparing two fairly rigidly fixed quantities is less applicable than comparing a fixed quantity and a fluid quantity?

      • (Score: 1) by monster on Tuesday November 25 2014, @04:32PM

        by monster (1260) on Tuesday November 25 2014, @04:32PM (#119844) Journal

        Even 1% can be a deal breaker: It's the difference between 99 (two digits) and 100 (omg three digits! expensive!).

        Lately, I've even seen a 0.01 difference in prices (like $999.99).

  • (Score: 2) by meisterister on Monday November 24 2014, @05:39PM

    by meisterister (949) on Monday November 24 2014, @05:39PM (#119479) Journal

    I thought that one of the big reasons for Mantle was to overcome single-threaded performance limitations in modern graphics APIs and such.

    Tom's did some Mantle benchmarks with the 8350, and it surprises me that SemiAccurate would go for "Crap chip (though pretty much on par with a 2GHz mobile Core 2 Quad), decent chip, super overpriced haxxorz Intel chip" instead of testing more hardware.

    http://www.tomshardware.com/reviews/amd-mantle-performance-benchmark,3860.html [tomshardware.com]

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
    • (Score: 0) by Anonymous Coward on Monday November 24 2014, @06:23PM

      by Anonymous Coward on Monday November 24 2014, @06:23PM (#119496)

      It looks like Tom's did more tests than Semi did, but their results presentation is somewhat harder to grasp.

      I agree that the sweet spot in the marketplace is probably a GPU in the $200-300 range. One wouldn't expect the Mantle to accelerate an APU as much because the interconnect isn't physically as big of a bottleneck. That probably explains why AMD and Sony haven't been hustling to get Mantle onto the PS4. Microsoft, of course, has other reasons for not wanting it on the One.

    • (Score: 2) by LoRdTAW on Tuesday November 25 2014, @01:49PM

      by LoRdTAW (3755) on Tuesday November 25 2014, @01:49PM (#119780) Journal

      What an awful set of graphs. I had to look twice at the color legend and try to figure out why the author decided to use a tiny fucking gradient instead of a nice big, red and black square.