Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Tuesday April 14 2015, @01:54PM   Printer-friendly
from the sound-of-one-hand-clapping dept.

Fudzilla have 'obtained' a slide showing details of a forthcoming APU from AMD based on their new "Zen" architecture.

The highest end compute HSA part has up to 16 Zen x86 cores and supports 32 threads, or two threads per core. This is something we saw on Intel architectures for a while, and it seems to be working just fine. This will be the first exciting processor from the house of AMD in the server / HSA market in years, and in case AMD delivers it on time it might be a big break for the company.

Each Zen core gets 512 KB of L2 cache and each cluster or four Zen cores is sharing 8MB L3 cache. In case we are talking about a 16-core, 32-thread next generation Zen based x86 processor, the total amount of L2 cache gets to a whopping 8MB, backed by 32MB of L3 cache.

This new APU also comes with the Greenland Graphics and Multimedia Engine that comes with HBM memory on the side. The specs we saw indicate that there can be up to 16GB of HBM memory with 512GB/s speed packed on the interposer. This is definitely a lot of memory for an APU GPU, and it also comes with 1/2 rate double precision compute, enhanced ECC and RAS and HSA support.

The new APU sports quad-channel DDR4 support, with up to 256GB per channel at speeds of up to 3.2GHz. No information yet on which processor socket this APU will use, but it's safe to assume the DDR4 support alone will render it incompatible with all AMD's current motherboards. Support is also included for secure boot and AMD's encryption co-processor.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by meisterister on Tuesday April 14 2015, @10:45PM

    by meisterister (949) on Tuesday April 14 2015, @10:45PM (#170606) Journal

    Plus flipping one to this. They seriously feel even worse than 3dfx did (marketing marketing marketing!) back in the day. The reason I bought an FX-8350 over a haswell i7 (which had just come out at the time) was that it offered good compute performance at a good price. If faced with the same decision today, I still would've bought the 8350.

    Single-threaded performance hasn't been relevant since Core 2/K8 hit about 3GHz or so back in 2006, and Intel only has a minimal power consumption advantage due to its raw process advantage (which the other fabs are trying to whittle away at).

    The best advice that I'd have for a computer buyer now is to get anything that fits their budget and does what they need it to. If you want a compute workstation, go for the most cores/biggest FPUs at the lowest price. If you want a gaming box, get the biggest GPU you can afford and build around that. If you want to do office work, you needn't even look beyond anything made less than about 8-10 years ago or more!

    In order to bring my post from -1 Flamebait to at least 0 or +1, I think that one observation is in order:

    The media does an excellent job of making things sound either far better or far worse than they are. When reading reviews for all manner of computer hardware, I often leave with the conclusion that $SHINY_THING is either the second coming or the absolute worst piece of crap that has ever cursed the face of this planet. There is no acknowledgement of scale, namely that the supposedly huge increases seen usually don't amount to much in the real world. Will most people feel the difference between an 8350 and an 4770k? Not likely. Will they feel the difference between an r9 290 and a 980? Maybe, but if they have VSync on, it's all irrelevant anyway. Will they notice the move from 8GB to 16GB of RAM? Not unless they're doing scientific computing or virtualization.

    Can most people tell the difference between a Pentium D and a high end modern computer when dealing with text documents or checking their email? Hell no! Does one benchmark so much higher than the other that you would expect to notice a difference? Absolutely!

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 3, Insightful) by Hairyfeet on Wednesday April 15 2015, @02:06AM

    by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Wednesday April 15 2015, @02:06AM (#170693) Journal

    This is what royally pisses me off about the blatant market manipulation as the average person does not have access to multiple chips so are often at the mercy of tech sites that 1.- use rigged benchmarks, and 2.- have their site paid for by Intel.

    Can you imagine any other field being able to get away with this? I mean they aren't even attempting to hide it anymore, they really do not give a flying fuck because they know they have a lock on the media and the benchmark companies. This isn't even the level of Gates and MSFT in the 90s, with them you needed internal memos or insiders to catch 'em while with the way intel is paying to rig the benches all you have to do is grab a Via CPU and change the CPUID to an Intel one and gasp! The CPU suddenly scores 30% higher on the same tests!

    And you are 100% correct that nobody would know the difference between the chips except for the hit to the wallet as I've gotten to try out just about every Intel and AMD chip and the difference? You'd need a stopwatch to notice. Hell I've placed Phenom X4s against Intel chips 3 generations newer and we are talking about MAYBE mid single digits in time and that is in a perfect run, we're talking about not even 2 minutes difference on a lot of runs and that is a $50 chip against a $200+ one!

    This is why I tell folks that unless you are one of those that are literally pushing every core full bore for the majority of the day doing some sort of uber heavy number crunching or using some task where all day all you are doing is locked into a single thread? You are just wasting your money, that premium you hand over to Intel could be better spent buying a better GPU, SSD, more storage space, etc. Heck my Phenom II X6 is going on 6 years old now and even with it spending a lot of time playing games I haven't bothered to upgrade because I have yet to run into any game I play it cannot run well.

    Finally for all the Intel fans I have a single simple question....if the Intel chips are so great, why are they spending all that money and risking getting busted by the EU to rig the benches? If their scores are so fucking great why would they bother and put themselves at risk? Why are they throwing such huge advertising budgets at all the review sites if they are so far ahead? If you think about it even for a moment there really only is one logical conclusion, its because they know the price premium doesn't come anywhere close to being inline with the actual performance so they are rigging so the numbers seem to justify their insane prices. Its no different than how they threw hundreds of millions at the OEMs to take the P4, they know if there was a fair fight their chips wouldn't be selling enough to justify their price and they would have to lower prices to reflect their actual value so they rig to keep the price inflated.

    If you win a fight fair and square? I'll be the first to trumpet your victory, no different than how I sold many a C2D and C2Q because those first gens were quite impressive, but I will never support a company rigging the market as we've seen where this leads, insane prices, outrageous profits for the company, and the consumers get shafted. Even those that like Intel should want a fair fight as a free market with real scores would cause Intel prices to be more in line with AMD, and a big win for the consumers!

    --
    ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.