Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday February 13 2018, @12:02AM   Printer-friendly
from the loads-of-power dept.

AMD has launched two desktop APUs with Ryzen CPU cores and Vega graphics. The $169 Ryzen 5 2400G is a 4 core, 8 thread APU with 11 graphics compute units. The $99 Ryzen 3 2200G has 4 cores, 4 threads, and 8 graphics compute units. Both have a 65 W TDP and support dual-channel DDR4-2933 RAM:

Despite the Ryzen 5 2400G being classified as a 'Ryzen 5', the specifications of the chip are pretty much the peak specifications that the silicon is expected to offer. AMD has stated that at this time no Ryzen 7 equivalent is planned. The Ryzen 5 2400G has a full complement of four cores with simultaneous multi-threading, and a full set of 11 compute units on the integrated graphics. This is one compute unit more than the Ryzen 7 2700U Mobile processor, which only has 10 compute units but is limited to 15W TDP. The 11 compute units for the 2400G translates as 704 streaming processors, compared to 640 SPs on the Ryzen 7 2700U or 512 SPs on previous generation desktop APUs: an effective automatic 25% increase from generation to generation of desktop APU without factoring the Vega architecture or the frequency improvements.

The integrated graphics frequency will default to 1250 MHz and the total chip TDP is 65W. Maximum supported memory frequency will vary depending on how much memory is used and what type, but AMD lists DDR4-2933 as the support for one single-sided module per channel. Aside from the full set of hardware, the CPU frequency of the 2400G is very high, similar to the standard Ryzen 7 desktop processors: a base frequency of 3.6 GHz and a turbo of 3.9 GHz will leave little room for overclocking. (Yes, that means these chips are overclockable.)

The Ryzen 5 2400G somewhat replaces the Ryzen 5 1400 at the $169 price point. Both chips will continue to be sold, but at this price point AMD will be promoting the 2400G over the 1400. The 2400G has a higher set of frequencies (3.6G vs 3.2G base frequency, 3.9G vs 3.4G turbo frequency), higher memory support (DDR4-2933 vs DDR4-2666), no cross-CCX latency between sets of cores, but has less L3 cache per core (1 MB vs 2 MB). In virtually all scenarios, even if a user does not use the Ryzen 5 2400G integrated graphics, the Ryzen 5 2400G seems the better option on paper.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Touché) by ealbers on Tuesday February 13 2018, @12:05AM (1 child)

    by ealbers (5715) on Tuesday February 13 2018, @12:05AM (#636891)

    wow! cool, we can do math now! I remember FPU's for floating point, glad we can do Arithmetic now!

    • (Score: 1, Funny) by Anonymous Coward on Tuesday February 13 2018, @12:20AM

      by Anonymous Coward on Tuesday February 13 2018, @12:20AM (#636900)

      You're thinking of a different APU, TFA clearly references thank you, come again [wikipedia.org]

  • (Score: 2) by bob_super on Tuesday February 13 2018, @01:17AM (6 children)

    by bob_super (1357) on Tuesday February 13 2018, @01:17AM (#636918)

    Since we can't find discrete GPUs at a decent price, it's nice to know that integrated graphics are getting a pretty big boost, to help push enough pixels for a 4k monitor or two.
    Gaming ? Ain't the target.

    • (Score: 3, Informative) by richtopia on Tuesday February 13 2018, @01:36AM (3 children)

      by richtopia (3160) on Tuesday February 13 2018, @01:36AM (#636922) Homepage Journal

      It is "good enough" gaming for most. The PS4 and Xbox One run Jaguar APUs, and the new 2400G looks comparable to the PS4's.

      PS4: The CPU consists of two quad-core Jaguar modules totaling 8 x86-64 cores.[47][48] The GPU consists of 18 compute units to produce a theoretical peak performance of 1.84 TFLOPS. (wikipedia)

      2400G: 4 CPU cores, 8 threads; 11 Vega cores for 1.76 TFLOPS

      Previously AMD enabled crossfire configurations between the APU and a dedicated GPU, and I'm unsure if that will be supported in this generation. If it is, there could be an upgrade path once dedicated GPU prices decline, although I suspect I would be wanting more than 4 CPU cores at that point.

      • (Score: 2, Insightful) by iru on Tuesday February 13 2018, @05:09AM

        by iru (6596) on Tuesday February 13 2018, @05:09AM (#636997)

        > It is "good enough" gaming for most. The PS4 and Xbox One run Jaguar APUs, and the new 2400G looks comparable to the PS4's.

        On the other hand console games are highly optimized and tuned to their platforms. The same results can’t be expected on similar PC configurations.

      • (Score: 2) by tibman on Tuesday February 13 2018, @03:26PM (1 child)

        by tibman (134) Subscriber Badge on Tuesday February 13 2018, @03:26PM (#637140)

        I'm probably one of the few people who did the APU/GPU crossfire. It was really cool but crossfire/sli has fallen away from mainstream. My only issue with it was it could only crossfire with a very specific gpu. R7 270 i think? Back then graphics cards were much cheaper and getting a slightly better graphics card would completely outperform an apu/gpu crossfire. But i love the innovation : )

        --
        SN won't survive on lurkers alone. Write comments.
        • (Score: 2) by takyon on Tuesday February 13 2018, @04:51PM

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday February 13 2018, @04:51PM (#637191) Journal

          crossfire/sli has fallen away from mainstream

          It should be coming back to the mainstream with Vulkan [wikipedia.org].

          At the very least, the weaker GPU could be used for video decoding. For example, the new Intel cards with AMD graphics include the Intel iGPU.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by takyon on Tuesday February 13 2018, @03:54AM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday February 13 2018, @03:54AM (#636969) Journal

      It could enable gaming at a reasonable resolution (1080p) without a discrete GPU:

      https://arstechnica.com/gadgets/2018/02/desktop-ryzens-with-integrated-graphics-finds-an-awkward-middle-ground/ [arstechnica.com]

      The new parts don't offer the same sizable core and thread-count advantage. Rather, their big advantage comes from their GPU, with the Vega cores being faster than Intel's Gen 9 GPU cores. The benchmark results reflect this. For example, from Anandtech [anandtech.com], the AMD chips can manage around 30 frames per second at 1080p in Civilization VI, compared to a meager 10 fps from the Intel parts. In Grand Theft Auto V, the 2400G is just shy of 20 fps, to sub-5 fps for the Intel parts. From Tech Report [techreport.com], Dota 2 at 1080p manages 46 fps on the 2400G, compared to just 16 fps on an Intel system.

      If you want 60 FPS in modern titles, VR, or 4K, then you'll probably want a discrete GPU.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by bobthecimmerian on Tuesday February 13 2018, @12:25PM

      by bobthecimmerian (6834) on Tuesday February 13 2018, @12:25PM (#637072)

      Tom's Hardware and pcper.com did a pretty thorough review, and for value-for-your-dollar these parts are pretty good. But an Nvidia GT 1030 isn't hard to find these days and beats or matches the better part at the GPU level. So if you're thinking long term get a dedicated CPU and that, and then later swap out the GT 1030 for something better.

      I read about it out of curiosity - my wife's desktop runs one of the older AMD CPU/GPU ('APU') parts and is just fine for what she does with it and Minecraft for the kids. I lost interest in gaming in my late 30s, though, so this part doesn't matter much to me. I'd rather get a Ryzen 7 and some bottom end GPU.

  • (Score: 0) by Anonymous Coward on Tuesday February 13 2018, @04:44AM

    by Anonymous Coward on Tuesday February 13 2018, @04:44AM (#636991)

    Interesting play, if AMD could negate the need for dedicated graphics, it could capture some much needed market share. Perhaps, even change the game entirely.

(1)