Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday December 03 2021, @08:40AM   Printer-friendly [Skip to comment(s)]
from the these-names-are-getting-terrible dept.

Snapdragon 8cx Gen 3 launched: First 5 nm Windows PC SoC with four Cortex-X1 cores at 3 GHz, 6 nm Snapdragon 7c+ Gen 3 announced too

Qualcomm today unveiled the Snapdragon 8cx Gen 3 and Snapdragon 7c+ Gen 3 Compute Platforms that will power the next wave of Windows-on-ARM Always On Always Connected PCs. The Snapdragon 8cx Gen 3 is the first 5 nm SoC for the PC and features a 4+4 CPU with Cortex-X1 and Cortex-A78 cores along with other Qualcomm connectivity and security features.

[...] With the Snapdragon 8cx Gen 3, PCs are all set to get their first taste of the 5 nm architecture albeit on ARM. According to Qualcomm, the Snapdragon 8cx Gen 3 is about 85% faster in CPU and 60% faster in GPU performance. The exact comparative parameters were not disclosed during the presentation, however.

The Snapdragon 8cx Gen 3 is also slated to offer 29+ TOPS of AI performance. Once again, the comparisons aren't really obvious, but we can hazard a guess that this could be with respect to the Snapdragon 8cx Gen 2.

Previously, ARM SoCs have had only a single Cortex-X core, with the exception of Google's Tensor, found in the Pixel 6, which has 2x Cortex-X1 cores.

Also at CNX Software.

See also: Qualcomm 8cx Gen 3: Too dangerous to deploy

Qualcomm Announces Snapdragon 8 Gen 1: Flagship SoC for 2022 Devices

At this year's Tech Summit from Hawaii, it's time again for Qualcomm to unveil and detail the company's most important launch of the year, showcasing the newest Snapdragon flagship SoC that will be powering our upcoming 2022 devices. Today, as the first of a few announcements at the event, Qualcomm is announcing the new Snapdragon 8 Gen 1, the direct follow-up to last year's Snapdragon 888.

The Snapdragon 8 Gen 1 follows up its predecessors with a very obvious change in marketing and product naming, as the company is attempting to simplify its product naming and line-up. Still part of the "8 series", meaning the highest end segment for devices, the 8 Gen 1 resets the previous three-digit naming scheme in favor of just a segment and generation number. For Qualcomm's flagship part this is pretty straightforward, but it remains to be seen what this means for the 7 and 6 series, both of which have upwards of several parts for each generation.

As for the Snapdragon 8 Gen 1, the new chip comes with a lot of new IP: We're seeing the new trio of Armv9 Cortex CPU cores from Arm, a whole new next-generation Adreno GPU, a massively improved imaging pipeline with lots of new features, an upgraded Hexagon NPU/DSP, integrated X65 5G modem, and all manufactured on a newer Samsung 4nm process node.

The Snapdragon 8 Gen 1 notably lacks AV1 decode.

See also: Qualcomm phones it in for the Snapdragon Series-8 Gen 1

Qualcomm announces the Snapdragon G3x Gen 1 Gaming Platform with a Razer developer kit

Qualcomm has chipsets for a ton of different devices, and an expansion to gaming was likely always on the cards. Obviously, its most famous chips are those that it makes for smartphones, but it also makes Snapdragon chips for wearables, extended reality (XR) devices, PCs, and even cars. The aim of the Snapdragon G3x Gen 1 gaming platform is to unite all of the Snapdragon Elite Gaming technologies into one cohesive product. It's a chipset built purely for gaming, and Qualcomm says that it's designed to be "the PC gaming rig of mobile games". It has updateable GPU drivers for game optimizations, true 10-bit HDR gaming, support for external displays up to 4K resolution at 144FPS, USB-C for future XR accessories, and supports game streaming from the cloud, from your PC, and from your console. It has support for Qualcomm's 5G mmWave Modem-RF system too.

Given the proliferation of gaming on Android, Qualcomm has said that for now, it's focused exclusively on providing its chipset to Android devices. As a result, this doesn't look like it will end up turning into an NVIDIA Tegra/Nintendo Switch competitor — yet. Even still, this is the company's first real push into the gaming market, and it has the potential to grow into a whole lot more into the future. It didn't go too in-depth about the new chipset's capabilities, though given that the company designed a developer kit in tandem with Razer, it's clear that Qualcomm has an idea of the direction it wants to push this in. We're not entirely sure if the G3x is much faster than the Snapdragon 8 Gen 1 just yet, but we'll probably find out more about that in the near future.


Original Submission

Related Stories

Samsung Announces Exynos 2200 SoC with AMD RDNA2 Graphics 5 comments

Samsung announces Exynos 2200 with AMD "Xclipse" GPU

Now, the Exynos 2200 is finally official. The headline feature is a new "Samsung Xclipse 920 GPU" that was co-developed by AMD. Samsung says the GPU uses AMD's RDNA 2 architecture, the same as AMD's Radeon desktop GPUs, and will bring "hardware-accelerated ray tracing" to mobile devices.

David Wang, the SVP of AMD's Radeon division, said, "Samsung's Xclipse GPU is the first result of multiple planned generations of AMD RDNA graphics in Exynos SoCs." Previous reports have indicated that Samsung isn't just eyeing smartphones but eventually wants to put together an Apple M1-fighting ARM laptop chip.

The CPU is about what you would expect from a 2022 ARM chip. The 4 nm SoC has one Cortex X2 CPU for single-threaded performance, three Cortex A710 cores, and four low-power Cortex A510 cores, just like Qualcomm's 2022 chip, the Snapdragon 8 Gen 1. These are all new ARM v9 cores, with the X2 and little cores both being 64-bit only.

Despite finally announcing the Exynos 2200, Samsung's announcement does not put to bed any questions about a troubled development of the Exynos 2200. The press release and product site are both lacking many of the details that are typically disclosed at this point. For instance, Samsung has not made any performance claims about the Exynos 2200 CPU or GPU. If you read through the Exynos 2100 press release from this time last year, you'll see claims like 30 percent better CPU multi-core performance and 40 percent faster graphics.

Leaks have pointed to thermal issues with the Exynos 2200 which could potentially lead to lower performance than its main competitors: Qualcomm's Snapdragon 8 Gen 1, MediaTek's Dimensity 9000, and Apple's A15.

Also at The Verge, SamMobile, and Bloomberg.

Related: Samsung Ends Development on Custom ARM Cores, Signals Layoffs at Austin, Texas R&D Center


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0, Offtopic) by Frigatebird on Friday December 03 2021, @10:34AM (1 child)

    by Frigatebird (15573) on Friday December 03 2021, @10:34AM (#1201790)

    I have a bad feeling about this!

    • (Score: 0) by Anonymous Coward on Friday December 03 2021, @03:36PM

      by Anonymous Coward on Friday December 03 2021, @03:36PM (#1201820)

      Having recently completed work on my time machine, I forwarded this news summary back in time 100 years. I may have accidentally triggered the roaring twenties.

  • (Score: 2) by bzipitidoo on Saturday December 04 2021, @01:37AM (3 children)

    by bzipitidoo (4388) on Saturday December 04 2021, @01:37AM (#1202021) Journal

    Woof, Moore's Law not only won't die, it's stronger than ever. Seems like every year, it's a new generation that has at least 40% more performance. My 2017 hardware is now 0.71^4 = 26% of the performance of new processors.

    Been thinking about getting a new PC. One thing I see is that video cards are insanely expensive, when they're not just plain sold out. Used to be, $200 got you a good video card. Now they're $1000, or higher. I was hoping cryptomining mania would have waned a bit, but, evidently not. So, I'll stick with the integrated graphics.

    • (Score: 4, Interesting) by takyon on Saturday December 04 2021, @03:31AM (2 children)

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Saturday December 04 2021, @03:31AM (#1202034) Journal

      Woof, Moore's Law not only won't die, it's stronger than ever. Seems like every year, it's a new generation that has at least 40% more performance. My 2017 hardware is now 0.71^4 = 26% of the performance of new processors.

      Snapdragon 8cx Gen 3 is misleading since its two predecessors were overpriced and not very fast. But now they've crammed in a whopping 4x Cortex-X1 cores to increase performance at the expense of die space and power efficiency. To be fair, that's what you want when moving from phone to laptop.

      I'm also wondering what your 2017 hardware is, because I doubt today's processors (even if they are Arm mobile) are four times faster. Arm's recent roadmaps are forecasting lower gains.

      3D packaging and monolithic 3D should give us some huge boosts in the future, so it might feel like the 90s again.

      Been thinking about getting a new PC. One thing I see is that video cards are insanely expensive, when they're not just plain sold out. Used to be, $200 got you a good video card. Now they're $1000, or higher. I was hoping cryptomining mania would have waned a bit, but, evidently not. So, I'll stick with the integrated graphics.

      The Cezanne 5700G [slickdeals.net] has been pretty cheap lately. Rembrandt would be a huge improvement if there is a desktop model next year, but it would be DDR5 only. Hopefully, DDR5 prices crash by then to something more reasonable like +10% over DDR4.

      It will probably take until 2023 for the GPU market to normalize, although AMD/Nvidia would prefer if price points were permanently heightened. Intel should be bringing a sub-$200 GPU to market soon, although that will still be expensive compared to pre-2020 prices.

      I hear that gamer demand is still responsible for the majority of GPU sales, not cryptomining mania. Not that crypto helps things.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by bzipitidoo on Monday December 06 2021, @04:27AM (1 child)

        by bzipitidoo (4388) on Monday December 06 2021, @04:27AM (#1202447) Journal

        My stuff is low end, low power Skylake and Kaby Lake, a 6260U and a 7100U. I also still have my late 2000s era Athlon Phenom II X4 945, with a Radeon HD5450. I tried to hold out for Ryzen, but a need for SSE4 pushed me into buying before that was released. Interestingly, one of the video cards available, for just $50, is a Radeon HD5450! Huh, thought that would've vanished by 2015. Must be really hurting for any video card at all to dredge up stuff that old. Integrated graphics has surpassed it, so what reason could there be to want one? For a CPU that doesn't have integrated graphics, or a multiple monitor set up, or, virtual machines?

        Didn't expect gaming demand to be the main reason, but perhaps it's the pandemic? Everyone staying in a lot more, playing more video games? I was guessing it was cryptomining and the chip shortage.

        Anyway, one of my motivations is just how dreadfully long it takes to do AV1 encoding. Takes an hour to make 10 seconds of video! Have tried libaom 3.2, which they warn is very slow, and rav1e. The latter is faster, but not anywhere close to real time, still takes most of an hour. I do not have a handle on what the dozens of options do, just going with default settings for the most part, but no doubt some trade quality or size for much greater encoding speed.

        • (Score: 2) by takyon on Monday December 06 2021, @05:20AM

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday December 06 2021, @05:20AM (#1202457) Journal

          For a CPU that doesn't have integrated graphics, or a multiple monitor set up, or, virtual machines?

          The entire Ryzen non-APU lineup and the Intel 'F' models don't have integrated graphics. But the HD 5450 is an absolutely heinous choice, wow.

          I know that some people get budget GPUs for extra display outputs, but iGPUs should technically be fine if a sufficient number of output ports are provided. According to ARK [intel.com], the lowly N4020 found in some SBCs supports up to 3 displays and 4K60 resolution out of DisplayPort.

          Didn't expect gaming demand to be the main reason, but perhaps it's the pandemic? Everyone staying in a lot more, playing more video games? I was guessing it was cryptomining and the chip shortage.

          And not only that, gaming was becoming much more mainstream and massive even before the pandemic. It's huge.

          Videogames are a bigger industry than movies and North American sports combined, thanks to the pandemic [marketwatch.com]

          Anyway, one of my motivations is just how dreadfully long it takes to do AV1 encoding. Takes an hour to make 10 seconds of video! Have tried libaom 3.2, which they warn is very slow, and rav1e. The latter is faster, but not anywhere close to real time, still takes most of an hour. I do not have a handle on what the dozens of options do, just going with default settings for the most part, but no doubt some trade quality or size for much greater encoding speed.

          I have glanced at posts about libaom and rav1e on Phoronix. Here they are getting as low as 6 FPS encoding 4K with a 12-core 5900X [phoronix.com], but possibly near real-time with 1080p ("two-pass" encoding mode).

          I don't have any specific advice, but I hope that you actually have a need to encode stuff in AV1 at this time.

          Also, it should be fun to see how long AV2 [reddit.com] takes to decode when it hits the scene.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(1)