Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday May 26 2019, @12:48PM   Printer-friendly
from the what's-in-a-name? dept.

AMD's RX 3080 has been rumoured for quite some time, a GPU name which is designed to 1-up Nvidia's RTX 20XX series in a literal sense, copying the tactics of the company's CPU division when they released their Ryzen-based X370 platform to compete with Intel's Z270 offerings.

The idea is simple, you see two products on a shelf and you look at the numbers. X370 must be better than Z270; the number is bigger, right? It's a simple marketing tactic, and it makes sense for AMD to reuse it within the graphics card market. AMD's naming schemes have moved from RX 580 to RX Vega 64 to Radeon VII; it's not like AMD has a defined branding scheme to follow within the GPU market anymore, so why not piggyback on Nvidia? Nvidia even went to the effort of changing GTX to RTX on the high-end, simply begging to be confused with AMD's established RX graphics lineup.

[...] Now, it looks like Nvidia wants to stop AMD's games, with recent trademark applications showing that Nvidia claims ownership of the numbers 3080, 4080 and 5080, at least within the world of PC graphics. This move appears to be Nvidia's attempt to stop Radeon calling their next graphics card the RX 3080, a name which would inevitably cause confusion when Nvidia releases their RTX/GTX 30XX series, which should include a model called the RTX 3080.

https://www.overclock3d.net/news/gpu_displays/nvidia_hopes_to_block_amd_s_rx_3080_with_a_new_trademark/1


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by opinionated_science on Sunday May 26 2019, @07:43PM (2 children)

    by opinionated_science (4031) on Sunday May 26 2019, @07:43PM (#847955)

    anyone got a way of using CUDA/Tensorflow/Pytorch on AMD GPUs?

    This is probably the only way we'll get some competition as Nvidia is *crushing* the industry in machine learning, modelling etc...

    AMD could have the best hardware on the planet, but if it can't be used without a load of software angst, it doesn't really help...

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Sunday May 26 2019, @08:19PM

    by Anonymous Coward on Sunday May 26 2019, @08:19PM (#847974)
  • (Score: 2) by takyon on Monday May 27 2019, @12:38AM

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday May 27 2019, @12:38AM (#848045) Journal

    Nvidia will get discrete GPU competition (including datacenter/maching learning) from Intel starting as soon as next year.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]