AMD announced its first RDNA 2 (Radeon RX 6000 series) gaming GPUs during a live stream (24m42s) on October 28.
AMD originally planned for RDNA 2 to have 50% more performance per Watt than GPUs using the RDNA 1 microarchitecture. Now, AMD is claiming 54% more performance per Watt for the RX 6800 XT and RX 6800, and 65% more performance per Watt for the RX 6900 XT. Part of the efficiency gain is due to the use of "Infinity Cache", similar to the L3 cache found in Ryzen CPUs. This allowed AMD to use a 256-bit memory bus with 2.17x the effective memory bandwidth of a 384-bit bus, while using slightly less power.
The RX 6900 XT ($1000) has performance comparable to Nvidia's RTX 3090, with a total board power (TBP) of 300 Watts. The RX 6800 XT ($650) is comparable to Nvidia's RTX 3080, also with a 300 Watt TBP. The RX 6800 ($580) is around 18% faster than Nvidia's RTX 2080 Ti, with a 250 Watt TBP. All three of the GPUs have 16 GB of GDDR6 VRAM and 128 MB of "Infinity Cache".
The 6800 XT and 6800 will be available starting on November 18, while the 6900 XT will be available on December 8.
Also at Tom's Hardware, Phoronix, Ars Technica, and Guru3D.
Previously: Nvidia Announces RTX 30-Series "Ampere" GPUs
AMD Announces Zen 3 CPUs
Related Stories
Nvidia has announced its latest generation of gaming-oriented GPUs, based on the "Ampere" microarchitecture on a customized Samsung "8nm" process node.
The GeForce RTX 3080 ($700) has 10 GB of GDDR6X VRAM and will be released on September 17. TDP is up significantly, at 320 Watts compared to 215 Watts for the RTX 2080. The GeForce RTX 3070 ($500) has 8 GB of GDDR6 and a TDP of 220 Watts. The GeForce RTX 3090 ($1500) is the top card so far with a whopping 24 GB of GDDR6X VRAM. The GPU is physically much larger than the other two models and it has a 350 Watt TDP.
Nvidia's performance benchmarks should be treated with caution, since the company is often using ray-tracing and/or DLSS upscaling in its comparisons. But the RTX 3070 will outperform the RTX 2080 Ti at less than half the launch price, as it has 35% more CUDA cores at higher clock speeds.
Nvidia also announced some new features such as Nvidia Reflex (4m53s video), Broadcast, Omniverse Machinima, and RTX IO. Nvidia Broadcast includes AI-derived tools intended for live streamers. RTX Voice can filter out background noises, greenscreen effects can be applied without the need for a real greenscreen, and an autoframing feature can keep the streamer centered in frame while they are moving. Nvidia RTX IO appears to be Nvidia's response to the next-generation consoles' use of fast SSDs and dedicated data decompression.
NVIDIA GeForce RTX 30 Series | Official Launch Event (39m29s video)
Previously: Micron Accidentally Confirms GDDR6X Memory, and Nvidia's RTX 3090 GPU
AMD announced its first Zen 3 (Ryzen 5000 series) desktop CPUs on October 8.
Compared to Zen 2 (Ryzen 3000 series) CPUs, the Zen 3 microarchitecture has higher boost clocks and around 19% higher instructions per clock. A unified core complex die (CCD) allows 8 cores to access up to 32 MB of L3 cache, instead of two groups of 4 cores accessing 16 MB each, leading to lower latency and more cache available for any particular core. TDPs are the same as the previous generation, leading to a 24% increase in performance per Watt.
AMD estimates a 26% average increase in gaming performance at 1080p resolution, with the Zen 3 CPUs beating or tying Intel's best CPUs in most games.
Ryzen 9 5950X, 16 cores, 32 threads, boosts up to 4.9 GHz, 105W TDP, $800.
Ryzen 9 5900X, 12 cores, 24 threads, boosts up to 4.8 GHz, 105W TDP, $550.
Ryzen 7 5800X, 8 cores, 16 threads, boosts up to 4.7 GHz, 105W TDP, $450.
Ryzen 5 5600X, 6 cores, 12 threads, boosts up to 4.6 GHz, 65W TDP, $300.
You may have noticed that these prices are exactly $50 more than the launch prices for the Ryzen 3000 equivalents released in 2019. The 5600X is the only model that will ship with a bundled cooler.
The CPUs will all be available starting on November 5. AMD will stream an announcement for its RX 6000 series of high-end GPUs on October 28.
See also: AMD Zen 3 Announcement by Lisa Su: A Live Blog at Noon ET (16:00 UTC)
AMD Teases Radeon RX 6000 Card Performance Numbers: Aiming For 3080?
Previously: AMD's Zen 3 CPUs Will Not be Compatible with X470, B450, and Older Motherboards
AMD Reverses BIOS Decision, Intends to Support Zen 3 on B450 and X470 Motherboards
AMD Launching 3900XT, 3800XT, and 3600XT Zen 2 Refresh CPUs: Milking Matisse
AMD Zen 3, Ryzen 4000 Release Date, Specifications, Performance, All We Know
(Score: 2, Interesting) by jasassin on Thursday October 29 2020, @12:32AM (23 children)
I know the 2080ti was massively overpriced compared to this generation, but people should stop buying this shit so we can get a good video card for under $300. $300 sounds about right for a 3090. For $1000 I'd rather have a beater back up car.
I really wish the days of $1000 cell phones and video cards would go the way of the buggy whip.
Oh well, I guess I'm just a buster.
jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
(Score: 2, Insightful) by Anonymous Coward on Thursday October 29 2020, @01:23AM
I believe the memory alone on the 3090 costs more than $300, but that would be nice ;)
(Score: 3, Insightful) by Immerman on Thursday October 29 2020, @02:01AM (3 children)
I'm a bit out of touch, but When was the last time we could get a good video card for under $300? And what does that price translate to in today's dollars?
I know far too often I find myself thing "$X is way too much for Y, what happened? And then I realize 10 to 20 years of inflation happened, and my money is worth only a fraction of what it used to be.
(Score: 3, Interesting) by jasassin on Thursday October 29 2020, @02:39AM (1 child)
I'm totally with you on the out of touch thing. Back in the day it was a Voodoo 2 for $320. It blew the hell out of a Voodoo 1 and was the first, and only, card I spent $300 on. I remember installing it and loading up Quake and almost pissing my pants.
Anecdotally since then I've bought an Asus EAH Radeon 3450 (new at the time) for $45 and recently (maybe around April or May) after using an Nvidia 9800 GT a friend gave me about 8 years ago I bought an Nvidia GTX 1650 Super. It was $165, but the main reason I bought that was because I got $50 off for signing up for an Amazon store card and another $60 off for signing up for an Amazon Prime Visa card and free shipping for signing up for a free month of Amazon Prime. So I got it for a little over $60.
I closed the store account about two weeks after purchase and canceled Amazon Prime two days before the 30 day trial. I kept the Visa card to use with my friends Costco gas card because the pumps don't take Discover. (Yes I'm frugal.)
jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
(Score: 2) by Immerman on Thursday October 29 2020, @03:03AM
Oh, yeah, the good old Voodoo2 - that was the first "real" 3D card I bought, after getting burned by one of the competing "3D decelerators" in the first generation. Cost over $600 in today's dollars.
I've had a few second-hand upgrades since then, but I'm not sure when the last time I bought a new video card was... well before Bitcoin, and GPU-based superccomputing hadn't yet caught on enough to affect prices. I don't think the Wii had even come out yet. And even then $300 only got you a solid mid-range card. I don't even remember what I bought anymore, but I think it cost around $200, and that was around the time I decided I was no longer interested in chasing high-performance PC gaming. I should probably buy something new soon - at the very least something that supports a 4K desktop natively rather than requiring a 30Hz driver hack with a decent chance of being reverted any time Windows does a major update.
(Score: 0) by Anonymous Coward on Thursday October 29 2020, @03:13AM
GPU is cheap as hell, just not the newest best. Navi 10 was less than 300 when it was brand new though, and that will still play any game you throw at it on max settings.
(Score: 2) by EJ on Thursday October 29 2020, @04:16AM (1 child)
Wrong. If people stop buying this stuff, then they will stop making it.
The only reason we're seeing such advancements that we're seeing is because people are willing to pay for it. These cards are also helping to advance the power of consoles because the research that goes into making PC cards carries over to console technology.
If you don't want to pay for the top of the line, then don't. Buy the $300 tier cards.
You just want to have a car that goes 331 MPH without paying the $1.9M it costs to have a Tuatara.
(Score: 2) by jasassin on Thursday October 29 2020, @04:42AM
They might stop making $1000 cards, but it's not like they are going to throw their hands in the air and close up shop. They'd make more realistically priced consumer cards.
I understand where you're coming from, really I do. I think the car analogy is interesting because owning a car that can go 330MPH with 75MPH speed limits is not quite the same as a video card that can run flight sim on max settings at 4K at 144FPS. At least you could push the card to the limit all the time with no legal or bodily worries... but still... $1000 for a video card, in my mind, is almost as an obscenely overpriced object.
While we're at it, why don't they mass produce Lamborghini's and sell them for the same profit margins as other cars? It's the Apple/Name/Status symbol. It's not really because people are driving 300MPH.
If they sold gullwing cars that looked like a Lamborghini with the engine of a Honda Civic would they sell like hotcakes,or it really the horsepower people care about?
jasassin@gmail.com GPG Key ID: 0xE6462C68A9A3DB5A
(Score: 2) by takyon on Thursday October 29 2020, @05:50AM (13 children)
Inflation is a factor, and the BOM cost is pretty high for a 3090, probably at least $700 or so. One of the components making it expensive is the cooler. GDDR6X is also the newest and most expensive memory, and probably a strategic mistake given how power-hungry it is.
https://www.mooreslawisdead.com/post/nvidia-s-ultimate-play [mooreslawisdead.com]
Intel rejoining the market should provide needed competition, and they plan to start at the low/mid-range first.
https://www.notebookcheck.net/More-Intel-Xe-HPG-gaming-GPU-specs-and-performance-info-leak-out.498703.0.html [notebookcheck.net]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by fakefuck39 on Thursday October 29 2020, @06:51AM (10 children)
lol. yeah, this is the summary I'd expect from someone approaching it from the point of view of "I'm sitting at my house."
There are companies buying GPU farms for all kinds of data mining, predictive analysis, vdi, high frequency trading, etc. They don't care if this card costs 1k or 10k - they're spending 2mil on the farm. And they are many of them. These go into racks with a few hundred xeon cores, a few hundred nvidia cards, and a mellanox backend that the all-flash SAN and NAS all plug into. Each rack costs 10-20mil, and people are buying them left and right. I know, because I design and sell it.
These nvidia cards are a terrible choice for someone sitting at their house playing a game. And just like an iphone is a terrible choice and value for someone who needs a smartphone, people with extra cash will still buy them. wait till you find out the specs and the cost of a shitty underpowered mainframe.
(Score: 2) by EvilSS on Thursday October 29 2020, @08:24PM (9 children)
The 3090 card is crazy for a gamer unless you also do a lot of video rendering (and do it for money). 3080 and 3070 are more aimed at gamers. 3070 is actually a decent card when you compare it's performance and price to the prior gen 2080ti (which was a massive rip-off for anyone gaming). I'm sure the majority of gamers will end up with the 3060 when it is released (and has enough stock built up people can actually buy it). The #060 nvidia cards are usually the most popular in the Steam surveys.
(Score: 2) by fakefuck39 on Thursday October 29 2020, @09:29PM (8 children)
"For instance you need to buy per-user licenses to use them for things like VDI"
I have no idea what you're talking about. I literally sell this, daily. VDI solutions like Citrix need per user licenses - to use citrix, irrelevant of what hardware you're on. You absolutely don't need a per user license, or any license, for the hardware.
But it's cute you say if I never worked with DC GPU farms that I should check it out. I did. Over 10 years ago when I moved from implementation to architecture and started designing and selling the solutions.
Although I admit, maybe if you buy something from nvidia, maybe they do have the weird restrictions for licensing. I mostly sell these inside of Dell servers, so the bom is dell, and it includes no licensing for the GPU, because none is needed.
(Score: 2) by EvilSS on Friday October 30 2020, @12:31AM (7 children)
As for the datacenter licensing, again, you really need to take a sabbatical and update your industry knowledge: https://www.datacenterdynamics.com/en/news/nvidia-updates-geforce-eula-to-prohibit-data-center-use/ [datacenterdynamics.com]
Oh, and for the record, I also literally sell this stuff daily, and have for over 20 years for multiple Citrix platinum partners. If you sold me a solution for VDI and vGPUs I'd be pissed when the cards won't work because you didn't tell me I needed to buy licenses for them.
(Score: 2) by fakefuck39 on Friday October 30 2020, @05:51AM (6 children)
I don't need to read any of that. Here's what I do:
go to https://www.dellemc.com/partner/en-us/auth/sales/solution-configurator.htm [dellemc.com]
log in to my VAR account.
select servers or HCI
launch configurator for the compute model
add nvidia GPUs, pick ram, disk, cpu/etc.
add per-GPU/vGPU licenses, grid if needed. Each license it tied to a GPU. I can select annual or perpetual. Yes, I have the option to select per concurrent user as well. I have not had a single customer ask me to do that. Ever.
then I add prosupport or prosupport plus
click validate, generate bom, apply discounts the customer has
click build quote
I have no idea what that datacenter dynamics site is. Maybe you're talking about buying something from nvidia directly.
Point is, you're saying it's not possible, when I literally just checked, right now, before making this post. What's annoying is your way is absolutely an option. It's also absolutely not the only option, and you're completely full of crap, and condescendingly telling the guy who builds these quotes to "read up." That's just funny. Whoever told you this (and you're clearly a customer, not a vendor), you should find a new VAR. As a customer, you literally have never logged in to build the quote and are here telling people how the quotes work. That's just funny.
(Score: 2) by EvilSS on Friday October 30 2020, @11:29AM (5 children)
Honestly at this point I have to think you are just a troll or a liar about what you actually do. This has all been known by people who actually work in the industry for years now. And, again, I've been doing this for over 20 years, and never as a customer.
(Score: 2) by fakefuck39 on Friday October 30 2020, @12:02PM (4 children)
I get it. You're talking about purchasing it from nvidia. I am not. I am talking about purchasing it from a server vendor. Do you understand that? Do you know what vmw horizon is? Heck, I might even be able to quote out a ucs from cisco w/o per-user, but that's a slow one to try and I'm not going to spend the time.
You're talking about vcomputeserver. you can run some vdi on that too if you want btw.
Now, maybe we have a misunderstanding here too. The per-GPU license package does cut out at 96vPCs or some high number of vApps. There is another in the dropdown that's up to 192, costing about a third more. If I only add 4 GPUs, I get the option for 64 that's cheap.
That does not make it a per-user license. That's the max for the package. Like when you sign up for a comcast account and you get a max of 6 static IPs. That doesn't mean their package is a cost for each IP.
You know how your phone plan is 10gig/month max usage? That doesn't make it a per-byte price plan.
I'm not a troll buddy. I'm the guy competent customers buy things from. Now I'll admit, VDI is not my shtik. I'm mostly storage and compute. But I do build some for vdi every quarter. When you say you've never been a customer and you sell this stuff, then other places you say things only a customer would say:
"If you sold me a solution for VDI and vGPUs I'd be pissed when the cards won't work because you didn't tell me I needed to buy licenses for them."
that makes me think you lied so much you forgot what lies you told. sold you a solution? so you're buying them. strange, now you say you're selling them. why don't you make up your mind about what you do, then get back to us.
(Score: 2) by EvilSS on Friday October 30 2020, @12:42PM (3 children)
There is no option to do VDI without a CCU license from NVidia. None. You really need to contact your Dell rep to have them go over this with you. And again, it does not matter where you buy the license from, for example:
https://docs.netapp.com/us-en/hci-solutions/hcvdivds_nvidia_licensing.html [netapp.com]
https://www.cisco.com/c/dam/en/us/solutions/collateral/data-center-virtualization/desktop-virtualization-solutions-vmware-horizon-view/whitepaper_c11-739654.pdf [cisco.com]
https://h20195.www2.hpe.com/v2/GetDocument.aspx?docname=a00059765enw [hpe.com]
That's apparent
That was a hypothetical. I have never, in my career, been employed by a end-user company. I've been implementing, selling, and architecting Citrix and VMWare VDI solutions since the Winframe days.
(Score: 2) by fakefuck39 on Friday October 30 2020, @10:48PM (2 children)
Yes, and the CCU license comes in a base package that sets a max. usually 96 or 192 CCUs. That does not make it a per user license. Just like your 10Gb/month plan does not make it a per-byte plan. What is it you're not getting here?
When you buy a 5-seater car, that is the max you can fit in the car. It's not a car that charges you per seat. I'm really boggled the mind gymnastics you're doing here. Yes, you can buy grid licenses per user. But it is not limited to that - you buy a package with a max CCU included. That is not a per-user license. A per user license is about $350, per user. That is not the only option, nor a common option.
Seriously, what is it you do not understand here?
(Score: 2) by EvilSS on Saturday October 31 2020, @03:46AM (1 child)
(Score: 2) by fakefuck39 on Saturday October 31 2020, @10:47AM
and now you lose. backed into a corner, instead of addressing the actual points, you have nothing.
btw, I don't believe you for a second from the way you talk that you build or sell anything. you're a customer, and someone from a VAR showed you a configurator once, then you read some pdfs. you say you quote out UCS. that's a great test, since google won't give you the answer. to build a UCS config and quote it out, what is the url of the cisco site?
nice try, linux admin at midsize commercial account.
(Score: 0) by Anonymous Coward on Thursday October 29 2020, @08:45PM (1 child)
*sigh* ... if only they wouldn't use sooo much power. shhesh >300 Watt?
maybe wait for the die-shrink? else i am gonna go spend the money on 2 square meters of silicon that MAKE energy instead of consuming it? or so sayZ alyx in half life 3 ^_^
(Score: 2) by takyon on Friday October 30 2020, @01:04AM
Technically, they did die shrink. But they allegedly tried to play hardball with TSMC and got shafted onto Samsung's shitty "8nm" ("10nm" derivative) node instead. Then they put fast but hot GDDR6X on top of that.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Thursday October 29 2020, @12:33PM
The tech market, like the appliance or car market, always has halo products at absurd prices for rich stupid people. Then in a few years that level of features is mainstream. So you and I won't buy these new. But in ten years the equivalent performance will cost $250 (well, not counting inflation). This is the same reason I love the Tesla Model S - I can't afford one and I dislike Elon Musk. But the existence of the Model S means it's exceedingly likely that $20k long range electric family vehicles will exist eventually.
(Score: 3, Interesting) by sjames on Friday October 30 2020, @02:10AM
It seems backwards to me. We used to have $1500 PCs with $50 video cards...
(Score: 2) by fadrian on Thursday October 29 2020, @01:41AM (2 children)
The 6800 XT and 6800 will be available starting on November 18, while the 6900 XT will be available on December 8.
Given AMD's normal lack of capacity, when will people be able to get them without paying scalpers' prices? My Magic Eight Ball says "Better not tell you now."
That is all.
(Score: 3, Interesting) by EvilSS on Thursday October 29 2020, @02:21AM
(Score: 2) by takyon on Thursday October 29 2020, @05:52AM
We'll see, but AMD is said to have a lot more capacity on TSMC "7nm" than Nvidia has on Samsung "8nm". They could still end up selling every card they make though.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by Captival on Thursday October 29 2020, @03:24AM (2 children)
has total board power (TBP) been a euphemism? The fact that there's a term for it instead of just saying 'this card is 300 watts' makes me suspicious it means something else.
(Score: 5, Informative) by EJ on Thursday October 29 2020, @04:18AM
It's because of the ambiguity of the term GPU. When talking about the GPU, are we talking about just the chip or the entire card? Total Board Power is unambiguous. It says that the entire thing, memory, etc. consumes 300W.
(Score: 4, Interesting) by takyon on Thursday October 29 2020, @05:54AM
I think it's a shot at Nvidia, which has been exceeding TDP on Ampere. Some of the AIB Ampere cards are being pushed to 400 Watts.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]