AMD is showing off more of its own Radeon RX 6000 benchmarks, edging out the GeForce competition

I sold my 2080ti and was all ready to buy the 3090 until the (even more than 2080ti) ridiculous price to performance was revealed. Then I was all set on the 3080 until the 10gb vram gave me pause and then it was impossible to buy one anyway. Now I'm very enthused by the 6800 models (the 6900 is still very poor value in terms of uplift over 6800xt), but I'm worried that they won't be able to cut it in RT. So I've put in a PS5 preorder in the hopes that they can actually manufacturer enough to satisfy demand at or near launch and I can play Demon Souls and check out the new controller functions with Astrobot (I'd planned to buy the PS5 in a year or so after a price drop and more exclusives had shown up). If I can get a PS5 by Xmas I'll let the dust settle on the GPUs and pick one up in Spring.
 
Remember when the rumours were big Navi would be on par with the 2080 super lol.

I seem to recall that the rumours were that it was going to be better than a 2080 Ti, and so it appears so. I expected their best to slot in between a 3070 and a 3080. Perhaps with Rage mode off and Smart Access Memory mode off it might still be that way.

Still, I wouldn't be so fast with the victory lap until a broad range of reviewers get the cards onto test benches and turn on and off the special AMD only feature (SAM) as well as the "one click overclocking" thing in Rage Mode, and run them on Intel and older AMD CPUs.

The selection of games pretty much matched what I thought they would use, with the exceptions of Doom Eternal and Wolfenstein Youngblood. Kudos for picking games that the 5700XT didn't perform as well in. But, you can go and look at recent reviews and see how the 5700XT stacked up against the 2070 Super and 2080 and see why they picked games like Battlefield V, Gears 5, Call of Duty MW and the like. It seems like those games engines like the RDNA architecture. So, it will be interesting to see what the reviewers pick for games, as well as how the different CPU platforms run these cards.
 
I seem to recall that the rumours were that it was going to be better than a 2080 Ti, and so it appears so. I expected their best to slot in between a 3070 and a 3080. Perhaps with Rage mode off and Smart Access Memory mode off it might still be that way.

Still, I wouldn't be so fast with the victory lap until a broad range of reviewers get the cards onto test benches and turn on and off the special AMD only feature (SAM) as well as the "one click overclocking" thing in Rage Mode, and run them on Intel and older AMD CPUs.

The selection of games pretty much matched what I thought they would use, with the exceptions of Doom Eternal and Wolfenstein Youngblood. Kudos for picking games that the 5700XT didn't perform as well in. But, you can go and look at recent reviews and see how the 5700XT stacked up against the 2070 Super and 2080 and see why they picked games like Battlefield V, Gears 5, Call of Duty MW and the like. It seems like those games engines like the RDNA architecture. So, it will be interesting to see what the reviewers pick for games, as well as how the different CPU platforms run these cards.

Considering the 6800xt was supposedly benched with neither rage or SAM enabled and was still well matched to a 3080, I would expect the 6900xt to be slightly slower than a 3090 with no features enabled.

And frankly "rage mode" is just allowing the 6900xt to try and consume as much power as a stock 3090.
 
It will be interesting to see how AMD drivers hold up with this new release. At this stage in GPU performance I would consider paying the premium simply to be on nvidia's driver platform and software suite though admittedly it's been a few years since I threw in the towel with AMD.
Use pro driver instead of the adrenalin
 
Considering the 6800xt was supposedly benched with neither rage or SAM enabled and was still well matched to a 3080, I would expect the 6900xt to be slightly slower than a 3090 with no features enabled.

And frankly "rage mode" is just allowing the 6900xt to try and consume as much power as a stock 3090.

Even though AMD calls the 6900XT a "300W" card they call for a 750 W PSU for the 6800XT and a 850W PSU for the 6900XT? I loved the "it's too hot" and "it consumes too much power" comments from the AMD crew before the new cards were revealed though, that was fun. Now, we need some independent tests to see what is going on under the hood.

I'll wait and see the down under Steve and the right side up Steve reviews, and throw in some Guru3D, TPU, Hardware Canucks, and some others, before I hand anyone the crown.

This all should be fun. I don't envy the reviewers having to set up 3 test benches. A 5000 series CPU to test the SAM mode (and PCIe 4.0), an Intel platform (PCIe 3.0), and a 3000 series AMD platform (PCIe 4.0 no SAM possible), just to give people a full dive into the various architectures at play now. Maybe they just don't test the old AMD CPUs? But I would be interested to see because there are a lot of folks out there with these installed.
 
I'm very excited to read 6800 and 6900 reviews and comparisons in like for like test systems. While I'm a perfectly content 5700 XT owner, it is amazing that AMD made this big of a leap into direct competition with Nvidia's latest / greatest offerings.
 
Remember when the rumours were big Navi would be on par with the 2080 super lol.
Lmao.
Post Maxwell - 20% AMD dGPU marketshare.
Post Pascal - 30%
Post Turing - 22% AMD still not faster than 1080Ti

So at what point did you think AMD had this in the bag with Big Navi?
 
Ray tracing is a gimmick...
Yet both consoles are using it as selling points.
NVIDIA, AMD, ARM and two massive game engines are supporting it.
It's supported in the DirectX Ultimate API.

We're at the point where graphics performance is on the level of motherboards. How so? We have STRONG raster performance across the board now, meaning the differentiating factor now is drivers/software/features.

Although it was a long time ago, I don't remember devs or the studio getting death threats for making a game that took years to run with a single GPU at 1080p/60fps. That game is Crysis.

But Ray tracing has to be an instant success...
 
Nope. PCIe 3.0 is just fine.

Something did come up though sometime ago where we were warned about a AMD Radeon card that was locked to the x8 speed which if it was put on a board with PCI Express 3.0 slot then it would be slower than if it was put on a board with a PC Express 4.0 slot. Does anyone remember that?
 
Even though AMD calls the 6900XT a "300W" card they call for a 750 W PSU for the 6800XT and a 850W PSU for the 6900XT? I loved the "it's too hot" and "it consumes too much power" comments from the AMD crew before the new cards were revealed though, that was fun. Now, we need some independent tests to see what is going on under the hood.

I'll wait and see the down under Steve and the right side up Steve reviews, and throw in some Guru3D, TPU, Hardware Canucks, and some others, before I hand anyone the crown.

This all should be fun. I don't envy the reviewers having to set up 3 test benches. A 5000 series CPU to test the SAM mode (and PCIe 4.0), an Intel platform (PCIe 3.0), and a 3000 series AMD platform (PCIe 4.0 no SAM possible), just to give people a full dive into the various architectures at play now. Maybe they just don't test the old AMD CPUs? But I would be interested to see because there are a lot of folks out there with these installed.
I imagine the PSU recommendation difference between the RX 6800 XT & RX 6900 XT is to make sure the latter can be run with "Rage Mode" enabled w/o issue. I imagine at stock, power draw isn't going to be THAT different between the two cards. You'll probably still be able to get away with a good 650W PSU running a RX 6900 XT at stock settings.

(And for those unaware, "Rage Mode" simply bumps up the power limit & fan curve a tad for a +1-2% performance bump in exchange for a bit more power draw [it's NOT actually an "auto-overclock" as AMD misleadingly called it in the presentation. It doesn't touch the clock-speed settings AT ALL]).
 
Nope. PCIe 3.0 is just fine.

This is only for sure true in a pre-Smart Access Memory world. With SAM, PCIe bandwidth could have a VERY notable impact on the performance gain it can provide. (Because it provides a faster connection between the CPU & the GPU's VRAM). Though this is more likely to start making a real difference on down the line when developers actually start optimizing for the feature (in addition to things like Direct Storage).
 
Something did come up though sometime ago where we were warned about a AMD Radeon card that was locked to the x8 speed which if it was put on a board with PCI Express 3.0 slot then it would be slower than if it was put on a board with a PC Express 4.0 slot. Does anyone remember that?
That was the 4GB RX 5500XT. The reason for the performance gain is because it had only 8x PCIe lanes & it was regularly running out of VRAM. Thus, PCIe 4.0 helped it shuffle data back & forth into that limited VRAM pool faster. The 8GB model didn't see the same benefits as games could actually fit all their assets in VRAM at once.
 
RT drags even Nvidia's own cards down so hard, that the tech remains a question mark itself on how to progress. Even Nvidia itself might need a few more generations to polish up it's own introduction to play at a respectable framerate without lowering the resolution or details.

I wouldn't call RT is the thing to look for until quite some time to come, IF it ever takes off.

Like VR, it remains a niche area.
RT it`s a niche, like VR, really? I strongly disagree. It`s a step closer to what we wanted for so long: photorealism. Once you make cards fast enough to drive 80-100fps in 4k, there`s really little incentive to go 8k or beyond. It`s like a camera with 1000 megapixels, the image is bigger, but still sucks. Improving graphics quality is the future. And I`m not saying it like they said "VR is the future", I`m saying it because this has happen from the beginning of games ever.
PS: the most anticipated game is Cyberpunk2077, do you want to buy a top card that will take advantage of all it has to offer or just more fps? Because than I can argue to play it on medium settings and buy a regular card...
 
Last edited:
Keen to see if there’s any overclocking headroom on these, or if like the 3000 series they’re already running at or near the absolute max frequencies. A 5% gain from a manual OC could push it beyond the 3090 in all games (with RT off) so could take the crown.
 
RT it`s a niche, like VR, really? I strongly disagree. It`s a step closer to what we wanted for so long: photorealism. Once you make cards fast enough to drive 80-100fps in 4k, there`s really little incentive to go 8k or beyond. It`s like a camera with 1000 pixels, the image is bigger, but still sucks. Improving graphics quality is the future. And I`m not saying it like they said "VR is the future", I`m saying it because this has happen from the beginning of games ever.
PS: the most anticipated game is Cyberpunk2077, do you want to buy a top card that will take advantage of all it has to offer or just more fps? Because than I can argue to play it on medium settings and buy a regular card...
I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.

And no, gaming will advance further beyond 4k and even 8k.
 
Yeah, 34 games is a handful and more are coming, among which is the game most people will going to buy cards for: Cyberpunk 2077. Also, the best shooters support RT: CoD:MW and Cold war, BFV, Metro:Exodus, Fortnite.

Remember when HairWorks was the next big thing? Lol
 
Yeah, 34 games is a handful and more are coming, among which is the game most people will going to buy cards for: Cyberpunk 2077. Also, the best shooters support RT: CoD:MW and Cold war, BFV, Metro:Exodus, Fortnite.
You're counting games that are not even out yet. Read the article again.
 
I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.

And no, gaming will advance further beyond 4k and even 8k.
I never said gaming will be stuck on 4k, it`s just not that practical beyond. I almost can`t tell the difference between 2k and 4k on a 32 inch monitor. To really see 8k you`ll probably need a 50+inch monitor which is lame, because it`s as big or bigger than your average tv. Anything less and you won`t be able to play without squinting.
Your arguments against RT are last year`s. NVidia already delivers playable RT framerates in 4k with 3080 (check DXR benchmarks) for all games that have the feature, so it`s not like "oh, it`s so new and nobody uses it anyway". Competitive shooters are mainly played in lower resolutions (why? poor resolution, bigger targets), which for CoD:MW for example, RT activated 1080p, the 3080 pulls 162fps and for 2k 124fps, meaning it`s very much playable. For BFV is 1080p 125fps and 2k 95fps. I imagine is much better for Fortnite, but I couldn`t find any benchmarks.
 
Back