AMD Radeon RX 6x50 XT lineup specifications confirmed, plus some early benchmarks

Ok, nice word salad and a hot take there. But still not a game feature.


What exactly is AMD’s version of ray tracing? And how is it easier to implement?
You must mean something other than DXR or VulkanRT. I mean, everybody knows AMD cards are not competitive there, right?


I guess you mean FSR? It doesn’t have a temporal component so it is of course barely comparable. And runs on Nvidia cards. I mean, if you’d want to.


I wouldn’t care to guess what you mean by that. Sounds like hand waving.
You’re wasting your time. These people will never accept the truth that GeForce absolutely humiliates Radeon on features. They are fanboys, they are emotionally attached to AMD. There is no point attempting to rationalise with them.
 
Synthetic benchmark
meh

Higher resolutions? nVidia couldn't even be bothered to put more than 12gb of VRAM on their 3080ti and they're charging $1200 for it. I also don't know if you've seen the new unreal engine demo, but ray tracing is going to be irrelevant soon. Ray tracing was a cool gimmik for like 4 years that no one could run anyway, DLSS or not. If I'm paying $1000+ for a graphics card I don't want to play at 1080p with DLSS and still get 40FPS with my "gsync"

1- Above 12GB is useless. 24GB on 3090 is not giving any advantage over 12GB 3080 Ti in any game. 3090 is slightly faster because of higher speed and thats it.


2- 1000 dollars can easily handle Ray tracing above 1080p 40fps

3060 Ti runs Metro Exodus enhaced edition with nromal RT at 1440p 60+fps without DLSS
https://static.techspot.com/articles-info/2447/bench/Metro.png

 
Radeon 6000 series is not really a match for the Nvidia 30xx parts. Sure at frames per dollar they compete when you strip away games to their bare bones but in terms of features and driver support the Geforce solutions reign supreme.
What??? I'm sure you lived under a rock since both companies released these cards last time I checked both companies have cards that compete very well with each other. Yea ray tracing is in Nvidia's favor but Nvidia is on their second release of this and AMD is on their first release of ray tracing and even then AMD pretty much matches Nvidia's first gen release of RT so not to bad. Besides that there are way more people out there not willing to give up FPS just to have some extra pretty lights and shadows and what not added to their games.

I always said RDNA2 was always held back by lack of memory bandwidth any ***** could see that because of how the infinity cache scaled with resolution being upped. If these leaks benches are anything to go by then with AMD adding faster vram and how from the leaks these RDNA2 cards scale to much greater performance then maybe AMD should have opted for faster memory from the start.
 
Yea ray tracing is in Nvidia's favor but Nvidia is on their second release of this and AMD is on their first release of ray tracing and even then AMD pretty much matches Nvidia's first gen release of RT so not to bad.
Get ready. I have said things very similar to that and, let's just say it didn't go over well. What I have said is that I give the 6000 series an A in RT and Nvidia a B-. And I said that is because even though this is Nvidia 2nd round, they didn't do near as well as was predicted leading up to the 3000 series.

I also said I would give the Radeons a reduced score if they don't do any better in their 2ud Gen at RT than Nvidia did, but few wanted to hear it.
 
Get ready. I have said things very similar to that and, let's just say it didn't go over well. What I have said is that I give the 6000 series an A in RT and Nvidia a B-. And I said that is because even though this is Nvidia 2nd round, they didn't do near as well as was predicted leading up to the 3000 series.

I also said I would give the Radeons a reduced score if they don't do any better in their 2ud Gen at RT than Nvidia did, but few wanted to hear it.
So the very essence of grading on a scale. Well this certainly explains how a lot of people justify defending AMD’s poor copies of other people’s innovations.

AMD barely matching Turing, which not coincidentally got a lot of undeserved flak at the time, is now an A for effort. Too funny.
 
Get ready. I have said things very similar to that and, let's just say it didn't go over well. What I have said is that I give the 6000 series an A in RT and Nvidia a B-. And I said that is because even though this is Nvidia 2nd round, they didn't do near as well as was predicted leading up to the 3000 series.

I also said I would give the Radeons a reduced score if they don't do any better in their 2ud Gen at RT than Nvidia did, but few wanted to hear it.
Yea it seems a lot people are just don't want to hear anything that is not the same as their own opinion these days and god forbid someone has an opinion of their own. With AMD matching and maybe slightly better than NV's first gen RT I say good effort on AMD's part being RT was not really their goal with the RDNA2 cards in the first place and was added in late in the dev cycle.
I personally don't care about RT right now because it sucks the life out of your FPS for a few trinkets of special effects yay team.

RT in the future will be a huge part of games and a lot bigger in the industry but for right now its just a new toy nothing more. I tell those that disagree call me when they make hardware that game handle RT without plunging the FPS and not having to rely on gimmic's like DLSS or FSR to make it look like your getting better FPS at the expense of graphics quality.

Speaking of which those that praise DLSS and scorn FSR saying oh FSR looks so much worse than DLSS. CLearly they were not around when DLSS 1 was released because it was pretty fugly and most all reviews said that as well. FSR according to most unbiased reviews said pretty much FSR was leaps and bounds better than DLSS 1 and quite a few releases after as well. But they all seem to forget just how bad DLSS was when it was released.
 
So the very essence of grading on a scale. Well this certainly explains how a lot of people justify defending AMD’s poor copies of other people’s innovations.

AMD barely matching Turing, which not coincidentally got a lot of undeserved flak at the time, is now an A for effort. Too funny.
Here is the problem, man. People base opinions on experience and not hearsay. Your defense of a company borders on the fanatical and has already crossed into propaganda. I know you don't believe everything you are shoveling but the thing is, neither are the people that know better.
 
Here is the problem, man. People base opinions on experience and not hearsay. Your defense of a company borders on the fanatical and has already crossed into propaganda. I know you don't believe everything you are shoveling but the thing is, neither are the people that know better.
The opinions do not seem to be shaped by experience or technical considerations as much as by certain preconceived notions, though. This argument that AMD should be judged in context rather than solely on merit is a good case in point, and one that is quite common in the AMD underdog hive mind that is so vocal on this site.

As for fanaticism, I’m surely passionate about rendering technology, but there are others in this topic that are far more prolific and present in almost all GPU topics, with a reliable one-sided perspective. I am doubtful that this is really your actual problem. Man.
 
The opinions do not seem to be shaped by experience or technical considerations as much as by certain preconceived notions, though.
And that is your problem. You are the one still claiming a lack of features and even more ridiculous, long ago driver problems.
As for fanaticism, I’m surely passionate about rendering technology
You being a fan of rendering tech is fine, but it is overshadowed by the way you act like Nvidia is your mommy. And you completely ignore that the Radeons are every bit as fast as their Nvidia counterpart and even cost less. One of the other Nvidia zealots even went so far as to say they "reign supreme" which they obviously do not.

EDIT - Don't want to go off the rails here Beerfloat, but I just found out that the Gigabyte Gaming OC 6900 XT is priced at MSRP ($999) at the Egg. And the 3090 is very close to its MSRP.
 
Last edited:
Nvidia's 4000 series will be the company's worst product launch to date, from both a competitive and compatibility standpoint. Not only is its monolithic design and PCIe 4.0 going to hurt, but the power draw will be a huge downfall. Hundreds of thousands of consumers have high quality 800-1000W PSUs. This would be just fine under normal circumstances. However, with Jensen desperately wanting the performance crown, he's decided to sacrifice these consumers by pushing out 600W+ flagship cards which, in turn, will be met with system errors/crashes due to inadequate PSUs. It's going to be a real **** show. Negative press and damage control will be monumental. All for a title Jensen never truly stood a chance at winning next round.

And I will not pretend that all PC gamers are inclined enough to know better in regards to PSUs, because that's far from the truth. The unfortunate consumers who purchased pre-assembled gaming rigs are going to be hit the hardest as they will think their bloated systems with an RTX 3070 and a proprietary 1000W PSU will be enough to push a 4080/4090. They will be sadly mistaken.

On the other hand, when an RTX 4090 gets installed into a system capable of thoroughly supplying the power it needs, only compute heavy customers will be truly happy. Gamers will come to the harsh realization that 100 TFLOPs doesn't equate to epic gaming performance. All of the heavy voltage and high wattage will not be enough to dominate at the highest level.

Meanwhile, the RX 7900 XT is going to outpace the RTX 4090 quite spectacularly. Why? Well, a flagship monolithic card cannot compete with a flagship MCM card from an established company like AMD. While the 4090 will surely outclass the monolithic 7000 series designs, it will struggle and often lose against the lower tier RX 7800 MCM card. This means, of course, that there is no chance the 4090 gets close to the 7900. Jensen is pumping everything he's got into the RTX 4090, but his efforts will be regarded as a futile attempt at best.

I'm sure there are plenty of optimistic thoughts concerning DLSS. Truthfully speaking, it's not a huge advantage anymore. FidelityFX Super Resolution closed the gap to the point that the proprietary implementation of DLSS is far less attractive. Once FidelityFX Super Resolution 2.0 with Temporal Anti-Aliasing is released, the momentum will shift even more into AMD's favor, regardless of the latest generation of DLSS.

Nvidia will surely keep the advantage over AMD in Ray Tracing next round. To what extent is anyone's guess. Unless AMD reveals dedicated RT cores for the 7000 series, Nvidia will retain a sizable lead. One thing to note is that upscaling technology from both companies will make RT games playable more than ever before.

Also, contrary to what uninformed consumers are led to believe, the 256-bit bus width is not the 6900XT's Achilles heel. That honor goes to slow VRAM and the small amount of Infinity Cache. 128MB is simply not enough. Despite its shortcomings, at 1080p, 1440p, and 4K, the RX 6900 XT still trades blows with the RTX 3090, winning in many titles. However, the Infinity Cache limitation does show at 4K. If the rumors are true, that's all about to change significantly. The 128MB cache has been moved from the GPU core die to separate cache dies (128MBx4). Each cache die is 3D stacked onto the GCDs (2 on each). This offers a total of 512MB. The performance uplift and sustainability will be incredible.

At the end of the day, many people are expecting Nvidia to stay on top. The reality is that the green team won't take the performance crown next round. AMD will dominate. If Nvidia isn't careful, AMD may run away with the crown for years to come.

This is all my humble opinion, of course. I actually can't wait to see what happens as I love a good battle. I'll surely keep an eye on this thread and others like it right up until reviews are in. May the best team win.
 
Last edited:
How legit are these leaks? This implies the 6950 is 20% faster than the 6900 which would be wild.
I feel you should not expect 20% increase in performance. The reason for these significant bump in performance number is because of the skewed benchmarking method in 3D Mark. Clearly the use of the 58003DX benefited the numbers greatly since 3D Mark generally runs a "physics" test in their bench. So just by having a better processor is enough to impact the numbers, despite using the same GPU. So the numbers you see in 3D Mark generally have little bearings in real life usage where anything above 1440p is mostly GPU bound, and having a very fast CPU will not make a meaningful improvement in performance. That is why I will not use 3D Mark to gauge performance because it is that bad and irrelevant. The whole idea of the 3D Mark "benchmark" is more to make you upgrade your hardware often because it gives you the sense that its slower than others using the same GPU, so people will go out and upgrade their system. But in games, you are highly unlikely to see that same kind of performance increase in games. The same applies to their SSD benchmark that have very little bearing on gaming if you actually consider the tests in the bench. I've just looked at what's being tested and I've never bothered to run it since it is a waste of power and time.
 
I don't see the fan of siding with either AMD or Nvidia. Both camps have their pros and cons. It is true that from a feature standpoint, AMD's card are at a disadvantage, for example, ray tracing. However, not everyone is looking to use RT since the performance lost is quite significant. I started off with a RX 6800 XT, and I thoroughly enjoyed using it. Most of the games I play don't use RT anyway and I never really find it a must to have at any point in time. The only RT games I've tested are Control, Guardians of the Galaxy and Metro EE (Metro Exodus EE is more to see the improvement over the previous version). While the RX 6800 XT lost in RT and DLSS features (and ability to use the CUDA and Tensor cores), it makes up with a cheaper price (I got it at launch and it was indeed cheaper than the RTX 3080), and more VRAM, which is very welcoming considering that Nvidia only offered 10GB back then on their RTX 3080. However, the better value advantage is kind of hazy now because in my country, cards like the RX 6700 all the way to the top end 6900 XT cost more than the equivalent Nvidia Ampere card.
Anyway, be it the Ampere or the RDNA2 refresh here, I see little reasons in getting them. Most of the cards are pushed hard, so it is unlikely to see any meaningful improvement in performance. This is especially the case where we are expecting the next gen GPUs to be significantly more powerful and poised to be announced/ released later this year.
 
Back