Nobody will sway you of your deep delusion that RT and DLSS mean nothing. Those are the very reasons why GeForce is absolutely destroying Radeon right now. You are so deeply incorrect about these technologies that there is simply no way you have actually ever used them. It’s quite pathetic really. However I have absolute no doubt in my mind at all that when Radeon eventually catches up on RT and AI compression that you will like it and use it.
But you are unlikely to disagree with hard numbers and as you can see from the results in this very survey that AMD sell about one Radeon for every 10 GeForce parts that Nvidia sells at least. The market isn’t stupid. If they were they would buy Radeon...
I find it hilarious that Nvidia has dedicated - D E D I C A T E D - RT cores to handle RT and without the help of DLSS downgrading the image to lower resolution and then leaving a motion like blur after effect once the image is upscaled, Nvidia cards take a huge hit to their raytracing performance (perhaps DLSS 3.0 will eliminate that blurry after effect that 2.0 and 1.0 have...albeit, 2.0 is certainly much improved over 1.0)
Without help from DLSS Nvidia suffers from a lot of performance loss, not as bad as AMD, but it is very noticeable.
As for AMD, they don't have DLSS and nothing yet released to counter it and their RT performance on the 6800XT tends to fall between what a 2080Ti can do and what a 3070 can do.
As it stands, neither company has any kind of bragging rights for such a new technology such as RT that's being utilized in games. Give both sides two more generations to hash things out. Until then, anyone that touts Nvidia is better (given that they have dedicated RT cores and still suck at providing high RT performance without the help of DLSS) is jumping the gun.
Two generations out, then you use X company has better RT performance over Y company....and don't forget that Intel might be part of the equation by then, they could very well be a contender in all of this in a couple of generations down the road.