I find it hilarious that Nvidia has dedicated - D E D I C A T E D - RT cores to handle RT and without the help of DLSS downgrading the image to lower resolution and then leaving a motion like blur after effect once the image is upscaled, Nvidia cards take a huge hit to their raytracing performance (perhaps DLSS 3.0 will eliminate that blurry after effect that 2.0 and 1.0 have...albeit, 2.0 is certainly much improved over 1.0)
Without help from DLSS Nvidia suffers from a lot of performance loss, not as bad as AMD, but it is very noticeable.
As for AMD, they don't have DLSS and nothing yet released to counter it and their RT performance on the 6800XT tends to fall between what a 2080Ti can do and what a 3070 can do.
As it stands, neither company has any kind of bragging rights for such a new technology such as RT that's being utilized in games. Give both sides two more generations to hash things out. Until then, anyone that touts Nvidia is better (given that they have dedicated RT cores and still suck at providing high RT performance without the help of DLSS) is jumping the gun.
Two generations out, then you use X company has better RT performance over Y company....and don't forget that Intel might be part of the equation by then, they could very well be a contender in all of this in a couple of generations down the road.