You know XeSS and FSR are GPU independant ? - so you can choose as you like - if you have Nvidia you have one extra solution . Though where it matters is what developers do - or whether it matters .
It's still early days
RT is still in it's infancy for consumer GPUs - both AMD/Nvidia and probably Intel will bring massive improvements in their next iterations - plus I'm sure software cheats/techniques will come .
I know nothing about it - but I could imagine one - instead of running 4K RT say on a water texture in real time . Run a quick Monte Carlo simulation* at say 540p , using AI and a known database to build 4K image .
Monte Carlo simulator's build quite an accurate model - for nearly impossible or super long things to calculate - eg maybe 20 quick slightly varied 540p snapshots when combined give you a pretty accurate visual that AI can use with an existing database to build a pretty realistic 4K visual.
True, but I would choose XeSS at this stage and XeSS will still be better performing on Intel no doubt. Choosing Nvidia gives me all three solutions, but as I said I'm not writing off RDNA3 and I don't expect Arc 2 until we see RDNA4 and Hopper.
MLisD has said RDNA3 should bring big gains in RT, but so will Lovelace. AMD needs to improve RT at least 200%. as Lovelace will probably get at least 50% RT gains, but RDNA2 is so far behind it's embarrassing, so even 100% would be still a fair way behind Lovelace.