AMD Radeon RX 7900 XT Re-Review: What Should Have Been

5 years later, ray tracing is supported on few games
it was more than a hundred a year ago already.

169 currently supported games are more than a "few".
yeah.

If AMD were to improve their RT performance, you can bet the AMD crowd would be touting that over NVidia in a heartbeat. One wonders why there was no RT testing in this article?
wait for 7800xt/7700xt reviews, rt will suddenly matter for 6700xt owners, despite 6800 beating 7700xt with more vram in rasterization.
"Nonetheless, we believe this generation was destined to fail. It failed because the crypto boom ended overnight"

No, this generation was destined to fail die to:
- greed

- planned the silicon and price tag more to "desperate" crypto miners and AI clients, than for their loyal costumers: gamers and small multimedia businesses

- greed, greed, greed

Based on those aspects, I refuse myself to buy any current generation (Nvidia, AMD or Intel) chip, as I like to show that I don't like to taken for idi@t and I don't like to be scr.w.d.

Everyone is free to do whatever they want, but the more we buy this gen and help these companies to maintain high costs and to get rid of stock, the more they feel freedom to treat us badly. This way (not buying) they can rethink their attitude and perhaps rebadge the current gen with a lower price tag.
QFT. Got a used, mint condition 3080 and 6800 in apr/may, 789 altogether, 1.5yrs warranty left. Waiting for another shortage to buy myself a small city car off selling these.
 
Last edited:
169 currently supported games are more than a "few". And, I do not think RT is a useless feature. If I pay the same price for a GPU and raster is within 10%, then the one with better RT performance is likely the one I would go with. When I'm pushing 150+ fps in a game, losing 20 or 30 fps for RT isn't a big deal.

If AMD were to improve their RT performance, you can bet the AMD crowd would be touting that over NVidia in a heartbeat. One wonders why there was no RT testing in this article?
According to Nvidia website: https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/

There are SIX games that have Full RT. That's few TBH. There are more games that have "support" for RT, but many of those are very old titles and that "support" is very limited.

150+ FPS? Cyberpunk 2077 runs 76 FPS 1440p high RT ultra on RTX 4090. So basically even RTX 4090 cannot do proper ray tracing on almost three year old game. Simply put, RT on current cards (including most expensive one) is way too slow to have any effect.

When RTX 2000 -series launched, it was supposed to be good "because it has Ray tracing". Now we know RTX 2000 -series RT is totally useless. And 4000-series RT is also way too slow. Basically we need at least two times RTX 4090 RT performance for it to have some effect. And that will take years. Like it or not.

AMD will improve RT performance on 7800 series (compared to 6800 series). I didn't bother to wait 7800 because 6800 is more than competitive on raster and RT is still useless.
 
it was more than a hundred a year ago already.
I expect game to fully support RT, not just "have some support that matters nothing". There you go https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/

As said previously, there are six titles that properly use RT.
wait for 7800xt/7700xt reviews, rt will suddenly matter for 6700xt owners, despite 6800 beating 7700xt with more vram in rasterization.
It didn't matter for me, I just bought 6800XT for fair price instead waiting for better (but still useless) RT on 7800XT.
 
There are SIX games that have Full RT.
and a hundred that have one/two effects.

As said previously, there are six titles that properly use RT.
and amd run like crap in any that uses 3/4 effects, with 7900xt matching 3080 10G

XXyAIbn.png
 
and a hundred that have one/two effects.
Exactly, and no-one cares about those small effects. Like this: https://www.tomshardware.com/news/hands-on-with-crysis-remastereds-new-ray-tracing-upgrade
Overall, Boost mode seems like a cool idea, but I'd like to see a more noticeable visual improvement if it's to graduate past the experimental feature phase. As it stands, the ray tracing boost is barely noticeable at all when actually playing the game.
Definitely game changing feature ;) But hey, it "supports RT" so one game added to list :D What's funny, it even doesn't require hardware RT support from GPU.
 
plenty of ppl do, reflections in spiderman games/wd legion looked amazing on higher presets, so did RTGI in Metro EE. Plenty more examples. What ppl don't care about is console level noisy quarter resolution "rt" (which is actually hybrid ssr/ssgi+some rt) like in resident evil/far cry.
Watch Dogs Legion for example uses DXR that doesn't use tensor cores found on Nvidia cards. And looking at benchmarks, differences are pretty small https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,19.html

Tbh I hardly see reason to pick Nvidia because of "better RT", considering price also. AMD even wins "most important" FHD resolution battle :p
 
Watch Dogs Legion for example uses DXR that doesn't use tensor cores found on Nvidia cards. And looking at benchmarks, differences are pretty small https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,19.html

Tbh I hardly see reason to pick Nvidia because of "better RT", considering price also. AMD even wins "most important" FHD resolution battle :p
tensor cores are not used for rt acceleration. rt cores are. seems pretty self explanatory

native fhd is dead as dodo. even 1440p fsr2 balanced looks a lot less pixelated and runs the same or better. same is 1440p native vs 4k dlssp

small difference - don't think playable vs unplayeable is small. 7900xtx runs 42 fps and has no upscaler support for wd legion. 4080 runs 10% faster and has dlss.
 
Last edited:
tensor cores are not used for rt acceleration. rt cores are. seems pretty self explanatory

native fhd is dead as dodo. even 1440p fsr2 balanced looks a lot less pixelated and runs the same or better. same is 1440p native vs 4k dlssp

small difference - don't think playable vs unplayeable is small. 7900xtx runs 42 fps and has no upscaler support for wd legion. 4080 runs 10% faster and has dlss.
Of course, point was that RT was probably only or mostly software one.

CPU tests always favour lower resolutions, that's why 1080P is "very important".

WD Legion is Nvidia sponsored title, therefore no FSR support. However when talking about RT performance, it's pretty abysmal showing for Nvidia. Even on Nvidia sponsored title AMD grabs crown on lower resolutions. Nv*****s make it sound like Nvidia is miles ahead on RT but in practice differences are pretty small. Leaving out some Nvidia sponsored titles of course.
 
Of course, point was that RT was probably only or mostly software one.
isn't.

Even on Nvidia sponsored title AMD grabs crown on lower resolutions.
lol, who buys a 7900xt for 1080p ? check 4K.
1080p native is absolutely DEAD with how good upscalers (fsr/fsr2) or reconstruction (dlss/xess) have become.dlss at 1440p is better than native 1440p more often than not (10 won, 9 lost,5 tied). If you count more precisely, it's 17 vs 11 in terms of image quality pluses over native 1440 (50% better in general). and it runs like 1080p native + taa in terms of performance. it'll crush 1080p native in 100 out of 100 cases in terms of image quality for performance difference.
This wasn't even manually changed to 2.5.1 and it's already a lot better than native target 1440p.
To properly emphasize how poor 1440 native is these days compared to dlss, dlssq wins 10 titles vs 1440p, but only 9 at 4K and plus vs plus ratio is just 16 vs 15, not 17 vs 11 like at 1440p.

sUyrpwP.png
 
Last edited:
It seems I was looking at other game on that.
lol, who buys a 7900xt for 1080p ? check 4K.
1080p native is DEAD with how good upscalers (fsr/fsr2) or reconstruction (dlss/xess) have become.dlss at 1440p is better than native 1440p more often than not (10 won, 9 lost,5 tied). and it runs like 1080p native + taa in terms of performance. it'll crush 1080p native in 100 out of 100 cases in terms of image quality for performance difference.
This wasn't even manually changed to 2.5.1 and it's already better than native target 1440p

sUyrpwP.png
Still, if Nvidia has much better RT, it should win 1080P too. 4080 barely wins on 4K so Nvidia has very small lead.

I expect those are just opinions. Some may not like "better" quality to be actually better.
 
It seems I was looking at other game on that.

Still, if Nvidia has much better RT, it should win 1080P too. 4080 barely wins on 4K so Nvidia has very small lead.

I expect those are just opinions. Some may not like "better" quality to be actually better.
cause it's one effect, where 7900xtx should still easily win if it's a better card. TBH, amd only look like they have made progress in rt, but what really happened is more games use console level "rt", where rasterization still matters more. If you measure real ray/path tracing, rdna3 only got worse in comparison to ada than rdna2 was to ampere.

Q5MD8mp.png


use real rt, even one effect, not console type and expect 3080 to match 7900xt more often than not. even at 3440x1440.

Z74nu88.png
 
cause it's one effect, where 7900xtx should still easily win if it's a better card. TBH, amd only look like they have made progress in rt, but what really happened is more games use console level "rt", where rasterization still matters more. If you measure real ray/path tracing, rdna3 only got worse in comparison to ada than rdna2 was to ampere.

Q5MD8mp.png


use real rt, even one effect, not console type and expect 3080 to match 7900xt more often than not. even at 3440x1440.

Z74nu88.png
AMD decided not to invest on RT and decided to stick with "RT" because RT is too slow even with RTX4090. You would expect Nvidias top card to be enough to run RT on 2020 published game (Cyberpunk 2077) without frame guessing crap. But no. How about future games?

No matter if Nvidia was 100 times faster on RT vs AMD, Nvidia is still too slow. Technology just isn't there yet. It will be some day but at that time current cards are long time obsolete. AMD just made wise move not to invest too much for still useless technology.
 
RTGI only
metro-exodus-rt-3840-2160.png


RTGI+RTAO+RT Reflections + RT Shadows

control-rt-3840-2160.png


and this is native 4K, dlssq it'll look same/better but run +90 fps in control and +150fps in metro.
Metro games have always been trash when it comes to optimizations. Control is Nvidia sponsored title.

Also neither of those Nvidia considers Full RT titles. Only modern Full RT title according to Nvidia is Cyberpunk 2077 and even RTX4090 is not enough for that. Slower cards would have been better to have only "RT" capability.

Summary: Nvidias 1500 dollar card is too slow for only at least somewhat modern Full RT game. That pretty much summarizes whole RT hype.
 
Metro games have always been trash when it comes to optimizations. Control is Nvidia sponsored title.
If it was amd sponsored it'd never get any rt, let alone rtgi+rtao+shadows+reflections, I don't get your point. complaining about a game looking better instead of worse ?
"full rt" in nvidia's naming is actually path tracing, and it runs fine (with dlss for now). see how amd runs it. 4090 dlss2 literally 4x faster than 7900xtx fsr2. My 6800 got 19-20 fps with fsr2 1440p ultra performance (looked like someting from crt-era, seen 800x600 look better) while 3080 10g gets 45-50 fps with 1440p dlss performance (not quite 1440p native, but a lot better than 1080p native)


difference in pc gaming is, nvidia tries to push cgi-level effects while amd wants to sell you console level IQ with enthusiast pc component prices.
in 5-7 years, nvidia will run cgi-level path tracing native at 60, and +120 fps with dlss, while amd will be where 4090 is now (not even sure it it goes this far with alleged reports of mcm derailing architectually). They're just not investing money into r&d but subsituting it with pushing more vram onto cards. At one point when nvidia does a 20g xx70 card, amd will be just powerless to respond. They could do it now, but leather jacket guy is still making off like a bandit while amd lets him. 7900 gre (80cu navi31) beating 4070 (actually cut 46SM ad104 that should have been 4060ti) by just 10% overall is just more proof that nvidia doesn't even need to take them seriously, so they don't. it's like 6800xt beating 3060ti by just 10%. wait till you see 6800xt match 7800xt and 6800 beat 7700xt.

to sum up - either 1080p or no rt belong on ps4, not on 2023 enthusiast cards. dunno why you're bringing it up in 7900xt review.

best regards anyway, hope you'll appreciate progress a little more.
 
Last edited:
If it was amd sponsored it'd never get any rt, let alone rtgi+rtao+shadows+reflections, I don't get your point. complaining about a game looking better instead of worse ?
"full rt" in nvidia's naming is actually path tracing, and it runs fine (with dlss for now). see how amd runs it. 4090 dlss2 literally 4x faster than 7900xtx fsr2. My 6800 got 19-20 fps with fsr2 1440p ultra performance (looked like someting from crt-era, seen 800x600 look better) while 3080 10g gets 45-50 fps with 1440p dlss performance (not quite 1440p native, but a lot better than 1080p native)
Metro series have always been badly optimized. No matter if AMD sponsored or not, still run like trash. Not complaining about game looking better but RT is still overhyped. It could some day be excellent but today it's just not.

Path Tracing or as Nvidia calls it is still very uncommon. Five years after RTX cards launched there is six PT or "full RT" games. That's 1.2 per year. Yea, there are few games that look bit better but I cannot see why I should bother with RT performance when buying new GPU. Just one marginal feature, nothing else.

difference in pc gaming is, nvidia tries to push cgi-level effects while amd wants to sell you console level IQ with enthusiast pc component prices.
in 5-7 years, nvidia will run cgi-level path tracing native at 60, and +120 fps with dlss, while amd will be where 4090 is now (not even sure it it goes this far with alleged reports of mcm derailing architectually). They're just not investing money into r&d but subsituting it with pushing more vram onto cards. At one point when nvidia does a 20g xx70 card, amd will be just powerless to respond. They could do it now, but leather jacket guy is still making off like a bandit while amd lets him. 7900 gre (80cu navi31) beating 4070 (actually cut 46SM ad104 that should have been 4060ti) by just 10% overall is just more proof that nvidia doesn't even need to take them seriously, so they don't. it's like 6800xt beating 3060ti by just 10%. wait till you see 6800xt match 7800xt and 6800 beat 7700xt.

to sum up - either 1080p or no rt belong on ps4, not on 2023 enthusiast cards. dunno why you're bringing it up in 7900xt review.

best regards anyway, hope you'll appreciate progress a little more.
We already know if Nvidia was right or not. Five years, one AAA game with PT/Full RT. ONE. AMD was right, it's not yet time for RT. Technology just isn't good enough yet. AMD does offer decent RT capabilities that are more than enough for current market.

Oh, now you tell AMD will continue same path for 5 years. Or perhaps AMD will invest more heavily for RT when it's actually usable speed wise. AMD does not take seriously because market has decided to buy Nvidia even if AMD is better. So why bother. On CPU side market finally accepted that Intel is not better just because it's Intel.

As said many times, there is ONE AAA game with good level RT. Not much to bother, isn't there? There has been many new GPU technologies supported on handful of games. But single AAA game in 5 years. Sorry for repeat but that just proves RT hype went over and has been almost total disappointment so far. I wonder why we are even talking about RT being any meaningful technology right now.
 
Metro series have always been badly optimized. No matter if AMD sponsored or not, still run like trash.
Metro EE runs like a dream on 3080, rtgi+rt reflections on. so it did on 3060ti, with dlssq looked great and got +80 fps.

Path Tracing or as Nvidia calls it is still very uncommon. Five years after RTX cards launched there is six PT
Path Tracing is only a thing after Ada, Turing and Ampere were for Ray Tracing. Get your information straight. CP2077 is the first PT AAA game, and it only got a PT patch this year after 4090 launch.

perhaps AMD will invest more heavily for RT
And cover 10 years of r&d in one gen ? not how things work.

As said many times, there is ONE AAA game with good level RT
that's your opinion said ten times over in this thread, not tens or hundreds of peoples opinion. many ppl love rt in spierman, metro,control,guardians of the galaxy, doom eternal,dying light 2,modern warfare 2019 and many, many others.
Already told you, just wait and see amd buyers justify buying 7800xt over cheaper and faster 6950xt just cause it runs one or two rt effects at the same time 1% faster.
and in rasterization it most certainly will be slower than 6800xt even. Not saying it will be you buying them over rdna2, but it will happen. And since you care about what is said many times.....
7900gre is only 12-15% faster than 6800xt while having 33% more CUs than 7800xt.
 
Last edited:
Metro EE runs like a dream on 3080, rtgi+rt reflections on. so it did on 3060ti, with dlssq looked great and got +80 fps.
That doesn't really tell anything about it. Even badly optimized game runs well if there is enough computing power available.
Path Tracing is only a thing after Ada, Turing and Ampere were for Ray Tracing. Get your information straight.
Well whatever Nvidia means with "full RT".
And cover 10 years of r&d in one gen ? not how things work.
And why not? Or do you really think Nvidia did best RT implementation possible? Or was it more like "something that works and is barely enough for getting software support"?

AMD could easily make GPU that is miles faster than any Nvidia on RT. Die size, cost and other things are another question. Basically, it wouldn't make any sense right now. Who would want "superior RT card" that is either very expensive or suck at everything else?
that's your opinion said ten times, not tens of peoples opinion.
Not opinion, but fact. Even Nvidia agrees with me. I don't consider Minecraft or Quake 2 AAA games for obvious reasons.
 
The best deal on enthusiast graphics just dropped at Lenovo's website for the 4090 suprim liquid normally selling for $1749 can be bought for $1562.49 with free shipping fyi. Use code BUYMORELENOVO and EXTRA5
As far as the 7900xt the best price is $749 via pcpartpicker website.
Update we are moving in an upwards trajectory
$789.99 as of 8/12/23 for an Asrock 7900xt.
Source pcpartpicker website.
FYI I am not ai, I am human. 😌
 
Last edited:
I just don't get the pricing strategy. I was holding out for the new 7800XT. I assume that when it is released it is going to come in at the similar price to the 6800XT when released, or $649. However, I really don't see that happening. I was hoping for a $599 release price, and performance close to the 6950XT.

Having said this, it's a moot point as I just purchased an XFX 6950XT Merc from Amazon for $575 and got a free copy of Starfield worth $99 (which I was going to buy anyway)..... So effectively my current upgrade to 6950XT cost $476.....

And that's another thing. If I can buy the 6950XT at $575, just how much profit are the manufacturers and resellers making at MSRP?

BTW, this video card is seriously big, biggest card I have ever owned. Also the Merc is loud at full load unless you tweak the fan curve to 40% which results in load temps still below 72C. So, when are the fan curves so poorly designed for these expensive cards???
 
That 4090 it's way ahead of the rest of the pack. Like every new CPU released pushes the card even further. There is no current CPU that can max out the 4090.
What's funny is that Nvidia can slip 3-4 sku's from 4080 to 4090 with that gap.
 
Back