AMD Radeon RX 7900 XTX vs. Nvidia GeForce RTX 4080: The Re-Review

What isn't there are driver issues;

Forcing disabling of vsync via Nvidia control panel:

Works in STALKER Shadow of Chernobyl; FPS uncapped

Forcing 16xAF via Nvidia drivers:

Works in TES IV: Oblivion

Works in Operation Flashpoint

Playing Combat Mission games on 2014 engine with Nvidia driver:

Works fine

vs

Forcing disabling of vsync via AMD control panel:

Doesn't work in STALKER Shadow of Chernobyl; FPS locked at monitor refresh rate

Forcing 16xAF via AMD driver:

Doesn't work in TES IV: Oblivion

Doesn't work in Operation Flashpoint

Playing Combat Mission games on 2014 engine with AMD Driver

Crashes when loading 3d engine

I've verified this with R9 290X, Vega 64, and RX 7900XT AMD cards and GTX 780Ti, GTX 980Ti, GTX 1080Ti, and RTX 3080Ti Nvidia cards. AMD drivers significantly degrade user experience as outlined above; so at a certain point the relatively lower cost of AMD starts to become irrelevant when you realize how inferior AMD drivers are and that they continue to ignore significant broken compatibility as demonstrated with their breaking of Combat Mission drivers about 2 years ago or so. Not acceptable.

I'd hate to go full team green, but issues like this simply are not acceptable in 2023.

The experience has been reminiscent of trying to use my 3dFx Voodoo 4 and 5 cards in Windows XP with 3rd party drivers.

Concerning non broke driver game performance @Max to ultra settings the story is far less rosily in favor of the 7900XT.

On my end with i9 10920, i7 10700KF all cores @5.129Ghz, i7 6850K@4.2Ghz, i7 5930K@4.5, i7 980X@4.5 , i7 960@4.1, and i7 920@3.8 the difference in FPS is marginal with my 7900XT and RTX 3080Ti. On my 10920X with per core turbos locked and XMP3200Mhz, across a 42 game benchmark tally the difference is practically margin of error close in favor of the 7900XT@3440 (140 FPS vs 133FPS) and in similar favor of my 3080Ti @1080p (172 to 171) both results akin to margin of error.
 
Can you quote the actuals not the percentages? As from your example it is possible the Intel chip is doing something with RT performance of AMD GPUs?
To prove that either Intel flagship i9 13900K cpu or AMD flagship cpu 7800X3D is doing anything to the rt performance of 4080 and or 7900xtx we would need a direct side by side, ( ideally the same everything except the cpu/motherboard).

To summarize.
Is the i9 boosting the 4080 rt performance or is it holding back the 7900xtx rt performance. Is the 7800X3D holding back the 4080 rt performance or is boosting the 7900xtx rt performance?
As a 4090 owner on the am5 7800X3D platform I'm actually very interested in this. Also would throwing more cores via 7950X3D change anything? 🤔.

Steve, I know you spent countless hours benchmarking these, which is much appreciated. I love your work! No pressure 😅.
 
Why the hell anyone compares the 7900xtx to the rtx4080 when in 90+% of markets the 7900xtx is priced the same as the rtx4070ti and for that money it demolishes it, meanwhile the rtx4080 is up to $700 more than the 7900xtx and let's not forget that the rtx4090 in most markets is more than double the cost of the 7900xtx with the amd offering nipping at it's heels and near matching it for performance in some applications. Alls I can say is from a neutral stance comparisons showing the 7900xtx vs the rtx4080 or rtx4090 are dishonest as they are deliberately putting cards way more expensive up against a card that is much more affordable. People wanna know where there money is going not what card can beat another for ~$600 more lol
90% of markets? Not in the US. The XTX is running mostly around $1000. The cheapest models I saw in a quick look up on Amazon is $920. 4070Ti is running around $800 with the low price being about $769. The 4080 is running mostly around $1100-ish, so about $100 more than the XTX.

Once you turn on RT, the 4080 makes sense. The 5-10% performance differences in raster is not as meaningful when you're already pushing 150+ fps. When you're below 100 fps, it's more noticeable, IMHO.

Either card is a good choice. I would go Nvidia, just for the RT performance, because we don't know what new games will require to get good RT performance.
 
As I have posted in your Cyberpunk 2.0 benchmarks article, please start benchmarking dual 4k. After having my Samsung G9 57" for a month now, I am convinced that ultra-wide dual 4k, with ray tracing and leveraging upscaling technologies, is the future of gaming.

That is a super niche resolution that benefits almost nobody to benchmark. 61% of Steam users still use 1080p. Standard 4K is only 3.3%. Double wide 4K is such a small amount that it didn't even make the list, so it's less than .23%.
 
I don't know, the 7900 XTX has 2500 more cores, a massive amount of l3 cache and a wider memory bus. I think you see that a little in the 4K results (likely due to that memory bus), but the GPU should perform better compared to the 4080. Even though it may have improved some, it's clear AMD did not get the performance it wanted out of RDNA3. Also, with the limited number of games here, a lot of them are the very ones that AMD performs better on.

If you take one particularly terrible game ("Ratchet & Clank") out of this survey, the 7900xtx probably pulls EVEN with the 4080 in RAY TRACING. And I do not think that "these games favor AMD" because Dying Light, Cyberpunk, and Ratchet & Clank are clearly NVidia-bigot games. The only clear AMD-favoring game is Starfield.

The choice of games in this review is HIGHLY questionable. Games like PUBG, Apex Legends, GTA V, and CS:GO are probably 10x more popular than the games where NVidia dominates on this list (Ratchet & Clank, Hitman 3, Dying Light - who plays those? They are not even in the top-200 on steam and average rank is about #1200!) - see : https://steamdb.info/charts/ I wouldn't be surprised if NVidia paid for this obvious Ad Placement.

The reason to buy an NVidia 4080 is VR and ray tracing. That's about it at the moment.
 
Last edited:
The same sh....t all over again,try to do a review on how many care or not ,the only thing that matters is the overpriced GPUS are for people that can afford them ,maybe this should be a column on FORBES NOT HERE!! Bored to death, tell something new !!
 
I really appreciate your review though, maybe, not many people will buy this gpu because the price is still too expensive.
where I live the prices of both gpu's are way over msrp..
 
I find this review extremely interesting, even more than the original one. I believe that the choice of games is correct because, even if they are not widely played (as one user said), these are titles released or updated after the release of these GPUs; maybe I'd have added also immortals of avenum. Personally, for better objectivity, I'd have removed the extremes from the percentage calculation. it should also be said that, at 4k rt and in absolute terms, the differences are often quite laughable: it is not a drastic change to go from 34 to 38 fps in Hogwarts Legacy, for example. it is also true that I don't think frame generation was taken into consideration, which would have increased the gap in favor of Nvidia. nevertheless I continue to believe that the 7900xtx is preferable to the 4080, in the hope that AMD releases the 8900xtx...
 
That is a super niche resolution that benefits almost nobody to benchmark. 61% of Steam users still use 1080p. Standard 4K is only 3.3%. Double wide 4K is such a small amount that it didn't even make the list, so it's less than .23%.
I 100% agree. It's a "now" or "future" discussion. We benchmark the RTX4090 [yes I know this review is 4080, I'm talking in general]. I couldn't justify that, but I could justify the G9 57, at Samsung launch education special price and allowing for distorted AUD pricing of RTX4 series when released. I'd say an RTX4090 with a dual 4k monitor is more indicative of the future than when mated with a 16:9 4k monitor. My point is, 4k is no longer the future goal of gaming, dual 4k is.

But, again, I do agree, the few of us who have the G9 57 can happily chat together on reddit.
 
I 100% agree. It's a "now" or "future" discussion. We benchmark the RTX4090 [yes I know this review is 4080, I'm talking in general]. I couldn't justify that, but I could justify the G9 57, at Samsung launch education special price and allowing for distorted AUD pricing of RTX4 series when released. I'd say an RTX4090 with a dual 4k monitor is more indicative of the future than when mated with a 16:9 4k monitor. My point is, 4k is no longer the future goal of gaming, dual 4k is.

But, again, I do agree, the few of us who have the G9 57 can happily chat together on reddit.
I doubt that. With Ray Tracing and Path Tracing becoming the next step in gaming graphics, getting to 4K 60fps is a struggle even for the 4090. The use of ultra wide monitors is also not that prevalent, most are still playing on a 16:9 aspect ratio. Additionally, most are not even playing on 4K, it's a small percentage. Just because you have a 4K ultra wide monitor, does not make it the future.
 
I doubt that. With Ray Tracing and Path Tracing becoming the next step in gaming graphics, getting to 4K 60fps is a struggle even for the 4090. The use of ultra wide monitors is also not that prevalent, most are still playing on a 16:9 aspect ratio. Additionally, most are not even playing on 4K, it's a small percentage. Just because you have a 4K ultra wide monitor, does not make it the future.
So what do you think the future is then? Real-life looking 1080P???? In 16:9 or 4:3? People spending $2000 USD on a GPU to run it on a $200 monitor??? You can't quote "the average gamer only uses x,y,z" in a review of the RTX4080 v RD7900XTX.

CP2077 on my RTX4080 running dual4k (32:9) with ray tracing Ultra and then path tracing with ray reconstruction looks unbelievable. But its currently only 40fps and unplayable. Resolution increases are not dead. The reason we have been stuck at 1080P and recently 1440P is that the GPUs could not handle it. nVidia know this and developed DLSS and AI. They are betting that resolutions will increase and their upscaling technologies will deliver their ray traced visuals to bigger and higher definition monitors. The monitor manufacturers have been making quicker and quicker gaming monitors (500Hz now ... seriously, you can see that???) because bigger means more pixels and GPUs couldn't handle that.

Yes I spent a stupid amount of money on a monitor. But in 2 years time they will be half that price and available from multiple manufactures. The RTX5 series will be nearing end of life and everyone will be looking towards the next thing.
 
Deal of the day for Prime Members: ZOTAC Gaming GeForce RTX 4080 16GB Trinity OC GDDR6X 256-bit 22.4 Gbps PCIE 4.0 Graphics Card, IceStorm 2.0 Advanced Cooling, Spectra 2.0 RGB Lighting, ZT-D40810J-10P https://a.co/d/iOBBb9I
$989
with Alan wake 2

FYI
 
What I know is that drivers and some features are as important as FPS alone.

- do you need JUST raster at best price and the apps or games that you need, work well with AMD? Go AMD

- do you need great RT, upscaling, stable drivers, drivers that are compatible with a lot of 3D, AI, video and photo editing apps, flexible drivers, great streaming, CUDA tech, resale value... then go Nvidia, even if it costs more.

Nvidia is a symbol of very little compromises, though it is more expensive; AMD is the cheaper option, though it has its target and value. If the stretch isn't too big, I would go Nvidia.
 
I did obscene levels of research before buying these cards. I ultimately went 4080 for me for the better live streaming capabilities and went with the AMD for my wife.

After months of owning both cards, the performance numbers in this article are spot on. There are some games we play together that I look over and her FPS is generally higher. This is running at high refresh 1440p and 1600p widescreen monitors respectively.

DLSS is truly awesome but is limited where FSR is more flexible.

What the article doesn't mention is the generally better recording and streaming done by the 4080. However, with newer tech like AV1 in YouTube for example the gape is narrowed. It also doesn't mention the most pressing problem my wife faces: the constant driver timeouts in dx12 games with the 7900xtx and the annoying workarounds you have to do with windows to prevent outdated driver installs. Nothing resolves the timeouts. Cleaning past drivers, disabling the annoying windows drivers, beta drivers, nothing. It's a constant headache that I assumed eas a thing of the past.
Same here, I actually just went ahead and purchased it after seeing that nothing looked to be changing until late 2024 or 2025 in the GPU market. I mostly went based on DLSS and RT performance. Although it's clear that the 7900XTX will get a 5-15fps lead on the 4080, it's not the kind of performance you'll clearly notice.

Additionally, if we're being honest, nearly every game today requires the use of DLSS or FSR to get above 60fps 4K. It's a reality that has no end in sight, so that narrows it down quick for me. My 6950XT does ok once FSR is turned on, but in most games it just doesn't look very good. I think the upscaler in UE5 looks better than FSR.

Turning to RT, although the lite RT games like RE4 tend to have the 7900XTX and RTX 4080 are fairly close, but once we get into real RT it's not even close, Turn on Path Tracing and the 7900XTX performs as good as a 2080ti. I think at $1000 that's unacceptable. The future is RT, PT, and upscaling, so to me, as someone who is actually spending my money on this, the 4080 is a no brainer over the 7900XTX.
 
That is a super niche resolution that benefits almost nobody to benchmark. 61% of Steam users still use 1080p. Standard 4K is only 3.3%. Double wide 4K is such a small amount that it didn't even make the list, so it's less than .23%.
Yeah, that's the sad state of gaming. That 4K is still in the sci-fi category, not something viable. Because we have games that struggle to maintain even 60 FPS in 4K, even without VSync, even on current flagship products. No wonder there's no 4K gaming. There's no GPU in existence to play them on.

We've been hearing since the GTX 900 series that it's "4K ready". It's been 9 years, but it's still the same, no 4K 60 FPS for you, only "frame generation", and "dynamic resolution", and all the other shortcuts to avoid making the games actually work in 4K, at all costs. They kinda, sorta, improve things, but it's still not good, only somewhat better.
 
Yeah, that's the sad state of gaming. That 4K is still in the sci-fi category, not something viable. Because we have games that struggle to maintain even 60 FPS in 4K, even without VSync, even on current flagship products. No wonder there's no 4K gaming. There's no GPU in existence to play them on.

We've been hearing since the GTX 900 series that it's "4K ready". It's been 9 years, but it's still the same, no 4K 60 FPS for you, only "frame generation", and "dynamic resolution", and all the other shortcuts to avoid making the games actually work in 4K, at all costs. They kinda, sorta, improve things, but it's still not good, only somewhat better.

I remember when the 1080ti came out and it was touted as THE 4k card. Now thanks to lazy devs, the 1080ti can barely run a 1080p game without upscaling (Alan Wake II, Remnant 2). But jokes on you, since latest DLSS doesn't work with it.
 
Last edited:
The RTX4080 will NEVER beat the 4090 in any game, ever....


But the Radeon 7900XTX beats both^ in several games. And is currently $749 less than the 4090's $1,800 price. Radeon is just a better choice for gamers, the only people buying high-end Ada are cuda workers.
 
Back