Nvidia GeForce RTX 4090 vs. AMD Radeon RX 7900 XTX: Is the GeForce Premium Worth It?

Let's just point out numbers of today's pricing...

7900 XTX --> 900-950$
4090 --> 1700-1900$

So, do you think that 20% more performance is worth almost a 100% price increase?

And even if it was only this... however, there is also this to consider...

12VHPWRs melting

PCBs cracking

So if you ask me if I can recommend buying a 4090... then my answer is an outstanding NO! I can`t do that as a PC enthusiast.
 
I still admire how Nvidia manages to squeeze those extra frames from their flagmans.
The only thing I wonder about is if 12pin was a dire necessity or not.
Well, obviously not for mid level cards, but I would really like to hear their engineers
on reasons why they actually implemented it.
 
It comes down to the resolution you want to play at.
4K 4090 is worth it. Anything else is pointless to spend the money.

4090's have been sold out in Canada for a while. I think sear numbers speak for themselves.
People want a 4090.

I've tried amd in the past and drivers were a major issue. Every generation there seems to be issues.

With many Intel and Nvidia systems I have rock solid performance. My system doesn't turn off for weeks and weeks on end. AAA gaming the whole time, no crashes no issues my hardware works.
Nothing worse than paying for to tier hardware and bc of crap software interface not being able to use it properly and consistently.
 
Let's just point out numbers of today's pricing...

7900 XTX --> 900-950$
4090 --> 1700-1900$

So, do you think that 20% more performance is worth almost a 100% price increase?

And even if it was only this... however, there is also this to consider...

12VHPWRs melting

PCBs cracking

So if you ask me if I can recommend buying a 4090... then my answer is an outstanding NO! I can`t do that as a PC enthusiast.
20% what are you playing at 1080p?

4090 are sold out as a PC enthusiast it's the only option. Or you're using the term PC Enthusiast improperly.
 
I don't understand why I have to keep explaining this. Ray tracing lowers the bar of entry, as you mentioned. However, it is the most heavily advertised feature in games. If the most advertised feature in games is the one that requires less development then the game probably doesn't have a lot to offer

Sure but it’s not raytracing’s fault that other areas of gameplay haven’t improved. You’re implying that adding raytracing to a game makes the game worse. If you think the gameplay sucks it would still suck without raytracing.
 
Sure but it’s not raytracing’s fault that other areas of gameplay haven’t improved. You’re implying that adding raytracing to a game makes the game worse. If you think the gameplay sucks it would still suck without raytracing.
You say that as if I didn't already try to explain the connection between Ray tracing and lazy development several times. If you feel that I implied adding RT to games makes them bad then that is your fault for misinterpreting what I said.
 
So why haven't you compared the 7900xtx with a 4080 since that's such a big part of your 'conclusion'? You knew before starting where limitations shortcomings and problems could be seen but didn't do anything about it. Did you not take things seriously or had a lot of data you could reuse and rehash with little work? Or both?
 
So why haven't you compared the 7900xtx with a 4080 since that's such a big part of your 'conclusion'? You knew before starting where limitations shortcomings and problems could be seen but didn't do anything about it. Did you not take things seriously or had a lot of data you could reuse and rehash with little work? Or both?
What the fu*k?
There are numerous articles comparing 4080 and 7900xtx.
What is wrong with you?
 
So why haven't you compared the 7900xtx with a 4080 since that's such a big part of your 'conclusion'? You knew before starting where limitations shortcomings and problems could be seen but didn't do anything about it. Did you not take things seriously or had a lot of data you could reuse and rehash with little work? Or both?
https://www.techspot.com/review/2746-amd-radeon-7900-xtx-vs-nvidia-geforce-rtx-4080/

"Bottom line, with a 15% discount, the Radeon 7900 XTX stands as a considerable alternative to the RTX 4080, but it's not an automatic choice. With the most affordable 4080s priced around $1,090, we believe the 7900 XTX should be around $900 to be the preferred choice, and the PowerColor Hellhound is nearly there. We hope this is useful during the holiday season, so you know how these GPUs need to be priced to get the best bang for your buck."
 
20% what are you playing at 1080p?

4090 are sold out as a PC enthusiast it's the only option. Or you're using the term PC Enthusiast improperly.

You were saying? You cannot even look at the numbers up above yourself? My OC XFX XTX is 15% slower at 2160p than a 4090.

You are confusing Enthusiast with WHALE tier.

My display is a 7000$ 82 inches flagship 8K TV doing 4k 120Hz HDR and VRR with Freesync Pro, if you want to talk seriously...

2160p.png
 
Last edited:
I don't think the 40 series is going to age well especially now that ultrawide is slowly becoming standard. nearly all high-end displays now are ultrawide and, it can be assumed, those with expensive displays are also going to have an expensive graphics card. Those ultrawide displays use a good bit more VRAM than 16:9 displays.

I think that the 7000 series is going to age better than the 40 series. People tolerated the 30 series small amount of VRAM but if you have a 3080 you're already being limited at the high end and don't even have access to all the DLSS features. Frankly, if you have a 30 series card you're better off using FSR3.0

I also think driver support for the 7000 series is only going to get better because AMD is putting RDNA3 in their APUs. I don't know why AMD drivers have been bad for RDNA3 but AMD is going to be manufacturing new products with RDNA3 in it for atleast 2 more years.

On the positive side, AMD has better (unofficial) Linux drivers for those looking to escape windows. Many of the problems AMD has had for awhile have been fixed by the open source community for several months now. It's actually really bizarre.

But in conclusion, I think RDNA3 is going to age better than the 40 series. One be ause of memory and 2, because of windows. The 40 series has a lot of interesting tech in it but anyone who thinks they might want to escape Windows almost has to go with an AMD graphics card. That said, if you want to buy a graphics card for Linux gaming go with the 6000 series not the 7000 series if you can.

Erm, no.
Ultrawide will never become standard.
 
Well I'm not an expert neither I own an AMD card but when I read "experience can range from great to downright horrible" when playing a game, I think this is a garbage GPU. Maybe instead of charts only and numbers you could show some pictures or video to support such statements
 
The only reason I can think is price. Why else would you choose an xtx over a 4080? At the same price the 4080 is the obvious choice.
How is it obvious if the 4080 is the worst selling card in Nvidias history? They've stopped making it months ago and they have enough inventory on hand to sell them through 2024 at a minimum...
 
It comes down to the resolution you want to play at.
4K 4090 is worth it. Anything else is pointless to spend the money.

4090's have been sold out in Canada for a while. I think sear numbers speak for themselves.
People want a 4090.

I've tried amd in the past and drivers were a major issue. Every generation there seems to be issues.

With many Intel and Nvidia systems I have rock solid performance. My system doesn't turn off for weeks and weeks on end. AAA gaming the whole time, no crashes no issues my hardware works.
Nothing worse than paying for to tier hardware and bc of crap software interface not being able to use it properly and consistently.
It's not just the resolution... turn down a few settings to hit the frames you want with the 7900xtx and within 5 min or playing, most won't yell a difference in quality
 
The cheapest 4090 I can find on Amazon is $2400 with the 7900xtx is at $960. Nvidia has been shipping everything to China for a while now. I hate them artificially restricting supply all the time.
 
Yes and no. First off, people should only buy these cards for 4k or 1440p. The other thing is that at those resolutions 12GB of memory can make or break a game but 16GB is still good....for now.

I'm worried about the longevity of the 4080 simply because I think we have 2 years tops of 16GB being good enough for the high end. I don't think this is a big deal on low end cards but when paying $1000 for a card I should be able to expect to perform well on next gen games.

So I actually find the 4070ti super with 16GB the more interesting card. We also don't know if nVidia is going to make anymore 4090s so we might end up with the 4080 super being the flagship until the 5090 comes out, allegedly, this fall.
4090 struggles to hit 1080p60 in plenty of games due to poor optimization....
 
It comes down to the resolution you want to play at.
4K 4090 is worth it. Anything else is pointless to spend the money.

4090's have been sold out in Canada for a while. I think sear numbers speak for themselves.
People want a 4090.

I've tried amd in the past and drivers were a major issue. Every generation there seems to be issues.

With many Intel and Nvidia systems I have rock solid performance. My system doesn't turn off for weeks and weeks on end. AAA gaming the whole time, no crashes no issues my hardware works.
Nothing worse than paying for to tier hardware and bc of crap software interface not being able to use it properly and consistently.
Nvidia has been artificially restricting supply or 4090s. Demand really isn't known. Just looking at industry volume numbers, 7900xtx is much more in demand.
 
Agree with this, as UE5 gets more utilised and more developers get onboard the current gen train, I reckon 16GB will start to be a bottleneck at 4K.
16GB wont be a bottleneck for 4k anytime soon. Consoles are target #1. They have 10GB TOTAL ram, and that has to serve system duties as well as graphical duties. Yes, I know the series X and PS5 have 16 GB of RAM, but the series S must be compatible and so that will become the baseline. Any game made will have to be compatible with it. That means that your world design and character design has to fit on that muddy box first, then get up scaled on the more powerful consoles.

Even if a developer ignores the xbox altogether, the PS5 is only 16GB, and historically a PC game running at max settings will use the same or less VRAM then consoles have total RAM. Most developers will not put in the effort to make the PC versions substantially superior graphics wise to consoles.

Until the PS6/xbox series XXX come out, 16GB will be more then enough for 4k.
 
4090 struggles to hit 1080p60 in plenty of games due to poor optimization....
There are two types of optimization and people are increasingly using the wrong one. When something was poorly optimized it used to mean that the games source code was CPU intensive and that the only way to increase FPS was to brute force it.

Today, people say games are poorly optimized when they can't play at max settings. I'm here to inform you that you're perfectly capable of doing your own optimizations in the settings menu.
 
16GB wont be a bottleneck for 4k anytime soon.
So you’re saying, I shouldn’t be able to find any already existing games right now, that will eat close to or exceed 16GB VRAM on my 4090 at 4K? Or really anything above 10GB?

Cos I’m pretty sure 3080 owners already run into “more than 10GB is required” realm.
 
Last edited:
There are two types of optimization and people are increasingly using the wrong one. When something was poorly optimized it used to mean that the games source code was CPU intensive and that the only way to increase FPS was to brute force it.

Today, people say games are poorly optimized when they can't play at max settings. I'm here to inform you that you're perfectly capable of doing your own optimizations in the settings menu.
On the one hand, I do agree with your sentiment, I've spent the last three years running games at 1080p and occasionally lowering settings while I was keeping my 1080 Ti going. On the other hand, I also frequently see games that run at lower framerates than other games with significantly higher visual appeal. That is something I would call a case for optimization. A good example would be Ratchet & Clank: A Rift Apart. I've witnessed scenes in that game with *insane* amounts of visual clutter present on the screen and a very good render distance still running at 45fps+ on max settings, when a similar-looking scene in Cyberpunk 2077 runs at 25 or less on the same hardware. And, R&C is only 35GB, none of this "120GB install" nonsense.

Yes, sometimes you do just need to bump some settings down and accept that your hardware does have limitations, but there's still plenty of times when studios are just too cheap to hire (and retain) qualified developers who can squeeze the most performance out of a given level of visual fidelity.
 
RX 7900 XTX definitely loses head to head but look at it this way, if you can get 60@4K in all games by tweaking a couple of settings and I believe you can, you're pretty much good to go, what more do you want?
Only game I have played that would require lowering settings is cyberpunk.

Nvidia can eat a big one and their $2k gpu.
 
Back