Nvidia GeForce RTX 4090 vs. AMD Radeon RX 7900 XTX: Is the GeForce Premium Worth It?

My biggest issue with the AMD cards right now is Blender performance. It has improved, but they need to do a lot more to match Nvidia there. the 7900XTX kinda just matches or loses to the 4070 in Blender 4.0 in most rendering scenarios if you don't hit a VRAM limit.

That being said, for gaming there are many reasons why somebody would go for AMD GPUs.
 
Don't see any modern game worth spending 2k to play. I also see raytracing as an excuse for lazy development. For anything involving compute, well you can buy 2 7900xtx cards for the cost of a single 4090.

Then you have my biggest issue with it. Not paying 2k for something that won't have access to the latest tech. If I bought a 3090 I'd be furious that I didn't get the latest DLSS features. They did it to the 20 series, too, so I have every reason to believe that they will do it to the 40 series.
 
Last edited:
I also see raytracing as an excuse for lazy development.
How do you come to that conclusion? What makes current games so expensive to produce is the shear amount of time spent getting lighting and art looking good and correct, Ray-Tracing massively speeds this up and can look even better when done properly. It's not lazy to find a better way of doing something.

Digital Foundry got 4A Games to send some footage over of them developing Metro Exodus and do a comparison between developing a version of the game using normal lighting systems vs ray-tracing and we're talking about a big difference here, 5 minutes to Ray-tracing vs an hour to do the exact same scene.
 
How do you come to that conclusion?
Because games that heavily advertise their raytracing usually have very little to offer in terms of story or gameplay. Also, making games easier to make also makes it easier for companies to pump out overpriced garbage. A thing that everyone from into to AAA devs have shown they can and will do. Also, the cost of games has recently gone up to $70 so I don't see why I should care of Ray tracing reduces development costs
 
Because games that heavily advertise their raytracing usually have very little to offer in terms of story or gameplay. Also, making games easier to make also makes it easier for companies to pump out overpriced garbage. A thing that everyone from into to AAA devs have shown they can and will do. Also, the cost of games has recently gone up to $70 so I don't see why I should care of Ray tracing reduces development costs
Then do what normal people and don't buy the crap games?

Raytracing has nothing to do with how good a game actually is, it's just a lighting technology that's better than the old way of doing things. Have a watch of this to see what Raytracing really enables for developers because it's actually a really good technology:
I'm not saying buy all the crap games that come out, I'm simply confused how you've correlated rubbish games with a lighting technology.
I love the original Unreal Tournament but not for it's lighting technology.
I hate (modern) Call of Duty, but not for it's lighting technology.

Same with development costs, I don't care how much a game cost to make, it's whether the game is any good that will decide if I buy it or not.
 
The comparison may have come about by tortured logic but the outcome seems familiar nonetheless.
There are no technical reasons for choosing Radeon, just like at any other price point. Lower cost remains the only objective reason why anyone would want one.
 
RDNA 4 with hardware BVH for RT can't come soon enough.

Dress it up anyway you want but RDNA 3 is a dumpster fire for RT.
 
Let's revisit the topic in the end of month with the 4080 super rumored to be 10% more powerful than 4080 and 20% cheaper at $999. This should cause the shakeup in the pricing structure with current lineup. 7900xtx $899? Or $849 depending where 4070ti super 🤪 falls in terms of price/performance as well.
This is probably going to be all irrelevant at the end of month or two.
 
My biggest issue with the AMD cards right now is Blender performance. It has improved, but they need to do a lot more to match Nvidia there. the 7900XTX kinda just matches or loses to the 4070 in Blender 4.0 in most rendering scenarios if you don't hit a VRAM limit.

That being said, for gaming there are many reasons why somebody would go for AMD GPUs.


Blender quads or solos...? (*smirk)

Illustrative^ that people do not buy RTX4090 solely for gaming, but for CUDA and work, because the rtx4090 @ $1,900, is still cheaper than a $3k Pro card. Perhaps that is why the 4090 has sold more than the 4080...? (Bcz most people buying them are doing so for work flow.)

Perhaps that is why NVidia is coming out with a bunch of rebranded 4080's to entice non-cuda users to purchase the 4080 for gaming...!

Radeon 7900XTX = $949
NVidia RTX4090 = $1,900


It's 2024 and the XTX is on it's way out.... replacing it is a $700 single chip.





 
Let's revisit the topic in the end of month with the 4080 super rumored to be 10% more powerful than 4080 and 20% cheaper at $999. This should cause the shakeup in the pricing structure with current lineup. 7900xtx $899? Or $849 depending where 4070ti super 🤪 falls in terms of price/performance as well.
This is probably going to be all irrelevant at the end of month or two.
Considering that the 4070ti super is going to be 16GB I think it's going to be what most people go for. Also, keep in mind these are the "base price" for these cards. We're going to see a $100 premium from all of the board partner cards because nVidia doesn't leave much in the way of profit margin for anyone.
 
Instead of price, I like to look at the perf/$. Something could be cheaper and not be worth buying. Where I live the the 4080 is about 200 euros more expensive.
Well the 4080 and the 7900xtx have very similar raster performance, so whichever is cheaper will have the better perf / $.
 
Well the 4080 and the 7900xtx have very similar raster performance, so whichever is cheaper will have the better perf / $.
Yes and no. First off, people should only buy these cards for 4k or 1440p. The other thing is that at those resolutions 12GB of memory can make or break a game but 16GB is still good....for now.

I'm worried about the longevity of the 4080 simply because I think we have 2 years tops of 16GB being good enough for the high end. I don't think this is a big deal on low end cards but when paying $1000 for a card I should be able to expect to perform well on next gen games.

So I actually find the 4070ti super with 16GB the more interesting card. We also don't know if nVidia is going to make anymore 4090s so we might end up with the 4080 super being the flagship until the 5090 comes out, allegedly, this fall.
 
Yes and no. First off, people should only buy these cards for 4k or 1440p. The other thing is that at those resolutions 12GB of memory can make or break a game but 16GB is still good....for now.
Agree with this, as UE5 gets more utilised and more developers get onboard the current gen train, I reckon 16GB will start to be a bottleneck at 4K.
 
I wish you read your own conclusions BEFORE embarking on this ridiculous test! Tell us something we don't know ...or compare apples to apples.

"If you're willing to spend $1,600 – or right now it would have to be $2,000 – to acquire the GeForce RTX 4090, you're unlikely to even consider the 7900 XTX. Likewise, if you're considering the 7900 XTX, it's extremely unlikely that you'd be choosing between it and the RTX 4090. Instead, you'd be more likely considering possibly stepping up to the RTX 4080.

So in a sense, it's a bit of a pointless comparison, but we knew this going into it. Still, it's nice to have an updated look at AMD and Nvidia's best offerings."
 
Agree with this, as UE5 gets more utilised and more developers get onboard the current gen train, I reckon 16GB will start to be a bottleneck at 4K.
I don't think the 40 series is going to age well especially now that ultrawide is slowly becoming standard. nearly all high-end displays now are ultrawide and, it can be assumed, those with expensive displays are also going to have an expensive graphics card. Those ultrawide displays use a good bit more VRAM than 16:9 displays.

I think that the 7000 series is going to age better than the 40 series. People tolerated the 30 series small amount of VRAM but if you have a 3080 you're already being limited at the high end and don't even have access to all the DLSS features. Frankly, if you have a 30 series card you're better off using FSR3.0

I also think driver support for the 7000 series is only going to get better because AMD is putting RDNA3 in their APUs. I don't know why AMD drivers have been bad for RDNA3 but AMD is going to be manufacturing new products with RDNA3 in it for atleast 2 more years.

On the positive side, AMD has better (unofficial) Linux drivers for those looking to escape windows. Many of the problems AMD has had for awhile have been fixed by the open source community for several months now. It's actually really bizarre.

But in conclusion, I think RDNA3 is going to age better than the 40 series. One be ause of memory and 2, because of windows. The 40 series has a lot of interesting tech in it but anyone who thinks they might want to escape Windows almost has to go with an AMD graphics card. That said, if you want to buy a graphics card for Linux gaming go with the 6000 series not the 7000 series if you can.
 
How many hours did this testing take? I'm in awe of the amount of work that goes into these tests!
 
Because games that heavily advertise their raytracing usually have very little to offer in terms of story or gameplay. Also, making games easier to make also makes it easier for companies to pump out overpriced garbage. A thing that everyone from into to AAA devs have shown they can and will do. Also, the cost of games has recently gone up to $70 so I don't see why I should care of Ray tracing reduces development costs

Well this makes absolutely no sense. If your problem is that games are overpriced then raytracing can only help. Continuing to use expensive lighting hacks that require tons of artist and developer time will not get you cheaper games.

 
Well this makes absolutely no sense. If your problem is that games are overpriced then raytracing can only help. Continuing to use expensive lighting hacks that require tons of artist and developer time will not get you cheaper games.
I don't understand why I have to keep explaining this. Ray tracing lowers the bar of entry, as you mentioned. However, it is the most heavily advertised feature in games. If the most advertised feature in games is the one that requires less development then the game probably doesn't have a lot to offer
 
"upscaling" should really be called "downscaling and blurry"; such a classic Emperor's New Clothes; it is alway so blurred vs just using native. I guess for the fake looking ray tracing crap to be useable it has to be pushed for games to attain high FPS via this de-resolution by another name.

I would say that a 7900XT is a far more sensible option that doesn't require what seems to be the ticking timebomb that is the Melter Connector 1000 used by Nvidia on their top end 4000 cards.

My 7900XT and 3080Ti handle 4k just fine and games that suck on them do so due to the engines involved; like ArmA 3 or X-Plane 12.

And the 7900XT is cheaper than eithrer the 7900XTX and RTX 4090.
 
Last edited:
Back