GeForce RTX 4070 Super vs. Radeon RX 7900 GRE: Rasterization, Ray Tracing, and Upscaling Performance

It might need to drop in price by another 5-10% (and even more in some regions because of higher than normal prices, like where I live)
 
"as notable examples, show minimal visual difference with ray tracing enabled, primarily affecting the frame rate. This trend is common across most ray tracing-supported games, with few exceptions demonstrating a substantial improvement."

Finally a better portrayal of the real world! The only two games that RT makes a relevant difference only run properly and without gimmicks on the 4090. Mid-end GPUs shouldn't even support RT, a waste of die space.

Buy the cheapest one!
 
Good article, and I’m glad to see some Helldivers 2 benchmarks! Any chance we will see a deep dive on this title?

This is the first game I’ve bought that actually uses all the cores on my 12700K, so I’d love to see how much impact both the CPU and GPU have on this title.
 
If you're using a 7900 GRE or 4070 Super to play games at 1080p, you're doing it wrong.
 
So it's kind of the ongoing, recurring theme. AMD offers more Vram in general, but struggles with ray-tracing. Somewhat similar compute performance. Efficiency wise, AMD has mostly been more power hungry, it seems.

Maybe this is a thing of the past, but Nvidia cards are just more completely fleshed out, out of the box as their drivers seem more stable. I've always been open minded about both brands, but find myself going the Nvidia route exclusively thus far.
 
Last edited:
Anything that at maximum quality delivers an average of 45fps in the games that interest me today, should last me about 4 years before falling below 30fps in the ones that will interest me at that moment, and 6 years if I lower the quality in some aspects.
It is the rule that I have created with experience.
and today DLSS/FSR/XeSS help that rule much more
unless the technology is changed in a more radical way, as it was to move to DirectX11
 
I don't know how much faster third party 4070 Supers' get, but the 7900GRE is now able to show significant performance increases and you can easily get another 15%+ over the gimped standard card. Taking that into account pushes favour to AMD for sure IMO.
 
So it's kind of the ongoing, recurring theme. AMD offers more Vram in general, but struggles with ray-tracing. Somewhat similar compute performance. Efficiency wise, AMD has mostly been more power hungry, it seems.

Maybe this is a thing of the past, but Nvidia cards are just more completely fleshed out, out of the box as their drivers seem more stable. I've always been open minded about both brands, but find myself going the Nvidia route exclusively thus far.
AMD are a generation behind on ray tracing (as expected given nVidia was pushing it when ray tracing was only available in demos). So whilst "struggling" they still match the previous nVidia generation (if we look at the actual head to heads x900 cards v x080 nvidia cards etc down the stack) - so each time someone says AMD suck at raytracing they are also saying anything except the current 4000 series sucks at raytracing. DLSS isn't a bonus for most nVidia cards - it is pretty much essential except for the top two tier cards (and essential for them if you add path tracing). By not focussing on ray tracing the need for FSR was never as urgent as DLSS so again FSR 3.5 is behind the latest DLSS, but it is not objectively bad and FSR is a lifeline for Pascal owners. Whilst AMD may be behind for the latest games in RT the better performance in rasterisation carries over to a much larger number of titles (RT is available in less than 2% of games in my steam library) - so for my use case RT and DLSS are not primary purchasing concerns.
 
We need a simple 2 test benchmark. First one shows the most beautiful, most advanced graphic scene with all the latest technologies.
The second test shows a fairly light scene and imitates a shooter game.
First test helps to find out how comfortable a GPU is with the best graphic games. Second test helps to find out how many fps a GPU can produce in a typical competitive game.
This because most people do not typically crank up graphics in a shooter game, those extra frames are worth sacrificing the best picture. A lot of good bf players I sometimes watch on Twitch do exactly this.
 
AMD are a generation behind on ray tracing (as expected given nVidia was pushing it when ray tracing was only available in demos). So whilst "struggling" they still match the previous nVidia generation (if we look at the actual head to heads x900 cards v x080 nvidia cards etc down the stack) - so each time someone says AMD suck at raytracing they are also saying anything except the current 4000 series sucks at raytracing. DLSS isn't a bonus for most nVidia cards - it is pretty much essential except for the top two tier cards (and essential for them if you add path tracing). By not focussing on ray tracing the need for FSR was never as urgent as DLSS so again FSR 3.5 is behind the latest DLSS, but it is not objectively bad and FSR is a lifeline for Pascal owners. Whilst AMD may be behind for the latest games in RT the better performance in rasterisation carries over to a much larger number of titles (RT is available in less than 2% of games in my steam library) - so for my use case RT and DLSS are not primary purchasing concerns.

The thing is, that DLSS is essential for Nvidia cards below the 4080 is twofold in its necessity:
That devs are awful these days.
That Nvidia go short on VRAM.

For the 4080 and up, Well as mentioned by me elsewhere a solid comparison of both Nvidia tier peer prive over AMD and between models in one Nvidia tier is very telling. For almost all of last year the 4080 ran £1350-2100 low to high. The 4090 ran £1650-2800. The 7900XTX ran £900-1150, with mine a premium model Sapphire Nitro+ with features to match the higher end of the others at £1070.
I guess RT and upscaling only .5-1 gen ahead is expensive. Though tbh I'd expect the x2-3 price of a 4090 to account for that +30 fps and lossless RT (still need DLSS) even if it doesn't run for the double the price like 4080. I'm sure RT and DLSS are nice just not that much more nice. Nvidia are tripping otherwise. For those kind of differences I could get a card fit for 3440x1440 and 4K gaming plus at least brand new 7800X3D/mobo/RAM upgrade or another 1440p+ capable card or... if not more.

Other than that, similar pros re AMD and use priorities as yourself. Given the potentially massive price:feature disparity they absolutely win on VRAM cap and raster, still the most important things for running any game. Especially where implementation vs outcome varies re RT and upscalers by so much nm them still being pretty thin on the ground. I've plenty of long term games that have neither one or both but will use 16+Gb VRAM with ease before 4K.
 
The thing is, that DLSS is essential for Nvidia cards below the 4080 is twofold in its necessity:
That devs are awful these days.
That Nvidia go short on VRAM.

For the 4080 and up, Well as mentioned by me elsewhere a solid comparison of both Nvidia tier peer prive over AMD and between models in one Nvidia tier is very telling. For almost all of last year the 4080 ran £1350-2100 low to high. The 4090 ran £1650-2800. The 7900XTX ran £900-1150, with mine a premium model Sapphire Nitro+ with features to match the higher end of the others at £1070.
I guess RT and upscaling only .5-1 gen ahead is expensive. Though tbh I'd expect the x2-3 price of a 4090 to account for that +30 fps and lossless RT (still need DLSS) even if it doesn't run for the double the price like 4080. I'm sure RT and DLSS are nice just not that much more nice. Nvidia are tripping otherwise. For those kind of differences I could get a card fit for 3440x1440 and 4K gaming plus at least brand new 7800X3D/mobo/RAM upgrade or another 1440p+ capable card or... if not more.

Other than that, similar pros re AMD and use priorities as yourself. Given the potentially massive price:feature disparity they absolutely win on VRAM cap and raster, still the most important things for running any game. Especially where implementation vs outcome varies re RT and upscalers by so much nm them still being pretty thin on the ground. I've plenty of long term games that have neither one or both but will use 16+Gb VRAM with ease before 4K.
AMD is on the right path IMHO, needs to fine-tune some of the features, like NVEnc, and better Perf/Watt. Something that AMD did well with the Ryzen line against Intel. Beating Nvidia is far different than beating Intel ( they got lazy ), Nvidia is well aware of AMD capabilities on next-tech jump.... thats why Nvidia will make sure that 50xx series make a solid position, to keep AMD on previous gen.
 
This is how the next reviews should be in the near future:

If 60 > fps = Playable

Game Title 1 (with Max Settings) //this is what we really care as gamers!
1080p - Playable
1440p - Playable
4K - Playable

Game Title 1 (with RT Settings) //because we want to support NV
1080p - Playable
1440p - Playable for NV only
4K - Not Playable

Game Title 1 (with RT + Upscale Settings) //because we want to give hope to some gamers
1080p - Playable
1440p - Playable for all
4K - Not Playable


As long as a game can be higher than 60fps with the highest resolution that our Monitor screen can support then we are happy.
 
Back