Doom: The Dark Ages, 36 GPU Benchmark

This is silly, because you're drawing eternal conclusions on the first available round of drivers.
Read my edit on my last reply.

Even based on previous testing, the 9070XT is roughly 16% slower at 4k than the 5080 while being at least 35-40% cheaper if not more. So 5080 is a very underwhelming card. Awfully so.
Again, nothing silly, just current reality.
 
Read my edit on my last reply.

Even based on previous testing, the 9070XT is roughly 16% slower at 4k than the 5080 while being at least 35-40% cheaper if not more. So 5080 is a very underwhelming card. Awfully so.
Again, nothing silly, just current reality.
Sure, if you pretend that it's 2010 and ignore ray tracing, upscaling quality, and other features. I couldn't sell my 7900XTX fast enough to snag a 4080 Super.
 
Sure, if you pretend that it's 2010 and ignore ray tracing, upscaling quality, and other features. I couldn't sell my 7900XTX fast enough to snag a 4080 Super.
I think you are.
I understand now why you argue endlessly for Nvidia. Enjoy your 4080 Super. It is slightly better price/ quality ratio than the 5080. Again, reality.
 
How can a mid range 70 class card can beat most of Nvidia stack is beyond me.

If Nvidia doesn't do something, next gen they might be in trouble.
The 50 series has been a mess from a start, seems they didn't put much effort into its design and validation (whereas Ada was at least a significant improvement over Ampere). I guess it's one consequence of using pretty much the same manufacturing node as the previous gen, and they were unable to repeat the engineering magic of going from Kepler to Maxwell on the same 28nm node.
 
Can someone clarify it for me. Is the rt on or off?
Cause if the gpus are tested with some crucial features off it makes the whole testing void.
Otherwise team amd looks stronk.
 
TechPowerUp and ComputerBase used the 576.40 drivers which improve RTX 50 performance but only makes it the same as RTX 40 performance. If I remember correctly, in both benchmarks the 9070 cards still punched a whole class above their weight.
Got it. Thanks for the info.
 
(Edit)
Even if we discount this Doom benchmark and take the average 4k results from the 5060Ti test, the 9070XT is 15.66% slower than the 5080 while costing at least 35-40% less. So yes, 5080 is a terrible card no matter how you slice it or dice it.
Well using your methodology amd cards are crap as well. The 9070xt is twice the price of the 5060ti while it's only 40% faster. :)

No matter how you slice it, it's a terrible card right? I mean going by HUB's review, the 9070xt is very similar - almost identical to last years 4070ti S, and it costs the same money the 4070ti S cost last year. So, yeah, great card, a year later for same performance and same price
 
Last edited:
That's why we have quality settings in the main menu.
And your point? My point is that 8GB of VRAM isn’t enough for modern games to use 4K textures.

With DLSS, modern GPU’s seem pretty capable of half decent 4K performance but you can’t load up the high res textures because 8GB of VRAM simply isn’t enough.

What’s your point about settings?
 
And your point? My point is that 8GB of VRAM isn’t enough for modern games to use 4K textures.

With DLSS, modern GPU’s seem pretty capable of half decent 4K performance but you can’t load up the high res textures because 8GB of VRAM simply isn’t enough.

What’s your point about settings?
I agree, 8gb isn't enough for 4k textures. So what? How does that stop devs from adding 4k textures to games? Can't you justowee the settings if you have lower vram?
 
I agree, 8gb isn't enough for 4k textures. So what? How does that stop devs from adding 4k textures to games? Can't you justowee the settings if you have lower vram?
Then what’s the point of the GPU being capable of it and costing £350+ but pairing it with insufficient VRAM?

Might as well half the GPU performance (lower the price to £200 or less) and leave it at 8GB saving a ton of money for everyone, then as you say, it wouldn’t matter it can’t run 4K textures as the GPU wouldn’t be capable of it anyway.

The problem is that the GPU itself is capable, but pairing it with 8GB of VRAM to save a small amount on build cost stops you, the consumer of even using all the GPU you’ve paid for.
 
Well using your methodology amd cards are crap as well. The 9070xt is twice the price of the 5060ti while it's only 40% faster. :)

No matter how you slice it, it's a terrible card right? I mean going by HUB's review, the 9070xt is very similar - almost identical to last years 4070ti S, and it costs the same money the 4070ti S cost last year. So, yeah, great card, a year later for same performance and same price
Actually, judging by the prices of the cards I can actually go out and buy in Canada, the 5060Ti 16GB is roughly 33% cheaper while being roughly 40% slower than the 9070XT.
So…
The 5060Ti 16GB or 5070Ti are ok-ish cards for the money. The 5080 is not, at least not from a gamer POV.
Again, this is reality.
 
Last edited:
Then what’s the point of the GPU being capable of it and costing £350+ but pairing it with insufficient VRAM?

Might as well half the GPU performance (lower the price to £200 or less) and leave it at 8GB saving a ton of money for everyone, then as you say, it wouldn’t matter it can’t run 4K textures as the GPU wouldn’t be capable of it anyway.

The problem is that the GPU itself is capable, but pairing it with 8GB of VRAM to save a small amount on build cost stops you, the consumer of even using all the GPU you’ve paid for.
What does insufficient vram mean? The card plays the majority of games at high or ultra settings. In the few that it doesn't, you just drop the settings. You know, just like I do on my 4090, or any other card for that matter.

But the discussion was about 8gb holding back texture quality which is not true.
 
Actually, judging by the prices of the cards I can actually go out and buy in Canada, the 5060Ti 16GB is roughly 33% cheaper while being roughly 40% slower than the 9070XT.
So…
The 5060Ti 16GB or 5070Ti are ok-ish cards for the money. The 5080 is not, at least not from a gamer POV.
Again, this is reality.
You judge cards based on your local pricing?
 
You judge cards based on your local pricing?
Would I care about the prices in Timbuktu? (No disrespect for the proud country of Mali intended)
Of course I judge everything, not only cards, on what I can get my hands on where I live.

Therefore my initial argument still stands. The 5080 represents a poor price/performance ratio for gamers. Most likely in the whole North America, certainly in Canada.
 
Last edited:
Would I care about the prices in Timbuktu? (No disrespect for the proud country of Mali intended)
Of course I judge everything, not only cards, on what I can get my hands on where I live.

Therefore my initial argument still stands. The 5080 represents a poor price/performance ratio for gamers. Most likely in the whole North America, certainly in Canada.
When it comes to a buying decision, of course you care about your local prices. But declaring a product crap cause your local pricing is kinda whack is disingenuous. It's not an objective way to judge a product.
 
When it comes to a buying decision, of course you care about your local prices. But declaring a product crap cause your local pricing is kinda whack is disingenuous. It's not an objective way to judge a product.
It is very objective when local pricing structure is pretty much all the same everywhere, as it is, at least with GPUs, but that’s besides the point. It is VERY objective from a consumer POV.
In any case, prices pulled from Micro Center are showing the 5060Ti 16GB being roughly 27% cheaper than the 9070XT, while the 9070XT is at least 52% cheaper than the 5080.
So Nvidia’s US prices are even whackier than they are in Canada.
Therefore, my argument still stands, 5080 is a terrible price/performance ratio card for gamers, at least in North America.
 
Last edited:
It is very objective when local pricing structure is pretty much all the same everywhere, as it is, at least with GPUs, but that’s besides the point. It is VERY objective from a consumer POV.
In any case, prices pulled from Micro Center are showing the 5060Ti 16GB being roughly 27% cheaper than the 9070XT, while the 9070XT is at least 52% cheaper than the 5080.
So Nvidia’s US prices are even whackier than they are in Canada.
Therefore, my argument still stands, 5080 is a terrible price/performance ratio card for gamers, at least in North America.
Well, in EU the 16gb is 429 and the 9070xt is 749 euros. That's 75% more expensive. Does that mean that the 9070xt is both great (cause it's cheap in your region) and crap (cause it's expensive on mine)?
 
Well, in EU the 16gb is 429 and the 9070xt is 749 euros. That's 75% more expensive. Does that mean that the 9070xt is both great (cause it's cheap in your region) and crap (cause it's expensive on mine)?
You finally got it, at least partially, we were talking about performance/ price ratios which is… well… dependent on the price it sells locally. The cards are what they are, that does not change, but, depending on the price they sell for, they can be a good or a poor choice. For. The. Money.

Nvidia is currently dumping their 5060Ti stock at under MSRP in Europe, which makes them a good, perhaps even great buy. They are a somewhat decent buy in North America as well.

However I was talking about the 5080 and that in Europe is at least 49% more expensive than the 9070XT.

So my initial argument not only still stands but you helped expanding it. Thank you! The 5080 is a poor performance/ price ratio card from a gamer’s POV in Nord America AND Europe.

That will never change, no matter how hard you try to work the 5060Ti in a 5080 vs 9070XT argument. That CAN change if, and ONLY if, Nvidia decides to reliably supply the 5080 under MSRP at perhaps around $900.

We could restart this conversation if that happens, OK?

Cheers!
 
Last edited:
You finally got it, at least partially, we were talking about performance/ price ratios which is… well… dependent on the price it sells locally. The cards are what they are, that does not change, but depending on the price they sell for they can be a good or a poor choice. For. The. Money.

Nvidia is currently dumping their 5060Ti stock at under MSRP in Europe, which makes them a good, perhaps even great buy. They are a somewhat decent buy in North America as well.

However I was talking about the 5080 and that in Europe is at least 49% more expensive than the 9070XT.

So my initial argument not only still stands but you helped expanding it. Thank you! The 5080 is a poor performance/ price ratio card from a gamer’s POV in Nord America AND Europe.

That will never change, no matter how hard you try to work the 5060Ti in a 5080 vs 9070XT argument. That CAN change if, and ONLY if, Nvidia decides to reliably supply the 5080 under MSRP at perhaps around $900.

We could restart this conversation if that happens, OK?

Cheers!
Well I don't think that price / performance is that simple. For example, the 5080 is ~30% faster (35% at 4k) in RT (TPU test btw), and even more so in PT games, to the point that the 9070xt is closer to a 5060ti in path tracing while the 5080 is twice as fast. So if Im playing Alan wake PT, Cyberpunk PT, witcher 3 remake etc. the 5080 isn't that bad of a value as you make it up to be and the 9070xt is terrible, barely matching the 5060ti.
 
Back