Doom: The Dark Ages, 36 GPU Benchmark

I find the reviewers sentiment on 8GB amusing and hyperbolic. I dont think shipping them is killing PC gaming. I mean what games can you not play if you have an 8GB card?

I definitely wouldnt buy an 8GB card myself but to claim this will kill the industry is a bit far. There are many many users out there who dont need 8GB of VRAM.
 
Between 2 and 6 fps in 4K on RTX 4060 is disgraceful! The game engine must have very poor optimization.

P.S. Somehow, you guys keep missing my video card - RTX 3080 Ti (12GB)
 
I find the reviewers sentiment on 8GB amusing and hyperbolic. I dont think shipping them is killing PC gaming. I mean what games can you not play if you have an 8GB card?

I definitely wouldnt buy an 8GB card myself but to claim this will kill the industry is a bit far. There are many many users out there who dont need 8GB of VRAM.
Flip side - how many games are being held back by our lowest common denominator requirements? See also: Space Marine 2 with 4k texture packs. Looks way better. Wasnt in the original release because the game had to run on 8GB cards. Once its installed, 8GB is totally unusable. The reviewer you're referencing points this out in plain English.

The silly 8GB baseline is absolutely holding back game development, just like the series S. We cant push tech forward if we have to account for new hardware stuck with VRAM limitations.

Maybe once consoles come with 32GB of RAM, we can move past 8GB GPUs? 64GB?
 
Flip side - how many games are being held back by our lowest common denominator requirements? See also: Space Marine 2 with 4k texture packs. Looks way better. Wasnt in the original release because the game had to run on 8GB cards. Once its installed, 8GB is totally unusable. The reviewer you're referencing points this out in plain English.

The silly 8GB baseline is absolutely holding back game development, just like the series S. We cant push tech forward if we have to account for new hardware stuck with VRAM limitations.

Maybe once consoles come with 32GB of RAM, we can move past 8GB GPUs? 64GB?
I have more problem with the claim that 8GB cards are harming the industry. I firmly believe that having cheaper 8GB variants of cards is actually a good thing for PC gaming. It provides cheaper access to gaming whether it be through budget or second-hand cards. And that 8GB baseline isn't going away anytime soon. You act like if Nvidia released only 12GB or higher cards this year that game devs would stop optimising for 8GB. On the steam survey 34% have 8GB cards. And an even higher percentage have less than that. Games will be optimised for 8GB for years to come.

And you don't have to go too far back in time to find this very website praising and recommending budget cards with 8GB on them. Look at the best graphics cards of 2024 article from last year, there are a few 8GB cards in there. So maybe they shouldn't have been doing that if they were so concerned about the baseline? How many games have launched since that don't run on those cards?
 
The reviewers mention that the game seems to be well optimized. That makes my next sentence all the more surprising.

The RTX 5080 is a new high performance card, only bettered by the RTX 5090 (generally).
But, in this test, the lower specced but very reasonable AMD 9070XT beats it across the board.

I find that incredible. It shows that NV have really screwed up Blackwell, in many ways. In this case I suspect it must be the typical, as of late, NV drivers. Even if the game was optimized for AMD, which there is no mention of, the 5080 should be better pretty much across the board.

NV really need to get their act together. Very glad I didn't blow loads of cash on a very sloppy Blackwell release. We all know about the power spread over the connectors being awful melting connectors. A few other things too. But it extends to drivers as well. People over on the official NV forums are fuming about it. As they should be.

NV should drop out of the gamer GPU market and focus on their big cash earner A.I. related.
They clearly just couldn't be bothered with Blackwell. And/or the entire team of devs for it and it's drivers are interns.
 
I mean what games can you not play if you have an 8GB card?
It's more like: What games COULD we play if most of us had more than 8GB cards? Ultra quality texture packs for example could be built into the games by default. Just take a look how SM2+Texturepack looks soo much better if you have the proper VRAM to run it enhanced.

It's not about lazy developers. It's about better games. Textures eat up a lot of space. Locally run AI to do better NPC interactions or generative textures eats up VRAM like nothing. And let's not forget how much you could do for particles, shading etc. Even the mod scene greatly benefits from higher VRAM.

IMHO If you truly love games and technological progression in gaming you should overcome the "8GB is enough" narrative. At least in 2025 and going forward.
 
What an insane comment section, people actually sticking up for 8GB cards.

You can pickup entire consoles for less money than some of these 8GB cards, it’s an entire self-contained system and comes with 16GB of VRAM.

Intel don’t even sell a GPU with less than 10GB, mobile phones have more memory to hand these days.

If these 8GB cards were less than £200, sure, preferably closer to the £150 mark, but they aren’t, they’re double that.

Their pricing is indefensible, seeing how much better a 4060Ti or 5060Ti with 16GB does goes to show how much the expensive bit (the actual GPU) is wasted when paired with such a small amount of VRAM.

What a waste of money modern GPU’s have become.
 
I have more problem with the claim that 8GB cards are harming the industry. I firmly believe that having cheaper 8GB variants of cards is actually a good thing for PC gaming. It provides cheaper access to gaming whether it be through budget or second-hand cards. And that 8GB baseline isn't going away anytime soon. You act like if Nvidia released only 12GB or higher cards this year that game devs would stop optimising for 8GB. On the steam survey 34% have 8GB cards. And an even higher percentage have less than that. Games will be optimised for 8GB for years to come.
You're SO close to getting it. No, it would not happen immediately, but it WILL happen, once 8GB cards get phased out.

That cant happen if we keep releasing generations with 8GB cards.
And you don't have to go too far back in time to find this very website praising and recommending budget cards with 8GB on them. Look at the best graphics cards of 2024 article from last year, there are a few 8GB cards in there. So maybe they shouldn't have been doing that if they were so concerned about the baseline? How many games have launched since that don't run on those cards?
How much better would the games we get be if they didnt have to account for a VRAM capacity that launched 10 years ago? The RX 480 and 5060ti both have 8GB variants. The 5060ti is leagues more powerful. We can demonstrate today that the 5060ti is severely constrained by 8GB in scenarios where the 16G card runs fine. At what point is it OK to drop 8GB?

This didnt happen with 2GB cards. or 512MB cards. But for some reason people get REAL hung up on 8GB GPUs.
 
Between 2 and 6 fps in 4K on RTX 4060 is disgraceful! The game engine must have very poor optimization.

P.S. Somehow, you guys keep missing my video card - RTX 3080 Ti (12GB)

Most places aren't doing the 3080 12GB nor the 3080Ti for a few reasons.
1) The 3080 12GB isn't as wide spread as the 3080 10GB or 3080Ti and samples weren't handed out to very many testers.
2) The 3080Ti performance is so close to the 3090 it's rather pointless to spend time benching both.
3) The 3080 12GB about 5% faster than a 3080 10GB (depending on resolution) and is only about 2% behind the 3080Ti. The 3080Ti is about 2% behind the 3090.

Unless a game getting benched is having cards that have 12GB or less struggle, just look at the performance the 3090 gives and that's where your 3080Ti is going to be (minus a couple of percentage). That's what I do, I've got a 3080Ti as well. I just look at the 3090.
 
The reviewers mention that the game seems to be well optimized. That makes my next sentence all the more surprising.

The RTX 5080 is a new high performance card, only bettered by the RTX 5090 (generally).
But, in this test, the lower specced but very reasonable AMD 9070XT beats it across the board.

I find that incredible. It shows that NV have really screwed up Blackwell, in many ways. In this case I suspect it must be the typical, as of late, NV drivers. Even if the game was optimized for AMD, which there is no mention of, the 5080 should be better pretty much across the board.

NV really need to get their act together. Very glad I didn't blow loads of cash on a very sloppy Blackwell release. We all know about the power spread over the connectors being awful melting connectors. A few other things too. But it extends to drivers as well. People over on the official NV forums are fuming about it. As they should be.

NV should drop out of the gamer GPU market and focus on their big cash earner A.I. related.
They clearly just couldn't be bothered with Blackwell. And/or the entire team of devs for it and it's drivers are interns.
TechPowerUp and ComputerBase used the 576.40 drivers which improve RTX 50 performance but only makes it the same as RTX 40 performance. If I remember correctly, in both benchmarks the 9070 cards still punched a whole class above their weight.
 
Back