AMD takes aim at Nvidia, highlights the importance of VRAM in modern games

midian182

Posts: 9,745   +121
Staff member
In brief: As Nvidia's RTX 4070 is expected to launch tomorrow with 12GB of GDDR6X, AMD has decided now's a good time to take a shot at its rival by pointing out how much memory new games require when played at higher resolutions. Team Red notes that its mid- to high-end cards have more VRAM than Nvidia's equivalents, though some of its benchmark results don't exactly line up with our own.

In a post titled 'Building an Enthusiast PC,' AMD writes that graphics card memory requirements are increasing as the games industry embraces movie-quality textures, complex shaders, and higher-resolution outputs.

The post highlights some of the most VRAM-hungry recent games and their peak memory usage when playing in 4K: Resident Evil 4 (15.2GB/17.5GB with RT), The Last of Us Part 1 (11.2GB), and Hogwarts Legacy (11.2GB/15.5GB RT on). It notes that without enough video memory to play games at higher resolutions, players are likely to encounter lower fps, texture pop-in, and even game crashes.

As such, AMD has listed which of its cards are best suited for various resolutions: the 6600 series with its 8GB of VRAM for 1080p, the 6700 series (12GB) for 1440p, and the 6800 and above (16GB+) for 4K. The latest Radeon cards, the RX 7900 XT and RX 7900 XTX, pack 20GB and 24GB of VRAM, respectively.

The post doesn't miss out on the opportunity to highlight the price difference between AMD cards and Nvidia's. One benchmark comparison is between the $849 RX 7900 XT (20GB) and the $799 RTX 4070 Ti (12GB), though finding the latter at that price isn't easy.

The benchmark charts cover 32 "select" games at 4K. AMD obviously wants to look good here, hence the extensive notes about the various system configurations used in the testing. For example, its RX 7900 XT vs. RTX 4070 Ti chart has the former card 10.1% faster on average at 4K, while our own results put the difference at just 3% in favor of the Radeon.

It's a similar story with the RX 7900 XTX vs. RTX 4080. AMD has its card 7.6% ahead on average; our testing puts the Radeon a mere 1% faster. It's still a couple of hundred dollars cheaper than Nvidia's card, however.

While memory size in graphics cards can make a difference, especially when playing in 4K, it's not the only factor. AMD usually packs more VRAM in its cards, but that doesn't stop Nvidia's equivalents from matching and, in some scenarios, outperforming its competitor's products. However, AMD's claim of offering better performance per dollar at the high end is something we agree with – only the RTX 3060 and its Ti version provide better value than the RX 6800 XT and 7900 XTX in our testing (below). It'll be interesting to see how much the RTX 4070 shakes things up.

Permalink to story.

 
AMD should talk less and release 7800 series instead.

It is funny AMD even mentions Ray Tracing considering none of their GPUs can do it properly and apparently think most people play games in full 4K (where 4090 dominates completely anyway, beating 4080 and 7900XTX by ~25%)

What will AMD do when 4080 Ti and 4090 Ti hit? They pretty much peaked 7000 series with 7900XTX and it only performs on par with 4080 (in raster, 4080 is much faster in ray tracing). Also 4080 consumes 50-60 watts less for similar performance + has better and more features.

7900XTX should have been 899 on release and 7900XT should have been 699 on release. Then AMD would actually be competitive and make some people sway over.

AMD dropped below 11% GPU marketshare in latest steam hw survey, march 2023. They are doing worse than ever in the GPU segment.

The problem is that AMD earns more (per wafer) on CPUs. They eat into their CPU output by making GPUs, which is why they probably don't care that much.

Pricing 7900XT only 100 dollars below 7900XTX and then making primarily 7900XT's was clever but people are not stupid. 7900XTX was better value than 7900XT which is unheard of. Flagship = worst value for money, but you get the fastest card. This just confirms that pricing was off. AMDs intention was to sell 7900XT.

Slowly but surely is 7900XT closing in on 4070 Ti pricing. Launched at 899 and now close to 750 dollars. 899 MSRP was way too high and only 100 less than the flagship? Cmon AMD. I ordered pizza for 100 dollars last night. Barely no-one cares about 100 dollars. 899 or 999, who cares. Too little difference. If it was not for AMD holding back 7900XTX production, NO-ONE would have bought 7900XT.

I am no Nvidia fanboy. I actually consider 7900XTX, however not at ~1000 dollars, maybe at 800 dollars. I am not considering 7900XT before it drops to 650-700 dollars either.

With my 3080 I have superior features. Features Im afraid to lose. DLDSR is amazing. DLAA is awesome as well and DLSS2 made me max out some games I thought was impossible.

I know for sure that I will get worse features with AMD, that is why I am not willing to pay a premium. If my 3080 died today, I would probably grab a 4080 over 7900XTX because 200 dollars is not enough difference when you add in the other stuff.

Does AMD even have an answer to DLDSR and DLAA? Especially DLDSR is a feature I will miss alot if I make the switch. Games look amazing downsampled from much higher resolution and needs zero game support, you just choose the virtual resolution ingame.

Bad RT perf, I can deal with. I don't really care about RT (performance hit is not worth it for me). However the RT and Tensor cores are much more than just RT.

Imagine Nvidia did not try and push RTX? Then they would have 25-33% die space for CUDA cores. No-one thought about this? It would have been game-over for AMD. Nvidia started to push RTX because they knew they were years head and could afford to drop their raster perf.

Nvidia beat AMD last generation, by using a subpar (but cheap) 8nm Samsung (closer to 10nm TSMC) with 66,7-75% CUDA Cores on their chip when AMD used 100% of the chip for pure Raster perf -and- made use of a superior node TSMC 7nm.

Food for thought..
 
Last edited:
"4GB MEANS 4GB"
Pepperidge Farm remembers.

What soon followed was a 20% loss in dGPU market share.
Please stop, AMD.
 
I clearly remember AMD claiming 4GB on Fury X was enough (while 380 series had 8GB variants). Fury series aged like milk. 980 Ti beat it more and more over time and 980 Ti also had insane OC headroom (from 1200ish MHz stock to 1500-1600 MHz on custom cards post OC, hell even reference could sometimes reach 1450-1500 MHz after OC...)

When I think back, 980 Ti was the true king over 1080 Ti. My 1080 Ti could not even OC 10% in comparison. My 980 Ti ran at 1580 MHz for years, thats 40% higher than stock/reference.. This is the true overclockers dream.. Yeah, remember when Lisa Su called Fury X and overclockers dream because it had AIO cooling? Sadly the OC gain was like 1% and watt usage spiked doing so... It was the worst OC card in history of GPUs, yet Lisa Su claimed it was an overclockers dream ON STAGE (Look up youtube video).

However, lets hope AMD comes back from this. MCM approach so far, looks kinda fail for GPUs. 7900XTX uses 50-60 watts more than 4080 while delivering same raster perf. MCM was meant to lower watt usage last time I checked.

I also hear RTX 5000 series is going to be monolithic again. Will AMD go back as well? Maybe it just don't make sense for GPUs...

I want competition again.
 
To be clear HB-Harware Unboxed were the first who took seriously this Nvidia RAM matter issue and also exposed Nvidia dark pattern planned obsolescence business model. Other reviewers followed and did it too.
AMD, of course that is backing HB and others who pointed out, for the good reasons, the RAM weakness of Nvidia videocards.

I bet that Nvidia will continue to release videocards with less memory (for intended bottleneck) to lure potential customers to buy the most expensive, higher models which have the needed ram. Especially if reviewers and community are not exposing them.

In a previous article I mentioned about this kind of dark pattern model business regarding DLSS too through a joke:
A "fisherman" was selling his fishes in the market. Joe guy asked him the prices.
"Fisherman" said: $100 for 10 fishes. $1000 for 10 fish heads.
Why so expensive fish heads? Asked Joe.
Because fish head contains phosphorus, which makes you smarter. If you eat 10 fish heads, you'd turn into a genius.
Really? Asked Joe. OK, then I buy 10 fish heads.
While eating the 2nd fish head, Joe screamed:
Wait a minute, If I had bought 10 fishes, I should get 10 fish head too for only $100.
You cheated on me.
The "fisherman" answers: see that phosphorus is working as I told you, now you are more intelligent than before.

Nvidia PR is the "fisherman". Who is Joe? Spoiler: uninformed customer(s).
How many Joes fell in the DLSS2-3 net?

Back to reality, watching Nvidia business approach, I propose the Nvidia "conjecture":
DLSS version will increase in number with every "new" Nvidia generation videocards and will "work" only on that last generation.
Spoiler warning: DLSS4 will come and it's new added features will "work" only for the next gen Nvidia 5xxx (or 6xxx) videocards.

I also propose the Nvidia Ram vs performance "conjecture":
Nvidia will offer the proper ram and performance only to their top most expensive videocards, the rest of them will not have enough Ram, or enough performance, or both, to play the games released almost 2 years after, without bottlenecks (stutters).

And believe me, I will be glad if Nvidia will contradict these "conjectures", because it means that customers will get better products instead of intentionally crippled ones.
 
Last edited:
To be clear HB (Harware Unboxed) took the matter first and other reviewers did it too.
AMD is backing them and others who pointed out, for the good reasons, the weakness of Nvidia videocards, and their dark pattern planned obsolesce business model.
I
PO. Everybody is doing it. VRAM is not going to save a weak GPU anyway.

If you look at recent Techpowerup GPU review, go to the page with minimum fps, you will see that 3080 still performs just as good as it did when it was released. 3080 even beats 6800XT by 5% in 1% lows in 3840x2160 while having 6GB less VRAM. And these cards are not even good for 4K to begin with and mostly suited for 1440p gaming.

And you think VRAM is an issue..? Nah.

If you eventually need to lower VRAM and GPU requirement, DLSS exist.

NVIDIA-DLSS-2-vs-AMD-FSR-2-Image-Quality-Performance-Comparison-HardwareUnboxed-_3-1456x819.png
 
Last edited:
PO. Everybody is doing it. VRAM is not going to save a weak GPU anyway.

If you look at recent Techpowerup GPU review, go to the page with minimum fps, you will see that 3080 still performs just as good as it did when it was released. 3080 even beats 6800XT by 5% in 1% lows in 3840x2160 while having 6GB less VRAM. And these cards are not even good for 4K to begin with and mostly suited for 1440p gaming.

And you think VRAM is an issue..? Nah.

If you eventually need to lower VRAM and GPU requirement, DLSS exist.

NVIDIA-DLSS-2-vs-AMD-FSR-2-Image-Quality-Performance-Comparison-HardwareUnboxed-_3-1456x819.png
I suggest to play the games released in 2023, like Howard Legacy or the games which HB tested and come back.
My opinion is that RAM matters more and more with every year passing by.
For example, I got a good offer for 130$ for a RX5600XT and declined because it has only 6GB RAM and is not good enough to play proper the new games in 2023.
 
Last edited:
High-end GPUs should have a minimum of 16GB this generation. So, the 4070 Ti and 4070 both have less VRAM than they should in my opinion. Nvidia got away with 8GB for anything under the 3080 last gen, but that is proving to be too little in many newer titles. Of course, you can always drop some settings and still get great looking games, so it's not like 8GB cards are completely obsolete yet or even close. But, having to dial back textures in higher resolutions is painful when you consider that it's not because the GPU itself is not capable, but that the card just lacks the VRAM.
 
Joe Nvidia customer: "You are right, I'm more clever now, can I have another 5 fish heads"
Lool, yep, sometimes, when buyers do not have a healthy competition and the market is imbalanced, some may find themselves asking: can I have 5 more fish heads?
Epic.

Or maybe they need to take baby steps.
 
Last edited:
I suggest to play the games released in 2023, like Howard Legacy or the games which HB tested and come back.
My opinion is that RAM matters more and more with every year passing by.
For example, I got a good offer for 130$ for a RX5600XT and declined because it has only 6GB RAM and is not good enough to play proper the new games in 2023.
I just booted up Hogwart's Legacy on my ASUS laptop w/2070 Super. I used recommended settings (global medium settings) and I'm using between 5 and 6 G of VRAM. Even at Ultra settings I'm only using slightly more than 7G. I'd think 12G is plenty for this game. Granted, I'm not gaming at 4K, but I wouldn't expect to game at 4K with a 2070 or even a 4070Ti.
 
I just booted up Hogwart's Legacy on my ASUS laptop w/2070 Super. I used recommended settings (global medium settings) and I'm using between 5 and 6 G of VRAM. Even at Ultra settings I'm only using slightly more than 7G. I'd think 12G is plenty for this game. Granted, I'm not gaming at 4K, but I wouldn't expect to game at 4K with a 2070 or even a 4070Ti.
Yes, for 2K is playing nice and is great that you can enjoy it in games.
I recently made the transition from 2k to a 4K monitor, and this is one of the reasons why I am more attentive to RAM issue.
Gaming in 4K will be the RAM challenge bottleneck with high and ultra settings for videocards with 8GB and even for 12GB Ram in the near future.
From 2K gaming point of view, Nvidia 8Gb limitation is not so disastrous right now, but it is a concerning issue for 4K, especially when Nvidia made it rather for market segmentation and planned obsolescence.
Granted, I'm not gaming at 4K, but I wouldn't expect to game at 4K with a 2070 or even a 4070Ti.
About 4070-4070Ti videocard, Nvidia compared it to 3090 videocard, which was promoted as a 4k gaming card.
I think that Nvidia exposed themselves for making this artificial market segmentation when they released a 3060 videocard with 12 GB RAM. So they could easily give 12GB to 3070 card or 16GB or more for 4070 and 4070Ti. And everybody would have been pleased. Also an RTX 3070 videocard with 12 GB RAM, or a 4070 with 16GB Ram or more, would have won the middle videocard market and consumers heart too.
Instead, the Nvidia middle videocard market is rather stacked in this planned obsolescence RAM limitation and artificial market segmentation.

One of the best solution for Nvidia now is to release their new videocard versions, with the proper RAM for their related performance and all of this RAM-gate saga will disappear and consumers will be happier.
 
Last edited:
Yes, for 2K is playing nice and is great that you can enjoy it in games.
I recently made the transition from 2k to a 4K monitor, and this is one of the reasons why I am more attentive to RAM issue.
Gaming in 4K will be the RAM challenge bottleneck with high and ultra settings for videocards with 8GB and even for 12GB Ram in the near future.
From 2K gaming point of view, Nvidia 8Gb limitation is not so disastrous right now, but it is a concerning issue for 4K, especially when Nvidia made it rather for market segmentation and planned obsolescence.

About 4070-4070Ti videocard, Nvidia compared it to 3090 videocard, which was promoted as a 4k gaming card.
I think that Nvidia exposed themselves for making this artificial market segmentation when they released a 3060 videocard with 12 GB RAM. So they could easily give 12GB to 3070 card or 16GB or more for 4070 and 4070Ti. And everybody would have been pleased. Also an RTX 3070 videocard with 12 GB RAM, or a 4070 with 16GB Ram or more, would have won the middle videocard market and consumers heart too.
Instead, the Nvidia middle videocard market is rather stacked in this planned obsolescence RAM limitation and artificial market segmentation.

One of the best solution for Nvidia now is to release their new videocard versions, with the proper RAM for their related performance and all of this RAM-gate saga will disappear and consumers will be happier.
Certainly, higher resolutions are going to be an issue for all GPUs, one way or the other. But, I think we've seen that this can be handled. Given that some of these games, like Last of Us Pt 1, is a console port, I'd suggest that vram is not the sole issue. It feels like the developers just threw it out there not knowing (or not caring) there were going to be problems. If these games can run on consoles, then they can run on PCs. The PlayStation has 16G of shared memory, so how is it that a PC with 16G system ram and 12G GPU vram cannot compete with that? My guess, poor optimization for the PC platform.

As for the 4070Ti, it does beat the 3090 at 4K, it's just when you turn on RT that it starts to lose out. Which, I think, points out that with today's crop of games, vram isn't the primary performance inhibitor especially when looking at raster only performance. Still, it would be nice to have more for future proofing. This may push me to a 7900XTX or 4080 to get more vram.
 
Certainly, higher resolutions are going to be an issue for all GPUs, one way or the other. But, I think we've seen that this can be handled. Given that some of these games, like Last of Us Pt 1, is a console port, I'd suggest that vram is not the sole issue. It feels like the developers just threw it out there not knowing (or not caring) there were going to be problems. If these games can run on consoles, then they can run on PCs. The PlayStation has 16G of shared memory, so how is it that a PC with 16G system ram and 12G GPU vram cannot compete with that? My guess, poor optimization for the PC platform.

As for the 4070Ti, it does beat the 3090 at 4K, it's just when you turn on RT that it starts to lose out. Which, I think, points out that with today's crop of games, vram isn't the primary performance inhibitor especially when looking at raster only performance. Still, it would be nice to have more for future proofing. This may push me to a 7900XTX or 4080 to get more vram.
I agree, and I can add that the hardware market, especially videocard market, is quite a strange one nowadays.
 
It is funny AMD even mentions Ray Tracing considering none of their GPUs can do it properly
"Properly"?
So Ampere/3090 level RT is now useless suddenly?
What will AMD do when 4080 Ti and 4090 Ti hit? They pretty much peaked 7000 series with 7900XTX and it only performs on par with 4080 (in raster, 4080 is much faster in ray tracing). Also 4080 consumes 50-60 watts less for similar performance + has better and more features.
They are not competing with 4090, much less 4090 Ti (if that materializes) anyway. And considering ho much more 4080 already costs I doubt they would have problems with 4080 Ti.
7900XTX should have been 899 on release and 7900XT should have been 699 on release. Then AMD would actually be competitive and make some people sway over.
Yes and 4090 should have been 1200. 4080 at 800 and so on. It's always the price argument with AMD. "They need to be $$$ lower to be competitive". As if that's the only value they bring.
AMD dropped below 11% GPU marketshare in latest steam hw survey, march 2023. They are doing worse than ever in the GPU segment.
Steam HWS is bs. Last time Chinese users increased by 50%. That about sums up the accuracy. Radeons market share is around 30% based on more reliable sources.
7900XTX was better value than 7900XT which is unheard of. Flagship = worst value for money, but you get the fastest card.
Nvidia did the same with 4090 making it artificially better value than the rest.
I am no Nvidia fanboy. I actually consider 7900XTX, however not at ~1000 dollars, maybe at 800 dollars. I am not considering 7900XT before it drops to 650-700 dollars either.
With my 3080 I have superior features. Features Im afraid to lose. DLDSR is amazing. DLAA is awesome as well and DLSS2 made me max out some games I thought was impossible.
I know for sure that I will get worse features with AMD, that is why I am not willing to pay a premium. If my 3080 died today, I would probably grab a 4080 over 7900XTX because 200 dollars is not enough difference when you add in the other stuff.
Say's the guy who owns 3080 and would pay 200+ for "features". If someone says they're not a fanboy it's a dead giveaway they are a fanboy.
Does AMD even have an answer to DLDSR and DLAA? Especially DLDSR is a feature I will miss alot if I make the switch. Games look amazing downsampled from much higher resolution and needs zero game support, you just choose the virtual resolution ingame.
You know for sure and yet you have not used AMD's cards and do not even know they have had VSR support for ages. Downsampling has been around for a long time. Sure they dont have AI assisted downsampling.
AMD's control panel on much better than Nvidias Win95 era control panel and mandatory login GFE.
Bad RT perf, I can deal with. I don't really care about RT (performance hit is not worth it for me). However the RT and Tensor cores are much more than just RT.
So again Ampere level RT perf is bad because it's on AMD?
Name me one use case for RT cores outside games? I cant think of any. There are some use for Tensor cores but AMD's FSR proves that these tasks can be done with regular compute units as well.
Imagine Nvidia did not try and push RTX? Then they would have 25-33% die space for CUDA cores. No-one thought about this? It would have been game-over for AMD. Nvidia started to push RTX because they knew they were years head and could afford to drop their raster perf.
RT core area is nowhere near 25% or more. And assuming it would have been then im sure it would have gone over really well to increase TDP from 400W+ to accommodate those extra cores or to lower clock speeds to keep power in check - negating the performance benefit of those extra cores would bring.
Nvidia beat AMD last generation, by using a subpar (but cheap) 8nm Samsung (closer to 10nm TSMC) with 66,7-75% CUDA Cores on their chip when AMD used 100% of the chip for pure Raster perf -and- made use of a superior node TSMC 7nm.
They did not "beat" AMD last gen. They barely scraped by and were bailed out by miners willing to buy their stuff at any price.
 
And you think VRAM is an issue..? Nah.
It really isn't, unless you want to hang your hat on a handfull of recently released garbage spec ports, some of which you were able to get refunded regardless of play time, OR, if like Steve you have a conclusion ready, and design a whole test around proving yourself right by creating scenario's virtually nobody would use. But of course AMD's real and volunteer marketing departments are going to play into it, they'd be foolish not to, it's all they've got...
 
Yes, for 2K is playing nice and is great that you can enjoy it in games.
I recently made the transition from 2k to a 4K monitor, and this is one of the reasons why I am more attentive to RAM issue.
Gaming in 4K will be the RAM challenge bottleneck with high and ultra settings for videocards with 8GB and even for 12GB Ram in the near future.
From 2K gaming point of view, Nvidia 8Gb limitation is not so disastrous right now, but it is a concerning issue for 4K, especially when Nvidia made it rather for market segmentation and planned obsolescence.

About 4070-4070Ti videocard, Nvidia compared it to 3090 videocard, which was promoted as a 4k gaming card.
I think that Nvidia exposed themselves for making this artificial market segmentation when they released a 3060 videocard with 12 GB RAM. So they could easily give 12GB to 3070 card or 16GB or more for 4070 and 4070Ti. And everybody would have been pleased. Also an RTX 3070 videocard with 12 GB RAM, or a 4070 with 16GB Ram or more, would have won the middle videocard market and consumers heart too.
Instead, the Nvidia middle videocard market is rather stacked in this planned obsolescence RAM limitation and artificial market segmentation.

One of the best solution for Nvidia now is to release their new videocard versions, with the proper RAM for their related performance and all of this RAM-gate saga will disappear and consumers will be happier.
Yeah but whos buying a 8GB Videocard for 4K? That would be insane unless you play old games or want to rely on DLSS/FSR
 
Yeah but whos buying a 8GB Videocard for 4K? That would be insane unless you play old games or want to rely on DLSS/FSR
True, unfortunately uninformed buyers may buy, thinking that Nvidia videocards price being close or higher than AMD.
Nvidia is selling RTX3xxx 8GB and RTX 12 GB videocards more expensive than AMD 16GB RX 6xxx videocards.
With these Nvidia videocards you cannot play proper games at 4k.
With AMD 16GB RX6xxx videocards you can.
 
True, unfortunately uninformed buyers may buy, thinking that Nvidia videocards price being close or higher than AMD.
Nvidia is selling RTX3xxx 8GB and RTX 12 GB videocards more expensive than AMD 16GB RX 6xxx videocards.
With these Nvidia videocards you cannot play proper games at 4k.
With AMD 16GB RX6xxx videocards you can.
Well I would not use either for 4K gaming. Only 4080, 4090 and 7900XTX seems to be good enough for proper 4K gaming...

However, 3070 Ti performs on par with Radeon 6800 16GB in 4K in Techpowerups latest GPU review


And in the Minimum fps test, 3070 Ti even beats 6800 with 1%


So it's not that simple. Yes more VRAM is better, if you actually need it, but you won't be running max settings on a 8GB card in 4K anyway. You won't on 6800 and 6900 series either. They are simply too slow.

Pretty much no last gen card will play new and demanding games at 4K using high settings at good framerate. Yes, some games will run fine, but some games will also run fine on 3060 in 4K as well. I am talking about the most demanding titles and going forward, this will only get worse and worse.

3090 Ti and 6950XT might do 4K in more games, however they will still crumble in others. The GPU is too slow.

If you buy a last gen card for 4K, you are simply doing it wrong. 4090 is in a league of its own currently, for 4K+ gaming. 25% faster than 4080 and 7900XTX on average, and this is in pure raster.
 
Last edited:
So it's not that simple. Yes more VRAM is better, if you actually need it, but you won't be running max settings on a 8GB card in 4K anyway. You won't on 6800 and 6900 series either. They are simply too slow.
Hardware Unboxed proved exactly that 6800XT and 6900XT which have 16GB RAM can run games at 4K, while Nvidia cards, which are direct or close price competitors to 6800XT and 6900XT and have less RAM can NOT without stutters, or the other bigger issue, the low image quality.
Why you disregard this, it is your choice how to address it.
Also, your responses seem too focused on praising Nvidia benefits. This could be OK, just that, in the mean time you dodge, or refuse to acknowledge the weakness of the same Nvidia cards, and the lack of proper RAM being a huge problem.
For me is OK, I agree with the strong points of Nvidia cards, and also I acknowledge the big weakness of Nvidia cards. Also I, together with many hardware reviewers and other users acknowledge that this huge Nvidia weakness RAM deficit is rather a planned obsolescence or an Nvidia dark pattern business model, which I chose to expose it instead of accepting it.
When I buy videocards my wallet choose the best deals, which in my region market, are AMD deals. I would gladly buy an Nvidia 4xxx card too, for the right price, but unfortunately (all) the last 4070 and 4070ti videocards are overpriced and do not have the proper RAM to justify the investment.
 
Last edited:
Hardware Unboxed proved exactly that 6800XT and 6900XT which have 16GB RAM can run games at 4K, while Nvidia cards, which are direct or close price competitors to 6800XT and 6900XT and have less RAM can NOT without stutters, or the other bigger issue, the low image quality.
Why you disregard this, it is your choice how to address it.
Also, your responses seem too focused on praising Nvidia benefits. This could be OK, just that, in the mean time you dodge, or refuse to acknowledge the weakness of the same Nvidia cards, and the lack of proper RAM being a huge problem.
For me is OK, I agree with the strong points of Nvidia cards, and also I acknowledge the big weakness of Nvidia cards. Also I, together with many hardware reviewers and other users acknowledge that this huge Nvidia weakness RAM deficit is rather a planned obsolescence or an Nvidia dark pattern business model, which I chose to expose it instead of accepting it.
When I buy videocards my wallet choose the best deals, which in my region market, are AMD deals. I would gladly buy an Nvidia 4xxx card too, for the right price, but unfortunately (all) the last 4070 and 4070ti videocards are overpriced and do not have the proper RAM to justify the investment.
How come Techpowerup show they can, with perfectly fine 1% lows?

6800XT performs miserable in 4K compared to the best 4K options and 6800XT and 6900XT were NEVER 3070 comptitors to begin with :laughing:

3070 launched at 499 dollars.
6800 was 579
6800XT was 649
6900XT was 999 USD

3070 still beats 6700XT 12GB in pretty much every game. 3070 is around 12-15% faster in most games.

And you compare these cards with 3070? 6700XT launched at 479 USD and is the true competitor for 3070. 6900XT launched at twice the price. And 6900XT don't even beat 3080 much in 4K, which launched at 699. 6900XT performs like 1 fps higher on average in 4K while costing 300 dollars more on launch.

NONE of those cards are running 4K on high settings in demanding games anyway. GPU is simply too slow. 3090 Ti and 6950XT don't even impress much in 4K these days either. Alot of VRAM will not save a dated GPU.


3070 beats 6700XT by almost 20% in the 4K test. Again, Minimum fps, 1% lows. 4GB more VRAM and still loose alot.

Yes more VRAM is always better >> if the GPU is actually solid and up to the task <<

Fact is that 3080 with 10GB performs 1.4 fps lower on average on all the games tested compared to 6900XT 16GB, and this is 1% minimum fps, which would reveal any stutter or issues if VRAM was an issue.

The only TRUE 4K card is 4090, 4080 and 7900XTX can work as well, every other card will feel subpar including 4070 Ti and 7900XT which both beats 6800XT and 6900XT with ease in 4K gaming anyway. Zero last gen cards will play new and demanding games at 4K with good fps on high/ultra settings, unless you like console fps.

You simply don't buy mid-end cards for 4K. This is why you WILL GET 16GB or more if you actually buy a proper 4K card. If you can't afford 4080/7900XTX or better, then you can't afford proper 4K gaming and you are better off settling with 2560x1440 or 3440x1440.

My 3080 performs just as good in 1440p as it did back in 2020 when I bought it. Zero problems with VRAM. I am not using RT and I am not using ultra presets, I always tweak my settings, which incudes max textures (always), 16xAF and all the stuff you actually want, with Motion blur, DOF and other crap disabled. This is often turned ON in "ultra presets" which sucks. Those features also increase VRAM usage while making image quality worse. Pointless to test games on ultra settings.
 
Last edited:
Back