8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800

It's a shame Nvidia stagnated their VRAM on the XX70 series cards, and most cards in general for the last few generations, 8GB was what the 1070 had way back, they should have been bumping it 2-4GB per generation, but elected not to.
 
AMD "fine wine" hits again. Yep this is exactly what I noticed on my 3070, a messy stutter in some newer games. But coming from a 2060 6GB it's still a good uplift for me playing at 1440p. I can turn the dial down a bit and still be happy. Not a fan of Ultra High settings and not going to break the bank for it.

 
If it wasn't for the fact that ampre's video encoder is better than RNDA2's I'd have gone for a 6700XT or 6800 (these were none existent for ages during mining boom). I've decided I'm just gonna buy a PS5 and call it a day. RTX 3060 Ti will have to do for some games but PS5 I know whatever game I like it at least run as expected.
 
AMD "fine wine" hits again. Yep this is exactly what I noticed on my 3070, a messy stutter in some newer games. But coming from a 2060 6GB it's still a good uplift for me playing at 1440p. I can turn the dial down a bit and still be happy. Not a fan of Ultra High settings and not going to break the bank for it.

Yeah Fury series aged very well.

Which games causes you issues, because I don't see 3070 having issues outside of flawed console ports like TLOU? (Using settings that actually make sense, for a 3 year old mid-end GPU) I have a 3080 but several of my friends have 3070 and they don't complain and run 1440p/144-240Hz.

It's really a non-issue for most people, because you have to run games completely maxed out - sometimes with RT enabled on max as well - to make 8GB struggle in 1440p.

Besides, 6800 was not the 3070 competitor, 6700XT was. 6800 was 579 dollars, 3070 was 499.

3070 _still_ beats 6700XT in pretty much all games, and it's 12% faster on average at 1440p (minimum fps) -> https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/36.html

6800 16GB is like 1.5% faster than 3070 8GB in the same test.

This is using high settings.

Ultra preset often means garbage motion blur, dof and other crap that costs alot of VRAM while delivering WORSE image quality. Always tweak your games, ultra preset often sucks.
 
Last edited:
Would love to hear some developer commentary on how this large VRAM requirement came to pass.

As these games were being designed and tested, and even today, the fraction of the gaming audience with > 8 GB VRAM is tiny. The consoles probably don't have > 8 GB available either, although it's a little murkier since they have a shared RAM space that is over 8 GB but that also must accommodate all system needs not just VRAM.

So given that it is no surprise at all that most systems will not have this larger VRAM capacity available, why is it only post-launch that gamers and apparently even the developers (given they are only now pushing out post-launch patches to handle it) are realizing the games don't actually run right on most systems?

I wonder if something internally changed at the driver level where the same API calls now require more VRAM to work right than they did say two years ago.
 
While I'm in full agreement that Nvidia did not offer enough VRAM in these cards, it's is worth noting that these games come with a full range of settings and that 8 GB of VRAM will be playable, even at higher resolution, for a long time to come. You just will not get the top tier of texture and effects quality. It is also worth noting that "Ultra" settings are often only a slight visual upgrade to high and sometimes even medium settings. A lot of time max settings really are just for those that have the really high-end GPUs. Many times you might not even notice the difference between Ultra and Medium settings unless you pause and look for the differences. "Oh look, the shadow under that tree is not soft enough, I need a new $1000 GPU!"

That being said, I have said before that 12 GB is just not enough for the 4070 and 4070 Ti, making Nvidia's least expensive GPU with sufficient VRAM the 4080 at $1200 and you maybe able to find a deal on a 3090. AMD has plenty of offerings with 16GB of VRAM both this generation and last that can be found starting around $500. If you must buy a new GPU, right now you are much better off with AMD in my opinion and I own a 3080.
 
Last edited:
Why large VRAM? The GPU stores all the textures, surfaces (mesh) and lighting info for the scenes that you're playing. It needs to store them on the card so they can be accessed quickly as you move about in the scene. If there isn't enough space on the card to store all the textures needed then there'll be stuttering as these textures are loaded from RAM or disk etc. If you play at higher resolutions then the textures that are used take up more space.

The skill is managing these textures so the only those that are needed are loaded onto the card and those no longer needed are discarded. In a game like GTA you'll get a certain class of cars appear in certain areas (ie Porches and Ferraris in expensive parts of town) and this helps the GPU as it only needs to load a particular class of cars textures. Additionally some textures in the game files might be compressed to save disk space and these will need decompressing before being used.

When you used to get a static "completed screen" appear at the end of a level, it usually meant the game was feverishly loading all the new textures needed for the next level.
 
""After today's testing we believe this is definitive proof that 8GB of VRAM is no longer sufficient for high-end gaming because gaming studios have cut so much DEV pay they don't care to fix it''"

Fix it for you ...
 
For most games 8GB should still be enough. A mid-range card today should have more than that but the notion that you will wake up one day and you won't be able to run games due to lack of VRAM is laughable. A card such as RTX 3070 is not meant for 4K and/or any serious ray tracing which is where the trouble begins. This is now "the current thing" people get upset about.
 
Would love to hear some developer commentary on how this large VRAM requirement came to pass.

As these games were being designed and tested, and even today, the fraction of the gaming audience with > 8 GB VRAM is tiny. The consoles probably don't have > 8 GB available either, although it's a little murkier since they have a shared RAM space that is over 8 GB but that also must accommodate all system needs not just VRAM.

So given that it is no surprise at all that most systems will not have this larger VRAM capacity available, why is it only post-launch that gamers and apparently even the developers (given they are only now pushing out post-launch patches to handle it) are realizing the games don't actually run right on most systems?

I wonder if something internally changed at the driver level where the same API calls now require more VRAM to work right than they did say two years ago.
The PS5 and Xbox Series X have roughly the same amount of RAM available for developers to use when making games and it's between 12 and 14GB. The actual amount is variable, as it depends on what functionality is running in the background with the operating system. Devs can 'disable' some of these to get more, but it only releases hundreds of MB, rather large amounts of GB. For the older consoles, there's obviously far less RAM -- in the case of the PS4, despite having 8GB, only 4.5GB was available for use by games.

In either case, though, the big, AAA titles use pretty much every available byte, storing the game's code, all of the assets in current use, caches for streaming, movie files, as well as all of the working buffers and render targets in that footprint. Because RAM is in such demand on consoles, a decent group of devs will spend a considerable amount of time streamlining and fine-tuning everything, especially if they're only making that title for one platform.

But when it comes to ports or multi-platform releases, PCs don't have anything like the restrictions that a console does -- an average gaming PC might have 16GB of system RAM and 8GB for the GPU. This and the fact that a port (or single version of a multi-platform release) is unlikely to have the same amount of resources (human, time, fiscal, etc) as a single-platform release means that one has to target a middle ground when it comes to settings, system configuration, and how it all gets used. That target may well have little semblance to what's actually required to run the game at full settings, max resolution, and so on.
 
I'm sure readers have already noted that something is up with the Resident Evil 4 "RT off" chart in this piece. That also happens to be the only game in this article that I've played on my 3070 (plus COD). The thing is though, I rarely play games with everything on ultra.

My 3070 is able to stay above 100fps in RE4 almost all the time with RT turned off (it's only reflections and is bugged anyway), with a couple of settings turned down a notch (in all the modern RE games that max volumetric lighting setting is unnecessary), while playing in 4K DLSS performance mode with the mod (effectively rendering at 1080p). The mod has some minor visual glitches but the performance uplift more than makes up for it.

DLSS has done a lot of heavy lifting for me and optimizing settings is a given, but it was insane for Nvidia to ship xx80-class cards with 8GB of VRAM for three generations straight.

The problem for developers, though, is that if the Steam hardware survey is to be believed, the majority of their audience has 8GB of VRAM or less. Less than 20 percent have 12GB or more. Devs are gonna have to deal with that. Recent games are already listing the 8GB 2070 Super as the "recommended" GPU. It'll be interesting to see how Unreal Engine 5 works into the equation.
 
Hogwarts Legacy is a weird game, when it first released my 3070ti handled it quite nicely, then one of the last patches ruined it, I can stand still inside the castle and textures may not load in for a minute, meanwhile my brothers 2070 super runs the game fine and other than gpu's we have the same system, theres something else lurking in that game causing its issues imo.

I borrowed a friends 6800 and the textures still didnt load in, at this point im thinking alot of games just have non-existent optimization going on and its a gamble if youll have a good or bad experience.
 
Too many systems files running in the background for these GPU cards to really work so well you would have to load a system without these files. Plus the PC CPU and GPU and entire case or laptop should be cooled. I play a lot of games on the desktop on WiFI now and laptop as well WiFi here. Laptop has 120MM cooling pad with 12MM fan right on the hottest point it's like a Frankenstein laptop. I removed the keyboard to access the heat better. This laptop is just use with external gaming RGB mouse and keyboard. Most of the games I play online run well on 8GB HD graphics. Main system has 32GB APU 4 cores and GPU 6 cores.
 
Excellent article!

8GB has been around on the 'low end' since the RX480. Why are $500 products still having the same amount.

If I bought a 3070Ti for $600+ last year I'd be so mad right now that I was having 4 games in the first quarter of 2023 that were having serious issues with my new expensive card.

I do own a 3070 and don't feel quite a bad, but I was forced into getting this card because I had to wait for 8 months to get it. It was this or nothing. So that's what I got. It was over priced, but it has done OK for a year. My 3080 or better never came in while it was a prime product.
 
"We told you back in late 2020 that the RTX 3070 was the GPU to buy if you cared about ray tracing performance and image quality, and at times even mocked the weak RT performance of RDNA 2 GPUs, such as the Radeon 6800."
Yeah, that was some pretty irresponsible reporting on your part back then. Now you have egg on your face and are being all diplomatic.

Let's look at that original comparison from two years ago because I have a memory like an elephant and I remember being so disgusted with the article that I didn't even bother commenting. Sometimes it's not what is presented, but how it's presented. In a tech comparison article, the tone of the overarching narrative is very often the most persuasive aspect of said article. This is because PC tech is pretty up there in the complexity category which means that most people are somewhat to completely lost when it comes down to the brass tacks of these products. Thus, they're forced to read between the lines in a tech article. This is because what they can understand is what the author sees as "good" and what the author sees as "bad" by the phrasing used.

I'm not going to cherry-pick what is said here, I'm going to let Steve's own words speak for themselves.

In the gaming benchmarks:

Battlefield V:
"The margin at 1440p remains the same. Here the Radeon GPU was 32% faster on average hitting 173 fps. However, as we move to the highest resolution tested, 4K, the margin is reduced to 20%, which is still a victory for the red team. It's worth noting that the Radeon 6800 costs 16% more, so the margin isn't as impressive as it appears, therefore we'll go through the rest of the graphs keeping that price discrepancy in mind."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Hitman 2:
"Moving to Hitman 2 we find that the Radeon RX 6800 is far less impressive in this title and although it still manages to beat the RTX 3070 in all tested resolutions, the margins are lower than the price premium. For example, at 1080p the Radeon GPU was just 4% faster, 5% faster at 1440p and 11% faster at 4K. As the resolution scales and we become more GPU bound, the RX 6800 performs better, but even the 11% margin at 4K isn't terribly impressive."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Borderlands 3:
"Borderlands 3 shows more impressive margins for the Radeon. Here the RX 6800 beats the 3070 by 27% at 1080p and 1440p, averaging 128 and 95 fps respectively, using the ultra quality preset. At 4K the RX 6800 leads the RTX 3070 by a 23% margin, pushing up to 54 fps on average from 44 fps, so a solid performance uplift there."
Simplified Translation: The 6800 won, but I can't insult it this time.

Fortnite:
"Next up is Fortnite performance, a game that didn't run as well on Radeon GPUs. AMD put in the work to fix performance in this title and as you can see the RX 6800 series is very competitive, beating the 3070 by a 17% margin at 1080p, 20% at 1440p and 21% at 4K. Given the price difference however those margins hardly make the GeForce GPU a poor value choice."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

Apex Legends:
"In Apex Legends we're looking at very similar performance with either GPU. The RX 6800 is just 6% faster at 1080p, 3% faster at 1440p and up to 5% faster at 4K. The margins are slim in this title, making the GeForce the more attractive option."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

PUBG:
"PUBG is another game that like Fortnite didn't do too well for AMD historically. Given the Radeon 6800 is fetching a price premium, seeing it only match the RTX 3070 isn't a great result."
Simplified Translation: It's a tie, nVidia wins!

CP2077:
"Moving on from battle royale games we have a little known game called Cyberpunk 2077, maybe you've heard of it. At 1080p we find that the Radeon RX 6800 is 8% faster and then 10% faster at 1440p, while we're looking at identical performance at 4K. Now you can enable DLSS with the RTX 3070 for greater performance, something we'll discuss more about later in the article."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

COD:MW:
"Call of Duty Modern Warfare appears to show a strong performance uplift for the RX 6800 over the RTX 3070. Around a 20% performance boost at all three resolutions, meaning that scaling between these two GPUs is very consistent. With the RX 6800 costing ~16% more but delivering about 20% more performance, it's the better value here, though not by a meaningful margin."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Witcher 3:
"The Witcher 3 is now a 5 year old game. With the visual quality settings all maxed out the Radeon is 13% faster at 1080p, 6% faster at 1440p and just 4% faster at 4K. So again it's a situation where the performance difference fail to offset the price premium."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Control:
"Control is yet another game where these AMD and Nvidia GPUs seem evenly matched, however this title supports DLSS 2.0, so you can easily boost the performance of the RTX 3070, which makes it a no contest."
Simplified Translation: It's a tie in an nVidia title, nVidia wins!

RDR2:
"We tested Red Dead Redemption 2 in game (as opposed to using the built-in benchmark) and as you can see the RX 6800 was 14% faster at 1080p, though we wonder if we're running into some kind of system limitation that's not GPU related as that margin opens up quite considerably at 1440p. Here the Radeon was 23% faster but then things close right up at 4K to just a 10% margin in favor of the Radeon GPU, though that's not uncommon to see as the Ampere architecture is often better utilized at higher resolutions."
Simplified Translation: The RX 6800 won, but I'll praise the Ampere architecture.

The Outer Worlds:
"The RX 6800 struggles in The Outer Worlds. Performance isn't bad by any means, but unlike other games where it typically beats the RTX 3070, here it's at best able to match the GeForce GPU. Historically this hasn't been a great title for AMD, so these results should not surprise anyone."
Simplified Translation: It's a tie, nVidia wins!

Warhammer Vermintide 2:
"It wasn't all smooth sailing with the Radeon RX 6800, as we did run into a few titles where performance wasn't where you'd expect it to be, Warhammer Vermintide 2 being one such example. The game plays well and with over 100 fps at all times at 1080p, you could say performance was excellent. However, relative to the RTX 3070 it was kind of terrible, dropping behind the GeForce GPU by a 24% margin. We're not sure what the issue is and we've reached out to AMD as this wasn't the only game to exhibit strange performance, but so far it seems unlikely that this will be addressed given the game's age. Still we felt it was important to include this data as it shows AMD hasn't had the time or resources to optimize for everything. By the time we hit 4K, the RX 6800 is roughly on par with the RTX 3070 though."
Simplified Translation: The RTX 6800 lost in an old game so I get to be savage!

World of Tanks:
"Another game that saw performance issues with the RX 6800 was World of Tanks. Overall performance is still good and well beyond what you'd need to take full advantage of when using a 144Hz monitor, but it was also well down on the RTX 3070 when looking at 1080p and 1440p performance."
Simplified Translation: If I can find a reason to condemn the RX 6800, I'll do it.

Kingdom Come Deliverance:
"A game that's very broken in terms of performance with the Radeon 6800 is Kingdom Come Deliverance and apparently this is a big issue for all games that use the Crytek game engine. Again, we notified AMD of this issue weeks ago, but a fix has yet to surface. The RX 6800 is able to match the 3070 at 4K, but the 1080p and 1440p performance is pretty horrible compared to the GeForce GPU."
Simplified Translation: The RX 6800 is a broken card, don't buy it.

Performance Summary:
"Depending on the game, the Radeon RX 6800 can either look like a hero, or a massive dud. So far we've looked at 15 of the 41 games tested, so let's take a look at how the RTX 3070 and RX 6800 stacked up across all of them..."
In which game was it the hero? All I've seen is complements that you were quick to minimise or outright condemnations. I know I haven't missed anything because I've copied and pasted your article word-for-word.

"Starting with 1440p performance, we see that overall the RX 6800 is just 11% faster on average which is slightly lower than the 14% win it enjoyed in our day one review featuring 18 games. Of course, that review didn't include titles such as Kingdom Come Deliverance, World of Tanks, Warhammer Vermintide 2 and Star Wars Jedi Fallen Order."
Oh I see... So you added a bunch of nVidia-friendly titles.

"Moving to 4K reduces the performance deficit for the RX 6800 seen in titles such as Kingdom Come Deliverance and we end up with just over half a dozen games where performance was even. Overall, the margin is much the same, changing from 11% in favor of the Radeon GPU at 1440p to 10% at 4K. While the Radeon 6800 is a tad faster, it's not necessarily better value or the GPU you should buy."
Well, with a "comparison" like that, I'm shocked that anyone bought the RX 6800. It turns out that the RX 6800 WAS the card that people should buy and I'm forced to wonder how many didn't because YOU implied otherwise. Then, when you were called out about being biased, you had the nerve to say:
"Just so I know, who am I biased towards?"
- You're joking, right?

"TechSpot is about to celebrate its 25th anniversary. TechSpot means tech analysis and advice you can trust."
Well, I'm sure that several of the people who came to read that original article because they didn't know what to do and trusted you are now regretting it. However, I don't see you taking ANY responsibility for your article that definitely made things worse.

"But the goal here is not a big long "I told you so, r/Nvidia," rather it's an attempt to raise awareness, educate gamers about what's going on, and encourage them to demand more from Nvidia and AMD."
Demand more from AMD? The whole root of this problem is that You didn't demand more from nVidia but instead felt like taking the piss out of what is no clearly the superior product! Yet, here you are, not taking accountability for what YOU said and the negatiove effects that YOUR words have had.

Well I have a BIG, LONG "I told you so, Steve Walton" and you're reading it right now. Nothing that I have said here is unfair and nothing that I've said here is wrong. I'm holding you accountable for your actions even as you try to pretend that you were being "impartial". Everyone who reads this will know damn well that it wasn't true.
 
For most games 8GB should still be enough. A mid-range card today should have more than that but the notion that you will wake up one day and you won't be able to run games due to lack of VRAM is laughable. A card such as RTX 3070 is not meant for 4K and/or any serious ray tracing which is where the trouble begins. This is now "the current thing" people get upset about.
That wasn't what the original Techspot review said. RT and DLSS were king....
 
Back