8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800

New games? pretty much yes. Good thing there are lots of older games that are more fun than shiny puddles and no fun.

Old games and indies are great, but why do I need to buy new hardware for them? I am happy with my old hardware, it works great.
 
"We told you back in late 2020 that the RTX 3070 was the GPU to buy if you cared about ray tracing performance and image quality, and at times even mocked the weak RT performance of RDNA 2 GPUs, such as the Radeon 6800."
Yeah, that was some pretty irresponsible reporting on your part back then. Now you have egg on your face and are being all diplomatic.

Let's look at that original comparison from two years ago because I have a memory like an elephant and I remember being so disgusted with the article that I didn't even bother commenting. Sometimes it's not what is presented, but how it's presented. In a tech comparison article, the tone of the overarching narrative is very often the most persuasive aspect of said article. This is because PC tech is pretty up there in the complexity category which means that most people are somewhat to completely lost when it comes down to the brass tacks of these products. Thus, they're forced to read between the lines in a tech article. This is because what they can understand is what the author sees as "good" and what the author sees as "bad" by the phrasing used.

I'm not going to cherry-pick what is said here, I'm going to let Steve's own words speak for themselves.

In the gaming benchmarks:

Battlefield V:
"The margin at 1440p remains the same. Here the Radeon GPU was 32% faster on average hitting 173 fps. However, as we move to the highest resolution tested, 4K, the margin is reduced to 20%, which is still a victory for the red team. It's worth noting that the Radeon 6800 costs 16% more, so the margin isn't as impressive as it appears, therefore we'll go through the rest of the graphs keeping that price discrepancy in mind."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Hitman 2:
"Moving to Hitman 2 we find that the Radeon RX 6800 is far less impressive in this title and although it still manages to beat the RTX 3070 in all tested resolutions, the margins are lower than the price premium. For example, at 1080p the Radeon GPU was just 4% faster, 5% faster at 1440p and 11% faster at 4K. As the resolution scales and we become more GPU bound, the RX 6800 performs better, but even the 11% margin at 4K isn't terribly impressive."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Borderlands 3:
"Borderlands 3 shows more impressive margins for the Radeon. Here the RX 6800 beats the 3070 by 27% at 1080p and 1440p, averaging 128 and 95 fps respectively, using the ultra quality preset. At 4K the RX 6800 leads the RTX 3070 by a 23% margin, pushing up to 54 fps on average from 44 fps, so a solid performance uplift there."
Simplified Translation: The 6800 won, but I can't insult it this time.

Fortnite:
"Next up is Fortnite performance, a game that didn't run as well on Radeon GPUs. AMD put in the work to fix performance in this title and as you can see the RX 6800 series is very competitive, beating the 3070 by a 17% margin at 1080p, 20% at 1440p and 21% at 4K. Given the price difference however those margins hardly make the GeForce GPU a poor value choice."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

Apex Legends:
"In Apex Legends we're looking at very similar performance with either GPU. The RX 6800 is just 6% faster at 1080p, 3% faster at 1440p and up to 5% faster at 4K. The margins are slim in this title, making the GeForce the more attractive option."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

PUBG:
"PUBG is another game that like Fortnite didn't do too well for AMD historically. Given the Radeon 6800 is fetching a price premium, seeing it only match the RTX 3070 isn't a great result."
Simplified Translation: It's a tie, nVidia wins!

CP2077:
"Moving on from battle royale games we have a little known game called Cyberpunk 2077, maybe you've heard of it. At 1080p we find that the Radeon RX 6800 is 8% faster and then 10% faster at 1440p, while we're looking at identical performance at 4K. Now you can enable DLSS with the RTX 3070 for greater performance, something we'll discuss more about later in the article."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

COD:MW:
"Call of Duty Modern Warfare appears to show a strong performance uplift for the RX 6800 over the RTX 3070. Around a 20% performance boost at all three resolutions, meaning that scaling between these two GPUs is very consistent. With the RX 6800 costing ~16% more but delivering about 20% more performance, it's the better value here, though not by a meaningful margin."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Witcher 3:
"The Witcher 3 is now a 5 year old game. With the visual quality settings all maxed out the Radeon is 13% faster at 1080p, 6% faster at 1440p and just 4% faster at 4K. So again it's a situation where the performance difference fail to offset the price premium."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Control:
"Control is yet another game where these AMD and Nvidia GPUs seem evenly matched, however this title supports DLSS 2.0, so you can easily boost the performance of the RTX 3070, which makes it a no contest."
Simplified Translation: It's a tie in an nVidia title, nVidia wins!

RDR2:
"We tested Red Dead Redemption 2 in game (as opposed to using the built-in benchmark) and as you can see the RX 6800 was 14% faster at 1080p, though we wonder if we're running into some kind of system limitation that's not GPU related as that margin opens up quite considerably at 1440p. Here the Radeon was 23% faster but then things close right up at 4K to just a 10% margin in favor of the Radeon GPU, though that's not uncommon to see as the Ampere architecture is often better utilized at higher resolutions."
Simplified Translation: The RX 6800 won, but I'll praise the Ampere architecture.

The Outer Worlds:
"The RX 6800 struggles in The Outer Worlds. Performance isn't bad by any means, but unlike other games where it typically beats the RTX 3070, here it's at best able to match the GeForce GPU. Historically this hasn't been a great title for AMD, so these results should not surprise anyone."
Simplified Translation: It's a tie, nVidia wins!

Warhammer Vermintide 2:
"It wasn't all smooth sailing with the Radeon RX 6800, as we did run into a few titles where performance wasn't where you'd expect it to be, Warhammer Vermintide 2 being one such example. The game plays well and with over 100 fps at all times at 1080p, you could say performance was excellent. However, relative to the RTX 3070 it was kind of terrible, dropping behind the GeForce GPU by a 24% margin. We're not sure what the issue is and we've reached out to AMD as this wasn't the only game to exhibit strange performance, but so far it seems unlikely that this will be addressed given the game's age. Still we felt it was important to include this data as it shows AMD hasn't had the time or resources to optimize for everything. By the time we hit 4K, the RX 6800 is roughly on par with the RTX 3070 though."
Simplified Translation: The RTX 6800 lost in an old game so I get to be savage!

World of Tanks:
"Another game that saw performance issues with the RX 6800 was World of Tanks. Overall performance is still good and well beyond what you'd need to take full advantage of when using a 144Hz monitor, but it was also well down on the RTX 3070 when looking at 1080p and 1440p performance."
Simplified Translation: If I can find a reason to condemn the RX 6800, I'll do it.

Kingdom Come Deliverance:
"A game that's very broken in terms of performance with the Radeon 6800 is Kingdom Come Deliverance and apparently this is a big issue for all games that use the Crytek game engine. Again, we notified AMD of this issue weeks ago, but a fix has yet to surface. The RX 6800 is able to match the 3070 at 4K, but the 1080p and 1440p performance is pretty horrible compared to the GeForce GPU."
Simplified Translation: The RX 6800 is a broken card, don't buy it.

Performance Summary:
"Depending on the game, the Radeon RX 6800 can either look like a hero, or a massive dud. So far we've looked at 15 of the 41 games tested, so let's take a look at how the RTX 3070 and RX 6800 stacked up across all of them..."
In which game was it the hero? All I've seen is complements that you were quick to minimise or outright condemnations. I know I haven't missed anything because I've copied and pasted your article word-for-word.

"Starting with 1440p performance, we see that overall the RX 6800 is just 11% faster on average which is slightly lower than the 14% win it enjoyed in our day one review featuring 18 games. Of course, that review didn't include titles such as Kingdom Come Deliverance, World of Tanks, Warhammer Vermintide 2 and Star Wars Jedi Fallen Order."
Oh I see... So you added a bunch of nVidia-friendly titles.

"Moving to 4K reduces the performance deficit for the RX 6800 seen in titles such as Kingdom Come Deliverance and we end up with just over half a dozen games where performance was even. Overall, the margin is much the same, changing from 11% in favor of the Radeon GPU at 1440p to 10% at 4K. While the Radeon 6800 is a tad faster, it's not necessarily better value or the GPU you should buy."
Well, with a "comparison" like that, I'm shocked that anyone bought the RX 6800. It turns out that the RX 6800 WAS the card that people should buy and I'm forced to wonder how many didn't because YOU implied otherwise. Then, when you were called out about being biased, you had the nerve to say:
"Just so I know, who am I biased towards?"
- You're joking, right?

"TechSpot is about to celebrate its 25th anniversary. TechSpot means tech analysis and advice you can trust."
Well, I'm sure that several of the people who came to read that original article because they didn't know what to do and trusted you are now regretting it. However, I don't see you taking ANY responsibility for your article that definitely made things worse.

"But the goal here is not a big long "I told you so, r/Nvidia," rather it's an attempt to raise awareness, educate gamers about what's going on, and encourage them to demand more from Nvidia and AMD."
Demand more from AMD? The whole root of this problem is that You didn't demand more from nVidia but instead felt like taking the piss out of what is no clearly the superior product! Yet, here you are, not taking accountability for what YOU said and the negatiove effects that YOUR words have had.

Well I have a BIG, LONG "I told you so, Steve Walton" and you're reading it right now. Nothing that I have said here is unfair and nothing that I've said here is wrong. I'm holding you accountable for your actions even as you try to pretend that you were being "impartial". Everyone who reads this will know damn well that it wasn't true.

That was interesting - mentioning some day to day variable like price in many game summaries is really a No No - You normally see price in reviews as statement at beginning and at end for all types of reviews - eg Cars, Audio - unless a specific point - eg at this price point of $5000 for the car one can't expect more than a simple wishbone suspension - at the $1000 point just a wooden yoke and springs
 
In todays test, Steve sets out with the goal to choke an 8GB card, make his point, and conclude he's correct.

The results, to nobodies surprise, show you can choke an 8GB card when you purposely design the test to do so. RT shadows at 1440P with everything maxed, yeah super realistic. Crap ports that need more patches, nice one.

Well done Steve.
 
Last edited:
"We told you back in late 2020 that the RTX 3070 was the GPU to buy if you cared about ray tracing performance and image quality, and at times even mocked the weak RT performance of RDNA 2 GPUs, such as the Radeon 6800."
Yeah, that was some pretty irresponsible reporting on your part back then. Now you have egg on your face and are being all diplomatic.

Let's look at that original comparison from two years ago because I have a memory like an elephant and I remember being so disgusted with the article that I didn't even bother commenting. Sometimes it's not what is presented, but how it's presented. In a tech comparison article, the tone of the overarching narrative is very often the most persuasive aspect of said article. This is because PC tech is pretty up there in the complexity category which means that most people are somewhat to completely lost when it comes down to the brass tacks of these products. Thus, they're forced to read between the lines in a tech article. This is because what they can understand is what the author sees as "good" and what the author sees as "bad" by the phrasing used.

I'm not going to cherry-pick what is said here, I'm going to let Steve's own words speak for themselves.

In the gaming benchmarks:

Battlefield V:
"The margin at 1440p remains the same. Here the Radeon GPU was 32% faster on average hitting 173 fps. However, as we move to the highest resolution tested, 4K, the margin is reduced to 20%, which is still a victory for the red team. It's worth noting that the Radeon 6800 costs 16% more, so the margin isn't as impressive as it appears, therefore we'll go through the rest of the graphs keeping that price discrepancy in mind."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Hitman 2:
"Moving to Hitman 2 we find that the Radeon RX 6800 is far less impressive in this title and although it still manages to beat the RTX 3070 in all tested resolutions, the margins are lower than the price premium. For example, at 1080p the Radeon GPU was just 4% faster, 5% faster at 1440p and 11% faster at 4K. As the resolution scales and we become more GPU bound, the RX 6800 performs better, but even the 11% margin at 4K isn't terribly impressive."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Borderlands 3:
"Borderlands 3 shows more impressive margins for the Radeon. Here the RX 6800 beats the 3070 by 27% at 1080p and 1440p, averaging 128 and 95 fps respectively, using the ultra quality preset. At 4K the RX 6800 leads the RTX 3070 by a 23% margin, pushing up to 54 fps on average from 44 fps, so a solid performance uplift there."
Simplified Translation: The 6800 won, but I can't insult it this time.

Fortnite:
"Next up is Fortnite performance, a game that didn't run as well on Radeon GPUs. AMD put in the work to fix performance in this title and as you can see the RX 6800 series is very competitive, beating the 3070 by a 17% margin at 1080p, 20% at 1440p and 21% at 4K. Given the price difference however those margins hardly make the GeForce GPU a poor value choice."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

Apex Legends:
"In Apex Legends we're looking at very similar performance with either GPU. The RX 6800 is just 6% faster at 1080p, 3% faster at 1440p and up to 5% faster at 4K. The margins are slim in this title, making the GeForce the more attractive option."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

PUBG:
"PUBG is another game that like Fortnite didn't do too well for AMD historically. Given the Radeon 6800 is fetching a price premium, seeing it only match the RTX 3070 isn't a great result."
Simplified Translation: It's a tie, nVidia wins!

CP2077:
"Moving on from battle royale games we have a little known game called Cyberpunk 2077, maybe you've heard of it. At 1080p we find that the Radeon RX 6800 is 8% faster and then 10% faster at 1440p, while we're looking at identical performance at 4K. Now you can enable DLSS with the RTX 3070 for greater performance, something we'll discuss more about later in the article."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.

COD:MW:
"Call of Duty Modern Warfare appears to show a strong performance uplift for the RX 6800 over the RTX 3070. Around a 20% performance boost at all three resolutions, meaning that scaling between these two GPUs is very consistent. With the RX 6800 costing ~16% more but delivering about 20% more performance, it's the better value here, though not by a meaningful margin."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Witcher 3:
"The Witcher 3 is now a 5 year old game. With the visual quality settings all maxed out the Radeon is 13% faster at 1080p, 6% faster at 1440p and just 4% faster at 4K. So again it's a situation where the performance difference fail to offset the price premium."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.

Control:
"Control is yet another game where these AMD and Nvidia GPUs seem evenly matched, however this title supports DLSS 2.0, so you can easily boost the performance of the RTX 3070, which makes it a no contest."
Simplified Translation: It's a tie in an nVidia title, nVidia wins!

RDR2:
"We tested Red Dead Redemption 2 in game (as opposed to using the built-in benchmark) and as you can see the RX 6800 was 14% faster at 1080p, though we wonder if we're running into some kind of system limitation that's not GPU related as that margin opens up quite considerably at 1440p. Here the Radeon was 23% faster but then things close right up at 4K to just a 10% margin in favor of the Radeon GPU, though that's not uncommon to see as the Ampere architecture is often better utilized at higher resolutions."
Simplified Translation: The RX 6800 won, but I'll praise the Ampere architecture.

The Outer Worlds:
"The RX 6800 struggles in The Outer Worlds. Performance isn't bad by any means, but unlike other games where it typically beats the RTX 3070, here it's at best able to match the GeForce GPU. Historically this hasn't been a great title for AMD, so these results should not surprise anyone."
Simplified Translation: It's a tie, nVidia wins!

Warhammer Vermintide 2:
"It wasn't all smooth sailing with the Radeon RX 6800, as we did run into a few titles where performance wasn't where you'd expect it to be, Warhammer Vermintide 2 being one such example. The game plays well and with over 100 fps at all times at 1080p, you could say performance was excellent. However, relative to the RTX 3070 it was kind of terrible, dropping behind the GeForce GPU by a 24% margin. We're not sure what the issue is and we've reached out to AMD as this wasn't the only game to exhibit strange performance, but so far it seems unlikely that this will be addressed given the game's age. Still we felt it was important to include this data as it shows AMD hasn't had the time or resources to optimize for everything. By the time we hit 4K, the RX 6800 is roughly on par with the RTX 3070 though."
Simplified Translation: The RTX 6800 lost in an old game so I get to be savage!

World of Tanks:
"Another game that saw performance issues with the RX 6800 was World of Tanks. Overall performance is still good and well beyond what you'd need to take full advantage of when using a 144Hz monitor, but it was also well down on the RTX 3070 when looking at 1080p and 1440p performance."
Simplified Translation: If I can find a reason to condemn the RX 6800, I'll do it.

Kingdom Come Deliverance:
"A game that's very broken in terms of performance with the Radeon 6800 is Kingdom Come Deliverance and apparently this is a big issue for all games that use the Crytek game engine. Again, we notified AMD of this issue weeks ago, but a fix has yet to surface. The RX 6800 is able to match the 3070 at 4K, but the 1080p and 1440p performance is pretty horrible compared to the GeForce GPU."
Simplified Translation: The RX 6800 is a broken card, don't buy it.

Performance Summary:
"Depending on the game, the Radeon RX 6800 can either look like a hero, or a massive dud. So far we've looked at 15 of the 41 games tested, so let's take a look at how the RTX 3070 and RX 6800 stacked up across all of them..."
In which game was it the hero? All I've seen is complements that you were quick to minimise or outright condemnations. I know I haven't missed anything because I've copied and pasted your article word-for-word.

"Starting with 1440p performance, we see that overall the RX 6800 is just 11% faster on average which is slightly lower than the 14% win it enjoyed in our day one review featuring 18 games. Of course, that review didn't include titles such as Kingdom Come Deliverance, World of Tanks, Warhammer Vermintide 2 and Star Wars Jedi Fallen Order."
Oh I see... So you added a bunch of nVidia-friendly titles.

"Moving to 4K reduces the performance deficit for the RX 6800 seen in titles such as Kingdom Come Deliverance and we end up with just over half a dozen games where performance was even. Overall, the margin is much the same, changing from 11% in favor of the Radeon GPU at 1440p to 10% at 4K. While the Radeon 6800 is a tad faster, it's not necessarily better value or the GPU you should buy."
Well, with a "comparison" like that, I'm shocked that anyone bought the RX 6800. It turns out that the RX 6800 WAS the card that people should buy and I'm forced to wonder how many didn't because YOU implied otherwise. Then, when you were called out about being biased, you had the nerve to say:
"Just so I know, who am I biased towards?"
- You're joking, right?

"TechSpot is about to celebrate its 25th anniversary. TechSpot means tech analysis and advice you can trust."
Well, I'm sure that several of the people who came to read that original article because they didn't know what to do and trusted you are now regretting it. However, I don't see you taking ANY responsibility for your article that definitely made things worse.

"But the goal here is not a big long "I told you so, r/Nvidia," rather it's an attempt to raise awareness, educate gamers about what's going on, and encourage them to demand more from Nvidia and AMD."
Demand more from AMD? The whole root of this problem is that You didn't demand more from nVidia but instead felt like taking the piss out of what is no clearly the superior product! Yet, here you are, not taking accountability for what YOU said and the negatiove effects that YOUR words have had.

Well I have a BIG, LONG "I told you so, Steve Walton" and you're reading it right now. Nothing that I have said here is unfair and nothing that I've said here is wrong. I'm holding you accountable for your actions even as you try to pretend that you were being "impartial". Everyone who reads this will know damn well that it wasn't true.

All of the content you've linked to, fairly pointed out the cons and pros of both products. The concerns with the RTX 3070 were all clearly highlighted. This is all the time I'm giving you on this.
 
The PS5 and Xbox Series X have roughly the same amount of RAM available for developers to use when making games and it's between 12 and 14GB. The actual amount is variable, as it depends on what functionality is running in the background with the operating system. Devs can 'disable' some of these to get more, but it only releases hundreds of MB, rather large amounts of GB. For the older consoles, there's obviously far less RAM -- in the case of the PS4, despite having 8GB, only 4.5GB was available for use by games.

In either case, though, the big, AAA titles use pretty much every available byte, storing the game's code, all of the assets in current use, caches for streaming, movie files, as well as all of the working buffers and render targets in that footprint. Because RAM is in such demand on consoles, a decent group of devs will spend a considerable amount of time streamlining and fine-tuning everything, especially if they're only making that title for one platform.

But when it comes to ports or multi-platform releases, PCs don't have anything like the restrictions that a console does -- an average gaming PC might have 16GB of system RAM and 8GB for the GPU. This and the fact that a port (or single version of a multi-platform release) is unlikely to have the same amount of resources (human, time, fiscal, etc) as a single-platform release means that one has to target a middle ground when it comes to settings, system configuration, and how it all gets used. That target may well have little semblance to what's actually required to run the game at full settings, max resolution, and so on.
Makes sense, thanks for the info. I'm going to take that to mean that the texture packs and other assets can and do make sense within a more mainstream VRAM budget if not pushed out of it by enabling additional settings. And if that's true, it is an important counter-weight to this article, by implying that there are likely settings that will provide both a good experience within the developer's artistic intent, and not incur the performance issues being flagged here.

I am a little surprised that publishers will let their games crash, stutter badly, and otherwise appear to malfunction from VRAM exhaustion, as opposed to simply disabling settings beyond a certain point or providing a clear error message ("You do not have sufficient VRAM to play at these settings.")

Whatever those guardrails would cost to develop, has to be be less expensive than the customer support calls, urgent patch releases, refunds/chargebacks, and brand reputation losses they are incurring now by not having them. The Last of Us in particular is taking a shellacking and if a lot of it is really just "you need to turn your settings down", what a waste of time for all involved.
 
In todays test, Steve sets out with the goal to choke an 8GB card, make his point, and conclude he's correct.

The results, to nobodies surprise, show you can choke an 8GB card when you purposely design the test to do so. RT shadows at 1440P with everything maxed, yeah super realistic. Crap ports that need more patches, nice one.

Well done Steve.
If this is your take away you need to read the article again. I showed where 8GB's of VRAM is an issue, showed where it isn't and clearly explained that this isn't an issue for all games. I also showed what you have to do to get RT working in Hogwarts Legacy and The Last of Us Part 1. I'm also using quality presets like most gamers, I'm not going out of my way to exceed the VRAM buffer in these games.
That was interesting - mentioning some day to day variable like price in many game summaries is really a No No - You normally see price in reviews as statement at beginning and at end for all types of reviews - eg Cars, Audio - unless a specific point - eg at this price point of $5000 for the car one can't expect more than a simple wishbone suspension - at the $1000 point just a wooden yoke and springs
Yeah imagine price being relevant. I got it wrong, delivering 11% more performance while costing 16% more is actually good value, silly me.
 
I’ve been harping on NVIDIA for skimping on their VRAM buffers for several generations, though I don’t think things are presently as gloomy for the 3070 and 3070 Ti as this article portends. My brother owns a 3070 and plays Elden Ring, RDR2, CP2077, MSFS, etc with no issues and good frame rates. He also has a Valve Index and has had zero issues with it.

My father (a former pilot) owns a 3070 Ti that he uses exclusively with MSFS and an HP Reverb G2 and gets great performance (solid 45 fps on VR). I had considered AMD cards, but frankly their drivers were widely reported to not work as well in VR/MSFS.

I get that this article’s point is that the RX 6800 has aged much better (and will continue to do so) due to the higher VRAM buffer. But considering the horrendous supply issues last gen, and how many people were ready to make an anatomical donation to land either card, I wouldn’t feel too terrible if I were a 3070 owner.
 
I’ve been harping on NVIDIA for skimping on their VRAM buffers for several generations, though I don’t think things are presently as gloomy for the 3070 and 3070 Ti as this article portends. My brother owns a 3070 and plays Elden Ring, RDR2, CP2077, MSFS, etc with no issues and good frame rates. He also has a Valve Index and has had zero issues with it.

My father (a former pilot) owns a 3070 Ti that he uses exclusively with MSFS and an HP Reverb G2 and gets great performance (solid 45 fps on VR). I had considered AMD cards, but frankly their drivers were widely reported to not work as well in VR/MSFS.

I get that this article’s point is that the RX 6800 has aged much better (and will continue to do so) due to the higher VRAM buffer. But considering the horrendous supply issues last gen, and how many people were ready to make an anatomical donation to land either card, I wouldn’t feel too terrible if I were a 3070 owner.
"That doesn't mean all 8GB graphics cards are now useless or obsolete, or that all graphics cards released in the last few years should have had more than 8GB of VRAM. Rather, we're seeing clear evidence that 8GB graphics cards are shifting towards the low-end, and therefore you can now consider 8GB of VRAM entry-level."

Also none of the games you mentioned were highlighted as having issues.

Again...

"To repeat ourselves, graphics cards with 8GB of VRAM are still very usable, but they are now on an entry-level capacity, especially when it comes to playing the latest and greatest AAA titles.

For multiplayer gamers, the RTX 3070 and other high-end 8GB graphics cards will continue to deliver the goods, as games like Warzone, Apex Legends, Fortnite and so on are typically played with competitive quality settings which heavily reduce VRAM consumption."
 
If this is your take away you need to read the article again. I showed where 8GB's of VRAM is an issue, showed where it isn't and clearly explained that this isn't an issue for all games. I also showed what you have to do to get RT working in Hogwarts Legacy and The Last of Us Part 1. I'm also using quality presets like most gamers, I'm not going out of my way to exceed the VRAM buffer in these games.
I've read it and watched it, I stand by my conclusion, but thanks for the reply!
 
All of the content you've linked to, fairly pointed out the cons and pros of both products. The concerns with the RTX 3070 were all clearly highlighted. This is all the time I'm giving you on this.
Steve, you sort of proved Avro point by your reply, and reading it all in text really brings his point across.
 
Steve, you sort of proved Avro point by your reply, and reading it all in text really brings his point across.
What was his point? That he doesn't understand that if a product is 11% faster but costs 16% more it's not particularly good value? The reality is he's attacking one of the few content pieces that clearly highlighted the 8GB VRAM buffer as being an issue and then years later that same media outlet is the only one to investigate the issue. Talk about friendly fire.

No one in their right mind would argue that the RTX 3070 hasn't been the vastly superior choice for ray tracing over the past few years. But we are starting to see a shift now, as I accurately predicted 2 years ago.

The TechSpot article is also a summary of my opinion (edited by Julio) from the original video, and I feel the video was a bit more critical of the RTX 3070. In the video I basically said I'd buy the RX 6800 over the RTX 3070...

"I was quite impressed with the RX 6800 in my day one review and felt it would be my go to option for $600 US or less, and despite the few hiccups seen in this testing I’m mostly still leaning that way. That said there’s a lot to talk about and depending on your preferences one might be better than the other."

Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.

RDNA2 had its fair share of issues when first released, there were a number of serious driver issues.

Anyway I'm very pleased with how my conclusion has aged and I stand by everything I said in the video:
 
Last edited:
Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.

Who wouldn't **** on Nvidia/Intel when they caused so much harm to the PC market. AMD would possibly have done the same if they had had their position, but Nvidia/Intel were the ones to do it.

I appreciate your channel most than most channels, but it could be good to reevaluate your conclusions about the 4070 and the RT DLSS vs RAM benefit.

BTW, interesting article. Thanks for the hard work.
 
Who wouldn't **** on Nvidia/Intel when they caused so much harm to the PC market. AMD would possibly have done the same if they had had their position, but Nvidia/Intel were the ones to do it.

I appreciate your channel most than most channels, but it could be good to reevaluate your conclusions about the 4070 and the RT DLSS vs RAM benefit.

BTW, interesting article. Thanks for the hard work.
That's the kind of crap the fan boys get into, we're not reviewing the company, we're reviewing the product. That said we certainly push back against anti-consumer behavior.

FROM MY REVIEW
"And while on the topic of VRAM, I’ll just quickly note that 12GB’s is now what we consider to be the bare minimum, so that is to say you shouldn’t purchase a graphics card with less than 12 GB’s, certainly not when spending over $200 US. It is our opinion that 8GB’s should be reserved only for the most entry-level gaming products, and anything less than 8 GB’s isn’t suitable for current generation gaming.

Really the RTX 4070 should have at least 16 GB’s of VRAM and the same is also true for the Ti model, more so in fact. That said I’m not saying the 12GB VRAM buffer is a deal breaker, rather I’m cautioning you that this is now the bare minimum moving forward and therefore this product might not age as well as those with 16GB’s or more VRAM."


What would you like me to re-evaluate? Again I mentioned the VRAM several times in my RTX 4070 review. I'm not sure what more you guys want from me? I've been on Nvidia's case about VRAM capacities for years now, and received a huge amount of flack for doing so, but I'm continuing to dig into it as the evidence mounts.
 
"That doesn't mean all 8GB graphics cards are now useless or obsolete, or that all graphics cards released in the last few years should have had more than 8GB of VRAM. Rather, we're seeing clear evidence that 8GB graphics cards are shifting towards the low-end, and therefore you can now consider 8GB of VRAM entry-level."

Also none of the games you mentioned were highlighted as having issues.

Again...

"To repeat ourselves, graphics cards with 8GB of VRAM are still very usable, but they are now on an entry-level capacity, especially when it comes to playing the latest and greatest AAA titles.

For multiplayer gamers, the RTX 3070 and other high-end 8GB graphics cards will continue to deliver the goods, as games like Warzone, Apex Legends, Fortnite and so on are typically played with competitive quality settings which heavily reduce VRAM consumption."
Hi Steve,

Thanks for the reply and all the great content you produce. I sincerely appreciate your methods and analysis, and have relied on your testing for many years now.

To be clear, I wasn’t insinuating your commentary in the article was misleading the masses, but I do think the impression that many readers will come away with (despite your very clear and repeated disclaimers as you fairly quoted) is that 8 GB cards like the 3070 have suddenly fallen off a cliff. Reading some of the other comments, I think it might have been interesting to include a few more titles in your charts (e.g., repeat a portion of your tests from your December 2020 article https://www.techspot.com/review/2174-geforce-rtx-3070-vs-radeon-rx-6800/ as a control group). This might have alleviated some of the critiques you received surrounding test selection and reinforced your written comments. Honestly, though, this is really picking nits and I still think this was an excellent article.

As to the point I was making: I was simply attempting to note that there are still many popular titles that are great on a card like the 3070, lest someone think their $700 card (crypto-inflated pricing!) needs to be replaced. 😉

Thanks again, Steve, and please keep harping on VRAM capacity!

P.S. It would be interesting to see how the other 3070 competitor (6700 XT w/ 12 GB) compares in these newer titles. From my experience, it was a lot easier to find a 6700 XT than a 6800 last year at this time.
 
Last edited:
Steve, don't let the fanboys get you down. Just keep putting actual data in and evaluating and reevaluating as we get new data. No human being can predict the future and nobody is always right, intelligent individuals use the info they have at hand and reevaluate over time. Stupid people make conclusions and then doggedly defend them, until they randomly switch if they switch.

I've put a few people on block on here for their fanboying and aggressive attacks. Tech is just a hobby for me and I have no allegiances.

Also I was actually curious if you were the same Steve as HardwareUnboxed, and I guess you are. I thought there was supposed to be an actual HUB site coming up, but I'm good with articles on Techspot. Prefer them over the videos although I'll occasionally check those out, as well as GamersNexus. Not a fan of MLID but I did see you guest on his, before I put his channel on ignore.

We need tech journalists that hold all these companies accountable. I have an AMD GPU and I am actually critical of them. Nvidia's prices and VRAM are dumb, but I think their product is still superior. Of course it depends on the specific product too, I would not buy an 8GB GPU today and even a 12 GB one I wouldn't pay more than $300 for. Maybe I'd pay $500 max for a 16 GB 4070.
 
So, Plague Tale Requiem (with RT off) Is one off the best if not the best looking game, much better than TLOU, godfall, Forspoken and all these amd sponsored games that look mediocre to average. Yet at 4k ultra I see it using between 4.5 and 6.3gb vram depending on the scene. 6.3 is the MAXIMUM. At 4k ultra. And somehow the conclusion out of all this is, that 8 gb of vram is not enough, because terribly optimized games like TLOU eat up 3 times that for worse visuals. Okay...
 
So would love to see 12GB RX 6750 XT in this. Be interesting to me anyway.

6700XT was the true counter for 3070.
6750XT was the counter for 3070 Ti.

It makes no sense to compare 3070 which launched at 499 to 6800 which launched at 579. 6800 was the better card to begin with and was priced in another pricerange for a reason.

3070 still beats 6700XT in 2023 in pretty much any game, outside of rushed and bugged console ports like TLOU maybe.

None of these cards were ever meant for 4K without DLSS/FSR.
Even 6900XT/6950XT and 3090/3090Ti feels dated for 4K gaming in 2023.
 
3070 is still a good card, entry level or not. I got one at $300 just a few days ago and more than happy with it. It's already the most expensive single part I got since 1996.
An Rx6800 was at least $400...out of my budget.
 
3070 is still a good card, entry level or not. I got one at $300 just a few days ago and more than happy with it. It's already the most expensive single part I got since 1996.
An Rx6800 was at least $400...out of my budget.
Thats what I am trying to say, they were never in the same bracket. It's like trying to compare a 3080 with a 6750XT. The 3080 will win with ease.

3070 was a 6700XT competitor, and both cards do well still in 1440p, the 3070 wins in pretty much all games.
 
I'm going to take that to mean that the texture packs and other assets can and do make sense within a more mainstream VRAM budget if not pushed out of it by enabling additional settings. And if that's true, it is an important counter-weight to this article, by implying that there are likely settings that will provide both a good experience within the developer's artistic intent, and not incur the performance issues being flagged here.
Oh, certainly! If one takes TLOU as an example, the various graphics settings can be filtered into the following categories:

Affects the VRAM footprint used by assets:
  • Animation Quality
  • Draw Distance
  • Dynamic Objects Level of Detail
  • Character Level of Detail
  • Environments Level of Detail
  • Dynamic Objects Texture Quality
  • Characters Texture Quality
  • Environments Texture Quality
  • Visual Effects Texture Quality

Affects the amount of shading/compute processing done:
  • Texture Filtering
  • Texture Sampling Quality
  • Ambient Shadows Quality
  • Directional Shadow Resolution
  • Directional Shadow Distance
  • Image Based Lighting
  • Spotlights Shadow Resolution
  • Point Lights Shadow Resolution
  • Bounced Lighting
  • Screen Space Shadows
  • Dynamic screen space shadows
  • Contact shadow quality
  • Screen space ambient occlusion
  • Ambient occlusion denoise quality
  • Screen space direction occlusion
  • Screen space cone tracing
  • Screen space reflections

Affects the amount of VRAM used in the process:
  • Ambient Shadows Quality
  • Directional Shadow Resolution
  • Spotlights Shadow Resolution
  • Point Lights Shadow Resolution
  • Bounced Lighting
  • Screen Space Shadows
    • Dynamic screen space shadows
    • Contact shadow quality
    • Screen space ambient occlusion
    • Ambient occlusion denoise quality
    • Screen space direction occlusion
    • Screen space cone tracing
    • Screen space reflections

  • (Put it as a quote to stop it from making this post too long)

So there are a vast number of settings that one can mess about with. Arguably, though, it's far too many -- too many for the internal and out-sourced testing teams to reliably go through the various combinations of settings and PC configurations to ascertain the relevant performance impacts and visual quality changes.

I am a little surprised that publishers will let their games crash, stutter badly, and otherwise appear to malfunction from VRAM exhaustion, as opposed to simply disabling settings beyond a certain point or providing a clear error message ("You do not have sufficient VRAM to play at these settings.")
It's simply about hitting publishing deadlines -- at the end of the day, it's a product that needs to be sold and given the hype around a lot of these new titles, the pressure to release on time is huge. I first learned of this first-hand many years ago, when I first started working for Futuremark (or MadOnion as it was back then): 3DMark2001 was released without being tested on any shader-capable graphics cards, because no company could provide one in time. The devs created the whole program using the Direct3D software renderer -- fortunately, it worked just fine.

Whatever those guardrails would cost to develop, has to be be less expensive than the customer support calls, urgent patch releases, refunds/chargebacks, and brand reputation losses they are incurring now by not having them. The Last of Us in particular is taking a shellacking and if a lot of it is really just "you need to turn your settings down", what a waste of time for all involved.
It's not, unfortunately. The majority of large dev houses will significantly scale back the amount of resources allocated to a title, post-launch. In some cases, it'll just be a handful of staff who are responsible for creating patches, alongside the rest of their normal duties.
 
Back