takaozo
Posts: 815 +1,298
New games? pretty much yes. Good thing there are lots of older games that are more fun than shiny puddles and no fun.PC gaming is trash now.
New games? pretty much yes. Good thing there are lots of older games that are more fun than shiny puddles and no fun.PC gaming is trash now.
New games? pretty much yes. Good thing there are lots of older games that are more fun than shiny puddles and no fun.
"We told you back in late 2020 that the RTX 3070 was the GPU to buy if you cared about ray tracing performance and image quality, and at times even mocked the weak RT performance of RDNA 2 GPUs, such as the Radeon 6800."
Yeah, that was some pretty irresponsible reporting on your part back then. Now you have egg on your face and are being all diplomatic.
Let's look at that original comparison from two years ago because I have a memory like an elephant and I remember being so disgusted with the article that I didn't even bother commenting. Sometimes it's not what is presented, but how it's presented. In a tech comparison article, the tone of the overarching narrative is very often the most persuasive aspect of said article. This is because PC tech is pretty up there in the complexity category which means that most people are somewhat to completely lost when it comes down to the brass tacks of these products. Thus, they're forced to read between the lines in a tech article. This is because what they can understand is what the author sees as "good" and what the author sees as "bad" by the phrasing used.
I'm not going to cherry-pick what is said here, I'm going to let Steve's own words speak for themselves.
In the gaming benchmarks:
Battlefield V:
"The margin at 1440p remains the same. Here the Radeon GPU was 32% faster on average hitting 173 fps. However, as we move to the highest resolution tested, 4K, the margin is reduced to 20%, which is still a victory for the red team. It's worth noting that the Radeon 6800 costs 16% more, so the margin isn't as impressive as it appears, therefore we'll go through the rest of the graphs keeping that price discrepancy in mind."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Hitman 2:
"Moving to Hitman 2 we find that the Radeon RX 6800 is far less impressive in this title and although it still manages to beat the RTX 3070 in all tested resolutions, the margins are lower than the price premium. For example, at 1080p the Radeon GPU was just 4% faster, 5% faster at 1440p and 11% faster at 4K. As the resolution scales and we become more GPU bound, the RX 6800 performs better, but even the 11% margin at 4K isn't terribly impressive."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Borderlands 3:
"Borderlands 3 shows more impressive margins for the Radeon. Here the RX 6800 beats the 3070 by 27% at 1080p and 1440p, averaging 128 and 95 fps respectively, using the ultra quality preset. At 4K the RX 6800 leads the RTX 3070 by a 23% margin, pushing up to 54 fps on average from 44 fps, so a solid performance uplift there."
Simplified Translation: The 6800 won, but I can't insult it this time.
Fortnite:
"Next up is Fortnite performance, a game that didn't run as well on Radeon GPUs. AMD put in the work to fix performance in this title and as you can see the RX 6800 series is very competitive, beating the 3070 by a 17% margin at 1080p, 20% at 1440p and 21% at 4K. Given the price difference however those margins hardly make the GeForce GPU a poor value choice."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
Apex Legends:
"In Apex Legends we're looking at very similar performance with either GPU. The RX 6800 is just 6% faster at 1080p, 3% faster at 1440p and up to 5% faster at 4K. The margins are slim in this title, making the GeForce the more attractive option."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
PUBG:
"PUBG is another game that like Fortnite didn't do too well for AMD historically. Given the Radeon 6800 is fetching a price premium, seeing it only match the RTX 3070 isn't a great result."
Simplified Translation: It's a tie, nVidia wins!
CP2077:
"Moving on from battle royale games we have a little known game called Cyberpunk 2077, maybe you've heard of it. At 1080p we find that the Radeon RX 6800 is 8% faster and then 10% faster at 1440p, while we're looking at identical performance at 4K. Now you can enable DLSS with the RTX 3070 for greater performance, something we'll discuss more about later in the article."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
COD:MW:
"Call of Duty Modern Warfare appears to show a strong performance uplift for the RX 6800 over the RTX 3070. Around a 20% performance boost at all three resolutions, meaning that scaling between these two GPUs is very consistent. With the RX 6800 costing ~16% more but delivering about 20% more performance, it's the better value here, though not by a meaningful margin."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Witcher 3:
"The Witcher 3 is now a 5 year old game. With the visual quality settings all maxed out the Radeon is 13% faster at 1080p, 6% faster at 1440p and just 4% faster at 4K. So again it's a situation where the performance difference fail to offset the price premium."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Control:
"Control is yet another game where these AMD and Nvidia GPUs seem evenly matched, however this title supports DLSS 2.0, so you can easily boost the performance of the RTX 3070, which makes it a no contest."
Simplified Translation: It's a tie in an nVidia title, nVidia wins!
RDR2:
"We tested Red Dead Redemption 2 in game (as opposed to using the built-in benchmark) and as you can see the RX 6800 was 14% faster at 1080p, though we wonder if we're running into some kind of system limitation that's not GPU related as that margin opens up quite considerably at 1440p. Here the Radeon was 23% faster but then things close right up at 4K to just a 10% margin in favor of the Radeon GPU, though that's not uncommon to see as the Ampere architecture is often better utilized at higher resolutions."
Simplified Translation: The RX 6800 won, but I'll praise the Ampere architecture.
The Outer Worlds:
"The RX 6800 struggles in The Outer Worlds. Performance isn't bad by any means, but unlike other games where it typically beats the RTX 3070, here it's at best able to match the GeForce GPU. Historically this hasn't been a great title for AMD, so these results should not surprise anyone."
Simplified Translation: It's a tie, nVidia wins!
Warhammer Vermintide 2:
"It wasn't all smooth sailing with the Radeon RX 6800, as we did run into a few titles where performance wasn't where you'd expect it to be, Warhammer Vermintide 2 being one such example. The game plays well and with over 100 fps at all times at 1080p, you could say performance was excellent. However, relative to the RTX 3070 it was kind of terrible, dropping behind the GeForce GPU by a 24% margin. We're not sure what the issue is and we've reached out to AMD as this wasn't the only game to exhibit strange performance, but so far it seems unlikely that this will be addressed given the game's age. Still we felt it was important to include this data as it shows AMD hasn't had the time or resources to optimize for everything. By the time we hit 4K, the RX 6800 is roughly on par with the RTX 3070 though."
Simplified Translation: The RTX 6800 lost in an old game so I get to be savage!
World of Tanks:
"Another game that saw performance issues with the RX 6800 was World of Tanks. Overall performance is still good and well beyond what you'd need to take full advantage of when using a 144Hz monitor, but it was also well down on the RTX 3070 when looking at 1080p and 1440p performance."
Simplified Translation: If I can find a reason to condemn the RX 6800, I'll do it.
Kingdom Come Deliverance:
"A game that's very broken in terms of performance with the Radeon 6800 is Kingdom Come Deliverance and apparently this is a big issue for all games that use the Crytek game engine. Again, we notified AMD of this issue weeks ago, but a fix has yet to surface. The RX 6800 is able to match the 3070 at 4K, but the 1080p and 1440p performance is pretty horrible compared to the GeForce GPU."
Simplified Translation: The RX 6800 is a broken card, don't buy it.
Performance Summary:
"Depending on the game, the Radeon RX 6800 can either look like a hero, or a massive dud. So far we've looked at 15 of the 41 games tested, so let's take a look at how the RTX 3070 and RX 6800 stacked up across all of them..."
In which game was it the hero? All I've seen is complements that you were quick to minimise or outright condemnations. I know I haven't missed anything because I've copied and pasted your article word-for-word.
"Starting with 1440p performance, we see that overall the RX 6800 is just 11% faster on average which is slightly lower than the 14% win it enjoyed in our day one review featuring 18 games. Of course, that review didn't include titles such as Kingdom Come Deliverance, World of Tanks, Warhammer Vermintide 2 and Star Wars Jedi Fallen Order."
Oh I see... So you added a bunch of nVidia-friendly titles.
"Moving to 4K reduces the performance deficit for the RX 6800 seen in titles such as Kingdom Come Deliverance and we end up with just over half a dozen games where performance was even. Overall, the margin is much the same, changing from 11% in favor of the Radeon GPU at 1440p to 10% at 4K. While the Radeon 6800 is a tad faster, it's not necessarily better value or the GPU you should buy."
Well, with a "comparison" like that, I'm shocked that anyone bought the RX 6800. It turns out that the RX 6800 WAS the card that people should buy and I'm forced to wonder how many didn't because YOU implied otherwise. Then, when you were called out about being biased, you had the nerve to say:
"Just so I know, who am I biased towards?"
- You're joking, right?
"TechSpot is about to celebrate its 25th anniversary. TechSpot means tech analysis and advice you can trust."
Well, I'm sure that several of the people who came to read that original article because they didn't know what to do and trusted you are now regretting it. However, I don't see you taking ANY responsibility for your article that definitely made things worse.
"But the goal here is not a big long "I told you so, r/Nvidia," rather it's an attempt to raise awareness, educate gamers about what's going on, and encourage them to demand more from Nvidia and AMD."
Demand more from AMD? The whole root of this problem is that You didn't demand more from nVidia but instead felt like taking the piss out of what is no clearly the superior product! Yet, here you are, not taking accountability for what YOU said and the negatiove effects that YOUR words have had.
Well I have a BIG, LONG "I told you so, Steve Walton" and you're reading it right now. Nothing that I have said here is unfair and nothing that I've said here is wrong. I'm holding you accountable for your actions even as you try to pretend that you were being "impartial". Everyone who reads this will know damn well that it wasn't true.
lol, it speaks volumes that both AMD superfans and Nvidia superfans think Steve is biased toward the respective other team. Your quotes and added commentary gave me a good laugh though - next level, truly."Just so I know, who am I biased towards?"
- You're joking, right?
AMD "fine wine" hits again.
"We told you back in late 2020 that the RTX 3070 was the GPU to buy if you cared about ray tracing performance and image quality, and at times even mocked the weak RT performance of RDNA 2 GPUs, such as the Radeon 6800."
Yeah, that was some pretty irresponsible reporting on your part back then. Now you have egg on your face and are being all diplomatic.
Let's look at that original comparison from two years ago because I have a memory like an elephant and I remember being so disgusted with the article that I didn't even bother commenting. Sometimes it's not what is presented, but how it's presented. In a tech comparison article, the tone of the overarching narrative is very often the most persuasive aspect of said article. This is because PC tech is pretty up there in the complexity category which means that most people are somewhat to completely lost when it comes down to the brass tacks of these products. Thus, they're forced to read between the lines in a tech article. This is because what they can understand is what the author sees as "good" and what the author sees as "bad" by the phrasing used.
I'm not going to cherry-pick what is said here, I'm going to let Steve's own words speak for themselves.
In the gaming benchmarks:
Battlefield V:
"The margin at 1440p remains the same. Here the Radeon GPU was 32% faster on average hitting 173 fps. However, as we move to the highest resolution tested, 4K, the margin is reduced to 20%, which is still a victory for the red team. It's worth noting that the Radeon 6800 costs 16% more, so the margin isn't as impressive as it appears, therefore we'll go through the rest of the graphs keeping that price discrepancy in mind."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Hitman 2:
"Moving to Hitman 2 we find that the Radeon RX 6800 is far less impressive in this title and although it still manages to beat the RTX 3070 in all tested resolutions, the margins are lower than the price premium. For example, at 1080p the Radeon GPU was just 4% faster, 5% faster at 1440p and 11% faster at 4K. As the resolution scales and we become more GPU bound, the RX 6800 performs better, but even the 11% margin at 4K isn't terribly impressive."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Borderlands 3:
"Borderlands 3 shows more impressive margins for the Radeon. Here the RX 6800 beats the 3070 by 27% at 1080p and 1440p, averaging 128 and 95 fps respectively, using the ultra quality preset. At 4K the RX 6800 leads the RTX 3070 by a 23% margin, pushing up to 54 fps on average from 44 fps, so a solid performance uplift there."
Simplified Translation: The 6800 won, but I can't insult it this time.
Fortnite:
"Next up is Fortnite performance, a game that didn't run as well on Radeon GPUs. AMD put in the work to fix performance in this title and as you can see the RX 6800 series is very competitive, beating the 3070 by a 17% margin at 1080p, 20% at 1440p and 21% at 4K. Given the price difference however those margins hardly make the GeForce GPU a poor value choice."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
Apex Legends:
"In Apex Legends we're looking at very similar performance with either GPU. The RX 6800 is just 6% faster at 1080p, 3% faster at 1440p and up to 5% faster at 4K. The margins are slim in this title, making the GeForce the more attractive option."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
PUBG:
"PUBG is another game that like Fortnite didn't do too well for AMD historically. Given the Radeon 6800 is fetching a price premium, seeing it only match the RTX 3070 isn't a great result."
Simplified Translation: It's a tie, nVidia wins!
CP2077:
"Moving on from battle royale games we have a little known game called Cyberpunk 2077, maybe you've heard of it. At 1080p we find that the Radeon RX 6800 is 8% faster and then 10% faster at 1440p, while we're looking at identical performance at 4K. Now you can enable DLSS with the RTX 3070 for greater performance, something we'll discuss more about later in the article."
Simplified Translation: The RX 6800 won, but I'll praise the 3070.
COD:MW:
"Call of Duty Modern Warfare appears to show a strong performance uplift for the RX 6800 over the RTX 3070. Around a 20% performance boost at all three resolutions, meaning that scaling between these two GPUs is very consistent. With the RX 6800 costing ~16% more but delivering about 20% more performance, it's the better value here, though not by a meaningful margin."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Witcher 3:
"The Witcher 3 is now a 5 year old game. With the visual quality settings all maxed out the Radeon is 13% faster at 1080p, 6% faster at 1440p and just 4% faster at 4K. So again it's a situation where the performance difference fail to offset the price premium."
Simplified Translation: The RX 6800 won, but I'll insult it anyway.
Control:
"Control is yet another game where these AMD and Nvidia GPUs seem evenly matched, however this title supports DLSS 2.0, so you can easily boost the performance of the RTX 3070, which makes it a no contest."
Simplified Translation: It's a tie in an nVidia title, nVidia wins!
RDR2:
"We tested Red Dead Redemption 2 in game (as opposed to using the built-in benchmark) and as you can see the RX 6800 was 14% faster at 1080p, though we wonder if we're running into some kind of system limitation that's not GPU related as that margin opens up quite considerably at 1440p. Here the Radeon was 23% faster but then things close right up at 4K to just a 10% margin in favor of the Radeon GPU, though that's not uncommon to see as the Ampere architecture is often better utilized at higher resolutions."
Simplified Translation: The RX 6800 won, but I'll praise the Ampere architecture.
The Outer Worlds:
"The RX 6800 struggles in The Outer Worlds. Performance isn't bad by any means, but unlike other games where it typically beats the RTX 3070, here it's at best able to match the GeForce GPU. Historically this hasn't been a great title for AMD, so these results should not surprise anyone."
Simplified Translation: It's a tie, nVidia wins!
Warhammer Vermintide 2:
"It wasn't all smooth sailing with the Radeon RX 6800, as we did run into a few titles where performance wasn't where you'd expect it to be, Warhammer Vermintide 2 being one such example. The game plays well and with over 100 fps at all times at 1080p, you could say performance was excellent. However, relative to the RTX 3070 it was kind of terrible, dropping behind the GeForce GPU by a 24% margin. We're not sure what the issue is and we've reached out to AMD as this wasn't the only game to exhibit strange performance, but so far it seems unlikely that this will be addressed given the game's age. Still we felt it was important to include this data as it shows AMD hasn't had the time or resources to optimize for everything. By the time we hit 4K, the RX 6800 is roughly on par with the RTX 3070 though."
Simplified Translation: The RTX 6800 lost in an old game so I get to be savage!
World of Tanks:
"Another game that saw performance issues with the RX 6800 was World of Tanks. Overall performance is still good and well beyond what you'd need to take full advantage of when using a 144Hz monitor, but it was also well down on the RTX 3070 when looking at 1080p and 1440p performance."
Simplified Translation: If I can find a reason to condemn the RX 6800, I'll do it.
Kingdom Come Deliverance:
"A game that's very broken in terms of performance with the Radeon 6800 is Kingdom Come Deliverance and apparently this is a big issue for all games that use the Crytek game engine. Again, we notified AMD of this issue weeks ago, but a fix has yet to surface. The RX 6800 is able to match the 3070 at 4K, but the 1080p and 1440p performance is pretty horrible compared to the GeForce GPU."
Simplified Translation: The RX 6800 is a broken card, don't buy it.
Performance Summary:
"Depending on the game, the Radeon RX 6800 can either look like a hero, or a massive dud. So far we've looked at 15 of the 41 games tested, so let's take a look at how the RTX 3070 and RX 6800 stacked up across all of them..."
In which game was it the hero? All I've seen is complements that you were quick to minimise or outright condemnations. I know I haven't missed anything because I've copied and pasted your article word-for-word.
"Starting with 1440p performance, we see that overall the RX 6800 is just 11% faster on average which is slightly lower than the 14% win it enjoyed in our day one review featuring 18 games. Of course, that review didn't include titles such as Kingdom Come Deliverance, World of Tanks, Warhammer Vermintide 2 and Star Wars Jedi Fallen Order."
Oh I see... So you added a bunch of nVidia-friendly titles.
"Moving to 4K reduces the performance deficit for the RX 6800 seen in titles such as Kingdom Come Deliverance and we end up with just over half a dozen games where performance was even. Overall, the margin is much the same, changing from 11% in favor of the Radeon GPU at 1440p to 10% at 4K. While the Radeon 6800 is a tad faster, it's not necessarily better value or the GPU you should buy."
Well, with a "comparison" like that, I'm shocked that anyone bought the RX 6800. It turns out that the RX 6800 WAS the card that people should buy and I'm forced to wonder how many didn't because YOU implied otherwise. Then, when you were called out about being biased, you had the nerve to say:
"Just so I know, who am I biased towards?"
- You're joking, right?
"TechSpot is about to celebrate its 25th anniversary. TechSpot means tech analysis and advice you can trust."
Well, I'm sure that several of the people who came to read that original article because they didn't know what to do and trusted you are now regretting it. However, I don't see you taking ANY responsibility for your article that definitely made things worse.
"But the goal here is not a big long "I told you so, r/Nvidia," rather it's an attempt to raise awareness, educate gamers about what's going on, and encourage them to demand more from Nvidia and AMD."
Demand more from AMD? The whole root of this problem is that You didn't demand more from nVidia but instead felt like taking the piss out of what is no clearly the superior product! Yet, here you are, not taking accountability for what YOU said and the negatiove effects that YOUR words have had.
Well I have a BIG, LONG "I told you so, Steve Walton" and you're reading it right now. Nothing that I have said here is unfair and nothing that I've said here is wrong. I'm holding you accountable for your actions even as you try to pretend that you were being "impartial". Everyone who reads this will know damn well that it wasn't true.
Makes sense, thanks for the info. I'm going to take that to mean that the texture packs and other assets can and do make sense within a more mainstream VRAM budget if not pushed out of it by enabling additional settings. And if that's true, it is an important counter-weight to this article, by implying that there are likely settings that will provide both a good experience within the developer's artistic intent, and not incur the performance issues being flagged here.The PS5 and Xbox Series X have roughly the same amount of RAM available for developers to use when making games and it's between 12 and 14GB. The actual amount is variable, as it depends on what functionality is running in the background with the operating system. Devs can 'disable' some of these to get more, but it only releases hundreds of MB, rather large amounts of GB. For the older consoles, there's obviously far less RAM -- in the case of the PS4, despite having 8GB, only 4.5GB was available for use by games.
In either case, though, the big, AAA titles use pretty much every available byte, storing the game's code, all of the assets in current use, caches for streaming, movie files, as well as all of the working buffers and render targets in that footprint. Because RAM is in such demand on consoles, a decent group of devs will spend a considerable amount of time streamlining and fine-tuning everything, especially if they're only making that title for one platform.
But when it comes to ports or multi-platform releases, PCs don't have anything like the restrictions that a console does -- an average gaming PC might have 16GB of system RAM and 8GB for the GPU. This and the fact that a port (or single version of a multi-platform release) is unlikely to have the same amount of resources (human, time, fiscal, etc) as a single-platform release means that one has to target a middle ground when it comes to settings, system configuration, and how it all gets used. That target may well have little semblance to what's actually required to run the game at full settings, max resolution, and so on.
If this is your take away you need to read the article again. I showed where 8GB's of VRAM is an issue, showed where it isn't and clearly explained that this isn't an issue for all games. I also showed what you have to do to get RT working in Hogwarts Legacy and The Last of Us Part 1. I'm also using quality presets like most gamers, I'm not going out of my way to exceed the VRAM buffer in these games.In todays test, Steve sets out with the goal to choke an 8GB card, make his point, and conclude he's correct.
The results, to nobodies surprise, show you can choke an 8GB card when you purposely design the test to do so. RT shadows at 1440P with everything maxed, yeah super realistic. Crap ports that need more patches, nice one.
Well done Steve.
Yeah imagine price being relevant. I got it wrong, delivering 11% more performance while costing 16% more is actually good value, silly me.That was interesting - mentioning some day to day variable like price in many game summaries is really a No No - You normally see price in reviews as statement at beginning and at end for all types of reviews - eg Cars, Audio - unless a specific point - eg at this price point of $5000 for the car one can't expect more than a simple wishbone suspension - at the $1000 point just a wooden yoke and springs
"That doesn't mean all 8GB graphics cards are now useless or obsolete, or that all graphics cards released in the last few years should have had more than 8GB of VRAM. Rather, we're seeing clear evidence that 8GB graphics cards are shifting towards the low-end, and therefore you can now consider 8GB of VRAM entry-level."I’ve been harping on NVIDIA for skimping on their VRAM buffers for several generations, though I don’t think things are presently as gloomy for the 3070 and 3070 Ti as this article portends. My brother owns a 3070 and plays Elden Ring, RDR2, CP2077, MSFS, etc with no issues and good frame rates. He also has a Valve Index and has had zero issues with it.
My father (a former pilot) owns a 3070 Ti that he uses exclusively with MSFS and an HP Reverb G2 and gets great performance (solid 45 fps on VR). I had considered AMD cards, but frankly their drivers were widely reported to not work as well in VR/MSFS.
I get that this article’s point is that the RX 6800 has aged much better (and will continue to do so) due to the higher VRAM buffer. But considering the horrendous supply issues last gen, and how many people were ready to make an anatomical donation to land either card, I wouldn’t feel too terrible if I were a 3070 owner.
I've read it and watched it, I stand by my conclusion, but thanks for the reply!If this is your take away you need to read the article again. I showed where 8GB's of VRAM is an issue, showed where it isn't and clearly explained that this isn't an issue for all games. I also showed what you have to do to get RT working in Hogwarts Legacy and The Last of Us Part 1. I'm also using quality presets like most gamers, I'm not going out of my way to exceed the VRAM buffer in these games.
Steve, you sort of proved Avro point by your reply, and reading it all in text really brings his point across.All of the content you've linked to, fairly pointed out the cons and pros of both products. The concerns with the RTX 3070 were all clearly highlighted. This is all the time I'm giving you on this.
What was his point? That he doesn't understand that if a product is 11% faster but costs 16% more it's not particularly good value? The reality is he's attacking one of the few content pieces that clearly highlighted the 8GB VRAM buffer as being an issue and then years later that same media outlet is the only one to investigate the issue. Talk about friendly fire.Steve, you sort of proved Avro point by your reply, and reading it all in text really brings his point across.
Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.
That's the kind of crap the fan boys get into, we're not reviewing the company, we're reviewing the product. That said we certainly push back against anti-consumer behavior.Who wouldn't **** on Nvidia/Intel when they caused so much harm to the PC market. AMD would possibly have done the same if they had had their position, but Nvidia/Intel were the ones to do it.
I appreciate your channel most than most channels, but it could be good to reevaluate your conclusions about the 4070 and the RT DLSS vs RAM benefit.
BTW, interesting article. Thanks for the hard work.
Hi Steve,"That doesn't mean all 8GB graphics cards are now useless or obsolete, or that all graphics cards released in the last few years should have had more than 8GB of VRAM. Rather, we're seeing clear evidence that 8GB graphics cards are shifting towards the low-end, and therefore you can now consider 8GB of VRAM entry-level."
Also none of the games you mentioned were highlighted as having issues.
Again...
"To repeat ourselves, graphics cards with 8GB of VRAM are still very usable, but they are now on an entry-level capacity, especially when it comes to playing the latest and greatest AAA titles.
For multiplayer gamers, the RTX 3070 and other high-end 8GB graphics cards will continue to deliver the goods, as games like Warzone, Apex Legends, Fortnite and so on are typically played with competitive quality settings which heavily reduce VRAM consumption."
So would love to see 12GB RX 6750 XT in this. Be interesting to me anyway.
Thats what I am trying to say, they were never in the same bracket. It's like trying to compare a 3080 with a 6750XT. The 3080 will win with ease.3070 is still a good card, entry level or not. I got one at $300 just a few days ago and more than happy with it. It's already the most expensive single part I got since 1996.
An Rx6800 was at least $400...out of my budget.
Oh, certainly! If one takes TLOU as an example, the various graphics settings can be filtered into the following categories:I'm going to take that to mean that the texture packs and other assets can and do make sense within a more mainstream VRAM budget if not pushed out of it by enabling additional settings. And if that's true, it is an important counter-weight to this article, by implying that there are likely settings that will provide both a good experience within the developer's artistic intent, and not incur the performance issues being flagged here.
Affects the VRAM footprint used by assets:
- Animation Quality
- Draw Distance
- Dynamic Objects Level of Detail
- Character Level of Detail
- Environments Level of Detail
- Dynamic Objects Texture Quality
- Characters Texture Quality
- Environments Texture Quality
- Visual Effects Texture Quality
Affects the amount of shading/compute processing done:
- Texture Filtering
- Texture Sampling Quality
- Ambient Shadows Quality
- Directional Shadow Resolution
- Directional Shadow Distance
- Image Based Lighting
- Spotlights Shadow Resolution
- Point Lights Shadow Resolution
- Bounced Lighting
- Screen Space Shadows
- Dynamic screen space shadows
- Contact shadow quality
- Screen space ambient occlusion
- Ambient occlusion denoise quality
- Screen space direction occlusion
- Screen space cone tracing
- Screen space reflections
Affects the amount of VRAM used in the process:
- Ambient Shadows Quality
- Directional Shadow Resolution
- Spotlights Shadow Resolution
- Point Lights Shadow Resolution
- Bounced Lighting
- Screen Space Shadows
- Dynamic screen space shadows
- Contact shadow quality
- Screen space ambient occlusion
- Ambient occlusion denoise quality
- Screen space direction occlusion
- Screen space cone tracing
- Screen space reflections
It's simply about hitting publishing deadlines -- at the end of the day, it's a product that needs to be sold and given the hype around a lot of these new titles, the pressure to release on time is huge. I first learned of this first-hand many years ago, when I first started working for Futuremark (or MadOnion as it was back then): 3DMark2001 was released without being tested on any shader-capable graphics cards, because no company could provide one in time. The devs created the whole program using the Direct3D software renderer -- fortunately, it worked just fine.I am a little surprised that publishers will let their games crash, stutter badly, and otherwise appear to malfunction from VRAM exhaustion, as opposed to simply disabling settings beyond a certain point or providing a clear error message ("You do not have sufficient VRAM to play at these settings.")
It's not, unfortunately. The majority of large dev houses will significantly scale back the amount of resources allocated to a title, post-launch. In some cases, it'll just be a handful of staff who are responsible for creating patches, alongside the rest of their normal duties.Whatever those guardrails would cost to develop, has to be be less expensive than the customer support calls, urgent patch releases, refunds/chargebacks, and brand reputation losses they are incurring now by not having them. The Last of Us in particular is taking a shellacking and if a lot of it is really just "you need to turn your settings down", what a waste of time for all involved.