Is 6GB VRAM Enough for 1440p Gaming? Testing with Nvidia's RTX 2060

Steve

Posts: 3,043   +3,153
Staff member
By the time 6GB isn't enough you should be replacing the card. I would say about 3 years or so, because I expect new consoles to be in full swing by then.

That is typically when you see a major uptick in VRAM demand, about a year after new consoles arrive. This is simply because the majority of games released are then designed for those new consoles in mind. As soon as they get a generational spec upgrade, all the game engines and PC performance demands increase over a relatively short period.

This is also likely to apply for CPU specs as well, if new consoles deliver on the rumoured Zen 8 core upgrade. You could quickly find a lot of games needing at least a very fast 6 core CPU, or a slower 8 core one just to get good performance.

The game that springs to mind after the PS4/XB1 generation for me personally was Watchdogs out in May 2014. Clearly a game designed for new consoles that had 3-4GB of VRAM available and ported (badly) to PC. That poor porting meant the VRAM demands of the game were very high, requiring at least 3GB of VRAM to make it run smoothly on anything better than medium settings.

That ruled out a massive swathe of video cards from 2011-2013 that would otherwise have been fast enough, but lacked memory to avoid stutter. The GTX670 2GB for example was only two years old and still plenty fast for nearly everything, but crumbled with Watchdogs due to lack of memory.
 
Last edited:
I love the article - and the testing seems to have been quite exhaustive... unfortunately, while I think the title is addressed admirably (yes, the 2060 has enough VRAM for 1440P gaming), the "future-proofing" cannot be answered.

Of course, this is partly because there simply is no crystal ball out there that sees the future (which the article also stated towards the end). But also because there is no real comparison between an equivalent card with a different amount of VRAM. This was a more relevant scenario when AMD had 4 and 8GB variants of their mid-range cards and Nvidia had 3 and 6GB variants (although the 1060 3GB and 6GB were not quite identical despite the misleading branding, they were still pretty similar).

Also, the article summary mentions that it will be compared to a 1070Ti, which would be a better comparison - but all the charts say 2070.... is this a typo?

With the 2060, there are no 6 and 8GB variants... comparing it to an 8GB 2070 cannot really do it justice, as the 2070 is a faster card - even if the memory was the same.
 
Honestly if I didn't have my 2700X + Vega 64 rig and was building a rig I would probably get 2600X + RTX2060 FE :p
 
My 970 is getting old fast. Where is Navi?
Fortunately I don't have to much time to game recently, and when I do, I play my backlog of dirt chip games from first half of this decade.
 
My 970 is getting old fast. Where is Navi?
Fortunately I don't have to much time to game recently, and when I do, I play my backlog of dirt chip games from first half of this decade.
Curious how the 970 holds up (for 1080p not 1440p)). That also got stick for its 3.5GB back in the day, but IRL was fine at the time. Has it aged okay?
 
The Ray Tracing Quake 2 demonstrations have basically shown that all ray tracing really offers is more reflection and refraction of light off hallways and the ground.

If I didn't already have a 2080Ti and I was trying to buy a card for gaming that I didn't want to be obsoleted anytime soon, I'd aim for a 2070 over the 2060.
 
It is not really the amount of memory, but the bandwidth allocation. You see huge difference between a 1080 TI and a 1080 GTX at 2160p. At 1080p the gap is way less important. The 1080 clearly crumble at higher resolutions.
 
Very intersting.

It would also be interesting to see how the RTX 2080 (8GB) vs GTX1080TI (11GB) stack up against each other when it comes to VRAM.
Will the 11GB of the GTX1080Ti be better in the long run compared to RTX2080s 8GB or will other advances in the RTX keep it the better choice come 2020 and beyond? :)
 
Thanks for doing this Steve. Just like the Fuji cards, people are exaggerating the vram efficiency of this card. Even after nearly 4 years, the only games that really suffer at MAX settings using 4 GB vram are RE2, Wolfenstein 2, Forza Horizon 4, and maybe a couple of others.

Against the 980ti, the Fuji X has aged very well, granted you dial down textures just a notch, which the 980ti will probably want to do as well with some of these newer titles. The Fuji X can still smash some newer games at 1440p with just minor tweaks to the settings. As you showed in RE2, it was just 15% behind the GTX 2060, which means it was most likely neck and neck with the 980ti.

I suspect the GTX 2060 will do just as well in games 4 years from now so long as the user doesn't select "psycho" textures.
 
....
Also, the article summary mentions that it will be compared to a 1070Ti, which would be a better comparison - but all the charts say 2070.... is this a typo?

With the 2060, there are no 6 and 8GB variants... comparing it to an 8GB 2070 cannot really do it justice, as the 2070 is a faster card - even if the memory was the same.

The intent of this article was to see if 6 GB of vram would have an effect on modern titles.

It would not have been wise to use last generation cards as you would be adding an unknown as to the cause of performance degradation being from the vram or the architecture difference.

By using a RTX 2070, Steve was able to dismiss the 15% or so performance advantage. So long as the this stayed consistent, with no huge penalties to the min frame, it is safe to say that the 6GB vram was not a problem.
 
My 970 is getting old fast. Where is Navi?
Fortunately I don't have to much time to game recently, and when I do, I play my backlog of dirt chip games from first half of this decade.
Curious how the 970 holds up (for 1080p not 1440p)). That also got stick for its 3.5GB back in the day, but IRL was fine at the time. Has it aged okay?



I have the KFA2 GTX970 HOF, i7 4790k, 32gb RAM, AGON 1440p 165mhz G-Sync monitor, 65-80 FPS on ultra
 
My 970 is getting old fast. Where is Navi?
Fortunately I don't have to much time to game recently, and when I do, I play my backlog of dirt chip games from first half of this decade.
Curious how the 970 holds up (for 1080p not 1440p)). That also got stick for its 3.5GB back in the day, but IRL was fine at the time. Has it aged okay?



I have the KFA2 GTX970 HOF, i7 4790k, 32gb RAM, AGON 1440p 165mhz G-Sync monitor, 65-80 FPS on ultra

On Quake Champions,
 
By the time 6GB isn't enough you should be replacing the card. I would say about 3 years or so, because I expect new consoles to be in full swing by then.

That is typically when you see a major uptick in VRAM demand, about a year after new consoles arrive. This is simply because the majority of games released are then designed for those new consoles in mind. As soon as they get a generational spec upgrade, all the game engines and PC performance demands increase over a relatively short period.

This is also likely to apply for CPU specs as well, if new consoles deliver on the rumoured Zen 8 core upgrade. You could quickly find a lot of games needing at least a very fast 6 core CPU, or a slower 8 core one just to get good performance.

The game that springs to mind after the PS4/XB1 generation for me personally was Watchdogs out in May 2014. Clearly a game designed for new consoles that had 3-4GB of VRAM available and ported (badly) to PC. That poor porting meant the VRAM demands of the game were very high, requiring at least 3GB of VRAM to make it run smoothly on anything better than medium settings.

That ruled out a massive swathe of video cards from 2011-2013 that would otherwise have been fast enough, but lacked memory to avoid stutter. The GTX670 2GB for example was only two years old and still plenty fast for nearly everything, but crumbled with Watchdogs due to lack of memory.

Watchdogs ran fine on my 780Ti @1080p/1440p @ultra with TAA. Allocated VRAM does not =VRAM breach measurement. Only massive stuttering does.
 
Watchdogs ran fine on my 780Ti @1080p/1440p @ultra with TAA. Allocated VRAM does not =VRAM breach measurement. Only massive stuttering does.

780Ti was a 3GB card.

Maybe you missed the bit in my post that said you needed 'at least 3GB of VRAM'.....
 
This is why you should buy a used 1070 for $200 or so instead

Adaptive sync is a huge upgrade for that card and its 8gb of vram
 
I dont understand Why Nvidia dont like VRAM. I mean if it was not cheap AMD would not put 8 GB in many cards.
 
Actually it ran fine @1440p Ultra even on a 2GB GTX 770: https://www.techspot.com/review/827-watch-dogs-benchmarks/page3.html .
You don't get that kind of avg FPS with VRAM breach.

You're using an average FPS measurement as evidence that VRAM limitations incurring stuttering and streaming issues didn't happen in that game? Mkay.

Nope.

Even the game's technical director no less pointed to the large amount of framebuffer the game requires for smooth game play with high end texture settings. The issues with Watchdogs were well reported at the time. Don't make me google for you and everyone else, the evidence abounds. The game was not smooth on higher texture settings without more than 2GB of VRAM.
 
You're using an average FPS measurement as evidence that VRAM limitations incurring stuttering and streaming issues didn't happen in that game? Mkay.

Nope.

Even the game's technical director no less pointed to the large amount of framebuffer the game requires for smooth game play with high end texture settings. The issues with Watchdogs were well reported at the time. Don't make me google for you and everyone else, the evidence abounds. The game was not smooth on higher texture settings without more than 2GB of VRAM.

VRAM breach would not give an average FPS that high relative to 3/4 GB cards. Breaching VRAM causes consistent FPS drops that would pummel the card; there is no indication of that with avg FPS that high. That would also be why there is no mention of stuttering or VRAM limits in that article
 
Watchdogs ran fine on my 780Ti @1080p/1440p @ultra with TAA. Allocated VRAM does not =VRAM breach measurement. Only massive stuttering does.

780Ti was a 3GB card.

Maybe you missed the bit in my post that said you needed 'at least 3GB of VRAM'.....

You said you need at least 3 GB of vram for better than medium settings, which I find HIGHLY doubtful. Hell, I think RE2 did fine on medium with just 2 GB of Vram.
 
You said you need at least 3 GB of vram for medium settings, which I find HIGHLY doubtful. Hell, I think RE2 did fine on medium with just 2 GB of Vram.

"Requiring at least 3GB of VRAM to make it run smoothly on anything better than medium settings."

This was what I said, in plain ol black and white english. Reading comprehension.

VRAM breach would not give an average FPS that high relative to 3/4 GB cards. Breaching VRAM causes consistent FPS drops that would pummel the card; there is no indication of that with avg FPS that high.

I'll say again, you're using average framerates to back up your assertions that the game did not have frame time/stutter issues when VRAM limitations were exceeded. Average frame rate shows nothing. It is a near useless measurement to highlight the known problems the game had with VRAM usage.

As I pointed out, the game was badly optimised and badly ported to PC. I only highlighted the large amount of VRAM said game required for higher textures which definitely impacted the fluidity of that title. It had other performance issues too....lots of threads were also a must. Leading back to my point about how much new consoles can influence games.
 
Back