Is 6GB VRAM Enough for 1440p Gaming? Testing with Nvidia's RTX 2060

"Requiring at least 3GB of VRAM to make it run smoothly on anything better than medium settings."

This was what I said, in plain ol black and white english. Reading comprehension.



I'll say again, you're using average framerates to back up your assertions that the game did not have frame time/stutter issues when VRAM limitations were exceeded. Average frame rate shows nothing. It is a near useless measurement to highlight the known problems the game had with VRAM usage.

As I pointed out, the game was badly optimised and badly ported to PC. I only highlighted the large amount of VRAM said game required for higher textures which definitely impacted the fluidity of that title. It had other performance issues too....

And I'll say again, you would not have relative AVG FPS that high if it was breaching VRAM.
 
And I'll say again, you would not have relative AVG FPS that high if it was breaching VRAM.

Yes, you easily could. Average frame rate measures just that and little else. 155FPS one second and 5 FPS the next second. Wow, look at that 80FPS 'average framerate' over two seconds. Perfectly playable! Shame about the massive noticeable lag you experienced when it dipped to 5 FPS for a whole second. Extreme example, but shows how 'average frame rate' graphs have limitations.

Frame times that bounce between e.g 16 milliseconds and say 90+ are not well represented in any average frame rate graph. Please don't claim otherwise, it's pure nonsense.

I assume you have never heard of SLI/Crossfire stutter or used the technology in the past, that often kicks out these hugely impressive average framerates. Ooooh look at those big average numbers way above single GPU cards! But said graph exposes absolutely nought of a game's terrible stutter issues because of frame time issues.

Minimum frame rates are also not always representative of 0.1 percent lows where the issues often resides, as sites often use averages (of runs) across tests there as well. This is a site methodology issue however and FCAT tests reveal the problems.

Watch Dogs had well, well, well documented stutter problems and massive VRAM usage requirements on higher texture settings for the time. Anyone can easily look it up for themselves if they don't believe me.
 
Last edited:
"Requiring at least 3GB of VRAM to make it run smoothly on anything better than medium settings."

This was what I said, in plain ol black and white english. Reading comprehension.



I'll say again, you're using average framerates to back up your assertions that the game did not have frame time/stutter issues when VRAM limitations were exceeded. Average frame rate shows nothing. It is a near useless measurement to highlight the known problems the game had with VRAM usage.

As I pointed out, the game was badly optimised and badly ported to PC. I only highlighted the large amount of VRAM said game required for higher textures which definitely impacted the fluidity of that title. It had other performance issues too....lots of threads were also a must. Leading back to my point about how much new consoles can influence games.

Slow down Turbo. Pretty sure I edited my post about 30 seconds after sending. In any case, I doubt 3 GB is needed for even high.

edit: disregard, I see that changes were not saved.

As for console influence, you might have a point there, but I am not sure I would blame Watchdog's optimization on it.
w
 
Slow down Turbo. Pretty sure I edited my post about 30 seconds after sending. In any case, I doubt 3 GB is needed for even high.

edit: disregard, I see that changes were not saved.

As for console influence, you might have a point there, but I am not sure I would blame Watchdog's optimization on it.
w

Medium textures needed 2GB @ 1080p. High was one setting up, used something like ~2.6GB. You would start to see issues on 2GB cards at that point. Poor frame times, stutter, streaming issues etc. The higher you went, the more extreme the issues.

Beyond that setting (v high, ultra) the VRAM demands increased further but had less impact on streaming textures if you had a 3GB card or more, the game shuffled memory a little better.

The game's technical director stated that on the newest consoles, it used over 3GB of their unified memory for video purposes. The PC version was based on that version. It was a rubbish port anyway, but the steep threaded CPU and VRAM demands being tied to the engine running on new consoles is pretty much a given.

https://twitter.com/SebViard/status/472030606420115458
 
This article was really self-explanatory prior to reading it... If a 980TI can last this long at 1440p w/ 6GB VRAM and is a worse card, than why can't the 2060.

Because the 980Ti is a few years old already... and the 2060 has the same amount of RAM... so the question is highly relevant! 6GB might be good for right now... but there's no evidence to suggest it will be sufficient in a couple of years...

The intent of this article was to see if 6 GB of vram would have an effect on modern titles.

It would not have been wise to use last generation cards as you would be adding an unknown as to the cause of performance degradation being from the vram or the architecture difference.

By using a RTX 2070, Steve was able to dismiss the 15% or so performance advantage. So long as the this stayed consistent, with no huge penalties to the min frame, it is safe to say that the 6GB vram was not a problem.

True... but it was also implied that this card would "age well", since the 980Ti, with the same amount of VRAM aged well. This logic doesn't make any real sense.

The 980Ti is about 3.5 years old... it was a fantastic card then, and is still a pretty decent card today - I agree that it has aged well (I'm actually quite peeved about it, as I bought three Titan X cards a few weeks before it came out!!).

But this has no real bearing on whether the 2060 will "age well"... yes, if the 2060 came out 3.5 years ago, it would be a logical conclusion.... but it came out now!!

Can we expect the 980Ti to still perform 3.5 years from now? Can anyone show me any evidence to support this?

Yes, it can play modern titles now... but we don't know that it will in a year, or 2 years.... or 3....

While I know it's apples to oranges, I find it intriguing that the Maxwell Titan X that is just about 4 years old now came with 12GB of VRAM.... The 2080Ti (which costs more than the Titan did) has only 11gb... yes, it's faster... but how will it age? How will any of the nvidia cards age?

While it pains me to bring this up, I recall ridiculing members on this site (HardReset, I'm looking at you!) who claimed that AMD cards always aged far better than Nvidia. While I still think they were wrong in those specific instances, it would be interesting to revisit this thread in a year or so...

We can't simply make a comparison to a similar card with more VRAM to decide if the card will perform well in the future - we don't know how much RAM future titles will require!
 
This article was really self-explanatory prior to reading it... If a 980TI can last this long at 1440p w/ 6GB VRAM and is a worse card, than why can't the 2060.

Because the 980ti was a flagship powerhouse with massive overclock headroom and offered the most VRAM of any card on the market

Imo this 2060 with 6gb (same as a 1060) really limits its power in the next few years

But, I know one example of positive optomization that can maybe continue. Forza Horizon 3 was a vram hog, albeit a beautiful game. Forza Horizon 4 is even better looking and requires much less horsepower and VRAM to look better. This could continue
 
Because the 980ti was a flagship powerhouse with massive overclock headroom and offered the most VRAM of any card on the market

Imo this 2060 with 6gb (same as a 1060) really limits its power in the next few years
The Titan X was released a few months before it... it had double the VRAM... but yes, your point stands :)
 
The Titan X was released a few months before it... it had double the VRAM... but yes, your point stands :)

Good catch, but that wasnt really a consumer grade gpu.

I dont keep cards for more than a year at a time anyway. I see no reason to purchase this 2060, its powerful, but at a cost. Id prefer to keep my 1070 and let its vram ceiling keep me in the game longer now that it has adaptive sync support
 
Great stuff @Steve - excellent work as usual.

Is there any way to do content creation VRAM testing? E.g. timeline and effects latency in Premiere + After Effects and Resolve 4K editing, and complex Blender model rendering. Might be an interesting thing to compare, with Radeon VII around the corner.
 
Because the 980Ti is a few years old already... and the 2060 has the same amount of RAM... so the question is highly relevant! 6GB might be good for right now... but there's no evidence to suggest it will be sufficient in a couple of years...

True... but it was also implied that this card would "age well", since the 980Ti, with the same amount of VRAM aged well. This logic doesn't make any real sense.

The 980Ti is about 3.5 years old... it was a fantastic card then, and is still a pretty decent card today - I agree that it has aged well (I'm actually quite peeved about it, as I bought three Titan X cards a few weeks before it came out!!).

But this has no real bearing on whether the 2060 will "age well"... yes, if the 2060 came out 3.5 years ago, it would be a logical conclusion.... but it came out now!!

Can we expect the 980Ti to still perform 3.5 years from now? Can anyone show me any evidence to support this?

Yes, it can play modern titles now... but we don't know that it will in a year, or 2 years.... or 3....

While I know it's apples to oranges, I find it intriguing that the Maxwell Titan X that is just about 4 years old now came with 12GB of VRAM.... The 2080Ti (which costs more than the Titan did) has only 11gb... yes, it's faster... but how will it age? How will any of the nvidia cards age?

While it pains me to bring this up, I recall ridiculing members on this site (HardReset, I'm looking at you!) who claimed that AMD cards always aged far better than Nvidia. While I still think they were wrong in those specific instances, it would be interesting to revisit this thread in a year or so...

We can't simply make a comparison to a similar card with more VRAM to decide if the card will perform well in the future - we don't know how much RAM future titles will require!

I think you are over reacting a bit.
The whole point of the FuryX test was to show that a vram deficient card can be a heavy hitter of you make minor sacrifices to the settings. Even using max setting, the FuryX has aged very well if you compare the release to what it was able to achieve 3 years later. Even now, only a handful of games really take a bath with 4 GB gram using max settings.

Pulled from another forum comparing the FuryX to the 980ti:
At launch:
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
4k - 2% slower 1440p - 8% slower 1080p - 12% slower 2 years ago:ago: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/24.html 4k- 3% faster 1440p - equal 1080p - 7% slower
Today: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/31.html 4k- 4% faster 1440p - 2% faster 1080p - 3% slower

This didnt show min frames, but other sites like his one show the Furyx pulling strong in those games.

Let's forget about the RTX2060 as it is really not the midrange hero the 1160ti or whatever it is called will be. In 3 years, that card will have no business running games at 1440p über settings, but neither should the RX 680 12 GB or whatever it is called and that has nothing to do with vram.

Bottom line, if you plan on playing 1080p ultra games in 3 years, the 6 GB will suffer. However, if you are fine with 1080p or 1440p at SLIGHTLY less detail, 6 GB will be no problem.

*several edits. A little drunk.
 
I think you are over reacting a bit.
The whole point of the FuryX test was to show that a vram deficient card can be a heavy hitter of you make minor sacrifices to the settings. Even using max setting, the FuryX has aged very well if you compare the release to what it was able to achieve 3 years later. Even now, only a handful of games really take a bath with 4 GB gram using max settings.

Pulled from another forum comparing the FuryX to the 980ti:
At launch:
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
4k - 2% slower 1440p - 8% slower 1080p - 12% slower 2 years ago:ago: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/24.html 4k- 3% faster 1440p - equal 1080p - 7% slower
Today: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/31.html 4k- 4% faster 1440p - 2% faster 1080p - 3% slower

This didnt show min frames, but other sites like his one show the Furyx pulling strong in those games.

Let's forget about the RTX2060 as it is really not the midrange hero the 1160ti or whatever it is called will be. In 3 years, that card will have no business running games at 1440p über settings, but neither should the RX 680 12 GB or whatever it is called and that has nothing to do with vram.

Bottom line, if you plan on playing 1080p ultra games in 3 years, the 6 GB will suffer. However, if you are fine with 1080p or 1440p at SLIGHTLY less detail, 6 GB will be no problem.

*several edits. A little drunk.
I didn’t mention any FuryX tests.... I only said AMD as a counterpoint to “aging well”....
Very few mid-range cards age well, as they aren’t really meant to.... if you really need a card to game well for more than a few years, you generally have to buy a flagship GPU - hence the 980Ti being useful even today....

I know sites like to give the “cost per frame”, but this isn’t really why people buy high-end GPUs.... while the 2080 might not give double the performance of the 2060 NOW....it probably will in 3 years....

Of course, if you’re swapping out GPUs every year or 2, this is irrelevant... but if you want to keep a GPU for 4-5 years - you get what you pay for.
 
I didn’t mention any FuryX tests.... I only said AMD as a counterpoint to “aging well”....
Very few mid-range cards age well, as they aren’t really meant to.... if you really need a card to game well for more than a few years, you generally have to buy a flagship GPU - hence the 980Ti being useful even today....

I know sites like to give the “cost per frame”, but this isn’t really why people buy high-end GPUs.... while the 2080 might not give double the performance of the 2060 NOW....it probably will in 3 years....

Of course, if you’re swapping out GPUs every year or 2, this is irrelevant... but if you want to keep a GPU for 4-5 years - you get what you pay for.

I guess it stems from the HD 7970, which was usually behind the GTX 680, and now takes it to school. It was a combination of vram size AND bandwidth among factors such as overrated nvidia driver support. The Rx 280 was a refresh, and the RX 290/390 stayed strong in comparison to the GTX 7 series. Things kind of went for AMD with the release of Maxwell, but Polaris was never seen as high end so it sort of got a break in a way. Polaris has stayed strong these last few years partly due to the 8 GB of vram but more so the extra attention for being the heart of the Sony and MS consoles.
 
Also don't forget that Tahiti was a fan favorite for its stellar Double Precision performance that came in handy for other computing tasks.
 
I might have gotten carried away, but truth be told it has a lot to do with overrated nvidia driver support over the years and the legacy of Tahiti. After the RX 290/390 AMD sort of avoid the high end as to stay the price/performance king in most cases.
 
Rise of the Tomb Raider was an Nvidia title designed to use less than 6gb's with no negative impact and designed to require more than 4gb's with a big impact if 4gb's was the limit, It was like this because Nvidia's flagship at that time was the 980ti & AMD had just released the Fury cards & a big deal was made out of how the game was a mess with the highest texture setting if you had a Fury card but silky smooth with a 980ti, My local forum has threads for different games with graphs for performance & ROTTR has virtually no Fury results due to the ultra setting texture requirement and how it led to single figure and even negative minimums so it's not a good choice for testing if 6gb's of memory is enough because it was built with that amount in mind.
 
The Titan X was released a few months before it... it had double the VRAM... but yes, your point stands :)

Good catch, but that wasnt really a consumer grade gpu.

It basically was though. It did not have superior FP64 performance over the 980Ti, unlike the first-gen Titan/Titan Black that did compared to the 780/780Ti. Its doubled VRAM was the only selling point to prosumers for apps that could make use of that, otherwise it was just a slightly faster gaming card compared to a 980Ti.

Titan as prosumer branding has been pretty inconsistent given that only 4 out of 7 of all the cards released have features that make them slightly more appealing over the model down, and one of those (Titan V) can't even be paired with NVLink to make use of multiple cards with pooled memory, which is the main advantage of that interface :\
 
It basically was though. It did not have superior FP64 performance over the 980Ti, unlike the first-gen Titan/Titan Black that did compared to the 780/780Ti. Its doubled VRAM was the only selling point to prosumers for apps that could make use of that, otherwise it was just a slightly faster gaming card compared to a 980Ti.

Titan as prosumer branding has been pretty inconsistent given that only 4 out of 7 of all the cards released have features that make them slightly more appealing over the model down, and one of those (Titan V) can't even be paired with NVLink to make use of multiple cards with pooled memory, which is the main advantage of that interface :\
Yep... what made the Titan X (Maxwell) sell was that it came out before the 980Ti.... I know that I never would have bought it had I known about the 980Ti....
 
The Ray Tracing Quake 2 demonstrations have basically shown that all ray tracing really offers is more reflection and refraction of light off hallways and the ground.

If I didn't already have a 2080Ti and I was trying to buy a card for gaming that I didn't want to be obsoleted anytime soon, I'd aim for a 2070 over the 2060.

In a hybrid rendering system sure, that is the case, remember Ray tracing isn't just a one tick feature like Tesselation, eventually like sprites->vector->rasterization->Ray tracing.
It's an actual rendering technique, In a hybrid scenario your only going to get very specific perks, in this case according to NVidia lighting/shadows, reflection, refraction, and color shading. Keep in mind with current compute you can only cast so many Ray's in real time.
 
It basically was though. It did not have superior FP64 performance over the 980Ti, unlike the first-gen Titan/Titan Black that did compared to the 780/780Ti. Its doubled VRAM was the only selling point to prosumers for apps that could make use of that, otherwise it was just a slightly faster gaming card compared to a 980Ti.

Titan as prosumer branding has been pretty inconsistent given that only 4 out of 7 of all the cards released have features that make them slightly more appealing over the model down, and one of those (Titan V) can't even be paired with NVLink to make use of multiple cards with pooled memory, which is the main advantage of that interface :\
Yep... what made the Titan X (Maxwell) sell was that it came out before the 980Ti.... I know that I never would have bought it had I known about the 980Ti....

I bought the titan X maxwell specifically to play Witcher 3 which Gtx 980 couldn't handle well. On the other hand having 12GB Vram meant I could play any unoptimized titles at launch day without any hiccup like Batman Arkam Knight (which was so unoptimized they allow unrestricted refund) and Dishonored 2, those 2 games required like 10GB Vram before being patched.
 
I'm running a GTX 970 @ 1440 and it seems to be doing fine still. By fine, I mean games up to and including Witcher 3 release date, with a few settings tapped down. Not a big deal.
I'm sure Shadow of the Tomb Raider might give it a slap in the face, but the tweaking you can do these days within the advanced graphic settings really allow you to make it work quite well with very little visual loss.
But heck - I barely have enough time to scratch a dent in my Steam list of games, let alone even install them lol. So maybe my gaming interest is waning.

Although, it's nice to see the gpu prices come back down to acceptable levels once again.
 
Last edited:
I'm running a GTX 970 @ 1440 and it seems to be doing fine still. By fine, I mean games up to and including Witcher 3 release date, with a few settings tapped down. Not a big deal.

It depends on one's preferences and budget. For example, I am a casual gamer who plays on 1440p with maxed out settings. If I cannot achieve 90+Hz I buy a new video card. Playing with lowered settings is not good enough for me.
 
Back