Many Years Later: 3GB vs 6GB GTX 1060 in Today's Games

The 3GB card performs better than I expected compared to the 6GB if I’m honest. It seems like if you’re using a 1080p monitor then you’re fine. Sure you can’t max out some games but come on you have a lower midrange card from 5 years ago!
 
I stopped using my GTX 780 only couple weeks ago. It had 3GB of RAM, but I've been using it with a 4K monitor without any issues for years, even played some games, like SC-2 where it performed well at 4K resolution.

It was only recently that I upgraded it to RTX-3080Ti, but since I'm not much of a gamer (I bought it mainly for Deep Learning), I haven't noticed much difference, LOL.

I think to a large extent, investing much money into an expensive graphic card only makes sense for those who has specific needs for it, like playing a lot of modern games, deep learning, or crypto (may they burn in hell).

For a causual gamer who mainly needs a good productivity PC, even RTX-3060 is an overkill. But still, considering it is the latest gen, I would just get RTX3060, and that's plenty enough.
 
Last edited:
I have bought 1060 3gb years ago following your article suggestion, I remember clearly. And a few years later you are telling me it was a mistake to buy the 3gb version, and it would be better to buy 6gb. ARgh damn you -_-'. (no worry im not upset, just commenting :p, I'm actually satisfied with my v.card )
 
I never got the whole 1060 thing, Steve. It was overall a bit slower and cost more than a RX 580. Plus Nvidia always had a consistent progression between tiers in their lineup but the 1060 was WAY behind the 1070.


Yeah, that's why I bought the GTX 1070. The RX 580 would have been another good option, but the cryptomining surge was just ending and the RX 580 was still too expensive. I knew the 3gb 1060 was a bad choice as it was already having issues in GTA 5 and Arkham Knight due to the low VRAM. These were games that were only about a year old on PC at the time.

Same thing is going to happen with the 3050/3050ti, I think. 4GB VRAM is a low settings spec for some of the newer demanding games. The price is also too close to the 3060 (laptop) and if the desktop version of the 3050 (Ti) comes out with only 4GB, I think it will meet the same fate. Having the same amount of VRAM as the 1050 that released 5 years ago doesn't sound good to me.
 
I never got the whole 1060 thing, Steve. It was overall a bit slower and cost more than a RX 580. Plus Nvidia always had a consistent progression between tiers in their lineup but the 1060 was WAY behind the 1070.


Agreed.

However, the laptop situation in 2016 was much different than it is now. Where laptops were just starting to move away from the "mobile" variants and getting into beefier versions of the real thing (though slightly underclocked).

I had a 1070 for my desktop and still use a 1060 gaming laptop. There was no comparison in performance but the 1060 worked great for 1080P gaming (especially for 2016, almost all gaming laptops had 1080p monitors). At 1080p, the 1060/i7 combo can still play almost every game maxed out.

The 1060 was an overpriced, odd duck but (IMO) filled some niches.
 
I walked into Micro Center Westbury yesterday. They had plenty of 6800 XT and 6700 AMD cards and they even had some GeForce Quadro cards.
 
Same thing is going to happen with the 3050/3050ti, I think. 4GB VRAM is a low settings spec for some of the newer demanding games. The price is also too close to the 3060 (laptop) and if the desktop version of the 3050 (Ti) comes out with only 4GB, I think it will meet the same fate. Having the same amount of VRAM as the 1050 that released 5 years ago doesn't sound good to me.
Contrary to the unending false impression tech sites give out, many people don't buy bottom tier GPU's to run AAA games badly, they buy them to run the bulk of the +30,000 Indie / middleware / older titles on Steam, GOG, etc (which typically use a lot less VRAM) at an affordable price. I own a GTX 1660 (6GB) but 95% of games I run (plus those on the top 100 games people are consistently playing) are under 4GB, and most Indie's in 2021 haven't actually got much "VRAM thirstier" than those from 2014-2015.

It also doesn't hurt to have at least one 4GB VRAM card on the market whose "too low VRAM" also makes it undesirable for miners for the many people who run above-mentioned lighter weight games and actually want the card in stock...
 
Contrary to the unending false impression tech sites give out, many people don't buy bottom tier GPU's to run AAA games badly, they buy them to run the bulk of the +30,000 Indie / middleware / older titles on Steam, GOG, etc (which typically use a lot less VRAM) at an affordable price. I own a GTX 1660 (6GB) but 95% of games I run (plus those on the top 100 games people are consistently playing) are under 4GB, and most Indie's in 2021 haven't actually got much "VRAM thirstier" than those from 2014-2015.

It also doesn't hurt to have at least one 4GB VRAM card on the market whose "too low VRAM" also makes it undesirable for miners for the many people who run above-mentioned lighter weight games and actually want the card in stock...

Older games yes, indie titles not so much. Many of these indie games running on Unity or UE4 are made by people who aren't programmers and don't bother with any optimization at all, leaving their games heavily unoptimized and using lots of vram and ram. I've seen 2d NES-style pixel art indie games using over 1 GB of vram and 3 GBs of ram! 3d Amnesia clone hide-and-seek simulator asset flips with PS2-tier graphics using over 5 GBs of vram and 10 GBs of ram!
 
Older games yes, indie titles not so much. Many of these indie games running on Unity or UE4 are made by people who aren't programmers and don't bother with any optimization at all, leaving their games heavily unoptimized and using lots of vram and ram. I've seen 2d NES-style pixel art indie games using over 1 GB of vram and 3 GBs of ram! 3d Amnesia clone hide-and-seek simulator asset flips with PS2-tier graphics using over 5 GBs of vram and 10 GBs of ram!
Depends on the game. Real-life examples - The Pedestrian and Wildfire (both 2020), Dusk (2018), Dex (2015), Portal 2 (2011), Bioshock 1 Classic (2007) and even Unreal Gold (1998) + HD texture pack all use the same 0.9GB VRAM at 1080p (not even half of a GT 1030). There certainly are some glorified "asset flipper" turds at the lower quality end of the market but the bulk of Indie's even the higher grade ones like Desperados 3 & SOMA (both 1.8GB), Shadow Tactics Blades of the Shogun (1.2GB), etc, are well under 4GB VRAM, and it's mostly the AAA industry stuck in the "I bet we can bloat out our games faster than you can upgrade your GPU" rat race that actually forces an upgrade to much higher priced GPU's.
 
Depends on the game. Real-life examples - The Pedestrian and Wildfire (both 2020), Dusk (2018), Dex (2015), Portal 2 (2011), Bioshock 1 Classic (2007) and even Unreal Gold (1998) + HD texture pack all use the same 0.9GB VRAM at 1080p (not even half of a GT 1030). There certainly are some glorified "asset flipper" turds at the lower quality end of the market but the bulk of Indie's even the higher grade ones like Desperados 3 & SOMA (both 1.8GB), Shadow Tactics Blades of the Shogun (1.2GB), etc, are well under 4GB VRAM, and it's mostly the AAA industry stuck in the "I bet we can bloat out our games faster than you can upgrade your GPU" rat race that actually forces an upgrade to much higher priced GPU's.

Yes but most games you mention (except Pedestrian, Wildfire and Desperados 3) are pretty old already. I was thinking of more recent stuff. Desperados 3, I see more as an A or AA than a "true" indie, these games tend to have more polish. And Wildfire, I had never heard of it but have taken a look at the steam page and well, I don't think it's acceptable to see a game with those 2d pixel graphics taking up almost 1 GB of vram at 1080p - so it sort of proves my point...
 
Yeah, that's why I bought the GTX 1070. [...] I knew the 3gb 1060 was a bad choice as it was already having issues in GTA 5 and Arkham Knight due to the low VRAM.

The best price/speed ratio were always the GTX/RTX xx70 versions, they are the low end of the high end chips, so they are very good and powerful without being too expensive.

If the main goal is to have a card for 5 years or more, than the xx70 versions are the way to go. And VRAM never hurts, more the better. In 2021 buying less than 8 GB is not a good idea.

IMHO 2020/2021 are the worse years to buy a latest generation card. I would wait for 2022 so the prices come down and special versions with more VRAM come out
 
I walked into Micro Center Westbury yesterday. They had plenty of 6800 XT and 6700 AMD cards and they even had some GeForce Quadro cards.


And yet they (amd) have had some of the worst uptake in the new gpu market share.... What this tells us is people do not see "value" in what amd is offering.

Maybe if their bait and switch pricing was actually real maybe more would be interested but when "retail" on their cards is higher than the next guys next tier up people are going to skip the overpriced cards and hold on to hope for the cheaper alternatives.

The holy grail this gen has been a msrp 3080 which in fact do (or did before... well you know) and not just the few that amd pushed out of their own pockets but actual aib cards.

Almost everything else above that has been overpriced and yea the whole market is screwed up but just going by actual retail costs the 3080 is definitely the best value you can get.

Me I've got a 3090 msrp Asus and 3080ti (watercooled) evga ftw3 ultra both cost significantly more but compared to what other have paid for much less I'm happy with what I got.

My 3080ti was a direct swap to a friend for my exact same card but 3080 he paid for the ti and gave it to me for the 3080 as he WAS hoping to make a bunch of it back with crypto.

And well...

I'm sure he can still make some but that's the crypto game and why I was happy to have the extra "free" performance I don't mess with all that risk.
 
I've gotten more used to using my GPUs for a longer period of time.

One: the games that can actually push my GPU properly are perpetually delayed, leaving me a bunch of indue hits to max-out.

Two: those games , when finally released, are so broken, they needed 6-12 months f patches to fix (Horizon Zero Dawn, Fallout 4, and Cyberpunk come to mind).

I have a system hen I reuse my old GPU in my HTPC, and the GTX 960 2gb currently there can handle newer titles like Horizon Zero Dawn and Fallen Order at 1080p medium settings., but I'm glad I paid an extra 20 bucks for the 1060 6gb I'm currently FPS gaming on.

The reason I chose this over RX 570 is because I knew the cooling would be harder on a 200w card, and because I know there are thousands of "little nags" you are unofficially signing yourself up with if you ever buy an AMD discrete card (a lot more corner-case incompatibilities in incomplete driver testing than Nvidia ever shipped)
 
"Pascal was an epic release from Nvidia back in the day, but they did have the few bad apples like this one and the GT 1030 DDR4 version. But all that drama is now in the past, and what we have in the present is a second hand 3GB GTX 1060 that we're going to re-test."
Yeah, nVidia has a whole different brand of drama now.
 
I've gotten more used to using my GPUs for a longer period of time.

One: the games that can actually push my GPU properly are perpetually delayed, leaving me a bunch of indue hits to max-out.

Two: those games , when finally released, are so broken, they needed 6-12 months f patches to fix (Horizon Zero Dawn, Fallout 4, and Cyberpunk come to mind).

I have a system hen I reuse my old GPU in my HTPC, and the GTX 960 2gb currently there can handle newer titles like Horizon Zero Dawn and Fallen Order at 1080p medium settings., but I'm glad I paid an extra 20 bucks for the 1060 6gb I'm currently FPS gaming on.

The reason I chose this over RX 570 is because I knew the cooling would be harder on a 200w card, and because I know there are thousands of "little nags" you are unofficially signing yourself up with if you ever buy an AMD discrete card (a lot more corner-case incompatibilities in incomplete driver testing than Nvidia ever shipped)

There's also the abysmal OpenGL performance on all AMD GPUs (higher end models might not seem like that, but that's only because they brute force it at the cost of much higher GPU usage in that scenario than a similar Nvidia model). It's one of the main reasons I have never purchased a discrete AMD graphics card. Of course, it's something that's becoming progressively less important (mainly thanks to Vulkan), but it was still fairly important until about 3 years ago.

Even if it's not too important anymore, it still bothers me that AMD/ATi have never bothered to fix OpenGL performance since the Radeon 7000 (R100) days. Leaves a bad taste.
 
Contrary to the unending false impression tech sites give out, many people don't buy bottom tier GPU's to run AAA games badly, they buy them to run the bulk of the +30,000 Indie / middleware / older titles on Steam, GOG, etc (which typically use a lot less VRAM) at an affordable price. I own a GTX 1660 (6GB) but 95% of games I run (plus those on the top 100 games people are consistently playing) are under 4GB, and most Indie's in 2021 haven't actually got much "VRAM thirstier" than those from 2014-2015.

It also doesn't hurt to have at least one 4GB VRAM card on the market whose "too low VRAM" also makes it undesirable for miners for the many people who run above-mentioned lighter weight games and actually want the card in stock...

I have some more reasons
I see your point. I play mostly AAA titles and occasionally play indies and esports games so I may have been biased towards the more mainstream titles.

Here is my issue. The 3050ti (laptop version) is double the raw performance of the 1050ti (laptop version) yet still has the same amount of VRAM (4GB). It just seems weird that the VRAM didn't scale up as well. It's OK if you use it to play indies, older titles, and esports, but it will struggle on newer, more mainstream games and will be a weird situation where you could have very high framerate in a resolution like 1080p but have to stick to low settings because of the VRAM limit. If you only have a 60hz screen, there's not much you can do.

Furthermore, the 3050 and 3050ti are also RTX cards and the first raytracing cards to have less than 6GB VRAM. Don't know if the desktop versions will be the same or will at least have more VRAM. Anyway, this is also a problem being advertised as "ray-tracing" cards as you need more VRAM (1-2 GB more) in games where you want to use ray-tracing.
 
There's also the abysmal OpenGL performance on all AMD GPUs (higher end models might not seem like that, but that's only because they brute force it at the cost of much higher GPU usage in that scenario than a similar Nvidia model). It's one of the main reasons I have never purchased a discrete AMD graphics card. Of course, it's something that's becoming progressively less important (mainly thanks to Vulkan), but it was still fairly important until about 3 years ago.

Even if it's not too important anymore, it still bothers me that AMD/ATi have never bothered to fix OpenGL performance since the Radeon 7000 (R100) days. Leaves a bad taste.

OpenGL games were very very very rare and AMD had OpenGL issues back in early 2000s, after the HD 4870 like 10-14 years ago, OpenGL is irrelevant now as it had been superseded by Vulkan. Current AMD drivers and GPUs are far less CPU reliant than nVidia and had been like that since Kepler.
 
OpenGL games were very very very rare and AMD had OpenGL issues back in early 2000s, after the HD 4870 like 10-14 years ago, OpenGL is irrelevant now as it had been superseded by Vulkan. Current AMD drivers and GPUs are far less CPU reliant than nVidia and had been like that since Kepler.

Yes that's what I said, AMD had OpenGL issues since the Radeon 7000 R100 generation (released in 2000).

OpenGL games aren't as rare as you imply, otherwise it wouldn't be an issue. If you've always only played AAA games then I'd agree that OpenGL doesn't matter much, but if you play lots of indie games and emulators, then subpar OpenGL performance is still a problem (most emulators offer Vulkan renderers nowadays but in most cases they are buggier and less stable than the OpenGL renderer).

It's like I said, even if the day when OpenGL performance doesn't matter at all arrives, the fact that ATi/AMD have never bothered to try and fix the problem in 20 years, doesn't do them any favors when it comes to gaining my trust.
 
Yes that's what I said, AMD had OpenGL issues since the Radeon 7000 R100 generation (released in 2000).
I just wanted to mention that Nvidia was never much on mastering OpenGL games either when compared to the OpenGL champ, which was 3Dfx. They used to release separate updates for OpenGL titles called MiniGL.
 
Back