GeForce GTX 1060 3GB vs. Radeon RX 570 4GB: 2018 Update

Julio Franco

Posts: 9,097   +2,048
Staff member
Enjoyed the review, but I would like to see the mid-range GPUs tested on a mid-range CPU. The current setup is not representative of what users have. The GPU driver overhead is an important factor, especially for older systems.
 
Enjoyed the review, but I would like to see the mid-range GPUs tested on a mid-range CPU. The current setup is not representative of what users have. The GPU driver overhead is an important factor, especially for older systems.
Even on a mid range CPU you'll hit the GPU limit before the CPU limit with these 2 cards. You would have to go for something cheaper than an R5 1600 to see a a very small difference.
 
I would never ever buy or promote the 1060 3GB due to its naming scheme. And yes, to me FreeSync is important enough to choose the RX 570 over the GTX 1060 3GB.

Additionally, it has 3GB only. The RX 570 4GB will likely fair better at 1440p than the 1060 3GB. Would love to see this tested actually.
 
I would never ever buy or promote the 1060 3GB due to its naming scheme. And yes, to me FreeSync is important enough to choose the RX 570 over the GTX 1060 3GB.

Additionally, it has 3GB only. The RX 570 4GB will likely fair better at 1440p than the 1060 3GB. Would love to see this tested actually.


Because running a 1080p card on a 2K panel seems like a good idea. Here's another one, get an Audi S8 and stick it with a LPG system..... The naming scheme .... again with that non-sense who gives a rat's *** what's it called? They could've named their whole line-up GTX 1080 and add letters to the end of it for all intents and purposes. I never heard someone cry wolf because he couldn't tell the 1060 6GB from the 1060 3GB...... The Freesync is a good argument if you can grab a decent enough panel. Other than that both perform the same despite one having 1 GB less vram that will "help" you in getting to that 30 FPS "sweetspot" for 2k and up panels ... please....
Anywho you can't go wrong with either of them if you're on a tight enough budget for a decent 1080p card
 
But, but, but what about those of us that don't understand how RAM on a video cards work and/or how certain in game settings are nothing more then ram hogs?? We need the comfort of future proofing with more RAM, not factual evidence of actual in game performance.
 
Enjoyed the review, but I would like to see the mid-range GPUs tested on a mid-range CPU. The current setup is not representative of what users have. The GPU driver overhead is an important factor, especially for older systems.

Which will give you 1 of 2 results:

1. Both GPUs will see slightly lower performance, but the same percentage difference between the two, so the final conclusion will be identical; or,

2. Both GPUs will see identical performance (because of both being limited by the "mid-range" CPU), making the final conclusion even more of a "pick the GPU based on the game(s) you're going to play", since their perceived performance & price will be identical.
 
Additionally, it has 3GB only. The RX 570 4GB will likely fair better at 1440p than the 1060 3GB. Would love to see this tested actually.

1. wrong, the 570 fairs the same at 1440p as 1080p. As Steve stated it's all about the game
2. there is a thing called google and tech review web sites that review these cards, they all post those results...

see links below, the 570 is a rebranded 470 with a slight OC

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1060_gaming_x_3gb_review,12.html

https://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1060-3gb-vs-6gb-review_14
 
I would never ever buy or promote the 1060 3GB due to its naming scheme. And yes, to me FreeSync is important enough to choose the RX 570 over the GTX 1060 3GB.
At least with the 1060 3GB the naming scheme has indicated a change in specifications unlike the GTX 1030. A 10% reduction in cuda cores is more than offset with the 20% reduction in (MSRP) price.

Freesync is about the only reason to grab the 570 assuming the same price. I'm considering a switch to AMD when the next GEN drops just for the feature.

Additionally, it has 3GB only. The RX 570 4GB will likely fair better at 1440p than the 1060 3GB. Would love to see this tested actually.
As others have stated both cards are more suited to 1080p gaming, unless you're referring to playing lighter e-sports titles and not AAA.
 
I would never ever buy or promote the 1060 3GB due to its naming scheme. And yes, to me FreeSync is important enough to choose the RX 570 over the GTX 1060 3GB.

Additionally, it has 3GB only. The RX 570 4GB will likely fair better at 1440p than the 1060 3GB. Would love to see this tested actually.

Steve tested the two cards last year across 29 games at 1080p and 1440p.
https://www.techspot.com/review/1411-radeon-rx-570-vs-geforce-gtx-1060-3gb/page8.html

Apart from the anomalous result for Resident Evil, that testing produced a similar outcome to this recent analysis at 1080p, and showed that the 1440p results paralleled the 1080p performance. Meaning, there wasn't an instance where the 3GB choked at 1440p and the 4GB was demonstrably better (apart from Deus Ex, which yielded fps minimums below 30 on both cards).
 
Steve tested the two cards last year across 29 games at 1080p and 1440p.
https://www.techspot.com/review/1411-radeon-rx-570-vs-geforce-gtx-1060-3gb/page8.html

Apart from the anomalous result for Resident Evil, that testing produced a similar outcome to this recent analysis at 1080p, and showed that the 1440p results paralleled the 1080p performance. Meaning, there wasn't an instance where the 3GB choked at 1440p and the 4GB was demonstrably better (apart from Deus Ex, which yielded fps minimums below 30 on both cards).
Fair enough.

Edit; After looking at the games individually... Yeah... The RX 570 is a better deal for 1440p, at least for DX12 games. DX11 is a toss-up.
 
Last edited:
Because running a 1080p card on a 2K panel seems like a good idea. Here's another one, get an Audi S8 and stick it with a LPG system.....
Always funny how using ensuring there's no bottleneck for a CPU by using 720p, a resolution that no one would likely use any high end gaming CPU for, is considered a good practice, but using a 1440p resolution for a 1080p GPU to avoid bottlenecks suddenly is stupid... Uhuh...

The naming scheme .... again with that non-sense who gives a rat's *** what's it called? They could've named their whole line-up GTX 1080 and add letters to the end of it for all intents and purposes. I never heard someone cry wolf because he couldn't tell the 1060 6GB from the 1060 3GB......
It's customer deception, and I don't stand for it. If you want to support that, that's on you. AMD is also guilty of something similar with their RX 560. Don't recommend anyone buy that one either... Nor any 1050 Ti due to the DDR4 deception.

The Freesync is a good argument if you can grab a decent enough panel. Other than that both perform the same despite one having 1 GB less vram that will "help" you in getting to that 30 FPS "sweetspot" for 2k and up panels ... please....
Anywho you can't go wrong with either of them if you're on a tight enough budget for a decent 1080p card
This is true I guess.
 
For someone who needs a $200-$250 graphics card to pass 12-18 months, the GTX is the obvious option.

For someone who is going to hold it for as long as possible, 3+ years, the 570 could be a better bet. That extra GB could become very important if we don't see more 3GB cards in the market from Nvidia, to keep programmers optimizing for that amount of memory.
 
Always funny how using ensuring there's no bottleneck for a CPU by using 720p, a resolution that no one would likely use any high end gaming CPU for, is considered a good practice, but using a 1440p resolution for a 1080p GPU to avoid bottlenecks suddenly is stupid... Uhuh...


It's customer deception, and I don't stand for it. If you want to support that, that's on you. AMD is also guilty of something similar with their RX 560. Don't recommend anyone buy that one either... Nor any 1050 Ti due to the DDR4 deception.


This is true I guess.


If it's a good idea to test the CPU that doesn't mean it's a good one to do GPUs. First of all ANY game engine has limit on how many FPS that can thing can crank out. Second of all if a card barely hits 60 FPS in 1080p would you test it in 1440p? Seeing that all that is 1440p is high refresh rate? Ever tried to limit the fps to 60 on a 144hz panel to see if there's any difference between 60 and 144? if the damn thing is "playable" at full HD care to take a guess what happens when you push the resolution up? Also you keep forgetting the fact that the GPU tends to bottleneck the system far faster than a CPU ever could since 90% of the games are GPU bound.

Heck no, the label says clearly 3GB 6GB. Even if you didn't see a computer part in your whole life-time you can tell that 3>6 and you can also notice the one that ends with 6 has a higher price.... I swear to god ever since people in America started suing companies because they are drool dripping morons and try to dry their cats in the microwave because it didn't say so in the instruction book everyone is quickly passing judgement on stuff that should be common sense... please... it's written on the box you know what you're buying end of story and a company shouldn't be taken to the cleaners because they're too dumb to read a 200 word article or read the damn label on the box.
 
Last edited:
For someone who needs a $200-$250 graphics card to pass 12-18 months, the GTX is the obvious option.

For someone who is going to hold it for as long as possible, 3+ years, the 570 could be a better bet. That extra GB could become very important if we don't see more 3GB cards in the market from Nvidia, to keep programmers optimizing for that amount of memory.

Nope. With how Nvidia drivers are going vs AMD, I would stick with AMD now and the long run. UNLESS Nvidia changes their act with drivers I never had a problem since the old 7700 series days. I have the better version of the AMD RX560 4GB with all the shaders and extra power pin connector over clocked and have drivers from October and no issues gaming at 1080P. Here is some proof AMD drivers are better then Nvidia. http://www.guru3d.com/news-story/th...s-amd-drivers-are-the-most-stable-gamers.html
 
The naming scheme .... again with that non-sense who gives a rat's *** what's it called? They could've named their whole line-up GTX 1080 and add letters to the end of it for all intents and purposes. I never heard someone cry wolf because he couldn't tell the 1060 6GB from the 1060 3GB......

Other than that both perform the same despite one having 1 GB less vram that will "help" you in getting to that 30 FPS "sweetspot" for 2k and up panels ... please....

Please your research on these cards before make false claims. It is simply false to imply the only difference between the 1060 3gb and 6gb variant is in the VRAM, when in fact the 3 gb version has a disabled SM, 10% CUDA cores reduction, and the halved VRAM. Therefore, the 1060 3gb and 1060 6gb are different GPUs and ought to be named differently.
 
I dispute these results! Everyone knows that AMD cards age far better than Nvidia cards!! How is it possible that Nvidia’s lead has grown after a year?!? I still recommend the AMD because in 5 years, it will have 1080 performance! /sarcasm

Not surprising as most games are quite old, like GTA V that is almost 5 years old and Far Cry 5 that's almost 6 years old. About only game that can be considered modern is Wolfenstein II New Colossus. There are also total craps like AC Origins with double DRM and Battlegrounds that is still in beta state.

And before anyone comments like Far Cry 5 launched this year, it uses exactly same engine as Far Cry 3 uses. And Far Cry 3 is 2012 title.
 
Please your research on these cards before make false claims. It is simply false to imply the only difference between the 1060 3gb and 6gb variant is in the VRAM, when in fact the 3 gb version has a disabled SM, 10% CUDA cores reduction, and the halved VRAM. Therefore, the 1060 3gb and 1060 6gb are different GPUs and ought to be named differently.

First of all I never said that that the only difference is 6 and 3. I was trying to put it on the level of the "average not so tech savy" consumer.... Do you actually think that people will start looking for disabled SM's if they don't even know what the term is?
Sigh... No they could've named it "carrots" for all I care. Do you think I do not know that bit of information? And NO they are the same gpu - GP106. Just in case it wasn't clear before they are officially named GTX 1060 6GB and 3GB so there would be your difference for name's sake (https://en.wikipedia.org/wiki/GeForce_10_series). There isn't a box out there that's just branded "GTX 1060" (if you don't count the china AIB ones although didn't see those w/o the 6 5 or 3 for that matter). I didn't say it's a good branding name but there is a difference and unless people can't read and suddenly start buying GPUs the argument that there is little difference between the names is very far fetched.
 
The naming scheme .... again with that non-sense who gives a rat's *** what's it called? They could've named their whole line-up GTX 1080 and add letters to the end of it for all intents and purposes. I never heard someone cry wolf because he couldn't tell the 1060 6GB from the 1060 3GB......

Other than that both perform the same despite one having 1 GB less vram that will "help" you in getting to that 30 FPS "sweetspot" for 2k and up panels ... please....

Please your research on these cards before make false claims. It is simply false to imply the only difference between the 1060 3gb and 6gb variant is in the VRAM, when in fact the 3 gb version has a disabled SM, 10% CUDA cores reduction, and the halved VRAM. Therefore, the 1060 3gb and 1060 6gb are different GPUs and ought to be named differently.
I see where you're coming from here. I think appropriate nomenclature would be 1060 (1060 3gb) and 1060 ti (1060 5/6gb) but starting a fight with someone isn't helping them see that view point. Being the only tech savvy person in my household, I can definitely agree with the argument that even a lesser informed individual would understand that the 1060 6gb would be the better card. Boycotting a specific card because you don't agree with a minor performance/price reduction is petty. Azshadi has been pretty patient with their side of the argument, and therefore looks more competent. C'mon man, don't get up in arms over this. You're making red team fans look bad.
 
If it's a good idea to test the CPU that doesn't mean it's a good one to do GPUs.
It's the EXACT SAME principle. I don't see how anyone can argue this.

First of all ANY game engine has limit on how many FPS that can thing can crank out.
So? APIs also have a limit to how many draw calls they can make. Does that somehow negate the lower res CPU test?

Second of all if a card barely hits 60 FPS in 1080p would you test it in 1440p?
Did anyone stop testing a CPU at 720p because it is already a limit at 1080p?

Seeing that all that is 1440p is high refresh rate? Ever tried to limit the fps to 60 on a 144hz panel to see if there's any difference between 60 and 144? if the damn thing is "playable" at full HD care to take a guess what happens when you push the resolution up? Also you keep forgetting the fact that the GPU tends to bottleneck the system far faster than a CPU ever could since 90% of the games are GPU bound.
That actually depends on a lot of factors. The type of game, the API, the resolution, the AA settings, the power of the card, the drivers, and so on and so on. Saying that a GPU tends to bottleneck faster than a CPU is an empty statement.

Heck no, the label says clearly 3GB 6GB. Even if you didn't see a computer part in your whole life-time you can tell that 3>6 and you can also notice the one that ends with 6 has a higher price.... I swear to god ever since people in America started suing companies because they are drool dripping morons and try to dry their cats in the microwave because it didn't say so in the instruction book everyone is quickly passing judgement on stuff that should be common sense... please... it's written on the box you know what you're buying end of story and a company shouldn't be taken to the cleaners because they're too dumb to read a 200 word article or read the damn label on the box.
Oh so you don't know that the GTX 1060 3GB is a cut down chip?

First of all I never said that that the only difference is 6 and 3. I was trying to put it on the level of the "average not so tech savy" consumer.... Do you actually think that people will start looking for disabled SM's if they don't even know what the term is?
Let me guess. You just looked that up to save face didn't you?
In any case, that is EXACTLY the problem. The average not so tech savy consumer does not know the details, and supporting the likes of these cards is basically encouraging nVidia to lie to their unknowing customers. You know, just like happened with the GTX 970 and the 3.5 GB fiasco. But yeah, you don't care because you're too busy adoring nVidia.

Sigh... No they could've named it "carrots" for all I care. Do you think I do not know that bit of information? And NO they are the same gpu - GP106. Just in case it wasn't clear before they are officially named GTX 1060 6GB and 3GB so there would be your difference for name's sake (https://en.wikipedia.org/wiki/GeForce_10_series). There isn't a box out there that's just branded "GTX 1060" (if you don't count the china AIB ones although didn't see those w/o the 6 5 or 3 for that matter). I didn't say it's a good branding name but there is a difference and unless people can't read and suddenly start buying GPUs the argument that there is little difference between the names is very far fetched.
The difference implies that only the memory is the difference, and that is the issue. It's completely irrelevant that the original GPU is the same. It has disabled shaders, and THAT is what matters. It's the reason the GTX 1070, GTX 1070 Ti and GTX 1080 have different names. Imagine if the 1080 Ti and the 1080 were both named GTX 1080, but one was called the GTX 1080 11GB and one was called the GTX 1080 8GB, and the price was a mere $50 difference. Don't you see the problem with this? If you don't, then you are the problem.

And spare me the argument that the 1080 Ti and 1080 are different chips. Like you yourself said, the average not so tech savy consumer is not going to look up whether they have different chips or not.
I see where you're coming from here. I think appropriate nomenclature would be 1060 (1060 3gb) and 1060 ti (1060 5/6gb) but starting a fight with someone isn't helping them see that view point. Being the only tech savvy person in my household, I can definitely agree with the argument that even a lesser informed individual would understand that the 1060 6gb would be the better card. Boycotting a specific card because you don't agree with a minor performance/price reduction is petty. Azshadi has been pretty patient with their side of the argument, and therefore looks more competent. C'mon man, don't get up in arms over this. You're making red team fans look bad.
"Boycotting a specific card because you don't agree with a minor performance/price reduction is petty."
Seriously? That is NOT the issue at all. No one is boycotting the GTX 1070 Ti because it is a minor performance/price reduction compared to the 1080. Don't distort the facts.
Azshadi has not been 'patient'. He has been passive-aggressive and condescending. And the fact that you support him says it all. Especially that you have to bring in shaming tactics with regards to team colors and whatnot.

Benjiwenji was correct in his statement. That Azshadi is capable of googling to try and save face after the fact does not change things, nor does it warrant you to tell BenjiWenji that he's making a particular demographic look bad, when he posted only facts.
 
As long naive peeps keep buying this overprice rubbish, they just keep selling it. Takes nvidia for example . Why integrated g-sync directly into the graphic card when we could just add a card into a monitor and charge double!
 
Stuff you said

Thank you sir you made me laugh more than the comedy specials on Netfilx,now I know I'm wasting my breath,

A. How the heck is the exact same principle since games become MORE GPU DEMANDING over time? Take the most powerful CPU of this time and run it with a 560 guess what? It's going to be the same pile of crap no matter what CPU you throw at it, that's mainly because the GPU is to weak, If you would like to see more useless benchmarking for a card that is aimed at 1080p because at some point you might want to hook a 4k panel on it, Google, no one in their right minds that spends $250 on a GPU would think "Hey I wonder how it does 4k since it barely spits out 60 fps on 1080p". Why on God's green Earth do you think 720p tests are used? newsflash to eliminate the GPU dependency. What bottleneck are you trying to expose when running a higher resolution other than the GPU itself? What's your opinion on CPUs performing the same at higher resolutions? How would a GPU test at 4k would make any difference for a 1080p card? You are implying that testing 2 different parts that have different roles in the same test is a good idea... it's not. Although you might think that having more Vram is a good thing if you want to go to a higher resolution, while that might be true in part, you have to take into account if the chip can push the that amount of Vram, if this wasn't a thing we would all be gaming on 8800 GT's with 12 GB of Vram.... both of these cards could have 240 GB of HBM they won't be able to use them.... That's why there's little to no difference between the 580 4gb and the 8gb version in higher resolutions, does it help? Yes, does it make the whole experience better? Hell no.

B. Please educate me more on how a CPU becomes the bottleneck faster than a GPU does. Here's a hint those APIs you quoted and have no clue about at this point tend to lift the weight off the CPU and focus it on the GPU. Here's another "empty statement" which part requires regular updates more in let's say 1 year in order for it to perform ok in a title? Also please educate me more how can a Ivy Bridge part still does not bottleneck the heck out of a 1080Ti.

C. Saving face? Hardly, it was plastered all over the tech sites, no one could miss it, furthermore no one brought up the cut down version, the whole debate started with the naming convention being "hard to get", but then again I'm wasting my breath since I can't convince you otherwise :).If you can't follow a conversation, stop having it (this is me being condescending). All sarcasm aside as long as it's listed somewhere everything is fair game, all you have to do is read, the specs are on the manufacturer's website, the retailer's website, take your pick, you don't have to have a degree in computer science to figure out if a number is higher (CUDA cores) it should perform better.

D. "Adoring nVidia" this would be the time that I would really need to be condescending. Here's the deal I don't care if BOSCH is making my GPU as long as it fits my needs, this wasn't the subject to begin with anyway, stop moving the goal posts. The 3.5 GB was a fiasco and guess what? they got sued and lost :D, that was deceptive marketing according to the court, they paid up, this is a whole different scenario it's not even remotely close to what you're implying. As long as your point is thrown out of court it means nothing.

E. Dude seriously ... Does the box say 6GB? Yes, it does, deal with it, you know what you're buying it's common sense.

Last but not least the average non tech savy consumer's judgement is way easier than you might think, the more expensive the part the better the performance that's why people buy i9s and Threadrippers and play Minecraft on them....
 
Thank you sir you made me laugh more than the comedy specials on Netfilx,now I know I'm wasting my breath,
There you go again with the arrogance, scorn and condescending attitude. It doesn't get you any points.
A. How the heck is the exact same principle since games become MORE GPU DEMANDING over time?
Don't games become more cpu demanding over time too? Why would anyone upgrade any CPU if this wasn't the case?

Take the most powerful CPU of this time and run it with a 560 guess what? It's going to be the same pile of crap no matter what CPU you throw at it, that's mainly because the GPU is to weak, If you would like to see more useless benchmarking for a card that is aimed at 1080p because at some point you might want to hook a 4k panel on it, Google, no one in their right minds that spends $250 on a GPU would think "Hey I wonder how it does 4k since it barely spits out 60 fps on 1080p".
Same as someone will not spend $400 on a CPU to play at 720p. You still don't get it do you? Do yourself a favor and stop arguing.

Why on God's green Earth do you think 720p tests are used? newsflash to eliminate the GPU dependency. What bottleneck are you trying to expose when running a higher resolution other than the GPU itself?
What bottleneck are you trying to expose when running a lower resolution other than the CPU itself?

What's your opinion on CPUs performing the same at higher resolutions?
It's most likely a GPU bottleneck, although it can be a memory (both system and GPU), API, engine... Basically it can be anything that is not a CPU bottleneck.

How would a GPU test at 4k would make any difference for a 1080p card?
Uhm... We're testing graphics cards, not only GPUs. Testing the limitations of the memory is valid, considering over time memory and bandwidth use grow over time.

You are implying that testing 2 different parts that have different roles in the same test is a good idea... it's not.
A graphics card has one specific role.

Although you might think that having more Vram is a good thing if you want to go to a higher resolution, while that might be true in part, you have to take into account if the chip can push the that amount of Vram, if this wasn't a thing we would all be gaming on 8800 GT's with 12 GB of Vram.... both of these cards could have 240 GB of HBM they won't be able to use them.... That's why there's little to no difference between the 580 4gb and the 8gb version in higher resolutions, does it help? Yes, does it make the whole experience better? Hell no.
This is not untrue. Except it generally does make the experience better since the frametimes are superior, but let's leave that out.

B. Please educate me more on how a CPU becomes the bottleneck faster than a GPU does. Here's a hint those APIs you quoted and have no clue about at this point tend to lift the weight off the CPU and focus it on the GPU.
Yes... For what? Oh right, to remove the CPU limit. You just answered your own question.

Here's another "empty statement" which part requires regular updates more in let's say 1 year in order for it to perform ok in a title? Also please educate me more how can a Ivy Bridge part still does not bottleneck the heck out of a 1080Ti.
Try playing at 1080p with a 1080Ti. I bet the newer CPUs do a lot better.

C. Saving face? Hardly, it was plastered all over the tech sites, no one could miss it, furthermore no one brought up the cut down version, the whole debate started with the naming convention being "hard to get", but then again I'm wasting my breath since I can't convince you otherwise :).If you can't follow a conversation, stop having it (this is me being condescending). All sarcasm aside as long as it's listed somewhere everything is fair game, all you have to do is read, the specs are on the manufacturer's website, the retailer's website, take your pick, you don't have to have a degree in computer science to figure out if a number is higher (CUDA cores) it should perform better.
So basically you expect everyone to go in depth before buying a product. It's fine if someone expects to get a large pizza but gets a medium one instead. Got it.

D. "Adoring nVidia" this would be the time that I would really need to be condescending. Here's the deal I don't care if BOSCH is making my GPU as long as it fits my needs, this wasn't the subject to begin with anyway, stop moving the goal posts. The 3.5 GB was a fiasco and guess what? they got sued and lost :D, that was deceptive marketing according to the court, they paid up, this is a whole different scenario it's not even remotely close to what you're implying. As long as your point is thrown out of court it means nothing.
They paid? Not really. The amount they had to pay pales in comparison to what they gained from the card. Additionally, it was only valid to the US, and yes, the US is not the only place in the world.

E. Dude seriously ... Does the box say 6GB? Yes, it does, deal with it, you know what you're buying it's common sense.
And then you talk about moving goal posts...

Last but not least the average non tech savy consumer's judgement is way easier than you might think, the more expensive the part the better the performance that's why people buy i9s and Threadrippers and play Minecraft on them....
If that was the case, the majority of sold PC parts would be high end. They aren't.
 
Back