The Best $100 GPU: Radeon RX 560 vs. GeForce GTX 1050

Julio Franco

Posts: 9,097   +2,048
Staff member
LOL. Believe it or not, due to Ethereum miners buying out all GPUS with more than 4GB of VRAM in some places of the World, 1050Ti is the best offering in $100-1000 category...
 
That's on my to-do list for next week, just gotta hunt one down locally.

That's great. I hope you're pitting it against an RX 550.

If I might ask for something: I'd love to see some benchmarks with the Radeon power-limited using wattman. I run my RX 460 with power limited to 50% in Wattman and performance is decent enough with a big drop in power use. Given that the GeForce 1030 is a lot more power efficient than a default RX 550, a comparison with a power limited 550 or 460/560 would be interesting.
 
I find these benchmarks far more useful then the flagship reviews

Same here. These are the cards that most people use. Plus, these benchmarks are often well after release. When drivers have become more mature for the card and you have a better understanding of it's value. Add on to that, you have a better idea of how the card is holding up with new titles (for instance, the Geforce 780 review is a good example of that).
 
Just got the EVGA 1050 TI SC Gaming this past weekend. There's not one game I can't put on max settings and it not be playable. The was only one game that seemed a little jittery, but I'm not sure it wasn't because I was download a 47Gb installation file for Witcher 3 at the same time...eating up some resources.
 
I find these benchmarks far more useful then the flagship reviews

Same here. These are the cards that most people use. Plus, these benchmarks are often well after release. When drivers have become more mature for the card and you have a better understanding of it's value. Add on to that, you have a better idea of how the card is holding up with new titles (for instance, the Geforce 780 review is a good example of that).
"These are the cards that most people use."

I'm not so sure about that: for 2016 the GTX 970 was the most popular card on Steam according to the survey (at 125 million-plus users this is a very reliable indicator) and that's a $300 mid-tier card. People fork over $300 and more for consoles all the time, so I think the average pc gamer probably spends more than $100.
 
"These are the cards that most people use."

I'm not so sure about that: for 2016 the GTX 970 was the most popular card on Steam according to the survey (at 125 million-plus users this is a very reliable indicator) and that's a $300 mid-tier card. People fork over $300 and more for consoles all the time, so I think the average pc gamer probably spends more than $100.

I'd have to disagree with that for a couple reasons. First, the Steam survey is people who use steam. This automatically has self-selection involved. The people who use steam are more likely to play higher end games that need better cards.

Second, the survey itself is self-selected. It's an opt-in process. Steam doesn't take your data without your permission (maybe that's changed?). Having 125 million users is probably not the sample size they are using -- but, once again, point to me to them explaining their methodology and raw data and I may change my mind.

Third, the most popular card is less relevant than the distribution of cards and price points. Currently the most popular card is the GTX 750 ti (it surpassed the 970 last month). Most of the other cards are low-mid range with the occasional high end ($200+) card thrown in. A major problem with the data from Steam is that their charts are pretty useless without being put into a distribution.

Tl;Dr: Self-selection highly skews the Steam survey towards the high end and, even then, the data seems to suggest sub $200 cards is the norm.
 
I'd have to disagree with that for a couple reasons. First, the Steam survey is people who use steam. This automatically has self-selection involved. The people who use steam are more likely to play higher end games that need better cards.

Second, the survey itself is self-selected. It's an opt-in process. Steam doesn't take your data without your permission (maybe that's changed?). Having 125 million users is probably not the sample size they are using -- but, once again, point to me to them explaining their methodology and raw data and I may change my mind.

Third, the most popular card is less relevant than the distribution of cards and price points. Currently the most popular card is the GTX 750 ti (it surpassed the 970 last month). Most of the other cards are low-mid range with the occasional high end ($200+) card thrown in. A major problem with the data from Steam is that their charts are pretty useless without being put into a distribution.

Tl;Dr: Self-selection highly skews the Steam survey towards the high end and, even then, the data seems to suggest sub $200 cards is the norm.
Well you've made some valid points there. I'm just glad that most tech sites are reviewing the entire range of cards these days... while the less expensive cards are more mainstream, reviews carry even more weight when one is shopping $750 AIB GTX 1080 Ti's
 
With this conclusion: "In my opinion, the only real advantage the RX 560 has is FreeSync support, if you can take advantage of that, then it might make sense at the same price as the 1050."

I have to ask; have you tried playing games in this low end with, and without, adaptive sync? Because in my opinion. This here, is where the real advantages of a monitor with adaptive sync really shows. And it is MASSIVE when you want to be rid of tearing, stutter and input lag at these low frame rates..

Your conclusion should really be: "The only reason to go for a GTX1050, is if you don't have, or don't plan to buy a Freesync monitor".

In my country you get a 75Hz Freesync gaming monitor for ~150USD. Whilst the cheapest G-Sync monitor is the 144Hz AOC2460PG at ~370USD. But those 144Hz will be wasted on these cards. And you'll get the same performance sub 75fps (Which most of these games vil be at with these cards) with a monitor at much less than half the price.

Now; good job testing all these cards, and in a more "realistic" setting with an R5 1400 to boot. Thumbs up! Sorry to see you stumble over the finish line like this though. :p
 
With this conclusion: "In my opinion, the only real advantage the RX 560 has is FreeSync support, if you can take advantage of that, then it might make sense at the same price as the 1050."

I have to ask; have you tried playing games in this low end with, and without, adaptive sync? Because in my opinion. This here, is where the real advantages of a monitor with adaptive sync really shows. And it is MASSIVE when you want to be rid of tearing, stutter and input lag at these low frame rates..

Your conclusion should really be: "The only reason to go for a GTX1050, is if you don't have, or don't plan to buy a Freesync monitor".

In my country you get a 75Hz Freesync gaming monitor for ~150USD. Whilst the cheapest G-Sync monitor is the 144Hz AOC2460PG at ~370USD. But those 144Hz will be wasted on these cards. And you'll get the same performance sub 75fps (Which most of these games vil be at with these cards) with a monitor at much less than half the price.

Now; good job testing all these cards, and in a more "realistic" setting with an R5 1400 to boot. Thumbs up! Sorry to see you stumble over the finish line like this though. :p

Obviously yes I have used FreeSync on low-end GPUs like the RX 560, I get what you are saying which is why I mentioned the advantage of having FreeSync in the conclusion. That said not everyone wants to use FreeSync, many complain that it messes with the input, creating a laggy sensation. FreeSync doesn't get right of input lag as you suggest.
 
I'd hope it was obvious that you have tried it. But seeing as how the difference between a 75Hz Freesync monitor and a 60Hz fixed refresh rate monitor behaves at these low frame rates. I don't think you are giving the Freesync alternative the attention it deserves. And I had to wonder if it was because you haven't seen it.. Because I'm guessing you spend most of your time with equipment in a slightly higher price range. :p

As for the claim that "many" people are complaining. I'd say that would only make sense if we are talking about games that are switching back and fort between in freesync range and above. And that they are experiencing driver or specific game problems. In such a scenario, the same will very likely apply to G-Sync monitors as well. Neither G-Sync or Freesync is some magic thing that fixes everything. But consider this: A 75Hz Freesync monitor is also a 75Hz fixed refresh rate monitor. So it still makes sense to go with the alternative that can give you both 20% higher refresh rate than regular 60Hz fixed refresh monitors, (and/)or adaptive sync.

I do agree with you that the only reason to go for the RX 560 is the Freesync option. I guess that didn't come out as clear as I intended. I just think that the Freesync option makes such a big difference. That it dwarfs the minor 2-3% average faster frame rates of the GTX1050.

Please consider this: The only way you actually see the number of fps you got in your benchmarks with a GTX1050 to actually display on a fixed refresh rate monitor. Is firstly; you have to have a monitor that is able to display as many frames per second as the graphic card outputs. And secondly; you have to turn V-Sync off. Which introduces tearing.

Or, if you want to avoid tearing, you can with a GTX1050 use "Fastsync". But that will still throw some frames in the garbage (and well, every frame above the monitors refresh rate). And you will get some stutter. And if you get lets say 70 fps average with the RX 560 card on a Freesync monitor. Most of those frames will be displayed on the monitor. What good is it with a graphics card that is able to produce even 10% more frames in a specific game, than the RX 560. If you have a fixed refresh rate monitor that caps, and throws everything above 60 fps in the trash?

Off course, not everyone is in the market for a new monitor. Especially at this low end of the market. But even for these, I'd make it a point that if their old monitor is, well, getting old. Then there is always the chance that they soon will not have any other option but to buy a new one. And wouldn't they appreciate it do you think. If you, just months earlier, had adviced them to go for a Freesync compatible graphics card? ;)

Ouuphh.. Sorry for longer post then I first intended. But I am a big fan of your massive benchmarkings. You are one of very few to actually do this correctly! Keep up the good work! :)
 
I own a 1050, but can't help but felt the way this article being written is biased against the Amd.
Funny how every time the numbers lean Nvidia's way, someone claims the review is biased. Just what Techspot needs- another irrational AMD fanboy. Are you Hard Reset's son by any chance?
 
Funny how every time the numbers lean Nvidia's way, someone claims the review is biased. Just what Techspot needs- another irrational AMD fanboy. Are you Hard Reset's son by any chance?

Oh come off it! He owns a 1050 himself! I have both a 1080Ti, 1080, and 2x960's lying around. Plus 780mSLI in my Alienware laptop. And I have one R9 380 with a 75Hz Freesync monitor. Nvidia knows how to make great GPU's. That doesn't mean that every one they make are the best choice for every situation.

Stevens conclusion makes sense on a spreadsheet, I'm sure. Better thermals, less power draw, slightly better performance, and better overclocking. Vs. Just the one thing the RX560 has going for it; Freesync. (And well, it does actually win performance wise too, if you only count the DX12/Vulkan games. That is, it may very well be an indicator for future performance..)

But as I tried to point out: A Freesync monitor will make the RX560 the obvious choice among these two cards. Because; who cares about 20W saved, especially when you are no longer saving it, if you are among the 2% who actually overclock. And almost no amount of overclocking would make up for the better performance you get with Freesync anyway..

And an argument can absolutely be made that we are no longer even making an apples to apples comparison. Say you have both system set at the "Ultra" preset of some game. But with the Nvidia card, you'd have to either turn of V-Sync, which introduces tearing. Or turn on fastsync, which introduces stuttering, and throws frames in the garbage. Then it either no longer looks like "Ultra" like it still does on the AMD system, or it no longer is the best performing with the smoothest frames..

Seriously? Has this never occurred to anyone else?
 
Oh come off it! He owns a 1050 himself! I have both a 1080Ti, 1080, and 2x960's lying around. Plus 780mSLI in my Alienware laptop. And I have one R9 380 with a 75Hz Freesync monitor. Nvidia knows how to make great GPU's. That doesn't mean that every one they make are the best choice for every situation.

Stevens conclusion makes sense on a spreadsheet, I'm sure. Better thermals, less power draw, slightly better performance, and better overclocking. Vs. Just the one thing the RX560 has going for it; Freesync. (And well, it does actually win performance wise too, if you only count the DX12/Vulkan games. That is, it may very well be an indicator for future performance..)

But as I tried to point out: A Freesync monitor will make the RX560 the obvious choice among these two cards. Because; who cares about 20W saved, especially when you are no longer saving it, if you are among the 2% who actually overclock. And almost no amount of overclocking would make up for the better performance you get with Freesync anyway..

And an argument can absolutely be made that we are no longer even making an apples to apples comparison. Say you have both system set at the "Ultra" preset of some game. But with the Nvidia card, you'd have to either turn of V-Sync, which introduces tearing. Or turn on fastsync, which introduces stuttering, and throws frames in the garbage. Then it either no longer looks like "Ultra" like it still does on the AMD system, or it no longer is the best performing with the smoothest frames..

Seriously? Has this never occurred to anyone else?
OK, here we go:
1) No one should be basing current GPU decisions on DX12/Vulkan titles as they literally make up .01% of all titles ever released. Are they the "future"? That's questionable. Hardly ANY games are in the works right now in June 2017. Why aren't developers jumping all over this miracle tech? Must be a reason...
2) The two cards have nearly identical performance, but the Nvidia runs cooler (and thus likely quieter) and uses less juice- so the choice is obvious except for Freesync. Speaking of which:
3) Freesync is cheaper- yes. But some of us don't care about cheaper or "bang for the buck". We can afford it and want the BEST performance period. Nvidia rules this domain for now. People don't buy Lamborghini's for the bang-for-buck value. Stop looking at everything from a budget standpoint- it doesn't matter to some of us.
 
OK, here we go:
1) No one should be basing current GPU decisions on DX12/Vulkan titles as they literally make up .01% of all titles ever released. Are they the "future"? That's questionable. Hardly ANY games are in the works right now in June 2017. Why aren't developers jumping all over this miracle tech? Must be a reason...
2) The two cards have nearly identical performance, but the Nvidia runs cooler (and thus likely quieter) and uses less juice- so the choice is obvious except for Freesync. Speaking of which:
3) Freesync is cheaper- yes. But some of us don't care about cheaper or "bang for the buck". We can afford it and want the BEST performance period. Nvidia rules this domain for now. People don't buy Lamborghini's for the bang-for-buck value. Stop looking at everything from a budget standpoint- it doesn't matter to some of us.

The thing is; this thread is in response to a review of budget graphics cards. And so it is the budget arguments that should weigh heaviest.

So, first of all: Most games made today are some iteration of DX11. This is true, and it is also a point. In that most games made today are not DX1, 2, 3, 4, 5, 6, 7, 8, 9, or even 10. They are DX11. And so it is safe to assume that future games will be DX12. Adoption of new APIs has always been slow at first. But in the end, there will be no way around DX12. I do think you are probably right in that it should not be a big selling point for RX560 at this early point. And that is also the reason why I put it as more of an afterthought in Parentheses. But it certainly shouldn't go on the minus column either..

Secondly: I can't see that Steven has presented any thermals, or noise measurements? Anyway, these parameters will all come down to the cooler on the card, and ventilation of the cabinet. The RX560 was drawing 28W more than the 1050. But also has 2 GB more VRAM. So ~20W more power draw (assumable for a 2GB version) is "nothing". And if you look at the cards tested. You'll see that the RX560 is a dual fan designed cooler with a heatpipe. Whilst the 1050 has just got one fan mounted on top of a block of aluminium. I would not be surprised if the RX560 is both the cooler, and the quieter(er?) of them. As long as the cabinet is well ventilated, I don't think the 1050 is scoring more than half a point in this category, if any. It could even be louder..

And thirdly: As for Freesync.. Again.. "No one" buys a 370USD G-Sync monitor to pair with a 100USD GPU. But when you get Freesync thrown in almost for free when you go for the AMD alternative. It makes a lot of sense to go for that.

There are so many different use cases. That it is difficult to do Freesync justice with just a few sentences in a comment section like here. So I'll try an extreme example that is bound to get some protests. But it is just to clarify the point, so please bear with me.

Let's say Steven had tested both cards with a 75Hz Freesync monitor to make it fair. And was using V-Sync off with the 1050 to get the results. Now, as the 1050 does not support Freesync. The monitor is in effect a fixed refresh monitor at 75Hz in this case. Lets take the game "Far Cry Primal" as an example. Now this is a title where the 1050 scores a rather impressive 13,7% better than the RX560. But off course, with V-Sync off, it looks like **** with tearing all over the place! So when he is actually playing the game. He turns V-Sync on. What happens then, is that the GPU does only send a frame to the monitor, if the frame has been finished. Which on average takes 1000ms/58fps=17,24ms. But the monitor refreshes every 13,33ms! This means that on average only every other frame displayed will be a unique frame. Those times the scan to the monitor happens when the frame is not finished. It is still the frame that is already displaying on the monitor that is lying in the outgoing frame buffer. In effect, we now have a 37,5Hz monitor, displaying exactly 37,5 unique frames per second!!

So, we switch over to the RX560. And with Freesync turned on. There is no tearing, and there is 51 unique frames actually displayed on screen. And it is as smooth as 51 frames can possibly be. That is also 36% more frames than the 1050 is producing with V-Sync on.

As I said earlier, there is today the alternative with fastsync for Nvidia cards. And one could certainly try using that. But that will introduce some stuttering. Fastsync is in effect introducing a third frame buffer, that allows the card to start working on the next frame as soon as the last one is finished (As opposed to with V-Sync on). But it also means that the frame in the outgoing buffer, is falling further and further back in time. The card is playing "catch up".. And loosing. So with a scan of the outgoing buffer to the monitor every 13,3ms. That means that every third frame, assuming an average frame time of 17,24ms will have to be displayed twice. As the monitor is fixed refresh with an Nvidia card. You get a unique frame on the monitor just two times in a row, and then the same frame displayed twice. So the "unique frame display times" are now in effect 13,3 - 13,3 - 26,6 - 13,3 - 13,3 - 26,6.. So we are down to 56,25fps which is still more than the AMD card produces. But at what cost? Most people will find this jerking more disturbing than a steady 37,5fps. Or you have to accept tearing..

I think most people will just go for a lower graphics preset in stead with the Nvidia card. That might get them a somewhat steady 75fps. But how far down would they have to go? And what would most prefer out of 75 actual smooth fps at the "low" preset, or in stead 51 actual smooth fps at "Normal"? I know what I would choose. In fact, I think it is what almost everybody would choose, if they had gotten a chance to see for themselves the difference.

As I said in my first post to Steven, I think the Freesync argument is worth a lot more than just: "it might make sense"..
 
Hey Steve,

Thank you for the comparison. I know it took a lot of time to do it and I appreciate your effort, but what about content creation?
There are people like me who don't play games and are occasionaly editing videos in Premiere Pro.
What would be the difference between these 2 cards, having in mind that Premiere has been optimized to use CUDA for rendering?
It would be really great if you could compare those 2 in Premiere, I was trying to find such a comparison for months, but totally failed. I currently use a GTX 750 Ti and I am thinking about getting one of these 2, your help would be greaty appreciated.

Thanks in advance,
TheSkyGuy
 
Back