OK, here we go:
1) No one should be basing current GPU decisions on DX12/Vulkan titles as they literally make up .01% of all titles ever released. Are they the "future"? That's questionable. Hardly ANY games are in the works right now in June 2017. Why aren't developers jumping all over this miracle tech? Must be a reason...
2) The two cards have nearly identical performance, but the Nvidia runs cooler (and thus likely quieter) and uses less juice- so the choice is obvious except for Freesync. Speaking of which:
3) Freesync is cheaper- yes. But some of us don't care about cheaper or "bang for the buck". We can afford it and want the BEST performance period. Nvidia rules this domain for now. People don't buy Lamborghini's for the bang-for-buck value. Stop looking at everything from a budget standpoint- it doesn't matter to some of us.
The thing is; this thread is in response to a review of
budget graphics cards. And so it is the
budget arguments that should weigh heaviest.
So, first of all: Most games made today are some iteration of DX11. This is true, and it is also a point. In that most games made today are not DX1, 2, 3, 4, 5, 6, 7, 8, 9, or even 10. They are DX11. And so it is safe to assume that future games will be DX12. Adoption of new APIs has always been slow at first. But in the end, there will be no way around DX12. I do think you are probably right in that it should not be a
big selling point for RX560 at this early point. And that is also the reason why I put it as more of an afterthought in Parentheses. But it certainly shouldn't go on the minus column either..
Secondly: I can't see that Steven has presented any thermals, or noise measurements? Anyway, these parameters will all come down to the cooler on the card, and ventilation of the cabinet. The RX560 was drawing 28W more than the 1050. But also has 2 GB more VRAM. So ~20W more power draw (assumable for a 2GB version) is "nothing". And if you look at the cards tested. You'll see that the RX560 is a dual fan designed cooler with a heatpipe. Whilst the 1050 has just got one fan mounted on top of a block of aluminium. I would not be surprised if the RX560 is both the cooler, and the quieter(er?) of them. As long as the cabinet is well ventilated, I don't think the 1050 is scoring more than half a point in this category, if any. It could even be louder..
And thirdly: As for Freesync.. Again.. "No one" buys a 370USD G-Sync monitor to pair with a 100USD GPU. But when you get Freesync thrown in almost for free when you go for the AMD alternative. It makes a lot of sense to go for that.
There are so many different use cases. That it is difficult to do Freesync justice with just a few sentences in a comment section like here. So I'll try an extreme example that is bound to get some protests. But it is just to clarify the point, so please bear with me.
Let's say Steven had tested both cards with a 75Hz Freesync monitor to make it fair. And was using V-Sync
off with the 1050 to get the results. Now, as the 1050 does not support Freesync. The monitor is in effect a fixed refresh monitor at 75Hz in this case. Lets take the game "Far Cry Primal" as an example. Now this is a title where the 1050 scores a rather impressive 13,7% better than the RX560. But off course, with V-Sync
off, it
looks like **** with tearing all over the place! So when he is actually
playing the game. He turns V-Sync on. What happens then, is that the GPU does only send a frame to the monitor, if the frame has been finished. Which on average takes 1000ms/58fps=17,24ms. But the monitor refreshes every 13,33ms! This means that on average only
every other frame displayed will be a unique frame. Those times the scan to the monitor happens when the frame is not finished. It is still the frame that is already displaying on the monitor that is lying in the outgoing frame buffer. In effect, we now have a 37,5Hz monitor, displaying exactly 37,5 unique frames per second!!
So, we switch over to the RX560. And with Freesync turned on. There is no tearing, and there is 51 unique frames actually displayed on screen. And it is as smooth as 51 frames can possibly be. That is also 36% more frames than the 1050 is producing with V-Sync on.
As I said earlier, there is today the alternative with fastsync for Nvidia cards. And one could certainly try using that. But that
will introduce some stuttering. Fastsync is in effect introducing a third frame buffer, that allows the card to start working on the next frame as soon as the last one is finished (As opposed to with V-Sync on). But it also means that the frame in the outgoing buffer, is falling further and further back in time. The card is playing "catch up".. And loosing. So with a scan of the outgoing buffer to the monitor every 13,3ms. That means that every third frame, assuming an average frame time of 17,24ms will have to be displayed twice. As the monitor is fixed refresh with an Nvidia card. You get a unique frame on the monitor just two times in a row, and then the same frame displayed twice. So the "unique frame display times" are now in effect 13,3 - 13,3 - 26,6 - 13,3 - 13,3 - 26,6.. So we are down to 56,25fps which is still more than the AMD card produces. But at what cost? Most people will find this jerking more disturbing than a steady 37,5fps. Or you have to accept tearing..
I think most people will just go for a lower graphics preset in stead with the Nvidia card. That might get them a somewhat steady 75fps. But how far down would they have to go? And what would most prefer out of 75 actual smooth fps at the "low" preset, or in stead 51 actual smooth fps at "Normal"? I know what I would choose. In fact, I think it is what almost everybody would choose, if they had gotten a chance to see for themselves the difference.
As I said in my first post to Steven, I think the Freesync argument is worth a lot more than just: "it might make sense"..