The Best Graphics Cards for the Money: Nvidia & AMD GPUs tested and compared

We Completely ignore them, maybe you'll consider doing the same.
Don't you dare chase Steve away. I enjoy reading his comments as well as his articles. At least he is willing to stick around and join the community. I'd rather he stick around than post and leave.
 
I would never consider 'chasing' Steve away, but ignoring the purposefully misguided sounds like bad advice, Clifford? educate me, what possible gain in Reading, let alone engaging them? (I specifically quoted the misguided, it was Not a general remark -- if that was the impression, that was not my intent)
 
I would never consider 'chasing' Steve away, but ignoring the purposefully misguided sounds like bad advice, Clifford? educate me, what possible gain in Reading, let alone engaging them? (I specifically quoted the misguided, it was Not a general remark -- if that was the impression, that was not my intent)

Sometimes it is necessary to reply.

Given how much time is invested in creating these articles they deserve to be defended. To most it is obvious which opinions are worth ignoring. That said when someone is replying to your work directly a response is often required. That said I wouldn’t get sucked into a back a fourth argument, I have said my piece. Plus, some of the long-time members don’t oppose a bit of troll slapping.

Don't you dare chase Steve away. I enjoy reading his comments as well as his articles. At least he is willing to stick around and join the community. I'd rather he stick around than post and leave.

More than happy to entertain mate ;)
 
Good that you stopped pushing 4k as it still is under 60fps even on the best gpu. Good comparison.

Haha... I hate when people blindly love GTX 970 and give this stupid card a win over R9 390.
R9 390 beats 970 any day and is a much future proof card with 8gb vram vs poor 3.5gb. And who gives a **** about 20w power difference?

I give a ****. Electricity is pricey here and my computer is on almost 24/7. Would choose the 390 but still give a ****.

And if amd keeps having troubles it's not as future proof as you get no drivers from non existing company.
 
Last edited by a moderator:
If you aren't in a hurry, I would hold off a bit to see more dx12 benchmarks before buying.
And 14/16nm cards with HBM2 are slated to launch in 2016. These new cards could be a much better investment.

For so many reasons the huge amount of speculation surrounding DX12 and which camp will be the best to go with is a bit silly.

Rewind to the release of the first DX11 graphics cards, the GeForce GTX 480 and Radeon HD 5870. How long did it take before there were decent DX11 titles and were the first DX11 GPU’s even relevant in the DX11 gaming picture?

Granted DX12 is a much more significant step, I still think the uptake will be much the same. This current generation won’t have much to do with it.

dividebyzero said it best ;)



I think you are radically underestimating the adoption rates of dx12 (at least from the big players looking to push performance). In years past, with leaps in ever advancing versions of direct x, the cross platform game makers would target the lowest common denominator, the consoles. That meant sticking to a dx9 code base and porting that over to the pc. Further, to even begin to take advantage of dx10/dx12 titles, you needed an ENTIRELY new gpu.

NEITHER of those two MAJOR factors are at work here with dx12. DX12 is backwards compatible (to varying degrees with different feature levels) with several generations of gpu families from both nvidia and amd. And more importantly, gcn on the consoles is the basis of the lower overhead codebase for dx12 (however much people deny that reality). Both consoles even have some of the more advanced performance enhancing features of dx12, like asynchronous compute that can work concurrently with the cpu.

I expect nearly EVERY major EA title released in 2016 to support dx12. And probably most other triple A devs releasing cross platform games. It used to be the case that cross platform support meant a greater likelihood of getting dx9 ports, that is COMPLETELY flipped this time around.

Indie game devs may not have some pressing need to go dx12, but who cares, if the game is not taxing performance, then it's not critical in the first place. But for the games that are, the api is there, the console support is there, the hardware base is there. The idea that anyone could possibly expect a similar adoption rate for dx12 titles to previous dx advancements seems like a complete misreading of reality.

And yes, early results suggest that the 900 series / maxwell 2 parts, will not age as well as those "dated" "rebrands" amd released all the way back in 2013.

gcn can handle concurrent graphics and compute workloads, maxwell can't. Part of the reason is that maxwell/nvidia did not need to worry about that is the more serialized world of dx11, that power savings was largely a result of stripping out the power consuming hardware scheduler in maxwell and having the driver handle that. Power sipping gpus... that will have inferior crippled legs at handling asynch compute, and have significantly higher vr latencies and context switching latency.

It is a technically inferior part, and we only now realize that. At least some of us. That power savings was NOT without cost. It just took dx12 and vr to make it visible.
 
I think you are radically underestimating the adoption rates of dx12 (at least from the big players looking to push performance). In years past, with leaps in ever advancing versions of direct x, the cross platform game makers would target the lowest common denominator, the consoles. That meant sticking to a dx9 code base and porting that over to the pc. Further, to even begin to take advantage of dx10/dx12 titles, you needed an ENTIRELY new gpu.

NEITHER of those two MAJOR factors are at work here with dx12. DX12 is backwards compatible (to varying degrees with different feature levels) with several generations of gpu families from both nvidia and amd. And more importantly, gcn on the consoles is the basis of the lower overhead codebase for dx12 (however much people deny that reality). Both consoles even have some of the more advanced performance enhancing features of dx12, like asynchronous compute that can work concurrently with the cpu.

I expect nearly EVERY major EA title released in 2016 to support dx12. And probably most other triple A devs releasing cross platform games. It used to be the case that cross platform support meant a greater likelihood of getting dx9 ports, that is COMPLETELY flipped this time around.

Indie game devs may not have some pressing need to go dx12, but who cares, if the game is not taxing performance, then it's not critical in the first place. But for the games that are, the api is there, the console support is there, the hardware base is there. The idea that anyone could possibly expect a similar adoption rate for dx12 titles to previous dx advancements seems like a complete misreading of reality.

And yes, early results suggest that the 900 series / maxwell 2 parts, will not age as well as those "dated" "rebrands" amd released all the way back in 2013.

gcn can handle concurrent graphics and compute workloads, maxwell can't. Part of the reason is that maxwell/nvidia did not need to worry about that is the more serialized world of dx11, that power savings was largely a result of stripping out the power consuming hardware scheduler in maxwell and having the driver handle that. Power sipping gpus... that will have inferior crippled legs at handling asynch compute, and have significantly higher vr latencies and context switching latency.

It is a technically inferior part, and we only now realize that. At least some of us. That power savings was NOT without cost. It just took dx12 and vr to make it visible.

You can also add the fact that all major game engines already have support for dx12.
 
Great comparisons! AMD did pretty well considering that their products are just re brands of prior products. I'm still using a GTX 660Ti card from MSI, and it is still getting the job done at high 1080p settings.

I'm glad to see you're still getting by with that 660Ti. Luckily I was able to upgrade my primary rig to a Z97 build with an i7 4790K and an Nvidia made GTX 970 (I love it because it has the same awesome cooler as the GTX 980). Now, I do have another X58 build with a Core i7 920@ 4Ghz and a GTX 670. It's still amazingly fast by today's standards. The only thing you may be worried about is Just Cause 3, which looks to be AWESOME but requires a GTX 670 at a minimum! That's gonna be one taxing game, but also incredible.
 
I have always been an AMD fan but broke down and got a gtx970 and it will be hard to go back. My 970 plays everything with no problems.
Also I would like to mention that I mostly play racing games it would be great if you included some in your test comparisons. I would suggest Project cars or dirt rally both require good hardware to play at max settings.
same here...I was ATI/AMD for 15 years and finally got a GTX 980 in march.....I will never go back.
 
I think you are radically underestimating the adoption rates of dx12 (at least from the big players looking to push performance). In years past, with leaps in ever advancing versions of direct x, the cross platform game makers would target the lowest common denominator, the consoles. That meant sticking to a dx9 code base and porting that over to the pc. Further, to even begin to take advantage of dx10/dx12 titles, you needed an ENTIRELY new gpu.

NEITHER of those two MAJOR factors are at work here with dx12. DX12 is backwards compatible (to varying degrees with different feature levels) with several generations of gpu families from both nvidia and amd. And more importantly, gcn on the consoles is the basis of the lower overhead codebase for dx12 (however much people deny that reality). Both consoles even have some of the more advanced performance enhancing features of dx12, like asynchronous compute that can work concurrently with the cpu.

I expect nearly EVERY major EA title released in 2016 to support dx12. And probably most other triple A devs releasing cross platform games. It used to be the case that cross platform support meant a greater likelihood of getting dx9 ports, that is COMPLETELY flipped this time around.

Indie game devs may not have some pressing need to go dx12, but who cares, if the game is not taxing performance, then it's not critical in the first place. But for the games that are, the api is there, the console support is there, the hardware base is there. The idea that anyone could possibly expect a similar adoption rate for dx12 titles to previous dx advancements seems like a complete misreading of reality.

And yes, early results suggest that the 900 series / maxwell 2 parts, will not age as well as those "dated" "rebrands" amd released all the way back in 2013.

gcn can handle concurrent graphics and compute workloads, maxwell can't. Part of the reason is that maxwell/nvidia did not need to worry about that is the more serialized world of dx11, that power savings was largely a result of stripping out the power consuming hardware scheduler in maxwell and having the driver handle that. Power sipping gpus... that will have inferior crippled legs at handling asynch compute, and have significantly higher vr latencies and context switching latency.

It is a technically inferior part, and we only now realize that. At least some of us. That power savings was NOT without cost. It just took dx12 and vr to make it visible.
it's almost like I wrote this myself. ^_^
 
It would be nice to see comparisons to legacy cards such as the AMD 7800-7900 cards. Power usage is important to me as Electrical costs here are around 35 cents a kilowatt hour.
 
It would be nice to see comparisons to legacy cards such as the AMD 7800-7900 cards. Power usage is important to me as Electrical costs here are around 35 cents a kilowatt hour.

I am not sure where they would fit into a current generation buying guide. That said we have plenty of content that compares older generation graphics cards to the latest.
 
Generally the importance of power consumption is exaggerated. If you play 1000 hours a year, (which is around 3 hours of gaming every single day, more than most of us can game anyway), the power consumption of an R9 390x and a GTX 970 will be at most $20 per year, which is around one and a half dollars per month.
 
Same here.

I would have prefered to have seen frame-time perfornce in the benchmarks though as it is much more relevant.

How is it more relevant when it has no impact on single card performance and we proved this for many months before removing the metric. We only include frame time performance in our multi-GPU reviews.

Generally the importance of power consumption is exaggerated. If you play 1000 hours a year, (which is around 3 hours of gaming every single day, more than most of us can game anyway), the power consumption of an R9 390x and a GTX 970 will be at most $20 per year, which is around one and a half dollars per month.

No, the problem is people not understanding why power consumption is important. As you pointed out it has little or almost nothing to do with the power bill. It is all about operating temperatures/volume and overclocking performance.

The GTX 980 Ti overclocks like a champion and can run 20% faster while making almost no noise, assuming a non-reference design.

The Fury X or 390X for example can be barely overclocked and even at their default frequencies require either liquid cooling or in the 390X’s case a massive cooler which still runs very loud under gaming load.
 
No, the problem is people not understanding why power consumption is important. As you pointed out it has little or almost nothing to do with the power bill. It is all about operating temperatures/volume and overclocking performance.

The GTX 980 Ti overclocks like a champion and can run 20% faster while making almost no noise, assuming a non-reference design.

The Fury X or 390X for example can be barely overclocked and even at their default frequencies require either liquid cooling or in the 390X’s case a massive cooler which still runs very loud under gaming load.
Interesting perspective. I can partially agree with the ability to overclock if that's important to you, but not the rest... Even with these overheating problems the R9 290X became the 390X, and the 390X has around a 10% overclock room, so... Yeah. It's not 20%, but it is not 1% either.
What's the problem with liquid cooling? If the Fury X really requires liquid cooling, how are the Fury vanilla & Fury Nano able to operate?
A massive cooler is only a problem if you have a small case, and if it fits in your case, it's not really in issue. In fact, massive GPUs can look kinda slick in a windowed case.

At Idle, the noise will be minimal in pretty much all cases nowadays, and at load you'll be having your sound at a higher volume anyway.
And even if it does bother you, I don't think I've seen a noise level comparison on this site, although I might be wrong. But taken from Guru3D, the differences between a GTX 970 and an R9 390X are minimal.
Hexus agrees with a Sapphire, and so does Toms Hardware with an MSI.

I stand by my statement that the issue is blown out of proportion.
 
I'm saying you are not the only one that intentionally overlooks potential problems. Simply because you don't see them as problems. That doesn't mean they are not problems nor does it mean they are insignificant. Power usage (aka: heat production) is a problem whether you select to admit it or not.
 
I'm saying you are not the only one that intentionally overlooks potential problems. Simply because you don't see them as problems. That doesn't mean they are not problems nor does it mean they are insignificant. Power usage (aka: heat production) is a problem whether you select to admit it or not.
I never said they were not problems. I said the issue is blown out of proportion. That translates to the problem being smaller than it is portrayed to be. I was quite clear when I stated;
'I stand by my statement that the issue is blown out of proportion'.

Whether they are significant or not depends on the end user, and it should be portrayed as such.
 
I regard the Radeon Nano as the most technology advanced last year, why? it all comes down to HBM, its almost as fast as the titan x yet its half the size..
 
I've been an Nvidia user for about 10 years simply because I've been going with what I know. I'm constantly seeing posts in steam forums about issues AMD users are having. Are there still a lot of driver issues with AMD, or are the issues not as bad as others say? Are there compelling reasons to switch?

well am rocking a HD 7970 (380X performance by now) and I have not even notice a system crash in the 4 years I have had it, the system crashes I have had are due to my inexperience with asus mb and cpu overclocking that is not related to the gpu

so I suspect users on steam might have some other system issue, perhaps a bad psu or overclocking? using a nobrand psu with a graphics card that needs 200W is kinda a no-no

btw the fan issue with the first crimson driver was fixed pretty fast, nvidia has also had similar issue
http://betanews.com/2010/03/18/nvid...-responsible-for-fan-problems-issues-updates/

http://gpuboss.com/gpus/Radeon-R9-380X-vs-Radeon-HD-7970
 
well am rocking a HD 7970 (380X performance by now) and I have not even notice a system crash in the 4 years I have had it, the system crashes I have had are due to my inexperience with asus mb and cpu overclocking that is not related to the gpu
Well, you've obviously had an excellent user experience.
so I suspect users on steam might have some other system issue, perhaps a bad psu or overclocking?
So you are assuming that everyone else's situation is the same as yours - same hardware, same usage scenarios, same applications in use, same OS, same background applications - and the problems they encounter are purely down to user inexperience? Sound's more like poorly thought out guerrilla marketing or arrogance than a reasoned argument IMO, especially when EVERY driver (regardless of graphics vendor) includes a list of known issues. AMD's own driver release notes directly contradict your stance regarding a lack of driver issues:
DtKihmo.jpg


BTW: Can you avoid double posting
 
Well, you've obviously had an excellent user experience.

So you are assuming that everyone else's situation is the same as yours - same hardware, same usage scenarios, same applications in use, same OS, same background applications - and the problems they encounter are purely down to user inexperience? Sound's more like poorly thought out guerrilla marketing or arrogance than a reasoned argument IMO, especially when EVERY driver (regardless of graphics vendor) includes a list of known issues. AMD's own driver release notes directly contradict your stance regarding a lack of driver issues:
DtKihmo.jpg


BTW: Can you avoid double posting
the most stupid comment I've ever seen.
if you want to troll then do it somewhere else. do you want people to post the known issues of both nvidia and intel here just to make you look like a fanboy? the nvidia release notes have several pages of known issues and other problems. (but you don't actually care, you just want to justify your purchase, which you clearly have second doubts about, by flaming others --> it's just sad... and childish)
 
Back