The Best Graphics Cards for the Money: Nvidia & AMD GPUs tested and compared

Some glaring issues here. 22 watts in-game is enough "efficiency" to recommend the 970 over Hawaii refresh despite lower FPS??? - shill harder... 260x is mentioned, but not the 290/290x? I've used all three and I applaud the mention of Bonaire XTX, that said, why no mention of Hawaii? In the last two months Newegg has had the XFX Black Edition 290x(lifetime warranty) for $259 and a Sapphire reference 290 for $220. Want more efficiency? Use Afterburner to reduce maximum power to -20/25% without having to reduce the clockspeed and/or use frame limiting to match max monitor refresh.
Also, if you discuss price to performance and don't include Freesync vs Gsync you are leaving a really big piece of the budget puzzle out of the picture - booooo.
Try this:
<$650 - 980ti
<$550 - Fury Pro
<$400 - 390x
<$300 - 290x
<$250 - 290 (AIB $240ish, Reference $220)
<$200 - 380*(290 @ $220 crushes this though)
$100 to $200 cards are currently terrible value propositions courtesy of reduced fps per dollar and lack of Freesync(which, sadly, would benefit these lower tiers the most...)
<$100 - 260x
 
Maybe there could be a follow-up article, comparing performance per milliamp of electrical power used, without regard to any video card hardware cost.
 
I would rather have the witcher 3 or crysis 3 in this article. All the Ubisoft games used here were supported by Nvidia and have exclusive features to show it.
Err, the Witcher 3 has nvidia gameworks features just like those Ubi games.

The entire crysis series is known for favoring nvidia hardware.
 
Currently the R7 260X can be had for the same price and provides at least 10% more performance than the R7 370, making it a better choice until remaining stock dries up.

Just wanna point this out, R7 260X is not faster than R7 370, maybe you meant R7 360.
 
I would rather have the witcher 3 or crysis 3 in this article. All the Ubisoft games used here were supported by Nvidia and have exclusive features to show it.
Err, the Witcher 3 has nvidia gameworks features just like those Ubi games.
Which can be turned off - and has been in this latest comparison between the XFX DD R9 390X ($380 after MIR) and the Asus GTX 970 Strix ($320 after MIR). The highest playable settings are identical for both cards with the 970 slightly shading the 390X in this instance.
 
It's irresponsible of you to recommend the GTX970 over the R9-390. I own BOTH an R9-290(@390 clocks) and a GTX970. The AMD card is routinely faster, and if your use-case allows(space requirements), the R9 is a smarter choice 100% of the time. Everything else is pretty on point.

You're probably just trying not to piss of nVidia. The only logical cards they have are the GTX980-Ti and GTX950, for every other price category AMD has a better option. Basically, if you didn't give the nod to the GTX970, AMD would have had a clean sweep in the most profitable/popular part of the graphics card market.

If you really own both then you would know that they deliver virtually the same performance and differentiating between the two is near impossible. The frame rate battle is pretty much a tie, the GTX 970 runs cooler and consumes less power so I prefer it. That said gamers could happily go either way. To say it is irresponsible to recommend one over the other is just stupid when they are so similar.

It might be irresponsible to recommend the Fury X over the GTX 980 Ti or the GTX 960 over the R9 380.

FYI I couldn’t careless about Nvidia’s feelings and pissing them off, I just called it as I saw it.

Also, in the chart about power consumption, it says that the difference is only 22W between the gtx 970 and r9 390. Shouldnt the difference be much higher? If not, why are you recommending the 970 because of power efficiency when its only 7% more efficient but performs worse in all benchmarks?

The graphing error is fixed thank you, there was a truck load of data to get through. The power consumption figures are going to vary quite a bit from game to game. Typically, we test with at least three games but didn’t have time. We weren’t basing opinions solely on the power consumption results shown in this article but rather using them as an indicator. Having tested both GPUs extensively since their release we know the 390 consumes quite a lot more power than the 970 in most games.
 
I have to say that buying a 970 over a 390 is a bizarre recommendation, having owned both their is no way I would go with the 970, the MSI 390 is such a great card and was cheaper than the 970 with better performance.

In saying that the 970 was good but had frame stuttering issues in about 20% of games.
 
I have to say that buying a 970 over a 390 is a bizarre recommendation, having owned both their is no way I would go with the 970, the MSI 390 is such a great card and was cheaper than the 970 with better performance.

In saying that the 970 was good but had frame stuttering issues in about 20% of games.

Really? I game at 1600p and often use a GTX 970, never seen any frame stuttering issues in any of the latest games. Based on what I have seen price wise and performance wise they are very much the same.

I prefer the GTX 970 in that comparison but recognize this battle is very much a tie, so recommend the GTX 970 or R9 390.
 
Err, the Witcher 3 has nvidia gameworks features just like those Ubi games.

The entire crysis series is known for favoring nvidia hardware.

The Witcher 3 has hairworks but it's limited to only that. The whole engine was not optimized for Nvidia cards in mind.

Crysis 3 was the only game of the series that could be called a leaner with the over tessellation issue. 1 and 2 ran pretty agnostically.

Which can be turned off - and has been in this latest comparison between the XFX DD R9 390X ($380 after MIR) and the Asus GTX 970 Strix ($320 after MIR). The highest playable settings are identical for both cards with the 970 slightly shading the 390X in this instance.

Thanks for the link but I'm still seeing HBAO+ turned on in their tests. Can anyone confirm this Nvidia tech doesn't stint framerates on AMD cards vs SSAO?
 
Great comparisons! AMD did pretty well considering that their products are just re brands of prior products. I'm still using a GTX 660Ti card from MSI, and it is still getting the job done at high 1080p settings.
 
It's irresponsible of you to recommend the GTX970 over the R9-390. I own BOTH an R9-290(@390 clocks) and a GTX970. The AMD card is routinely faster, and if your use-case allows(space requirements), the R9 is a smarter choice 100% of the time. Everything else is pretty on point.

You're probably just trying not to piss of nVidia. The only logical cards they have are the GTX980-Ti and GTX950, for every other price category AMD has a better option. Basically, if you didn't give the nod to the GTX970, AMD would have had a clean sweep in the most profitable/popular part of the graphics card market.

If you really own both then you would know that they deliver virtually the same performance and differentiating between the two is near impossible. The frame rate battle is pretty much a tie, the GTX 970 runs cooler and consumes less power so I prefer it. That said gamers could happily go either way. To say it is irresponsible to recommend one over the other is just stupid when they are so similar.

It might be irresponsible to recommend the Fury X over the GTX 980 Ti or the GTX 960 over the R9 380.

FYI I couldn’t careless about Nvidia’s feelings and pissing them off, I just called it as I saw it.

Also, in the chart about power consumption, it says that the difference is only 22W between the gtx 970 and r9 390. Shouldnt the difference be much higher? If not, why are you recommending the 970 because of power efficiency when its only 7% more efficient but performs worse in all benchmarks?

The graphing error is fixed thank you, there was a truck load of data to get through. The power consumption figures are going to vary quite a bit from game to game. Typically, we test with at least three games but didn’t have time. We weren’t basing opinions solely on the power consumption results shown in this article but rather using them as an indicator. Having tested both GPUs extensively since their release we know the 390 consumes quite a lot more power than the 970 in most games.

You must have had a bad 390 or used a bad driver. The 390 always outperforms the 970 in every bench or game we test them in. Period.
 
Thanks for the link but I'm still seeing HBAO+ turned on in their tests. Can anyone confirm this Nvidia tech doesn't stint framerates on AMD cards vs SSAO?
It should easy enough to extrapolate from Steve's own review. The 390X has a negligible performance gain over the 290X until vRAM frame buffer at higher resolutions/downsampling makes an impact. Also note that the GTX 970 Strix in the comparison sports a middle-of-the-pack overclock - not reference (of the 31 models offered by Newegg, 12 offer higher overclocks, one is equal, 10 have OC's within 5% of the Strix, and 7 offer reference clocks) so slightly higher performance, but is fairly indicative of the market so far as Newegg's selection and pricing is concerned.
You must have had a bad 390 or used a bad driver. The 390 always outperforms the 970 in every bench or game we test them in. Period.
Well that isn't difficult if you tailor your bench suite (and/or image quality settings) to titles that fit your narrative is it?...and blaming a bad AMD driver? That's heretical talk to more than a few.

Last time I checked, some current games definitely favour the GTX 970 over the R9 390. Call of Duty: Advanced Warfare, WoW, Wolfenstein:New Order, GRID Autosport, and Project CARS definitely spring to mind. Some games like Metro Last Light /Redux, The Witcher 3, GTA V can tip one one of the other depending upon what I.q. settings are used. The 390 shades the 970 in many games, but I'm guessing a lot comes down to choice of the games used in comparison, and what SKU's are used in that comparison.
 
Last edited:
I have to say that buying a 970 over a 390 is a bizarre recommendation, having owned both their is no way I would go with the 970, the MSI 390 is such a great card and was cheaper than the 970 with better performance.

In saying that the 970 was good but had frame stuttering issues in about 20% of games.

Really? I game at 1600p and often use a GTX 970, never seen any frame stuttering issues in any of the latest games. Based on what I have seen price wise and performance wise they are very much the same.

I prefer the GTX 970 in that comparison but recognize this battle is very much a tie, so recommend the GTX 970 or R9 390.

Was the last table edited or was it this all the time? I see it sports "$300 - $400 GeForce GTX 970 or Radeon R9 390" now. If so, hats off for fairness and listening.
 
Was the last table edited or was it this all the time? I see it sports "$300 - $400 GeForce GTX 970 or Radeon R9 390" now. If so, hats off for fairness and listening.

This was how I had it to begin with. Not sure if it was changed during the editing process. They are obviously very similar making it hard to pick a real winner.
 
They recommend the 970 over the 390 when they're mostly identical and it's all torches and pitchforks. They recommend the 390X over the 980 even though the 980 consistently does better and nobody says a word.

That said, I do find it odd to put the 970 over the 390, power consumption is a really wishy washy reason. When you spend that much on a GPU you buy it to perform and typically overclock it, not save a few cents on the electricity bill. I'm not sure the difference of less than a light bulb will impact anyone's experience in any significant way and nobody will have to run out and buy a new PSU if they can already run a 970. 8gb of vram and better dx12 support are hardly speculative seeing as we're already seeing games hit that 3.5gb ceiling, and DX12 benchmarks are showing significant gaps between the two.
 
Err, the Witcher 3 has nvidia gameworks features just like those Ubi games.

The entire crysis series is known for favoring nvidia hardware.

The Witcher 3 has hairworks but it's limited to only that. The whole engine was not optimized for Nvidia cards in mind.

Crysis 3 was the only game of the series that could be called a leaner with the over tessellation issue. 1 and 2 ran pretty agnostically.

Which can be turned off - and has been in this latest comparison between the XFX DD R9 390X ($380 after MIR) and the Asus GTX 970 Strix ($320 after MIR). The highest playable settings are identical for both cards with the 970 slightly shading the 390X in this instance.

Thanks for the link but I'm still seeing HBAO+ turned on in their tests. Can anyone confirm this Nvidia tech doesn't stint framerates on AMD cards vs SSAO?
The witcher 3 has hairworks (physx + nvidia advanced Tessellation) and HBAO+.......all of which can be turned off.
AC games have HBAO+, Physx, nvidia advanced Tessellation, PCSS, and as an added blurry bonus, TXAA...............all of which can be turned off.

My experience with the Crysis games was not agnostic at all.
Crysis 1 ran better on my 8800 than my Radeon 5870. Yes the 5870 had higher peak fps, but the stutter was horrible, yet non existent at the same settings on the 8800.
Crysis 2 was definitely a TWIMTBP game. Again, my experience was that it ran better (smoother, less stutter) on my 760 than a 280X.
Crysis 3, no comment as I never played it.
 
They recommend the 970 over the 390 when they're mostly identical and it's all torches and pitchforks. They recommend the 390X over the 980 even though the 980 consistently does better and nobody says a word.

That said, I do find it odd to put the 970 over the 390, power consumption is a really wishy washy reason. When you spend that much on a GPU you buy it to perform and typically overclock it, not save a few cents on the electricity bill. I'm not sure the difference of less than a light bulb will impact anyone's experience in any significant way and nobody will have to run out and buy a new PSU if they can already run a 970. 8gb of vram and better dx12 support are hardly speculative seeing as we're already seeing games hit that 3.5gb ceiling, and DX12 benchmarks are showing significant gaps between the two.

Trust me I know what you are saying about people over reacting about our picks. That said the 390X and 980 comparison was a little more cut and dried given the 390X is so much cheaper.

Power consumption is extremely important and saving on your power bill isn’t the issue. The less power a component consumes the less heat it generates and as a result the less noise it makes. Lower powered components are more flexible as well as they can often be used in small builds and don’t require higher rated power supplies. I am not sure how anyone could say power consumption is a wishy washy reason :S

You are right 8GB of VRAM isn’t speculative anyone, it is well known this is entirely pointless and a marketing ploy by AMD. I have tested the 290X 4GB and 390X 8GB clock for clock in a number of games that use well over 4GB of VRAM and the 390X is not a single frame faster under playable conditions. I would very much like to see a test where the 390X is much faster than the 290X due to its larger VRAM, that would be interesting. The truth is in 99% of scenarios where that much VRAM is being used the Grenada XT (Hawaii XT) architecture isn’t powerful enough to move the data through the VRAM fast enough for the extra buffer to be an advantage.

DX12 performance on the other hand is very much speculative as we have really one seen one or two “pre-released” games. If Project CARS was the only DX11 game what would your conclusion be?

Furthermore, in the latest ‘Ashes of the Singularity’ build I am seeing very competitive DX12 performance between AMD and Nvidia. Again the real DX12 battle will take place between future GPU generations.
 
Thanks for the link but I'm still seeing HBAO+ turned on in their tests. Can anyone confirm this Nvidia tech doesn't stint framerates on AMD cards vs SSAO?
If we were talking about plain ol' HBAO, then it runs better than HDAO for Nvidia and AMD, and in my opinion looks better to.

Problem is there aren't many games available with HBAO+ built-in. There's this old article from Steve for Blacklist. With HBAO+ on, the 7970 GHz is on par with a 680, while with it off the 7970 GHz shoots well above the 680. Here performance is really harmed.

Although for AC: Unity, the 290X performs slightly proportionally worse (4 frames lost) compared to the 780 (2 frames lost). But the problem here is Nvidia PCSS is also enabled--so with that said if PCSS was off I think it would equal out perhaps? I don't like PCSS anyways.

Then there's this Guru3D review for CoD: AW. On the top of the page it's stated that HBAO+ is enabled, among other stuff. To me it doesn't really seem the results are anything odd.

I'm personally going with the assumption that HBAO+ performance on AMD cards is going to be a game-by-game basis and can't be generalized quite yet.
 
Yes the 5870 had higher peak fps, but the stutter was horrible, yet non existent at the same settings on the 8800.
Crysis 2 was definitely a TWIMTBP game. Again, my experience was that it ran better (smoother, less stutter) on my 760 than a 280X.
Crysis 3, no comment as I never played it.
It was a pretty common theme with Radeon GPU's.
I had a similar experience with my 6970 and 570. The 6970 actually benched a little higher and according to FRAPS achieved higher frames, but the 570 was noticeably smoother in several games. This same experience was noted by many others. The Radeons seemed to 'dip' more during stress.

Not trying to tick anyone off, pretty sure its not true anymore with their current hardware & drivers.
 
I thought this was a very well written article. I know people will have strong opinions because they are too loyal to one particular brand but he did what any sensible person will do and that's finding the product that fits in your budget. I thought he had done an excellent job of breaking down what's HIS choice will be within the pricing bracket.
 
Recommending a 970 over 390 today, is this a bad joke? 970 over 290 on their release, maybe, but with the improvements made on the 390, 8gb of vram vs 3,5gb with texture resolution increasing all the time and the fact that if you buy another 390 in the future (for a reduced price) the crossfire performance is both future proof and flawless for 1440p and with dx12 around the corner 4k seems completely doable aswell. I don't comment on these forums often but this was just too weird to ignore. Then again trolling the public might've been the goal of this, I know I couldn't resist from time to time.
 
Haha... I hate when people blindly love GTX 970 and give this stupid card a win over R9 390.
R9 390 beats 970 any day and is a much future proof card with 8gb vram vs poor 3.5gb. And who gives a **** about 20w power difference?
 
Haha... I hate when people blindly love GTX 970 and give this stupid card a win over R9 390.
R9 390 beats 970 any day and is a much future proof card with 8gb vram vs poor 3.5gb. And who gives a **** about 20w power difference?

I love it when fanboys provide feedback that no one wants and make an ignorant statement or two about things they don't fully understand.

The 390 is a much future proof card with 8gb vram vs poor 3.5gb - Savoz (2015)
 
Haha... I hate when people blindly love GTX 970 and give this stupid card a win over R9 390.
R9 390 beats 970 any day and is a much future proof card with 8gb vram vs poor 3.5gb. And who gives a **** about 20w power difference?

I love it when fanboys provide feedback that no one wants and make an ignorant statement or two about things they don't fully understand.

The 390 is a much future proof card with 8gb vram vs poor 3.5gb - Savoz (2015)

Humbly offer that most of your readers can spot a 'heavily invested opinion' as well as you can, and you do a disservice to your credibility when you respond in kind, I.e., We Completely ignore them, maybe you'll consider doing the same. Great article, and I share your disappointment with commenter's love for minutia, though that Is the nature of this particular beast (NV/AMD).
 
Back