Best Graphics Cards of 2016: Top picks for every budget

So much hate for GTX 970 on here. I own both GTX 970 and R9 390 (both are from GALAX), and I think I love the GTX 970 more mainly because of its much lower TDP and overclocks much, much better. I achieve better FPS on my 1080p 144Hz monitor for all my games too.
 
The problem with the "it can't do 4k" argument that I hate is that if you want to play at 4k, you can, just not at max settings. And I don't know what's going on with people today, but todays looks AMAZING even at low and medium settings if you tweak them right. Of course you can't max everything out if you want to play on 4k, but you can play at 4k. I know this because I play at 4k on my GTX 680 with my Philips BDM4065UC. Granted, I have to keep things mostly at medium, but I can play 4k at 40-50 FPS semi-reliably. Now ofcourse I don't like playing at 30 FPS and there-abouts, but it is playable.

So you are suggesting playing at 4K with medium to high quality settings looks better than 1440p using ultra/max quality settings. What exactly do you think the point of 4K gaming is or higher resolution gaming for that matter? The primary reason you increase the resolution is to improve image quality, seems very counterproductive to do this and then lower the quality settings just so you can achieve average performance.

Furthermore I really have no idea if you are aware of this but if you game at 4K and then reduce the quality settings you are reducing the VRAM requirement to that of a lower resolution. Again the GTX 970 and R9 390 are not powerful enough to effectively use more than 3.5 – 4GB of VRAM. If Nvidia developed a GTX 970 that legitimately had 8GB of VRAM I am willing to bet under playable conditions it wouldn’t be any faster than the current model.

SLI might be a different scenario under extreme conditions.
 
So you are suggesting playing at 4K with medium to high quality settings looks better than 1440p using ultra/max quality settings. What exactly do you think the point of 4K gaming is or higher resolution gaming for that matter? The primary reason you increase the resolution is to improve image quality, seems very counterproductive to do this and then lower the quality settings just so you can achieve average performance.

Furthermore I really have no idea if you are aware of this but if you game at 4K and then reduce the quality settings you are reducing the VRAM requirement to that of a lower resolution. Again the GTX 970 and R9 390 are not powerful enough to effectively use more than 3.5 – 4GB of VRAM. If Nvidia developed a GTX 970 that legitimately had 8GB of VRAM I am willing to bet under playable conditions it wouldn’t be any faster than the current model.

SLI might be a different scenario under extreme conditions.
I understand exactly what I'm doing, I'm essentially playing games at the equivalent settings of 4 years ago. 4 years ago, graphics were pretty good and I am happy with the image quality I'm getting. And I assure you, when blown up to 40 inches, the increase in resolution over graphics effects is a large improvement. I'd rather have a 40" 4k image at medium settings than a 40" 1080p image at high settings. I'm sure you have access to some type of 4k TV, try the experiment for yourself. Until you've seen it for yourself you really have no business commenting on it. Games don't need to be played at max settings to be enjoyed. I see things as lens flare and god rays as silly gimmicks that suck up performance. Many graphics effects I think makes games look worse since they can't uniformly look the same across an image. I'd rather have okay shadows that cover the whole area I'm looking at rather than seeing distance objects completely void of shadows. There is a lot more to image fidelity than just shaders. The believably of many games is often ruined by poorly implemented special effects. What I'm essentially doing is playing games at current gen console image settings in native 4k.
 
I understand exactly what I'm doing, I'm essentially playing games at the equivalent settings of 4 years ago. 4 years ago, graphics were pretty good and I am happy with the image quality I'm getting. And I assure you, when blown up to 40 inches, the increase in resolution over graphics effects is a large improvement. I'd rather have a 40" 4k image at medium settings than a 40" 1080p image at high settings. I'm sure you have access to some type of 4k TV, try the experiment for yourself. Until you've seen it for yourself you really have no business commenting on it. Games don't need to be played at max settings to be enjoyed. I see things as lens flare and god rays as silly gimmicks that suck up performance. Many graphics effects I think makes games look worse since they can't uniformly look the same across an image. I'd rather have okay shadows that cover the whole area I'm looking at rather than seeing distance objects completely void of shadows. There is a lot more to image fidelity than just shaders. The believably of many games is often ruined by poorly implemented special effects. What I'm essentially doing is playing games at current gen console image settings in native 4k.

I have a Samsung 55” 4K TV, not sure why that matters.

I fail to see how any of this has anything to do with the GTX 970 and its VRAM capacity limitation, especially if you are playing older games such as Crysis 2, The Witcher 2 and Dragon Age 2 for example. The VRAM requirements for those games even at 4K is very low.
 
I have a Samsung 55” 4K TV, not sure why that matters.

I fail to see how any of this has anything to do with the GTX 970 and its VRAM capacity limitation, especially if you are playing older games such as Crysis 2, The Witcher 2 and Dragon Age 2 for example. The VRAM requirements for those games even at 4K is very low.
Look, I know me and you fight a lot, but I REALLY want you to do the experiment. Having this expansive screen in front of your face does a lot for making an immersive experience. I say the immersion you feel outways the special effects that you lose. 4k gaming is possible on a budget and I'm sure many budgeted minded readers would be interested in reading about it. Sure, this has gotten a little off topic, but the reason this relates to the 970 is that because of the memory flaw it will have a difficult time handling the larger resolutions at these settings. I'm a really big fan of the increased screen real estate that 4k allows. The 30" 4k panels, IMO, are silly. At least while GUI scaling is as poor as it is.

My situation is that I bought the screen mainly for image quality and screen realestate to work on, gaming was an after thought. I was thinking that "if I really want to play games I'll just drop it to 1440 or 1080". I was delighted to find that with minimal tweaking of settings, I can game in 4k. You aren't going to be winning any pro matches of CSGO, but playing single player games like Fallout 4 and Skyrim is absolutely amazing. The higher resolution also benefits me greatly in AOEII HD and CiV V. You can see a wider area of the battle field and it helps you see options you weren't really aware you had before. You can get a decent quality 40" 4k monitor for around $400 and I think many people avoid adopting 4k because they think it isn't possible on their current hardware. I'm saying IT IS possible. 4k brings out detail in the "lower" settings you never noticed before.

If I can do this stuff on a GTX 680, imagine the experiance people would have on a 780 or a 980. Dropping the settings isn't the horrible thing that marketing teams want use to believe that it is, at least not anymore. New games at similar settings to older games often look and perform better, game engines have had time to mature and gain performance. I guess the really important thing here is that everyone keeps saying that "4k gaming isn't here yet" and I've been doing it for months on a 680.

All of that said, I am fully aware that my current hardware is inadequate, but I can see benefit and enjoy a 4k experience. I do intend to upgrade to the GTX 1080 or whatever that respective pascal model will be called. I also believe that many other people can enjoy a 4k experience just as long as they drop the settings and the high end shaders. This experiment could make for a very interesting article. You should try it, record your results and post it. Even if you disagree with me I'm sure other people would like to read about the conclusion.
 
Look, I know me and you fight a lot, but I REALLY want you to do the experiment. Having this expansive screen in front of your face does a lot for making an immersive experience. I say the immersion you feel outways the special effects that you lose. 4k gaming is possible on a budget and I'm sure many budgeted minded readers would be interested in reading about it. Sure, this has gotten a little off topic, but the reason this relates to the 970 is that because of the memory flaw it will have a difficult time handling the larger resolutions at these settings. I'm a really big fan of the increased screen real estate that 4k allows. The 30" 4k panels, IMO, are silly. At least while GUI scaling is as poor as it is.

My situation is that I bought the screen mainly for image quality and screen realestate to work on, gaming was an after thought. I was thinking that "if I really want to play games I'll just drop it to 1440 or 1080". I was delighted to find that with minimal tweaking of settings, I can game in 4k. You aren't going to be winning any pro matches of CSGO, but playing single player games like Fallout 4 and Skyrim is absolutely amazing. The higher resolution also benefits me greatly in AOEII HD and CiV V. You can see a wider area of the battle field and it helps you see options you weren't really aware you had before. You can get a decent quality 40" 4k monitor for around $400 and I think many people avoid adopting 4k because they think it isn't possible on their current hardware. I'm saying IT IS possible. 4k brings out detail in the "lower" settings you never noticed before.

If I can do this stuff on a GTX 680, imagine the experiance people would have on a 780 or a 980. Dropping the settings isn't the horrible thing that marketing teams want use to believe that it is, at least not anymore. New games at similar settings to older games often look and perform better, game engines have had time to mature and gain performance. I guess the really important thing here is that everyone keeps saying that "4k gaming isn't here yet" and I've been doing it for months on a 680.

All of that said, I am fully aware that my current hardware is inadequate, but I can see benefit and enjoy a 4k experience. I do intend to upgrade to the GTX 1080 or whatever that respective pascal model will be called. I also believe that many other people can enjoy a 4k experience just as long as they drop the settings and the high end shaders. This experiment could make for a very interesting article. You should try it, record your results and post it. Even if you disagree with me I'm sure other people would like to read about the conclusion.

I don’t want to have the 4K using lower quality settings discussion, least of all here. I strongly feel 4K gaming should only be tackled by high-end hardware right now and lowering the quality settings to medium defeats the purpose entirely.

What I want to know is why you think the GTX 970 will struggle here or present any kind of problem when there is absolutely no evidence to support this.

I mean we have tested both the 390 and 970 extensively at 4K in the latest games using maximum quality settings…

https://www.techspot.com/review/1024-and-radeon-r9-fury-x/page2.html

The 970 and 390 look very evenly matched to me, the 970 is even faster in quite a few of the games.
 
I don’t want to have the 4K using lower quality settings discussion, least of all here. I strongly feel 4K gaming should only be tackled by high-end hardware right now and lowering the quality settings to medium defeats the purpose entirely.

What I want to know is why you think the GTX 970 will struggle here or present any kind of problem when there is absolutely no evidence to support this.

I mean we have tested both the 390 and 970 extensively at 4K in the latest games using maximum quality settings…

https://www.techspot.com/review/1024-and-radeon-r9-fury-x/page2.html

The 970 and 390 look very evenly matched to me, the 970 is even faster in quite a few of the games.

I tend to disagree strongly with the statement in bold. And as far as the 970, I'm concerned that driver/firmware update can entirely fix a hardware problem. I experiment with many different operating systems and I'm really not sure of the performance problems that could surface. I also use my GPU's for alternate tasks. For the other software that I use with my GPU, there's really no guarantee that the software fix would solve the performance issues for my situation. And in my situation, I'd rather just pay more to avoid the problem. Granted, I'm part of a fairly small group who uses the performance of a PC for tasks other than gaming, but I still think buying a product with this many red flags around it is a bad idea. I couldn't honestly recommend a 970 to someone with confidence. You wouldn't buy other products with a class action lawsuit surrounding it, why is this any different?
 
I have to assume you are talking about the GTX 970? Even so it makes no sense. What is with you guys and your obsession over VRAM? Firstly, can you please provide some evidence where the GTX 970 suffers from a lack of VRAM under playable conditions. It is out right faster than the 390 at 1080p due to AMD’s driver overhead issue and matches the 390 at 1440p. It consumes less power and overclocks much better. It also costs the exact same amount.
(...)
The GTX 950 is an obvious choice here so I am not even going to bother arguing the point. The R9 270 has been heavily handy capped through poor driver optimisation for the latest games and well it is a discontinued product.
(...)
No the 390 isn’t faster, it has more VRAM it can’t use and it’s the same price. The 390 will only improve with time? Huh how is that working out for the 200 series?
Hello, Steve. I've been a lurker on this site for a long time, and even posted occasionally when guest posts were available. I figured today might be a good day to make my account, and address some issues you're raising.
First, you are wrong about the GTX 970 being faster, even at 1080p. The link below is from a recent TechPowerUp review with the most recent drivers, and they do something fantastic that I think every site that reviews GPUs should do, which is consolidate all the results in the games tested in a summary at the end. It's an excellent way to visualize performance across the board, and as a side note I'd like to suggest TechSpot do that as well, so we can have more of such summaries other than TechPowerUp ones.
https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/23.html
As you can see, even at 1080p the R9 390 was faster, although not by much (4% on the relative percentage unit used on the chart). That difference does increase as resolutions increase, up to 8% at 1440p and 11% on 4K (again, not absolute percentages between the 970 and the 390, it's just the chart's unit where 100% is the factory OCed 390 tested). You could argue that at 1080p if you overclock both cards the 970 will overclock better and might outperform the 390, but as of now I'm not aware of such a test having been done so we can say for sure, and not all users bother overclocking GPUs either (I'm inclined to guess most don't, but I can't back that up).
As for VRAM, yes, today there are few scenarios where the 970's 3.5 GB of VRAM become an issue. But that won't be the case forever. Back in the Crysis days, for example, 512 MB were enough for 1080p gaming and 1 GB was overkill. Then after a few years (in the 5870 vs. 480 days) 1 GB was adequate and 2 GB was overkill. A couple more years, and 1 GB is no longer enough, and you needed 2 GB. And today 2 GB is already getting really tight for 1080p and you'd want at least 3 GB to be comfortable, while the avant garde games of VRAM consumption keep pushing much further and mods will only increase that requirement. There is no such thing as a static "ideal VRAM for 1080p", VRAM requirement increase over time at the same resolution. So yes, we'll soon get to the point where you'll need 4 GB for comfortable 1080p gaming, and when that time comes the 970 will be the first to struggle due to the latency issues that come with 3.5+ GB usage, while other 4 GB cards such as the 380X, 290/290X and 980 still won't. In summation,the problem with the 970 isn't today, it's how it will behave in the future.
As for the 950, you can see in the TechPowerUp summary that, with current drivers, the 270X performs the same as the 960. 370 (which should be similar to the 270) performs the same as the 950. There is no clear divide in favor of Nvidia as you seem to suggest. Also, the 270X performing the same as the 960, while the 380 wipes the floor with it, suggests either good driver support from AMD or bad driver support from Nvidia. It's hard to conclude from this data the AMD drivers aren't doing a good job.
As for not being "able" to use the VRAM, I don't think that makes sense. Textures, for example, is an easy way to increase VRAM consumption significantly with minor impact on FPS, so with high-res textures even mid-range GPUs can require a lot of memory and still perform well. That's very significant for the downloadable high-res packs we've been seeing, as well as mods.
I mean we have tested both the 390 and 970 extensively at 4K in the latest games using maximum quality settings…
...
The 970 and 390 look very evenly matched to me...
Since TechSpot already has the data, I went ahead and calculated the summary.
Here are the 4K framerates for the 390: 21, 36, 21, 35, 19, 27, 30, 33, 28, 45, 33, 29, 43, 15, 26, 26, 21
Here are the 4K framerates for the 970: 19, 35, 23, 39, 16, 24, 27, 29, 29, 37, 30, 24, 36, 16, 24, 24, 18
In the 17 games tested, the 390 averaged 28.70 FPS, while the 970 averaged 26.47 FPS. That makes the 390 8.42% faster on average at 4K. Note also that it's a review from last july, so it's using relatively old drivers. TechPowerUp's most recent summary says the 390 is on average 12.89% faster at 4K (normalizing the 390's "97%" score as 100% and increasing the 970's 86% score in the same proportion). So I wouldn't exactly say they are evenly matched.
 
I'm rocking a 970 with a 4690k cpu. It far outpaces my old amd 280. I grabbed it for 250$ off ebay and it has held up so far. It does everything I want it to do at 1080p and no one recommends it for higher resolutions. Which is fine, top grade resolutions are not in my budget. When comparing the two, I looked at a variety of gaming benchmarks, the 970 seems to win most of the time. This is likely due to nvidias optimization, whatever the cause that is the main reason I passed on the 390.

Also my 280 was really hot and AMD's software seemed clunky. Half the time when I clicked on amds "turbo boost" the program malfunctioned and froze up. I tried and tried to get it working, I googled and found out its a common problem. I know I can OC it on my own, but I live in a hot climate and I don't always want my card boosted.

The argument that this game does not fair as well via resolutions above 1080p is faulty. These cards are built for 1080p, not 4k gaming. Also that extra .5 ram is supposed to automatically turn on when needed.

Side note: Recently I'v been reading about light boost, an nvidia exclusive that is supposed to grant a huge improvement in twitch gaming.

I looked up a few major games, 970 wins at 1080p
fallout 4
https://www.techspot.com/review/1089-fallout-4-benchmarks/page2.html

witcher 3 970
https://www.techspot.com/review/1006-the-witcher-3-benchmarks/page3.html

metal gear 970
https://www.techspot.com/review/1060-metal-gear-solid-phantom-pain-benchmarks/page2.html

batman ( 290 won this one)
https://www.techspot.com/review/1022-batman-arkham-knight-benchmarks/page2.html

dying light 970
https://www.techspot.com/review/956-dying-light-benchmarks/page3.html

Someone feel free to post some more benchmarks, the actual performance at 1080p is the only thing majority of people care about.
 
Last edited:
The title says "Best Graphics Cards of 2016". I look at my calender, and it says it is January. This article is false. All it talks about are the graphics cards of 2015, not 2016. I see nothing rating AMD's Radeon R9 490X against Nvidia's GTX 1080.

Where are all the graphics cards of 2016? Oh wait, those are not expected until around July. None of 2016's graphics cards have been released yet.
 
With Nvidia´s Pascal GPU´s right around the corner (according to the rumours) I think it would be premature to buy a GPU now, to say the least. I have great interest in HTC Vive but being those GPUs made with VR in mind I have high hopes that they will be much more efficient than the current lineup.
What do you guys think?
 
I look at my calender, and it says it is January. This article is false.
That's because you haven't read his August 2016 article yet. You should hop in your time machine and go read it - it's brilliant!
Look, the author clearly states in a comment that this article will be updated as the year progresses. Or are you so pedantic that you need him to change the title of the article to "Best Graphics Cards of 2016 (so far)" ?
Sometimes I wish comments had a Dislike button.
 
This is a follow up to our annual graphics card roundup (Oct/15), and the plan is to keep it updated throughout the year as new GPUs are released. Think of it as your one-stop resource for what GPU to buy at any given time.

As noted in the article, we don't expect new generation GPUs to come out until Q2 or Q3.
Ok.
 
It never ceases to amaze how many toxic comments are thrown up on articles like this. Calling the Techspot team "shills" because they recommend mostly Nvidia based cards this time around is pathetic. Who cares what company the graphics card is from. I as a consumer look at the best value for the money, not if it is AMD or Nvidia. The fanboys need to stop. My god, its like all the boobs who getting into a shouting match about their Chevy being better than Ford. Who gives a flying crap.
 
I quickly skimmed this article the day it came out before any comments where posted. Made a mental note to come back and read all the angry comments from people who lean to one side or the other. I am not disappointed.

Having said that, my GTX 970 only gave me a problem going above the 3.5gb once. I turned some settings down a little bit, and it's been fine every since. I can't even remember what game it was at this point. When all the drama broke over the 3.5 thing, I convinced Amazon customer service to give me back like $70 dollars. Best price/performance I have ever paid for a card. It was a great upgrade after keeping my GTX480 for so long.
 
It never ceases to amaze how many toxic comments are thrown up on articles like this. Calling the Techspot team "shills" because they recommend mostly Nvidia based cards this time around is pathetic. Who cares what company the graphics card is from. I as a consumer look at the best value for the money, not if it is AMD or Nvidia. The fanboys need to stop. My god, its like all the boobs who getting into a shouting match about their Chevy being better than Ford. Who gives a flying crap.
Absolutely. My previous 4 cards are AMD and I converted to NVIDIA because AMD is short of the mark. Simple. The article largely just reiterates the state of play in the market.
 
Hello people.
This is my first post here as I just wanted to throw in my 2 cents here. I hope to eventually become a regular poster here as not many people seem to think of people like me who upgrade very rarely ! But I do want to explain how I view the R9 390 and GTX 970.

Now about the 8GB of VRAM . The truth is that it is not an advantage in most situations EVEN at 4k. That is true. However, it can ammount to something in Crossfire support. Or people that mod their games heavily CAN see it being used up. Modding is a huge thing for me, absolutely superior to graphical fidelity and even frame rate for me. One of the major reasons why I am a PC Gamer even (after backwards compatibility and cheaper long term costs, before emulation).

The other reason is that some people do not upgrade every year or two or three... or four. The ability for your card to allow you to max two of the most important Visual Quality aspects ( Texture Quality and Model Quality) even in future titles, is great. EVEN if you have to turn down some other things, this means that for us, the R9 390 has this advantage.

My last GPU lasted me 6 years. The ATI 5770.
If I had listened to people saying back in the day that the 1 Gigabyte model is somewhat pointless... well it would have bit me in the backside. That 1 GB of VRAM allowed it to play even games like Witcher 3. This is the same thing here. I am certain that 2-3-4 years from now there will be games whose texture and model quality sliders would benefit from more than 4 GB of VRAM. And for those that upgrade like me, even more slowly, at 5 or 6 years... it will be a godsend. We DO exist.

Being a PC Gamer means I should be able to choose the best performance/quality settings. So no, it wont struggle at all in 2 years at 1440P... most options would still be on Ultra or High (though not all, I admit). I can manage though, as long as the heavy hitters can be done well on my card. And my 1 GB ATI 5770 allowed just that. So I guess... the 8GB R9 390 will manage too.

As for Power Consumption... it is not so simple. The first thing is Idle Draw or draw under not very punishing scenarios is good on the R9 390. Another thing is that FRTC (frame rate target control) does exist and can be used to great effect in many of the most popular titles of today and yesterday.

Meaning a power bill difference will exist... but it wont be a major difference to the pocket.

Just my opinion as a person who upgrades much more rarely than most of you
 
My last GPU lasted me 6 years. The ATI 5770.
If I had listened to people saying back in the day that the 1 Gigabyte model is somewhat pointless... well it would have bit me in the backside. That 1 GB of VRAM allowed it to play even games like Witcher 3. This is the same thing here. I am certain that 2-3-4 years from now there will be games whose texture and model quality sliders would benefit from more than 4 GB of VRAM. And for those that upgrade like me, even more slowly, at 5 or 6 years... it will be a godsend. We DO exist.
Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.

Same goes for GTX 970. For AAA (decent engine first person shooters) it just does not have the grunt to do better than 1080p at max detail settings and min FPS > 40fps. Real world... 32/64 player... it's just not going to be playable in anything competitive. I don't know about single player games like FO4 off the top of my head but I suspect the case is similar there too.
 
Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.
Same goes for GTX 970. For AAA (decent engine first person shooters) it just does not have the grunt to do better than 1080p at max detail settings and min FPS > 40fps. Real world... 32/64 player... it's just not going to be playable in anything competitive. I don't know about single player games like FO4 off the top of my head but I suspect the case is similar there too.
He obviously didn't mean he was playing on the highest settings, it was certainly with reduced visual quality. Perhaps even medium or low settings. But playable (and enjoyable) nevertheless. Also, I believe he saying 1440p was just an example of how the 390 and 970 would perform in the future, not that he uses this resolution on a 5770.
He is absolutely right. A LOT of people out there who play on PC don't have the "everything must be on max all the time" mentality, nor the "I need the best stuff there is to be competitive online" one. You probably saw that recently the 970 became the most popular GPU on Steam, but you can be sure that at least 90% of those people with a 970 won't upgrade any time soon, and that GPU will stay on that list for a long time. Just like the 760, which was the most popular GPU before it, and still holds the second place (among dedicated GPUs) on that list, meaning most people with 760s still haven't upgraded yet. The "enthusiants" that upgrade every time something new is released are a minority of the market, and so are people who buy high-end hardware (the bulk of the market has always been the mainstream $150-to-$300 segment).
 
Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.

Same goes for GTX 970. For AAA (decent engine first person shooters) it just does not have the grunt to do better than 1080p at max detail settings and min FPS > 40fps. Real world... 32/64 player... it's just not going to be playable in anything competitive. I don't know about single player games like FO4 off the top of my head but I suspect the case is similar there too.

You do realize there are more options than just Ultra High in a game?
I know how to optimize my graphics settings to squeeze the most out of it. I dont just press presets (which BTW often are not actually the highest settings) in the menu :)

My ATI 5770 1 GB model managed to play all games (and probably would still play them) reasonably well (as in not sub-30) at 1680x1050 and 1440x900 (my resolution on the two monitors I have... though only the 1440x900 is left now). Sometimes on Low-Medium, others on Low. Sometimes on high (yeah, depends on the game of course). But it served me well and new games look great even on Low/Medium settings. If I had listened to people and got a 512mb model... it would not have been able to do so. We dont always upgrade every 2-3 or even 4 years.

So the R9 390 will serve me and people like me (modding, long time users or multi GPU users) much better in the long run. Both at 1080 and 1440P.
 
Back