The problem with the "it can't do 4k" argument that I hate is that if you want to play at 4k, you can, just not at max settings. And I don't know what's going on with people today, but todays looks AMAZING even at low and medium settings if you tweak them right. Of course you can't max everything out if you want to play on 4k, but you can play at 4k. I know this because I play at 4k on my GTX 680 with my Philips BDM4065UC. Granted, I have to keep things mostly at medium, but I can play 4k at 40-50 FPS semi-reliably. Now ofcourse I don't like playing at 30 FPS and there-abouts, but it is playable.
I understand exactly what I'm doing, I'm essentially playing games at the equivalent settings of 4 years ago. 4 years ago, graphics were pretty good and I am happy with the image quality I'm getting. And I assure you, when blown up to 40 inches, the increase in resolution over graphics effects is a large improvement. I'd rather have a 40" 4k image at medium settings than a 40" 1080p image at high settings. I'm sure you have access to some type of 4k TV, try the experiment for yourself. Until you've seen it for yourself you really have no business commenting on it. Games don't need to be played at max settings to be enjoyed. I see things as lens flare and god rays as silly gimmicks that suck up performance. Many graphics effects I think makes games look worse since they can't uniformly look the same across an image. I'd rather have okay shadows that cover the whole area I'm looking at rather than seeing distance objects completely void of shadows. There is a lot more to image fidelity than just shaders. The believably of many games is often ruined by poorly implemented special effects. What I'm essentially doing is playing games at current gen console image settings in native 4k.So you are suggesting playing at 4K with medium to high quality settings looks better than 1440p using ultra/max quality settings. What exactly do you think the point of 4K gaming is or higher resolution gaming for that matter? The primary reason you increase the resolution is to improve image quality, seems very counterproductive to do this and then lower the quality settings just so you can achieve average performance.
Furthermore I really have no idea if you are aware of this but if you game at 4K and then reduce the quality settings you are reducing the VRAM requirement to that of a lower resolution. Again the GTX 970 and R9 390 are not powerful enough to effectively use more than 3.5 – 4GB of VRAM. If Nvidia developed a GTX 970 that legitimately had 8GB of VRAM I am willing to bet under playable conditions it wouldn’t be any faster than the current model.
SLI might be a different scenario under extreme conditions.
I understand exactly what I'm doing, I'm essentially playing games at the equivalent settings of 4 years ago. 4 years ago, graphics were pretty good and I am happy with the image quality I'm getting. And I assure you, when blown up to 40 inches, the increase in resolution over graphics effects is a large improvement. I'd rather have a 40" 4k image at medium settings than a 40" 1080p image at high settings. I'm sure you have access to some type of 4k TV, try the experiment for yourself. Until you've seen it for yourself you really have no business commenting on it. Games don't need to be played at max settings to be enjoyed. I see things as lens flare and god rays as silly gimmicks that suck up performance. Many graphics effects I think makes games look worse since they can't uniformly look the same across an image. I'd rather have okay shadows that cover the whole area I'm looking at rather than seeing distance objects completely void of shadows. There is a lot more to image fidelity than just shaders. The believably of many games is often ruined by poorly implemented special effects. What I'm essentially doing is playing games at current gen console image settings in native 4k.
Look, I know me and you fight a lot, but I REALLY want you to do the experiment. Having this expansive screen in front of your face does a lot for making an immersive experience. I say the immersion you feel outways the special effects that you lose. 4k gaming is possible on a budget and I'm sure many budgeted minded readers would be interested in reading about it. Sure, this has gotten a little off topic, but the reason this relates to the 970 is that because of the memory flaw it will have a difficult time handling the larger resolutions at these settings. I'm a really big fan of the increased screen real estate that 4k allows. The 30" 4k panels, IMO, are silly. At least while GUI scaling is as poor as it is.I have a Samsung 55” 4K TV, not sure why that matters.
I fail to see how any of this has anything to do with the GTX 970 and its VRAM capacity limitation, especially if you are playing older games such as Crysis 2, The Witcher 2 and Dragon Age 2 for example. The VRAM requirements for those games even at 4K is very low.
Look, I know me and you fight a lot, but I REALLY want you to do the experiment. Having this expansive screen in front of your face does a lot for making an immersive experience. I say the immersion you feel outways the special effects that you lose. 4k gaming is possible on a budget and I'm sure many budgeted minded readers would be interested in reading about it. Sure, this has gotten a little off topic, but the reason this relates to the 970 is that because of the memory flaw it will have a difficult time handling the larger resolutions at these settings. I'm a really big fan of the increased screen real estate that 4k allows. The 30" 4k panels, IMO, are silly. At least while GUI scaling is as poor as it is.
My situation is that I bought the screen mainly for image quality and screen realestate to work on, gaming was an after thought. I was thinking that "if I really want to play games I'll just drop it to 1440 or 1080". I was delighted to find that with minimal tweaking of settings, I can game in 4k. You aren't going to be winning any pro matches of CSGO, but playing single player games like Fallout 4 and Skyrim is absolutely amazing. The higher resolution also benefits me greatly in AOEII HD and CiV V. You can see a wider area of the battle field and it helps you see options you weren't really aware you had before. You can get a decent quality 40" 4k monitor for around $400 and I think many people avoid adopting 4k because they think it isn't possible on their current hardware. I'm saying IT IS possible. 4k brings out detail in the "lower" settings you never noticed before.
If I can do this stuff on a GTX 680, imagine the experiance people would have on a 780 or a 980. Dropping the settings isn't the horrible thing that marketing teams want use to believe that it is, at least not anymore. New games at similar settings to older games often look and perform better, game engines have had time to mature and gain performance. I guess the really important thing here is that everyone keeps saying that "4k gaming isn't here yet" and I've been doing it for months on a 680.
All of that said, I am fully aware that my current hardware is inadequate, but I can see benefit and enjoy a 4k experience. I do intend to upgrade to the GTX 1080 or whatever that respective pascal model will be called. I also believe that many other people can enjoy a 4k experience just as long as they drop the settings and the high end shaders. This experiment could make for a very interesting article. You should try it, record your results and post it. Even if you disagree with me I'm sure other people would like to read about the conclusion.
I don’t want to have the 4K using lower quality settings discussion, least of all here. I strongly feel 4K gaming should only be tackled by high-end hardware right now and lowering the quality settings to medium defeats the purpose entirely.
What I want to know is why you think the GTX 970 will struggle here or present any kind of problem when there is absolutely no evidence to support this.
I mean we have tested both the 390 and 970 extensively at 4K in the latest games using maximum quality settings…
https://www.techspot.com/review/1024-and-radeon-r9-fury-x/page2.html
The 970 and 390 look very evenly matched to me, the 970 is even faster in quite a few of the games.
Hello, Steve. I've been a lurker on this site for a long time, and even posted occasionally when guest posts were available. I figured today might be a good day to make my account, and address some issues you're raising.I have to assume you are talking about the GTX 970? Even so it makes no sense. What is with you guys and your obsession over VRAM? Firstly, can you please provide some evidence where the GTX 970 suffers from a lack of VRAM under playable conditions. It is out right faster than the 390 at 1080p due to AMD’s driver overhead issue and matches the 390 at 1440p. It consumes less power and overclocks much better. It also costs the exact same amount.
(...)
The GTX 950 is an obvious choice here so I am not even going to bother arguing the point. The R9 270 has been heavily handy capped through poor driver optimisation for the latest games and well it is a discontinued product.
(...)
No the 390 isn’t faster, it has more VRAM it can’t use and it’s the same price. The 390 will only improve with time? Huh how is that working out for the 200 series?
Since TechSpot already has the data, I went ahead and calculated the summary.I mean we have tested both the 390 and 970 extensively at 4K in the latest games using maximum quality settings…
...
The 970 and 390 look very evenly matched to me...
That's because you haven't read his August 2016 article yet. You should hop in your time machine and go read it - it's brilliant!I look at my calender, and it says it is January. This article is false.
Ok.This is a follow up to our annual graphics card roundup (Oct/15), and the plan is to keep it updated throughout the year as new GPUs are released. Think of it as your one-stop resource for what GPU to buy at any given time.
As noted in the article, we don't expect new generation GPUs to come out until Q2 or Q3.
Absolutely. My previous 4 cards are AMD and I converted to NVIDIA because AMD is short of the mark. Simple. The article largely just reiterates the state of play in the market.It never ceases to amaze how many toxic comments are thrown up on articles like this. Calling the Techspot team "shills" because they recommend mostly Nvidia based cards this time around is pathetic. Who cares what company the graphics card is from. I as a consumer look at the best value for the money, not if it is AMD or Nvidia. The fanboys need to stop. My god, its like all the boobs who getting into a shouting match about their Chevy being better than Ford. Who gives a flying crap.
Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.My last GPU lasted me 6 years. The ATI 5770.
If I had listened to people saying back in the day that the 1 Gigabyte model is somewhat pointless... well it would have bit me in the backside. That 1 GB of VRAM allowed it to play even games like Witcher 3. This is the same thing here. I am certain that 2-3-4 years from now there will be games whose texture and model quality sliders would benefit from more than 4 GB of VRAM. And for those that upgrade like me, even more slowly, at 5 or 6 years... it will be a godsend. We DO exist.
He obviously didn't mean he was playing on the highest settings, it was certainly with reduced visual quality. Perhaps even medium or low settings. But playable (and enjoyable) nevertheless. Also, I believe he saying 1440p was just an example of how the 390 and 970 would perform in the future, not that he uses this resolution on a 5770.Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.
Same goes for GTX 970. For AAA (decent engine first person shooters) it just does not have the grunt to do better than 1080p at max detail settings and min FPS > 40fps. Real world... 32/64 player... it's just not going to be playable in anything competitive. I don't know about single player games like FO4 off the top of my head but I suspect the case is similar there too.
I'm still rocking the 660, and couldn't be happier.meaning most people with 760s still haven't upgraded yet.
Just a little comment on that. I've just upgraded from a HD7950OC. My monitor is 2560x1440. My old card from what I can tell is *much* faster than a 5770. I'm not sure what games you are playing if that card lasted 6 years? You couldn't be playing modern AAA titles at that resolution. I could barely play BF4 at 1080p. The min fps were very hit and miss.
Same goes for GTX 970. For AAA (decent engine first person shooters) it just does not have the grunt to do better than 1080p at max detail settings and min FPS > 40fps. Real world... 32/64 player... it's just not going to be playable in anything competitive. I don't know about single player games like FO4 off the top of my head but I suspect the case is similar there too.