Gaming at 4K: GeForce GTX 970 SLI Performance

Steve

Posts: 3,034   +3,142
Staff member

gaming gainward geforce gtx sli gpu 4k 4k gaming

Offering 16% more performance than the Radeon R9 290 when playing games at 2560 x 1600 while costing over 20% less, the value of Nvidia's GeForce GTX 970 positioned it as an ideal candidate for multi-GPU 4K gaming rigs last month. And as expected, AMD was quick to respond with price cuts.

Along with the R9 290 being dropped from $399 to $299, the R9 290X moved from $549 to $399 and the R9 285 is now around for as low as $229. All this means folks looking to game at 4K now have some pretty capable multi-GPU options for as little as $600, which is great news considering a single GTX 980 can't quite handle 4K gaming in the most demanding titles.

For a slight premium over the GTX 980, the R9 290 or GTX 970 could be doubled up for $600 or $660 -- either should outperform a lone GTX 980, though we bet the GTX 970 will outdo the R9 290 in performance and efficiency.

Read the complete review.

 
Last edited by a moderator:
Hmm, 600 bucks for a setup that achieves the ideal 60 FPS in two games...

With even higher resolutions coming, we need a revolution in graphics hardware.
 
Thanks for another great review Steve, and a shining example of how the decreased SMM (shader module) count in the 970 directly affects the texture fill rate in relation to the full die 980.
 
But you can't stop progress in one field just because the other one can't keep up. They're gonna be here soon, whether we need them or not, simply because display manufacturers have a lot of leeway in terms of where they can go. The disparity seems inevitable in the end.
 
But you can't stop progress in one field just because the other one can't keep up. They're gonna be here soon, whether we need them or not, simply because display manufacturers have a lot of leeway in terms of where they can go. The disparity seems inevitable in the end.
SSD's is probably a good example, they've been a bit stale as of late because we haven't really moved on from SataIII properly.
Thanks for another great review Steve, and a shining example of how the decreased SMM (shader module) count in the 970 directly affects the texture fill rate in relation to the full die 980.
Agreed, also, is the 980 the full die? Or are they going to release a 980Ti variant with some more SMB's enabled? Just wondering, I'd hold onto my money if there's even a slither of chance xD
 
But you can't stop progress in one field just because the other one can't keep up. They're gonna be here soon, whether we need them or not, simply because display manufacturers have a lot of leeway in terms of where they can go. The disparity seems inevitable in the end.
SSD's is probably a good example, they've been a bit stale as of late because we haven't really moved on from SataIII properly.
Thanks for another great review Steve, and a shining example of how the decreased SMM (shader module) count in the 970 directly affects the texture fill rate in relation to the full die 980.
Agreed, also, is the 980 the full die? Or are they going to release a 980Ti variant with some more SMB's enabled? Just wondering, I'd hold onto my money if there's even a slither of chance xD

Honestly I really don’t know, half the time I can’t work out what Nvidia is doing a week before a new product release so I am not even going to try and predict months into the future.

That said Kepler featured up to 36% more transistors on a 40% larger die using the same manufacturing process. So that reason alone has people wondering if a bigger faster Maxwell card is on the horizon, like a GTX 990 maybe.

Nvidia could wait till AMD fires the next shot with their 300 series and then serve them with an overly complex Maxwell GPU.

Either way I can’t imagine waiting for that to possibly happen would be a wise move. Given the current state of the GPU market any new higher end Maxwell GPUs are probably going to be very pricey, much like the GTX Titan was.
 
Honestly I really don’t know, half the time I can’t work out what Nvidia is doing a week before a new product release so I am not even going to try and predict months into the future.

That said Kepler featured up to 36% more transistors on a 40% larger die using the same manufacturing process. So that reason alone has people wondering if a bigger faster Maxwell card is on the horizon, like a GTX 990 maybe.

Nvidia could wait till AMD fires the next shot with their 300 series and then serve them with an overly complex Maxwell GPU.

Either way I can’t imagine waiting for that to possibly happen would be a wise move. Given the current state of the GPU market any new higher end Maxwell GPUs are probably going to be very pricey, much like the GTX Titan was.
Yeah, good point, I'm not actually in the market for a GPU (780 is serving me fine with a little overclock) but I build PC's for people so always interesting to see what others reckon Nvidia will do. I can see that being their plan though. Once AMD come out with something, Open Maxwell up with a beefier version (maybe on a small die?(22nm or less?) and watch competition at it's best.
 
Agreed, also, is the 980 the full die?
It certainly is. Nvidia's GM 204 whitepaper (PDF) confirms it as such
Or are they going to release a 980Ti variant with some more SMB's enabled? Just wondering, I'd hold onto my money if there's even a slither of chance xD
GM 204 is as good as it gets now, aside from maybe a silicon revision and a power limit boost (which should be available via custom sub-zero BIOS's fairly soon).
The 980 Ti will likely be a salvage GM 200 part. There's enough precedent for the naming convention (GTX 560 Ti 448 Core was GF 110, GTX 560 Ti was GF 114, while the GTX 660 Ti was GK 104 with the GTX 660 being GK 106 for example).
If you're thinking about moving up the graphics food chain, then (unless you have dire need or uncontrolled OCD) it might be better to wait for the GM 200. It will likely be more expensive again, but its reasonable to assume that it should offer 50-70% better performance - especially at higher resolutions. Even if the higher price tag doesn't appeal, you'll certainly have a better choice of GTX 980's in the resell market from the serial upgraders.
Once AMD come out with something, Open Maxwell up with a beefier version (maybe on a small die?(22nm or less?) and watch competition at it's best.
GM 200 is on 28nm...
GM200-Customer-Sample-TITAN-II-635x222.png

...although there's a reasonable chance that the design could be ported to a smaller process node. TSMC's 20nm planar doesn't seem to offer much benefit (power envelope/transistor density) but has significant cost increases over 28nm. It seems more likely that it could be candidate for 16nm finfet where a die shrink could offset the wafer costs. In any event, if you want to see a GeForce branded GM 200 you'd better hope that AMD get Bermuda and Fiji out the door. Without a threat to the GTX 980 Nvidia doesn't have a lot of incentive to sell big Maxwell's as consumer boards.
 
Last edited:
So in a nutshell, there's no point going to a 4k monitor now because not even SLI setups can run games effectively.
 
Keep in mind that some of these settings might be a bit extreme when gaming at 4K - like the AA settings. I'm not sure as I've never gamed at 4K before but I've gotta assume you don't need as much at that resolution. 4K gaming might be doable now with SLI 980s.
 
Keep in mind that some of these settings might be a bit extreme when gaming at 4K - like the AA settings. I'm not sure as I've never gamed at 4K before but I've gotta assume you don't need as much at that resolution. 4K gaming might be doable now with SLI 980s.
Yep. Some judicious dialling down of the image quality settings makes a big difference. I'd also note, that (motherboard permitting) tri-SLI GTX 970's at ~$1K will pretty much eliminate that compromise.
 
Keep in mind that some of these settings might be a bit extreme when gaming at 4K - like the AA settings. I'm not sure as I've never gamed at 4K before but I've gotta assume you don't need as much at that resolution. 4K gaming might be doable now with SLI 980s.

It is a personal preference but I feel you still need virtually the same AA settings at 4K resolutions as you do at 2560x1600 using a 30" or larger screen. Without AA enabled there are still plenty of jaggies to be seen.
 
BF4 and Crysis 3 did what I expected them to do to these cards.
That being said I will admit that overall, yes it's playable, just need might need to notch it down a tad for a certain game here or there but it will still look amazing either way.

Thanks for doing this review it clears up many questions.
 
Last edited:
I feel you still need virtually the same AA settings at 4K resolutions as you do at 2560x1600 using a 30" or larger screen.
Does pixel density really matter? Are you suggesting that with lower pixel densities, AA is more important? Makes sense, just wondering where you were going with the mention of '30" or larger screen'.
 
Does pixel density really matter?
In a word, Yes. What provides the better image, 1920x1080 on a 24" screen or a 65" panel (other parameters being equal of course).
Are you suggesting that with lower pixel densities, AA is more important? Makes sense, just wondering where you were going with the mention of '30" or larger screen'.
Can't speak for Steve but the proviso of screen size is needed as a qualifier, since (for example) a 30" 2560x1600 screen has almost exactly the same PPI as a 22" 1920x1080 (100.63 vs 100.13), so aliasing effects would be similar for both. For a QFHD (3820x2160) panel of 28", which seems to be fairly representative for monitors, the PPI is 156.73, but increase the screen to 40" and you're much closer to the earlier example (109.71 PPI).

I don't think you can do away with AA just because of the increased resolution, but the effects are lessened (depending on dot pitch of course and how much aliasing affects the user in the first place)
It is also a reason why downsampling has gained an immediate following (BTW, the new Nvidia driver (344.48) allows downsampling (DSR) on Kepler and Fermi cards)
 
I really wonder what it will take GPUs to offer a better 4K experience. Is it more memory bandwidth? More cores? @dividebyzero any ideas?
Probably both. The bandwidth issue is presently being alleviated by colour compression, which might be with us as a regular feature. HBM offers a wider interface but is limited in its first iteration to 4GB (4x1GB stacks). Higher core count is also a given, which goes hand in hand with transistor density increases (and lower voltage requirements/ higher core frequencies) with smaller process nodes, as well as more attention to on-die cache resources ( a major reason for GM 204's performance jump over the similarly sized GK 104).
These advances are just increments of GPU evolution that have been in effect since vertex and pixel shaders first shared silicon. Bringing the extra pixels to the screen typically lags 1-2 generations behind the initial adoption of these higher resolutions- the GPU business being somewhat more complicated than the monitor industry when you take into account manufacturing process and additional feature set requirements.
 
In a word, Yes. What provides the better image, 1920x1080 on a 24" screen or a 65" panel (other parameters being equal of course).
I get that. I was probably visualizing someone playing 4K w/AA on 1080P at the time of my question, and questioning how AA could possibly make 1080P look better at lower pixel densities. As if it doesn't matter what pixel density is used, AA still wouldn't make a difference.

To me AA is a process to enhance game texture deficiencies. Much the same way as nVidia's new DSR for older game titles. These deficiencies can only be enhanced so far, before enhancements no longer render improvements. I see pixel density as irrelevant, because the enhancements are predetermined by resolution, and regulated by GPU power. These enhancements will be a goal no matter the pixel density. I understand pixel density is important, I just don't see it as a deciding factor in game settings.
 
I get that. I was probably visualizing someone playing 4K w/AA on 1080P at the time of my question, and questioning how AA could possibly make 1080P look better at lower pixel densities....
I really don't know what you're driving at WRT a catch-all argument. Pixel density and AA are mostly two different discussions. Pixel density determines overall quality of the presented screen space, while AA aims to make objects presented within that screen space more natural in appearance. Rather than write a screed of text, I'll just point you towards Anandtech's DSR review. I'd also note that unless you have failing eyesight, such things as sub-pixel rendering are much more apparent on large panels at standard resolutions - particularly noticeable with text.
As if it doesn't matter what pixel density is used, AA still wouldn't make a difference.
So you're telling me that at, say 2560x1440 resolution, aliasing is equally noticeable for a 5.1" Galaxy S5 LTE-A (577 PPI) as it is for a 32" monitor (97 PPI) ?...and they'd both be equally effected by the same level of AA? Just for the record, I'm not signing up for that newsletter.
To me AA is a process to enhance game texture deficiencies. Much the same way as nVidia's new DSR for older game titles. These deficiencies can only be enhanced so far, before enhancements no longer render improvements.
:confused: Nobody, and I'll repeat this for effect, NOBODY is saying that its a perfect system. What AA does is to improve deficiencies in basic pixel rasterization. I hope you aren't taking the stance that just because there is a finite limit to practical AA techniques, you're going to argue that they're irrelevant? I don't think you'll find many people who would argue that just because 8 x SSAA might be a practical limit for AA it isn't worth applying if the GPU can handle the workload.
I understand pixel density is important, I just don't see it as a deciding factor in game settings.
I don't believe I said it was, and I also don't believe anyone else stated otherwise. Steve stated it was "a personal preference" (the part you didn't quote), and I used what I thought was a sufficient amount of qualification
Me...I said:
I don't think you can do away with AA just because of the increased resolution, but the effects are lessened (depending on dot pitch of course and how much aliasing affects the user in the first place)
 
Last edited:
I hope you aren't taking the stance that just because there is a finite limit to practical AA techniques, you're going to argue that they're irrelevant?
Lets just say I was asking a question(probably poorly worded as always), before taking a stance. Thanks for your insight as always. :)
 
Great review @Steve, nice to see how things stack up for the 4k market at the moment. Seems to be in a good place right now from both camps (finally) as you can make a pretty intimidating 4K gaming rig without completely blowing your machines budget out the window. It's something that I do right now but it still is a challenge from time to time without playing with the settings.
 
I've been playing X-COM Enemy Unknown with high detail settings on my new 4K monitor using a lowly GTX670...and it looks and play's FLAWLESSLY. Amazing in fact. Possibly that's partly due to the G-Sync support on this gem of a screen, I don't know.

Here's my low tech strategy: Buy all the games you want to play but have about two years worth of "backlog", so that the game you are currently playing is always about two years behind the generation of your graphics card. Helps to be a strategy-game type person instead of FPS.
 
Just got my second 970 today (STRIX) but for some reason they don't like my SLI Bridge. Gonna have to get another one before I can take full advantage... for now the second 970 is just a PhysX card.
 
Back