GPU's and micro stutter

boagz

Posts: 103   +1
Hi all. I was wondering what everyone's experience was in regards to the best GPU in terms of smoothest performance. I keep hearing that GPU's in sli/crossfire suffer from microstuttering and that can really distract from the experience. However, I've also heard that this is really only a major issue with lower end cards and that higher end cards like the GTX 780 or titan in sli work very well. What are your guy's thoughts?

This question has stemmed from the fact that I have a OC GTX 770 right now for a 1080p monitor. I guess I'm wondering if in the future I want to upgrade should I try another 770 in sli or just go with a single GTX titan or 690 (even though I've heard the 690's multiple gpu setup can experience microstuttering as well, however much that is). I know everything will give me high fps but I really want a smooth ultra setting experience with future games like BF4.

P.s. sorry I seem like a NVidia fanboy. I also like video editing which I've heard NVidia's cuda cores are good for that. Also, lets say money isn't a problem. These are my computer specs

i7 4770k
750w xfx power supply
asus sabertooth z87 mobo
Cooler master HAF912 ATX case
 
Micro stutturing has largely been addressed fairly well by NVidia. Currently, NVidia cards in 2 way SLI do a great job of both scaling and keeping stuttering quite low.

ATI cards have been derided very much these past few years since NVidia brought the issue to light and fixed it up on their cards, but the latest 13.8 beta drivers show that ATI has come up with a decent solution to stuttering.

TL;DR, it's not really a problem from now on.
And yes in the future a second GTX770 is probably going to be a good choice.
 
A second 770 is a better option than a Titan, Nvidia's drivers do a good job reducing microstuttering. Btw a 690 is a dual GPU on a single board so it will have the same issues as an SLI setup.

From my experience (7970 Crossfire) microstuttering has been kind of exaggerated even before the 13.8 beta drivers came out. But I guess some people notice it more than others.
 
Okay thanks guys. Do you think though that the 2gb of vram on the 770's will hold me back much. Is it better to get more vram? Whats vram important for?
 
2GB of VRAM at 1080p is fine. You'll only need more than 2GB for multi monitor setups, e.g. 5760x1080 and even then most benchmarks will show little difference between a 770 2GB SLI setup and a 770 4GB SLI one.
 
Okay thanks guys. Do you think though that the 2gb of vram on the 770's will hold me back much. Is it better to get more vram? Whats vram important for?
The answer to this question will heavily depend on the number of displays and resolutions you want to use.

You will want more VRAM for larger resolutions. And with XFire/SLI the card with the least amount of VRAM dictates how much VRAM is used on the other cards. VRAM across cards do not add together. VRAM is cloned across all cards, so each card can process it's own portion of memory. Unfortunately this means that cards will ignore memory bits being processed by other cards, which makes memory usage greatly inefficient. If you plan on XFire/SLI, it is my opinion that you should choose cards with greater memory capacity.

Special thanks to dividebyzero for the explanation. Hopefully I didn't misinterpret what he was explaining.
 
You will probably be able to achieve Ultra on BF4 @ 40-60 fps at launch. It will become more stable within 1-3 months once drivers mature.
 
Hopefully I didn't misinterpret what he was explaining.
That's pretty much it.
Each card renders it's own frame independently- say GPU #1 renders frames1, 3, 5, 7, 9..., while GPU #2 renders frames 2, 4, 6, 8, 10...
All output goes via the primary card.
All the frame rendering is done in the back buffer of the VRAM, once it is complete, the back buffer becomes the front (display out) buffer, and what was the front buffer for the previous frame now becomes the back buffer for the succeeding frame. With triple buffering you have two back buffers instead of one so the swap involves:
Back buffer #2 ---> Back buffer #1 ---> Front buffer (which after sending its frame out becomes back buffer #2). Pretty straightforward. Card #2 (and 3 and 4) in an SLI/CFX setup does exactly the same thing except that the display out moves to the primary card's front buffer and is sent to the monitor from there
And a quick visual representation using my dodgy MS Paint skills
GPU_render_display.jpg
 
The thing is, micro-stuttering was blown out of proprotion and has been the main point of scaring people from multi-GPU solutions for awhile because of this supposed issue. Most of the time you dont have to worry about this issue much. Most of the time, the "issue" will not be noticed. Putting a low end GPU in SLI/CFX is the only time you would really notice something due to the demands of very high end games (Normally it only comes out if you have to stress the GPU's to their max power to gain the performance needed for ~60 FPS).

There are many charts showing "stuttering" or games, but at this point in drivers, your pretty safe no matter what choice you want. A single GPU is a great solution for future proofing more than anything and ignoring any chance of this "issue", however when it comes down to price to gain performance, you can buy something like 2 GTX 770's and get smooth FPS on BF4, or the same on a GTX 780 (Minus about 10-15 frames per second average), but the price you pay for the pair of 770's over one 780 and the extra performance would be worth it.

Everyones opinion on Micro-Stuttering is different and everyones benchmarks/charts change all the time, really, its almost unoticeable in most cases. Also the 3rd and fourth card in any SLI/CFX (If you add another) begin to be used to help counter this issue. So really no matter what you choose (Unless you go really low end), you shouldnt have to worry especially with the options you listed above.

At LAN parties, I have friends who have machines consisting of 2 or more 580's, 6950's, 7970s, 690's, the works. We all play BF3 and lots of otehr games at em, the only differences in performance that is ever seen, is if the FPS goes below 50.
 
...There are many charts showing "stuttering" or games, but at this point in drivers, your pretty safe no matter what choice you want.
A couple of points:
1. Microstutter is a real phenomenon and will always occur to one degree or another so long as AFR is used and SLI/CFX lacks a unified memory pool with that option. The issue isn't whether or not it occurs, but at what threshold the user notices and is bothered by the issue. This is totally subjective - some people don't notice (or if they do, don't care) whilst at the other end of the scale even a minimal amount of stutter for those that can perceive it can be a dealbreaker. The same can be said for screen tearing, antialiasing levels (jaggies ruin the immersion factor for many gamers), input lag, and colour (personally I can't spend any length of time using a TN panel). What affects you will not necessarily be representative of others.

2. Drivers are only as good as the ongoing effort put into keeping up with current game engines. If for arguments sake a game engine is tweaked for a particular game, a runtime routine might change the time refresh period for AI actions (say pathfinding), or add different/greater range of responses (hunt, cover, survival) that add latency to the render. Until frame metering moves from software to hardware (AFAIK the GTX 690 has the only hardware based frame metering - second last paragraph- which likely accounts for it being smoother than SLI'ed 680/670's) any new game whether using a new or pre-existing game engine has the capability of introducing micro stutter.
 
A couple of points:
1. Microstutter is a real phenomenon and will always occur to one degree or another so long as AFR is used and SLI/CFX lacks a unified memory pool with that option. The issue isn't whether or not it occurs, but at what threshold the user notices and is bothered by the issue. This is totally subjective - some people don't notice (or if they do, don't care) whilst at the other end of the scale even a minimal amount of stutter for those that can perceive it can be a dealbreaker. The same can be said for screen tearing, antialiasing levels (jaggies ruin the immersion factor for many gamers), input lag, and colour (personally I can't spend any length of time using a TN panel). What affects you will not necessarily be representative of others.

2. Drivers are only as good as the ongoing effort put into keeping up with current game engines. If for arguments sake a game engine is tweaked for a particular game, a runtime routine might change the time refresh period for AI actions (say pathfinding), or add different/greater range of responses (hunt, cover, survival) that add latency to the render. Until frame metering moves from software to hardware (AFAIK the GTX 690 has the only hardware based frame metering - second last paragraph- which likely accounts for it being smoother than SLI'ed 680/670's) any new game whether using a new or pre-existing game engine has the capability of introducing micro stutter.

Okay, then do you think the safer option would be a GTX 690? Are drivers for single GPU's usually better or more consistently updated then drivers for sli configurations?
 
Okay, then do you think the safer option would be a GTX 690? Are drivers for single GPU's usually better or more consistently updated then drivers for sli configurations?
It is my understanding, the 690 is SLI on a single card. Dual GPU's need XFire/SLI configurations to work together, regardless if they are one or two cards.
 
Okay, then do you think the safer option would be a GTX 690? Are drivers for single GPU's usually better or more consistently updated then drivers for sli configurations?
As Clifford said, 690 is still SLI.
If you can afford it, one powerful card is generally better than two lower specced cards. Firstly, you can always add a second of the more powerful cards further down the track should it be required. Secondly, driver issues are nearly non-existent with single cards in comparison to multi GPU setups.
Having said that, I wouldn't talk you out of SLI'ing a couple of GTX 770's (or similar) - although that is a little overkill for 1920x1080- it depends upon what kind of tolerance you have for glitches in gameplay. If a slight occasional stuttering isn't likely to ruin the gaming experience for you then SLI is definitely an option ( I've had multi GPU setups more often than single -and that has been with worse stutter issues than are present now). What I would suggest is that you hold off on a purchase until AMD's announcement on the 25th of the month. Even if you have no intention of buying an AMD card, there is a strong possibility that prices will fall for both Nvidia and AMD cards between that date and the series being available in retail ( some sources say mid-October).
 
Ill agree to the hold off on the buying till the AMD announcement this month, however micro stuttering was only recently (in the last 2 years) a big deal. The fact is, while some form may occur, in most cases it is not visible to the naked eye except in extreme situations (Like when crysis 2 came out for instance).

SLI/CFX are great bang for buck and both card companies now offer some form of frame metering using software as it is. The stuttering issue became big recently and is only big to some people, if this was such a humongous deal, there would not be so many 2/3/4 way CFX or SLI setups because people would be horrified by the horrid performance everyone claims.

I would say dude, wait and pick out something based on how long your willing to wait to upgrade, if you want to wait a few years and then scrap and start over, get the best system you can now using multi-GPU. If you like to upgrade later, get one powerful GPU and later grab a second.

I have also had multiple multi-GPU setups (all NVidia before this) and this is my first 4 way setup. They have all been fine and ive never heard/seen any issues with these types of setup until the last 2 years.
 
No one is mentioning frame queuing and how NVIDIA thoroughly trumps AMD at the moment. At least, this is what the last reports I have seen tell me.
 
Wait so DBZ, can you have GPU 1 as a 4GB GPU and GPU 2 as a 2GB one? Or do they need to be the same VRAM amounts?
 
I guess my thing is I've been a console user for my whole life so pc gaming is pretty new to me. I'm used to having games run buttery smooth (minus a few glitches here and there) so its pretty distracting for me when microstuttering or something along those lines occurs since I'm not used to it. But, I do like the fact that games are better technically on PC's (I always hated watching a video from E3 or something and saying wow that looks amazing! only to find out that was the PC version and consoles wont even come close to looking like that. Very annoying lol). So I just want the smoothest gameplay I can get, whether that be a single GPU or multiple, on ultra settings for games. Sounds like from what you guys are saying though, either option is viable though maybe less issues with single GPU's even though Nvidia and AMD are getting better at dealing with microstuttering.
 
As Clifford said, 690 is still SLI.
If you can afford it, one powerful card is generally better than two lower specced cards. Firstly, you can always add a second of the more powerful cards further down the track should it be required. Secondly, driver issues are nearly non-existent with single cards in comparison to multi GPU setups.
Having said that, I wouldn't talk you out of SLI'ing a couple of GTX 770's (or similar) - although that is a little overkill for 1920x1080- it depends upon what kind of tolerance you have for glitches in gameplay. If a slight occasional stuttering isn't likely to ruin the gaming experience for you then SLI is definitely an option ( I've had multi GPU setups more often than single -and that has been with worse stutter issues than are present now). What I would suggest is that you hold off on a purchase until AMD's announcement on the 25th of the month. Even if you have no intention of buying an AMD card, there is a strong possibility that prices will fall for both Nvidia and AMD cards between that date and the series being available in retail ( some sources say mid-October).
So then do you know if the 690 has the same kinda driver issues as other sli configurations given that it does have the hardware based frame metering that other cards don't? Just want to know if I do go with one GPU if I should do Titan or 690.
 
Wait so DBZ, can you have GPU 1 as a 4GB GPU and GPU 2 as a 2GB one? Or do they need to be the same VRAM amounts?

It will work, it just only allows the lower VRAM amount to be used.

The 690 will be more powerful, but if you want to go with 1 single gpu, grab a 780. I do t recommend the Titan because its way over priced. I highly doubt either way you will be dissappoted, like I said the whole "issue" is overblown for both sides.
 
GPU =/= graphics card. GPU = Graphics Processing Unit. GTX 690 is a dual-GPU card, Titan is single-GPU card.

GTX 690 is dual-GPU joined via internal SLI bridging, so will have the same microstutter issues as any other multi-GPU setup.

You personally haven't experienced microstutter yet. Also, you say you're used to buttery smooth play on consoles, so I don't think any microstutter or other inherent issue will affect you: consoles play at 30fps (far from 'buttery') and the last few years (especially) have been plagued by fps drops in demanding scenes, making for choppy gameplay. If you can't see that, you probably won't see anything along the lines of microstutter.

What you should do: wait for AMD to release new cards, see if 670 drops a decent amount. If yes, grab another one, as you want Ultra on 1080p.
 
however micro stuttering was only recently (in the last 2 years) a big deal.
That might be your experience but it certainly isn't indicative of the wider community. What has actually happened is that mainstream benchmarking has in the last two years been able to quantify (FRAPS, FRAFS, real-time gameplay capture) microstutter/frame pacing. Before this time, observations were deemed anecdotal and apocryphal - something put about by Nvidia card users to slate Crossfire and/or tall poppy syndrome associated with the multi-GPU enthusiast.
Having been an SLI and Crossfire user for most of my PC owning days, I can point you to a great many forum threads and articles - mainly because I was a participant in many of them. As a "for instance", here's PCGH's HD 5850/5870 Crossfire vs GTX 285 SLI review from 2009....first page : Introduction - Micro Stuttering. Also from 2009, one of Overclockers.com's articles on the same subject. Just for a change of pace, here's a thread from April 2008 at Anandtech.

The only real difference now is that the results are measureable. Once that happened, the phenomena transitioned from subjective to empirical - I.e. verifiable proof. Proof that has a direct impact on the gameplay experience ( FWIW, many European sites were already reviewing based on this rather than average/min/max f.p.s. notably ComputerBase, HT4U, PCGH, Hardware.info, iXBT, and Hardware France)...and of course, all those people that had already known about the issue but had been shouted down now had their collective "I told you so" moment....which of course was allied to Nvidia vs AMD neverending forum war - a war that is guaranteed page-click bait, which in turn led to most sites adopting the benchmarking metric and the aforementioned rabid F5'ing by faithful of the GPU Jihad.
Wait so DBZ, can you have GPU 1 as a 4GB GPU and GPU 2 as a 2GB one? Or do they need to be the same VRAM amounts?
They don't need to be the same (although they do need to be running at the same frequency), its just that the larger framebuffer is redundant.
Think of it as two pails of water. One holds 4 litres and one holds 2 litres. You can only pour 2 litres into each whichever way you try. If your 2nd GPU is 4GB then the primary cards framebuffer could only accommodate 2GB of it- likewise if the 2nd GPU is 2GB, all it can provide to the primary 4GB card is the 2GB worth of render.
 
You personally haven't experienced microstutter yet. Also, you say you're used to buttery smooth play on consoles, so I don't think any microstutter or other inherent issue will affect you: consoles play at 30fps (far from 'buttery') and the last few years (especially) have been plagued by fps drops in demanding scenes, making for choppy gameplay. If you can't see that, you probably won't see anything along the lines of microstutter.
That's what I was thinking :D

No offence but if you think console games are buttery smooth then there's no way you'll notice microstuttering especially on an SLI setup.
 
That might be your experience but it certainly isn't indicative of the wider community. What has actually happened is that mainstream benchmarking has in the last two years been able to quantify (FRAPS, FRAFS, real-time gameplay capture) microstutter/frame pacing. Before this time, observations were deemed anecdotal and apocryphal - something put about by Nvidia card users to slate Crossfire and/or tall poppy syndrome associated with the multi-GPU enthusiast.
Having been an SLI and Crossfire user for most of my PC owning days, I can point you to a great many forum threads and articles - mainly because I was a participant in many of them. As a "for instance", here's PCGH's HD 5850/5870 Crossfire vs GTX 285 SLI review from 2009....first page : Introduction - Micro Stuttering. Also from 2009, one of Overclockers.com's articles on the same subject. Just for a change of pace, here's a thread from April 2008 at Anandtech.

The only real difference now is that the results are measureable. Once that happened, the phenomena transitioned from subjective to empirical - I.e. verifiable proof. Proof that has a direct impact on the gameplay experience ( FWIW, many European sites were already reviewing based on this rather than average/min/max f.p.s. notably ComputerBase, HT4U, PCGH, Hardware.info, iXBT, and Hardware France)...and of course, all those people that had already known about the issue but had been shouted down now had their collective "I told you so" moment....which of course was allied to Nvidia vs AMD neverending forum war - a war that is guaranteed page-click bait, which in turn led to most sites adopting the benchmarking metric and the aforementioned rabid F5'ing by faithful of the GPU Jihad.

The first crossfire setup was introduced (I believe this is correct, ill double check my facts) in 2005 (looked it up September 27th). The first SLI was introduced in 1998 but disappeared after NVidia bought it until 2004. Multiple GPU setups have existed for quite sometime but have in the last few generations become more seen among gamers (My first was actually a pair of 9800's, I was dying for one of those 9800 GX2's but they were hard to get). I never stated it "Did not exist before 2 years ago" I said it became the next hot button starting 2 years ago. It was mentioned in some reviews before 2 years ago (Well maybe three years now, I forget how late in the year we are now) but its gone from being something lightly mentioned to being a household test with lots of programs (Like fraps) capturing the supposedly horrid results and showing the world.

My opinion for the thread creator is this, don't worry so much about micro-stuttering, its not the big a deal to begin with and with the recent drivers from BOTH companies there's a low amount of worry left. Buying 2 770's or one 780, will only see a difference in FPS with the edge to the 770's, however your expandability will be limited a little more later on down the line, but you should not have to worry for quite some time.
 
They don't need to be the same (although they do need to be running at the same frequency), its just that the larger framebuffer is redundant.
Think of it as two pails of water. One holds 4 litres and one holds 2 litres. You can only pour 2 litres into each whichever way you try. If your 2nd GPU is 4GB then the primary cards framebuffer could only accommodate 2GB of it- likewise if the 2nd GPU is 2GB, all it can provide to the primary 4GB card is the 2GB worth of render.

Alright, that is what I thought.
 
Back