How Many FPS Do You Need?

Julio Franco

Posts: 9,199   +2,119
Staff member
Well put together article - It brings an old topic back to mind for me though, SLi... so with Gsync system, a pair of Titan Xp in theory could provide a big boost (if the game supports SLi well enough), as it higher frames, and Gsync should fix the stuttering.
 
Great article. I always try and get as many FPS as possible in general. I prefer a high refresh rate to high settings or resolution if I have a choice.
 
Well put together article - It brings an old topic back to mind for me though, SLi... so with Gsync system, a pair of Titan Xp in theory could provide a big boost (if the game supports SLi well enough), as it higher frames, and Gsync should fix the stuttering.

Those are nice 'theories', but devs never really jumped in the SLI bandwagon. Its a cart and horse problem. Devs don't support it because most gamers don't use multiple cards and its tricky, Gamers don't use multiple cards because most devs don't support them [well] and its expensive.

Also, a 1080ti will net you similar performance, at a fraction of the cost.
 
Fantastic explanation here and video on HUB! Great job, Tim.

One minor point to clarify regarding Fast Sync / Enhanced Sync (with the same intent of dispelling misconceptions) is that not only should the framerate be higher than the refresh when enabling it, but that the framerate should be at least 2 to 3 times faster than the monitor's refresh rate to see the benefits of minimal latency with no tearing.

My only reason for bringing it up is because I've seen people make blanket statements that Fast Sync / Enhanced Sync "should always be enabled" for fast multiplayer games--which is only true if the optimum conditions for Fast Sync / Enhanced Sync are met or if 'responsiveness' isn't your top priority.

If the framerate is just above the refresh rate, the card doesn't have enough "back pressure" (nvidia engineer's words) of "new/recent" frames being juggled into the 'last rendered frame buffer', so that buffer would contain a frame that is "old". Just like with Vsync=On you wouldn't have any tearing, but also just like with Vsync=On you'll witness shortcomings from an input latency standpoint when compared to the responsiveness of Vsync=Off where new info is sent to the monitor immediately.
 
Excellent article... hopefully one day, GPUs will be able to render everything at > 300FPS along with > 300Hz monitors so that none of these technologies will be necessary...

It will just "work" :)
 
Given the now world-known visual acuity test made for fighter pilots, I'd say 240 Hz/frames per second is a good value.

Quote:
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.
 
Super informative... An addition I'd appreciate is write up of human perception eg the idea we can't see more than 30 FPS so anything above is wasted. I know that isnt quite true but I can never remember why, and up to how many FPS really makes a difference..
 
Super informative... An addition I'd appreciate is write up of human perception eg the idea we can't see more than 30 FPS so anything above is wasted. I know that isnt quite true but I can never remember why, and up to how many FPS really makes a difference..
The simple answer is that your eyes don't work in frames, every individual receptor works at its own rate detecting light and signalling the brain. So in theory there could be a receptor ready to detect another bunch of photons every microsecond.
 
Well put together article - It brings an old topic back to mind for me though, SLi... so with Gsync system, a pair of Titan Xp in theory could provide a big boost (if the game supports SLi well enough), as it higher frames, and Gsync should fix the stuttering.

Those are nice 'theories', but devs never really jumped in the SLI bandwagon. Its a cart and horse problem. Devs don't support it because most gamers don't use multiple cards and its tricky, Gamers don't use multiple cards because most devs don't support them [well] and its expensive.

Also, a 1080ti will net you similar performance, at a fraction of the cost.

I argue that the real problem with SLI/Xfire is that software has anything to do with it other than the initial driver. Like MFAA (nothing to do with sli), it doesn't require interaction of the game, only to have MSAA. Just (not to make it sound simple, if it could be done correctly, easily it'd be done) build a completely separate driver for sli systems. Where both cards act as 1 but with more bandwidth, with the checkerboard coverage.
 
Super informative... An addition I'd appreciate is write up of human perception eg the idea we can't see more than 30 FPS so anything above is wasted. I know that isnt quite true but I can never remember why, and up to how many FPS really makes a difference..
The simple answer is that your eyes don't work in frames, every individual receptor works at its own rate detecting light and signalling the brain. So in theory there could be a receptor ready to detect another bunch of photons every microsecond.

True they don't work in frames, and false you can see more than 30fps. 30fps is passable but it's more to do with what your brain can process. It can put the missing frames inbetween each of the 30 frames to complete the video/motion. If your eyes get trained on a higher resolution for a period of time, you can see the choppiness that is 30fps. It's less noticable the further up you go in comparison to lower/higher speeds. Not everyone can tell, not everyone has the same vision + brain to detect subtle changes and improvements.

It's my argument why I can't stand playing on a tv with a console anymore. Grew up with consoles.. But ever since going to 100hz+ monitors, I can't deal with it.. Most games feel slow without being slow, if you know what I mean..

Personally I like 100hz+, I have a 21:9 120hz and a 16:9 144hz. Additionally I would strongly suggest that a continuation be added to the article to include that of freesync, Gsync, and locked framerate by driver (not vsync). Vsync causes input lag, we've known this for years and years. whats the difference between locked frames at 57ish and unlocked? what's the quality of tearing between the 2?
 
Excellent article... hopefully one day, GPUs will be able to render everything at > 300FPS along with > 300Hz monitors so that none of these technologies will be necessary...

It will just "work" :)

Those days aren't too far I guess :p If it's AMD vs Nvidia this may never actually happen, I can see vsync and freesync running for a long time.
 
Back