Fps in Games

Status
Not open for further replies.

Ray

Posts: 58   +0
Hi
How do you tell how many fps you are running a game at when you are playing it? People talk about running Doom3 at 60 fps but how do they know? I am trying to find out what my graphics card is running games at but am at a loss to find out how to do this. Is it on my computer or do I need a software to do this?

Thanks :wave:
 
Some games have their own built-in framerate counters you can enable, usually from a configuration file or by typing commands into them.

Doom3, for example (as well as Quake4) have a console command for com_ShowFps 1 to display a framerate counter in the top corner of the screen while playing. Other games, like Oblivion or Source Engine games (Halflife2, counter strike, etc.etc.) also have their own methods which require searching through their faqs, forums or internet sites to determine the proper command and how to activate it.

Lastly, there is a very handy tool that works pretty well for most all games called FRAPS. You can download this utility and in it's free form it will provide framerate counters that are usually pretty accurate in most all games. You can download FRAPS at www.fraps.com. You simply download, install then run FRAPS prior to launching your game. It will then overlay a framerate on the screen and also allows for taking screenshots and even capturing (short/30 seconds or so in the free version, unlimited in pay version) videos of the gameplay to your harddisk if you have enough memory and CPU power.
 
Type cg_drawfps 1 in console for some games such as Soldier of Fortune etc. It would help if you told us what game this concers.

Regards,

Korrupt
 
Thanks
I downloaded Fraps and it works great. Simple utility but easy to use as well. At least now I can see how my frames are going. Is there a decent number of fps that a game should run at? I am a bit outdated as far as graphics card goes.

Thanks Sharkfood for the frap info
 
you're looking for a MINIMUM of 30 fps (some games are locked to this anyway - i.e. won't go any higher).
 
30 fps is a minimum for most games. Anything lower than that will appear glitchy and laggy.

60 fps is usually ideal. For example, 60 fps is good if you play Half-Life 2 or Counter-Strike: Source.

By the way, the command to show framerates is +showbudgets from the Half-Life 2 console. Bind a key to this command.
 
There is no real "standard" framerate requirement as this will vary on games, how they are written, and what is expected from the player.

For example, if you have a very high-resolution mouse, super low latency internet connection and play online multiplayer shooter games with fast action- having very high fps will benefit in feel and control.

On the other hand, a single player Role Playing game ( say, like Oblivion), you really can play fine with only a fraction of the framerate requirements of that above online action shooter against other humans.

Further still, simulations or strategy games such as Civilizations IV, SimCity, The Sims 2 or even flight simulators can generally be enjoyed without much detriment as low as 9-15 fps.

So it really depends on the game, your needs, and the expectations to "do well" in the games themselves. Twitch, action, aggressive games usually demand higher framerates though as milliseconds of response/feel can count!
 
Thanks
I get the picture now. The guidelines of 30 to 60 fps makes sense but it also matters on the demand of the type of game that you are playing. The multiplayer part is understandable and if the other guys playing online such as farcry have BFMeanMachines then I get my **** whipped as my response time will lag behind. Thanks for the info.
 
The eye sees a flicker frequency of 25-28Hz or less. This is why electric companies standardized at twice that frequency (Nyquest freq) or aprrox 60Hz. This way we don't "see" the lights flickering.

Frames per second of lower then 30 would start to be percieved and probably be just annoying or slowing drive you stark raving mad.
 
Well, human-eye, perceptions of flicker and the like really don't apply much to videogames, in my opinion. Cinema/movie rates are generally 24 frames per second as this far exceeds the eye's ability to "perceive" fluid motion. Refresh rates of monitors as well as LCD's also remove flicker outside what a game engine/3d engine is pumping out as far as transitions also, so this becomes a moot point (finally!) now too! hehe.

But it's really not perceived motion that impacts games. It's moreso how they are written and "look/control/feel" that drives higher framerates.

Example- you could set up a 35mm motion camera and watch a racing game at 24 fps, then record the same racing game at 60 fps. As the camera will only record at 24 fps, a viewer then watching the movie really wouldn't notice any reduction in fluidity in the motion. Heck, if the camera could record at 60fps, there still wouldn't be any real, massive measurable perception of difference between the two either just WATCHING that movie after the fact.

This is totally different from the player playing the game being recorded! The response time (for many games) between his thumb, the input loops and feeback refresh of the newly presented frame that his brain is now seeing/adjusting to is going to have a massive adjustment in his needed response times and hand-eye coordination. While the "motion" isn't any real beef as the motion perception will be similar, it's the feel/control that suffers given how many games tie the input processing and visual frame representation (i.e. result of input) in sync.

You can also test this on (poorly) written games of yesteryear that tied input process/response in sync on more powerful computers. I like to pull up an old PC game Wipeout and at 470 fps, it's impossible to play. The second you tap the gas and move the controller a hair- you're into a guardrail in .1 seconds lol.

More and more games are handling input processing and "scaling" to on-screen to better improve feel/control and response to lower framerate conditions. Of course, a highly-caffeinated, incredible response time/hand-eye coordinated player can feel the difference. This is also going to be subjective since some people can launch a car at the drag-strips in 0.010 seconds or less, versus some that can only muster foot-eye on the Christmas trees for .500 to .800 response times on the green.

Anyways, that's just my $0.02 on the huge "framerate" and "human eye" debate that has revolved around videogames for the past several years. When you get down to the science of feel, control, how a particular game engine is written, how/when input and framerates are timescaled, filtering and such... it's a big, big topic. :)
 
Status
Not open for further replies.
Back