Nvidia: High frame rates could lead to significantly better K/D ratios in competitive...

Polycount

Posts: 3,017   +590
Staff
In a nutshell: Did you know that higher refresh rates can give you a competitive edge during heated gaming sessions? That may seem like common sense to most avid PC gamers, but Nvidia has published a lengthy, in-depth article to hammer this point home even further.

The GPU maker covers quite a bit of ground in its post, but we'd like to focus on some of the more interesting points here. First, let's talk about the relationship between frame rates and refresh rates.

Also read: How Many FPS Do You Need? Frames Per Second, Explained

As Nvidia points out, the terms are sometimes used interchangeably, but they describe two different things: your frame rate is the rate at which your GPU can draw frames for a game or piece of software, and your refresh rate effectively measures how well your monitor can keep up. If your GPU is putting out 144 frames per second, but you're gaming on a 60Hz display, you're going to see a not-insignificant amount of screen tearing as your monitor fails to match your video card's speed.

This is why many gamers with lower-end displays prefer to lock their in-game frame rates to around 60 with third-party tools like MSI Afterburner or Rivatuner Statistics Server. As Nvidia states, for optimal results and minimal "ghosting" (the trail that gets left behind by moving images on LCD displays), you generally want both your FPS and refresh rate to be high.

But how high is high enough? For most PC gamers, 60 FPS gaming is the holy grail; the target to pursue and maintain with settings tweaks and hardware upgrades. And that's a perfectly reasonable standard. It's impossible to deny that there is a massive difference in smoothness and playability between 30 and 60 FPS, and most eSports pros will tell you that the latter is the bare minimum you want to achieve before playing games like Overwatch in a competitive capacity.

And then there's 144 FPS gaming: another very noticeable improvement from 60 FPS for many people, and one that can certainly make a difference in competitive gaming. Still, this is often considered an unnecessary luxury, even among many PC gaming enthusiasts.

Speaking from experience, even with the best hardware, it's challenging to maintain upwards of 120 FPS in the latest games without making hefty sacrifices in the visual quality department. See our Red Dead Redemption 2 performance benchmarks for ample evidence of that.

However, Nvidia aims to convince you that not only is 144 FPS-supporting hardware and display tech worth the investment, but you should actually make the jump to 240 FPS for the best competitive results.

In the company's own studies, Nvidia says it found a strong, direct link between a player's FPS and their K/D ratio in PUBG and Fortnite. Compared to gaming at 60 FPS, Nvidia says users who play at 140 FPS enjoyed a relative K/D improvement of 78 percent. That number climbed all the way to 90 percent when testers hit the 180 FPS mark.

Those results are certainly intriguing, and there's probably some merit to them, but we likely don't need to tell you that Nvidia is not exactly an unbiased third-party here. The company earns revenue from the sale of high-end PC hardware (such as the outlandishly-priced 2080 Ti), so it has a vested interest in persuading you that the best way to achieve maximum frame rates and competitive success is to buy its products.

Further, PUBD and Fortnite are not necessarily the best candidates for this sort of experiment. No matter how skilled you are, there's almost no such thing as an "average" battle royale match. By their very nature, these games are inconsistent, and it's quite easy to lose (or win) many matches in a row through no fault of your own. Additionally, you don't need to earn many kills to win a game, and a single death will boot you back to the lobby.

We don't mean to imply that Nvidia has somehow fudged the results here. If your reaction times are fast enough, it's absolutely possible that you'd benefit from gaming at well over 60, or even 144 FPS. Even in non-competitive games, a smoother experience can only be a good thing.

Whether those benefits will be as significant as Nvidia would have you believe, though, is another matter entirely, and that's the topic of much ongoing discussion in the PC gaming community.

To that end, we'd love to hear your thoughts on this situation. Do you agree with Nvidia that very high refresh and frame rates (144 or more) will lead to much better gaming results, or do you think the 60 FPS/Hz mark is satisfactory? Let us know in the comments.

Permalink to story.

 
I agree with nVidia on this one, more is always the better, BUT on the condition You see some returns on Your hefty investement, like You are a competitive gamer, living out of killing other blokes in a multiplayer video game.
Any other day I don't see the point to spend thousands of dollars on a "casual" gaming like most folks do..
Also, It's obvious to me that pros, who are more devoted to playing a title, will also spend more on their equipment that will give them this little edge (at least in their minds?). So maybe they would be great at gaming without 240Hz monitor? Who knows?
Science of statistics is a tricky b*tch! How big was their test group? How many control groups? Did They check pros who moved from 240Hz to 60Hz?
Just sayin' - it may be a marketing bullsh*t.
 
There should be standard FPS for Esport games and now 60 fps is the standard in my opinion. surely more frame will give better results but would it be fair that some one have alot of money and invested in powerful Rig versus normal Rig
 
I knew this two decades ago. It is hysterical the industry is finally touting this like it's some new discovery. The old "60Hz is the most the eye can see" has always been bullcrap. Now they see dollars signs on telling us this, so finally they admit it, without saying they were wrong. There is the law of diminishing returns even on this, but they won't admit that I'm sure.

Really the only benefit I see on framerates above like 75-85Hz is for side-scrolling in FPS games. It eliminates frame tearing.
 
I'm not a competitive gamer.

Sure I "could" spend twice as much on my CPU, my RAM, my SSD, my monitor, my motherboard and everything else...

But I see no reason to spend that kind of money unless I'm getting huge ROI by actually winning competitions and having the hardware pay for itself.

I bought the 2080Ti and i9 just because I wanted some level of future proofing and the ability to play 100% of games on the market.

But this is nothing more than psychological obsolescence and a way for Nvidia to suck the air out of the room when AMD wants to show off their products.
 
I'm not a competitive gamer.

Sure I "could" spend twice as much on my CPU, my RAM, my SSD, my monitor, my motherboard and everything else...

But I see no reason to spend that kind of money unless I'm getting huge ROI by actually winning competitions and having the hardware pay for itself.

I bought the 2080Ti and i9 just because I wanted some level of future proofing and the ability to play 100% of games on the market.

But this is nothing more than psychological obsolescence and a way for Nvidia to suck the air out of the room when AMD wants to show off their products.

"future proofing"

There is no such thing. Kind of like "job security". It's ok, you can admit you wanted something expensive. Everyone has their crutch. None of us really "need" that kind of hardware. I agree with you to a certain extent.
 
I just moved from 60 to 144 fps in Rocket League and it makes a difference. It's not a huge difference but it's definitely a noticeable one as I also play another game which my VC can only push 60fps in. When I forget to switch back to 144 Hz when starting up RL, I notice the difference right away, partially in lag and partially in smoothness, solidity of the image. It doesn't seem to make as big a difference in Asetto Corsa (I still suck) and dammit I haven't tried DiRT Rally yet.

IMO definitely worth it for esports and FPS.
 
"Nvidia says users who play at 140 FPS enjoyed a relative K/D improvement of 78 percent"

I have a 144 Hz monitor and even that 78% figure seems far too excessive. I certainly didn't improve 78% nor did I expect to. I don't know anyone who claimed such a massive increase either simply by switching monitors.

I'd take these numbers with a huge grain of salt as they are suspiciously high and Nvdia has a direct interest in selling video cards and high refresh g-sync equipped monitors. I love my 144 Hz monitor but this just feels way too misleading.
 
OK and just to add to how utterly arbitrary any statistics on kills are: upgrading from 60 to 144 FPS increased my kill ratio in Fortnite by ∞.

Yup, ∞. I had zero kills in the last 1.5 years (played about 20 times with the kid). After 144fps (and starting Fortnite Season 2) I got 4 kills in one match. And haven't played since. Dude, it wuz all the FPSsseses.

Or, the real reason was playing Tomb Raider, learning how to use a firearm properly in a game, finding a decent sniper rifle in FN (total luck), and having at it. In fact after all that, only 4 kills was pretty lame. Clearly I need 240fps.
 
Alright 1 last thought: fps is only part of the picture. I still play Rocket League on a NUC a few times a week and with the right (low) settings it gets 60fps @1080p. However the difference between this machine at 60fps and my gaming PC at home with 60 fps is huge. The lag in the NUC (and its inputs and its 2016 Dell office monitor) were much worse than my home gaming PC (and it's LG office monitor at the time).

The difference between NUC @60 fps and gaming PC @60fps is subjectively *much* greater than the difference between the gaming PC@60fps and that same PC @144fps.

It ain't all about the FPSes.
 
It will be most noticable in games like CSGO. The thing is to keep up that framerate in modern games you need to spend a lot of money. And the most important thing to know is that no amount of framerate will make you not suck at the game. It's still 99% about your skill.
 
The question is not whether you see a difference or not. Ofc you see. But can casual player be better?
I think not.

Pro players set resolution to lowest, tweak mouses, set low details, ser perfect sitting position, perfect monitor size and sometimes even use illegal drugs for better awareness.
Yes the can benefit a 1% from 144hz. For them 1% is a lot. For you filthy casual, for you is not worthed, just set details high and enjoy the game.
 
"future proofing"

There is no such thing. Kind of like "job security". It's ok, you can admit you wanted something expensive. Everyone has their crutch. None of us really "need" that kind of hardware. I agree with you to a certain extent.


Future proofing...

Can this computer still run new games that are released 2 or 3 years from today in at least High settings.

Well - considering most games are released to allow people with Core i5, low-end AMD CPU and low end GPU to play then I'd say the chances are high.

Many people buying cards right now are chasing Ultra 4K@60fps specs and they don't even have monitors capable of that.

There's no game that a Core i7 8700 or i9 9900k can run that a Core i7 4790 can't.

And that CPU is from 2014.
 
Lol, so much anti Nvidia in this forum, last I check AMD can reach high FPS too if you lower the settings enough. No one is playing competitive shooter such as PUBG, Fortnite at Ultra Presets. I play PUBG at mostly low settings on a watercooled 2080Ti because I can see and feel the input lag when fps dip below 140.

Still I find my shots are not as accurate when switching from a 1440p 144hz screen to an Ultra Wide 1440p 120hz one :/. Maybe I should try the 240hz as the primary gaming screen.
 
Future proofing...

Can this computer still run new games that are released 2 or 3 years from today in at least High settings.

Well - considering most games are released to allow people with Core i5, low-end AMD CPU and low end GPU to play then I'd say the chances are high.

Many people buying cards right now are chasing Ultra 4K@60fps specs and they don't even have monitors capable of that.

There's no game that a Core i7 8700 or i9 9900k can run that a Core i7 4790 can't.

And that CPU is from 2014.

Or the entire Ryzen lineup as well
 
"future proofing"

There is no such thing. Kind of like "job security". It's ok, you can admit you wanted something expensive. Everyone has their crutch. None of us really "need" that kind of hardware. I agree with you to a certain extent.

Depends, if you bought an i7 920 in 2008, it can still play any title in 2019, and @1080p be getting 50+FPS in 99% of recent titles on high-ultra with an R9 Nano. That's 11 year futureproofing on the CPU and 4+ years on the GPU.
 
Future proofing...

Can this computer still run new games that are released 2 or 3 years from today in at least High settings.

Well - considering most games are released to allow people with Core i5, low-end AMD CPU and low end GPU to play then I'd say the chances are high.

Many people buying cards right now are chasing Ultra 4K@60fps specs and they don't even have monitors capable of that.

There's no game that a Core i7 8700 or i9 9900k can run that a Core i7 4790 can't.

And that CPU is from 2014.
Yup, any intel quad+ core CPU from Nehalem onward can with a Fury/980ti gpu.
 
60hz/120fps is better than locked 60hz/60fps every time in terms of tearing and input lag. 60Hz/60fps should be your last resort. Period.

If you have a 120hz monitor, 120fps is not mandatory to benefit from high refresh monitor. This is a myth.

For more info and corroboration, check out Linus' video he did with Shroud (60hz-240hz), or any video from Battle(non)sense on the topics of frame rate and frame pacing etc. You'll thank me later!

Every gamer should be using a variable/high refresh monitor. It's as important as an SSD imo.
 
Last edited:
This isn't Radeon anti lag.
Its trying to be but fails to hit the mark.
But good attempt selling the lies.
people will blindly follow you Nvidia as they continue to do so.
 
Depends, if you bought an i7 920 in 2008, it can still play any title in 2019, and @1080p be getting 50+FPS in 99% of recent titles on high-ultra with an R9 Nano. That's 11 year futureproofing on the CPU and 4+ years on the GPU.

FPS mean nothing if your frame times are poop. Quad cores struggle hard in newer titles at 1080p with low settings. GPU helps, but then you aren't getting the most out of your hardware, or your forced to play at higher resolutions making your CPU the bottleneck. Faster GPU only gets you so far.
 
Back