Upcoming GeForce RTX 4090 is so fast Blizzard had to increase Overwatch 2's FPS cap

Daniel Sims

Posts: 1,365   +43
Staff
Why it matters: Nvidia continues to demonstrate the ultra-high framerates that its latest flagship GPU can coax out of recent and upcoming high-profile games. Blizzard decided to change Overwatch 2's performance metrics to utilize the graphics card's full potential as both products prepare to launch around the same time.

Nvidia recently told PC Gamer that Blizzard increased Overwatch 2's framerate cap to 600fps after seeing how well the upcoming title ran on the GeForce RTX 4090. Nvidia also confirmed Overwatch 2 would support its latency-reducing feature, Nvidia Reflex.

In a new demo, the RTX 4090 ran Overwatch 2 with framerates consistently over 360fps at 1440p, often topping 400fps and sometimes reaching 500. More impressive is that the demo likely ran at native 1440p without any resolution upscaling, instead displaying the 4090's potential in raw performance.

Blizzard and Nvidia did not indicate that Overwatch 2 will support Nvidia's framerate-boosting DLSS technology, much less the new DLSS3, which brought Cyberpunk 2077 up to 171fps at maximum ray tracing settings. The Overwatch 2 beta confirmed that the title supports AMD FSR, but it's unlikely Nvidia would use its competitor's upscaling tech in a demonstration.

Esports shooters like Overwatch 2 are optimized to enable extremely high framerates, but Blizzard likely didn't foresee a new GPU topping 500fps, thus the change in maximum framerate. No one confirmed the game's previous framerate cap, but PCGamingWiki suggests that the first Overwatch had a 400fps cap.

Most high-end monitors only reach refresh rates of 240Hz or 360Hz, but the 4090's performance may justify the Asus Rog Swift 500Hz, which Nvidia unveiled in May. Another manufacturer, AU Optronics, may be trying to make a 540Hz display. However, both monitors are only 1080p, at which the 4090 would achieve even higher framerates for Overwatch 2 than in Nvidia's recent demo if it doesn't run into CPU limits.

Overwatch 2 launches on October 4, replacing its predecessor, which hit PCs over six years ago. Legacy servers will go offline permanently a little over a day before Overwatch 2's free-to-play multiplayer version launches. The game's paid PvE portion debuts sometime in 2023.

The RTX 4090 arrives a week after Overwatch 2 on October 12, starting at $1,599. The RTX 4080 hits stores in November with an $899 12GB version and a $1,199 16GB variant.

Permalink to story.

 
This would have been good news if OW and OW2 beta would have not fallen to the lowest popularity.
 
4k series is a dumb release because of the missing displayport 2.0 spec.
I can already do 300 + frames on games with my 2070 @ 1440P.

4K is what gamers want and they want 240 frames with it. monitors are coming out this year and next that do 240 @4k. displayport 2.0 is needed for 240hz 4k
 
Last edited:
Rofl Tripwire should learn a thing or two from these. Killing floor 2 never ran properly in 4k for me, not even on medium. And I have a 2080 Ti. And the game is what, like 6 years old now?
 
Rofl Tripwire should learn a thing or two from these. Killing floor 2 never ran properly in 4k for me, not even on medium. And I have a 2080 Ti. And the game is what, like 6 years old now?
Killing floor 2 uses UE3.5 or something so it's just a modified version of UE3 which means it probably doesn't scale past 2-4 threads at most, so chances are the engine itself is bottlenecking your 2080ti. I'm surprised they haven't ported the game over to UE4.
 
Some of these articles are just pure hype fluffing pieces.

Instead, why dont we get more reminders in how stupidly priced they are and how they are pretty much an insult to us gamers?
 
Just what I'd want to do after spending $1600 on a GPU... Play some OW2 and submerge myself in that toxicity. lol
I play OW all the time. I do not see any major toxicity in that game. There's an ***-hat or two here or there, but I don't feel like it's prevalent in the game.

As for 500 FPS, not really sure why anyone needs that for OW or OW2. I am running a 2070 Super laptop and I get over 100 FPS and that seems to be more than fine. I do OK in that game. Newer games may drive the GPUs harder so I get why people want a card that is fast. I also believe that FPS has diminishing returns.
 
Some of these articles are just pure hype fluffing pieces.

Instead, why dont we get more reminders in how stupidly priced they are and how they are pretty much an insult to us gamers?
In a way, yeah, they seem overpriced. In a different way, maybe not so much. You are getting more performance than the previous generation and, IIRC, the 3090Ti was priced at $2K, so you are getting more performance at a 20% reduction in price which is pretty good given the inflation we've had over the past couple of years.

The thing is, very few people need a 4090 and most of them aren't gamers. You'll get good performance with a 4080 or even a 4070 (or 4080 12G). I'd rather see good performance a la 3080 but with lower power and cooling requirements. When you consider that a 4090 will likely require a new $150-200 PSU, maybe a couple fans or a water cooler to keep things frosty, that's when the cost gets stupid. An expensive GPU is one thing, but for many there will be additional costs and, honestly, not that much gain.
 
500 fps... not sure if it's really useful, even for eSport gamers, except for bragging and/or pushing more overpriced hardware down people's throats...
I doubt anyone can benefit from the time advantage this theoretically provides... or maybe I'm too old. Yes, that must be it! :laughing:
 
In a way, yeah, they seem overpriced. In a different way, maybe not so much. You are getting more performance than the previous generation and, IIRC, the 3090Ti was priced at $2K, so you are getting more performance at a 20% reduction in price which is pretty good given the inflation we've had over the past couple of years.

The thing is, very few people need a 4090 and most of them aren't gamers. You'll get good performance with a 4080 or even a 4070 (or 4080 12G). I'd rather see good performance a la 3080 but with lower power and cooling requirements. When you consider that a 4090 will likely require a new $150-200 PSU, maybe a couple fans or a water cooler to keep things frosty, that's when the cost gets stupid. An expensive GPU is one thing, but for many there will be additional costs and, honestly, not that much gain.
US$2K for a 3090Ti was simply a gigantic FU to us gamers and a blatant proof that nvidia think that we all are dumb.

And then you dare using that price as an excuse to justify the price of a 4090?

No wonder nvidia thinks that we all are stupid.

And hint, on suggestions like what you did, do yourself a favor and add some AMD gpus, because unlike what the media and rabid cult members tell you, they are not that inferior to nvidia offerings.
 
US$2K for a 3090Ti was simply a gigantic FU to us gamers and a blatant proof that nvidia think that we all are dumb.

And then you dare using that price as an excuse to justify the price of a 4090?

No wonder nvidia thinks that we all are stupid.

And hint, on suggestions like what you did, do yourself a favor and add some AMD gpus, because unlike what the media and rabid cult members tell you, they are not that inferior to nvidia offerings.
I never said the prices were appropriate. I only pointed out that the new pricing is lower than the old and delivering better performance.

I looked at a mid-range build for my grandson. If I go old AMD then I'm pretty stuck other than GPU and memory upgrades. If I go 12th Gen, I have a path to 13th Gen. If I go new AMD, then I'm paying a premium and having to deal with new gen issues. So, for him, he gets 12th gen a decent GPU and a path forward both in CPU and GPU. By the time he needs more, there will be 2 more new generations of CPU/GPU.

Nothing wrong with AMD, I am looking at it for my high-end build. Both CPU and GPU.
 
4k series is a dumb release because of the missing displayport 2.0 spec.
I can already do 300 + frames on games with my 2070 @ 1440P.

4K is what gamers want and they want 240 frames with it. monitors are coming out this year and next that do 240 @4k. displayport 2.0 is needed for 240hz 4k
First person shooter gamers such as Warzone players for example tend to go with 1440P monitors so that they get a decent amount of frames per second for smoothness of play and everyone knows first person shooter gamers are at the top of the gaming ladder.
 
First person shooter gamers such as Warzone players for example tend to go with 1440P monitors so that they get a decent amount of frames per second for smoothness of play and everyone knows first person shooter gamers are at the top of the gaming ladder.
you lost me at warzone.. valorant and csgo are still way bigger players in the fps world. and you can get over 300fps easily. with new cards and cpu's 4k gaming will be the next move. I get you though you are correct 1440p is the norm now. but why buy a card that does 400fps 4k and then cant connect a monitor in a year or 2 at 4k 240hz? unless you buy a new card every year... wait till displayport 2.0
 
Why would they add a 600FPS frame rate throttle, instead of just having "throttle" (with perhaps a handful of choices) and "no throttle?" To be honest above like 60FPS is not really useful, but go ahead and include 120FPS for those who have bought those monitors. Why would you include 300, 500, 600FPS choices? *shrug* Ahh well.
 
Why would they add a 600FPS frame rate throttle, instead of just having "throttle" (with perhaps a handful of choices) and "no throttle?" To be honest above like 60FPS is not really useful, but go ahead and include 120FPS for those who have bought those monitors. Why would you include 300, 500, 600FPS choices? *shrug* Ahh well.
because the higher fps the less latency there is. also for those not using gsync or freesyc there is less tearing at higher framerate.

pro gamers dont use freesync or gsync they add latency. not much but every ms counts in comp gaming.

the only time you want to cap your fps is if using freesync or gsync, and thats typically 3 frames below the max framerate of the monitor
 
More importantly, I had a sandwich for lunch. It was a Classic Cuban Midnight (Medianoche) Sandwich.
 
Back