Call of Duty: Modern Warfare 2 Multiplayer Benchmark

How many people out there will actually take the time to see 2 & II all they hear is Modern Warfare 2 & say oh boy another remake of a remake of a remake. They are some what right but at least this time around Warfare II has a new story line to play on. To bad they keep reusing the same name all it does is confuse the minions of the world. lol
You see, knowing difference between 2 and II makes difference between normal people and so called "experts" 😉
 
Thank you for all of the benchmarks at both ultra and basic. Incredibly thorough as always! If possible, it would be nice to see some side by side screenshot comparisons of Ultra vs Basic visuals.
Exactly what I'm hoping for too. Steve, is an article or video in the works ?
 
What's up with the GTX 1060 series performance being so far behind AMD RX 580 ???

Something that can be improved with drivers ?
 
Can we have a technical explanation, please? I'm getting drowned in all these journalistic figures of speech and innuendoes, like a was drowning a while ago in Intel's lakes. Why is the RTX 4090 so much faster than the RTX 3090 Ti in these graphs, on an exponential growth curve when all the other video cards follow a linear growth?

I don't know. But I know that Nvidia has a long tradition of bribing developers to make the game run best on their latest hardware, and punish all competition. And by that I mean not just cards made by other companies, but even older cards made by Nvidia. I'm not saying that's the case here, because I can't check it, but if it's true it wouldn't be the first time.
 
What's up with the GTX 1060 series performance being so far behind AMD RX 580 ???

Something that can be improved with drivers ?

Check the stats, even 1070 is behind RX 580 now. And it used to be far ahead just a few years ago.
 
Check the stats, even 1070 is behind RX 580 now. And it used to be far ahead just a few years ago.

I have a suspicion. AMD has always focused more on memory bandwidth then NVIDIA, where NVIDIA tends to favor raw shader performance. NVIDIA's approach works well when their cards get released, but as they age and new games naturally start to use more and more VRAM, that lesser VRAM performance causes NVIDIA's cards to age less gracefully then AMDs.

If I'm right, we should see NVIDIAs performance track with how much VRAM is on their cards (at a given settings level) to a point, at which point the difference in shader performance starts to matter more.
 
I have a suspicion. AMD has always focused more on memory bandwidth then NVIDIA, where NVIDIA tends to favor raw shader performance.
Except in the case of the RX 580 and GTX 1070, they're pretty much equal, in terms of FP32 throughput, global memory bandwidth, and texture rate; only in pixel output rate is there any big difference (and the 1070 is way ahead of the 580 in this respect). The difference is more likely to come down to the fact that the game was designed to run as well as possible on older consoles (Xbox One and PS4) which use the same GPU architecture as found in the RX 580
 
I have a suspicion. AMD has always focused more on memory bandwidth then NVIDIA, where NVIDIA tends to favor raw shader performance. NVIDIA's approach works well when their cards get released, but as they age and new games naturally start to use more and more VRAM, that lesser VRAM performance causes NVIDIA's cards to age less gracefully then AMDs.

If I'm right, we should see NVIDIAs performance track with how much VRAM is on their cards (at a given settings level) to a point, at which point the difference in shader performance starts to matter more.

Yeah, that could be the cause for the change.
 
Appreciate the benchmarks, looking forward to CPU testing as I'm fairly certain the 1% lows are heavily CPU bound even when we can see clear GPU bound scaling on average fps.
 
Back