Nvidia RTX 3080 Gaming Performance at 1440p: CPU or Architecture Bottleneck?


Posts: 28   +21
The only people I REALLY recommend choose Intel over AMD are the competitive, low res, high refresh gamers where Intel's single threaded advantage shows up. That also may be about to change, we will see how Zen 3 plays out, recommendations will adapt as new information emerges. THAT is how it's done.
Thanks, many of these 'bottleneck' benchmarks are at 1080p where the frame rates are well into the 100s of frames per seconds. I have an intel machine and honestly, I wish I would have gone with AMD because about the only thing that Intel is really better at these days is gaming. And with Zen 3, that looks like that won't be the case anymore. When I built my latest PC I didn't do enough research, I just knew that my last AMD machine was not great at gaming compared to the Intel chips that released at the time. I should have done more research into Ryzen, but I didn't and that was my fault. Even the fact that I have to upgrade my motherboard to upgrade to 10th gen is annoying. AMD typically allows more generations to use the same motherboards. I believe the AM4 series motherboards will work with Zen 3, but I'm not 100% sure about that.

But you are correct, the difference between equivalent chips is typically only realize at 1080p, or maybe 1440p, but the framerates are usually over 100 fps already when you see these differences. The more you push your GPU, the less of a problem it usually is. The scary "bottleneck" is really meaningless unless you just have a really old processor or you bought something that doesn't have enough cores for gaming. A 10% bottleneck at 1080p is probably half that at 1440p, maybe even non-existent.
  • Like
Reactions: Tyrchlis


Posts: 53   +45
My 3440x1440p, 120Hz monitor begs to differ.
As do the Samsung G7 240hz 1440p monitors. Clearly Sir Alex Ice doesn't realize calling people dumb while asking a question fits the definition of irony quite well. 1440p @ high refresh rates is just fine for 3000 series cards and he clearly doesn't own one to talk of where they should or shouldn't be used.
  • Like
Reactions: ddferrari


Posts: 53   +45
It’d be interesting to see results but I think you’re probably right
It's true, swaggman nailed it exactly. At 1080p, CPU differences stand out noticeably with higher end GPUs, while at 1440p, those differences disappear to almost nothing, but still measurable slightly. At 4K those differences are gone and the GPU is the PRIMARY determiner of performance.

Every Core i7 from first generation to latest 10th generation tested at 1080p and 1440p using a RTX 3080.

Resolution is the key to making CPU difference disappear. The higher the res, the lower the CPU relevance.

Edit - For the record, I have owned 3 in this list. i7 920, i7 2600K, and i7 7700K. After seeing this and currently gaming on a 1440p@165hz monitor, I'm more interested in upgrading to 4K@144hz than I am worried in any way about my 7700K being paired with a 3090. At 1440p, these tests show a tiny delta of performance between 7700K and 10700K. Almost nill, to be exact.

To take it a step further, if you watch the 1440p portion of these tests, and extrapolate the 10% improvement of 3090 over 3080, then it begins looking like a 1440p@165hz monitor actually isn't so badly paired either as most of the time, newer games still push these GPUs below 165fps when things like ray tracing and AA are stacked on top of each other. It's actually a perfect pairing in efficiency in some ways...

Last edited: