"CPUs Don't Matter For 4K Gaming"... Wrong!

I'm interested in how the 5800x3d compares in these tests as it's marketed as the Ultimate Gaming CPU for AM4. We know how it stacks up against the 7800x3d so that testing doesn't need to be redone. Just how it holds up in 4k performance as this testing provided some interesting results.
TechPowerUp covered it here with a RTX 4090:

TLDR: The 5800X3D trails the 7800X3D by around 5% at 4K, but it's important to note it's a near 20% gap at 1080P. IMO, most likely caused by faster RAM speed on the 7800X3D and the minor IPC uptick with Ryzen 7000 series.
 
Half of this comments section needs to be submitted to r/whoosh. But seriously you think the 3700X would be much faster than the 3600 in this testing? How about.... a little bit faster, relative to the 7800X3D, a TINY bit faster.
I mean, you're not wrong, CPU does matter for 4K gaming, of course you need enough CPU power to have a good experience. But I feel that going to the extreme and using a Ryzen 3600 lost the nuance of the argument.

When people say "CPU doesn't matter for 4K", obviously they mean "within reason". Nobody would reasonably build a Ryzen 3600 system today. But if someone is considering a Ryzen 7800X3D for 4K gaming, chances are the Ryzen 7600 will give them pretty much the same experience as the 7800X3D for half the price. And that's the information that is useful to have, because then it informs actual, practical purchasing decisions done today.

I think this article would be better with the 3600 + 5600 + 7600 + 7800X3D, for example. That way it still makes the point that it matters and you can't go too low end, but also shows that there are diminishing returns between the mid-range and the high-end.
 
I mean, you're not wrong, CPU does matter for 4K gaming, of course you need enough CPU power to have a good experience. But I feel that going to the extreme and using a Ryzen 3600 lost the nuance of the argument.

When people say "CPU doesn't matter for 4K", obviously they mean "within reason". Nobody would reasonably build a Ryzen 3600 system today. But if someone is considering a Ryzen 7800X3D for 4K gaming, chances are the Ryzen 7600 will give them pretty much the same experience as the 7800X3D for half the price. And that's the information that is useful to have, because then it informs actual, practical purchasing decisions done today.

I think this article would be better with the 3600 + 5600 + 7600 + 7800X3D, for example. That way it still makes the point that it matters and you can't go too low end, but also shows that there are diminishing returns between the mid-range and the high-end.
Unfortunately you've missed the point of the content. The point is the "CPUs Don't Matter For 4K Gaming" argument is wrong, not just because CPUs can make a difference at 4K, but MORE because it's the wrong way of looking at CPU performance.

Rather it's about how many fps your CPU can drive at a low resolution coupled with how many fps your GPU can deliver at your target resolution, and then of course the quality settings you're willing to downgrade in order to achieve your target resolution/frame rate.

If you're happy playing the next greatest game at whatever frame rate just to enjoy the best visuals at 4K, then sure, the CPU probably doesn't matter to you, though modern games, especially with RT enabled, are becoming increasingly CPU demanding. But if you're happy with sub 60 fps performance then that's fine I guess.

As for the Ryzen 5 7600, there would be next to no difference between the 7600 and 7800X3D at 4K in the games just tested, as I said in the article. The 7800X3D is just 15% faster on average than the 7600 at 1080p! It also costs 95% more, so you're not tossing up between the 7600 and 7800X3D, but if you have the mentality that "CPUs Don't Matter For 4K Gaming", then you might still be using a Ryzen 5 3600, and BTW it's generally people using much older hardware that tell us this.
 
The GPU of choice is wrong! The 4090 is such a beast it will bottleneck any current CPU. It's way ahead of curent gen processors. Next gen Intel 15 and AMD 9000 will prove I'm not wrong. Just look at this chart:
average-fps-3840-2160.png


The better the CPU the more FPS even at 4K.

HERE!Link to the article
I guess by now we all know 13900 and 14900 it's the same CPU? Right?
 
The GPU of choice is wrong! The 4090 is such a beast it will bottleneck any current CPU. It's way ahead of curent gen processors. Next gen Intel 15 and AMD 9000 will prove I'm not wrong. Just look at this chart:
average-fps-3840-2160.png


The better the CPU the more FPS even at 4K.

HERE!Link to the article
I guess by now we all know 13900 and 14900 it's the same CPU? Right?
The 7700 XT and 7900 XT were also included...
 
The 7700 XT and 7900 XT were also included...

Thanks for the notification, in the forum section I was not able to notice. And yes all components used in a build matter not only the CPU and GPU. The thing is to pair them all in order to have optimal performance and all according to your gaming needs and budget.



 
Imagine writing an entire “but actually” article and missing the point entirely.

No one ever said CPUs didn’t matter for 4k gaming - they said they mattered LESS - ie if you’re on a specific budget for a 4k rig, it makes more sense to spend the money on a GPU, once you’ve budgeted a baseline good enough CPU.

Obviously no one with a fixed budget should get a 4080 with a 3600. But at the same time, getting a 4060 with a 7800X3D makes no sense either. This is all about balance, and the balance should very much be on the GPU for 4k gaming.

Please think through the reasoning behind statements before writing an entire article with benchmarks next time! You can save the effort.
 
Like I was explaining just before, if you play at 2160p, it is because you crave for image FIDELITY. Which mean you will likely use max settings AT ALL TIME, which also mean that your CPU will have almost no impact in your framerate at 2160p unless you have a 4090.

Ray/Path Tracing has entered the chat.


 
Imagine writing an entire “but actually” article and missing the point entirely

I couldn't agree more.

In particular, starting with the word 'Wrong!' is misleading if your argument is based on an obsolete 5 year old CPU and just three curated games.

The 'Wrong' argument implodes with a tiny 7500f or 13400f, probably even with an 5600(x)/5700x. Yes, a cpu DOES matter for 4k if you pair a modern GPU with an Ryzen 1600(x) or i5-7500. You will get less frames, even in 4k full eye candy. If you stick to the cheapest modern CPU, not so much. Duh.

The article underlines the argument it seeks to counter. I had high hopes for an interesting article after reading the headline. And surely, a lot of work from Steve went into it. I know that and I appreciate it. But unfortunately, there is nothing counterintuitive going on here. Move along, 4k gamer, nothing to see here.
 
Last edited:
The testing completed and published by the author appears to undermine his own title. So you can play games at 4K60 on a 6 year old mid range budget CPU? I actually did not think was possible in 2024. So my conclusion on the data presented is that CPUs actually matter less than I thought for 4K gaming. Personally.

 
I have a R5 3600 and recently upgraded to a 7900xt for $600 from a 5700xt. I had planned on a 7800xt. Now I see the R7 5700x3d for $200. I already knew a faster, newer CPU would be better for higher tier GPU's, but didn't know by how much. I have a better idea now than I did. Despite what some say here and excluding the fact that I have just happen to have two of the pieces of hardware that were tested, it was interesting to see just how much the GPU/CPU pairing mattered especially at 2k.
 
I understand the theory of this test but I would like to see similar test with a little weaker gpu. Just to see the results. Obviously you are not going to pair a 3600 with a RTX 4090 (no normal person would). I have seen games that would push a strong cpu to it's limit even in 1080p. An example would be, City Skyline with mods.

What I would like to see this test with something like an RTX 3060TI in 1440p with the same CPUs. I know you won't pair a 3060 TI with an 7xxx cpu other than maybe a 7600x. It would just be interesting to see what an older gpu with a strong cpu will do in res higher than 1080p
 
I understand the theory of this test but I would like to see similar test with a little weaker gpu. Just to see the results. Obviously you are not going to pair a 3600 with a RTX 4090 (no normal person would). I have seen games that would push a strong cpu to it's limit even in 1080p. An example would be, City Skyline with mods.

What I would like to see this test with something like an RTX 3060TI in 1440p with the same CPUs. I know you won't pair a 3060 TI with an 7xxx cpu other than maybe a 7600x. It would just be interesting to see what an older gpu with a strong cpu will do in res higher than 1080p

Read a review of the 7600x and see how it does in games when not restricted by the GPU. You now see its capability.
Read a review of the 3060 Ti to see how it does in the same games when not restricted by the CPU. You now see its capability.

The lower fps number will be the restriction/bottleneck, and when combined in the same system the total fps will likely be a little lower than the lower number, by 10% or so.
 
Last edited:
I must defend the article writer. You don't need to do these endless tests with different CPUs, different GPUs, different games, as some claim. They demonstrated some games not being effected by the CPUs used in the test, and some are. Some people are not members of the "PC Master Race" and will indeed pop a newer GPU into a system with a chip this age -- it's not THAT old.

Really, if a game is CPU-efficient it can handle higher FPS before the CPU maxes out (more likely to hit the limit of the GPU). If a game is heavy on shaders and effects, raytrcing (if you have that on), it puts more load on the GPU and it's easier to hit full GPU load. Some games are.. I think the kind term is poorly optimized, so they'll need all the help they can get.

I can say, with my GTX1650 and 720 gaming, my Ivy Bridge I had before would easily peg the CPU before the GPU hit 100% load in most games (most were not even hitting 50% -- obviously this is with vsync off and 60FPS cap off, since naturally GPU load might be low if it's easily hitting the FPS limit.) GravityMark was about all that could hit 100% GPU load. CP2077 was only hitting something like 20% load on low, so I turned the quality up to high -- only a 1FPS drop, but GPU load went up to 80% (and game looked a lot better.) My Coffee Lake has significantly better per-core performance, and 6C/12T instead of 4C/4T (and I think newer PCIe version so it may get better GBps to/from the GPU too). It has a much easier time pegging out the GPU.

This is the same in principle -- 4K (3840x2160) pushes about 8x the pixels of 1280x720, but the RTX4090 is about 8x the speed of the GTX1650. These Ryzens are no slouches, but I'm not surprised they still found a few games where the CPU pegs out first.

Personally -- I think it's good to know and good for people to keep in mind. That said, I personally wouldn't sweat getting 75 versus 90FPS -- I do run my monitor at 75hz refresh, and most games either hit that or have their own 60FPS cap they hit; but honestly if I hit 30FPS fine with me (only really an issue with things like TLOUI -- which manages to peg out the GPU even on low settings.) I wouldn't waste the money putting a 4090 in my current rig, but I also wouldn't sweat putting a newer card in and having a few games only get the GPU to like 75% load instead of 100% as long sa the FPS is decent.
 
Last edited:
I think this is a very interresting and important article, some might claim that everyone plays at max settings but that just isn't true.
At 4k your images are already crisp looking, I'd rather have 120fps@4k medium then 45fps@4kultra, so yeah this is very good info to have, I would encourage reviewers to add data with @4k medium and low results and their fps. Very nice review ty!!
 
Back