"CPUs Don't Matter For 4K Gaming"... Wrong!

This is a bad comparison. CPUs don't matter much for 4k, so long as you have a decent CPU which the 3600 really isn't. Even just a 3700x over the 3600 would have made a huge difference. And also, cpu doesn't really matter for 4k unless you have a very powerful gpu. If you test the most powerful GPUs on the market with a $50 CPU from like 5 years ago, yea it's obviously going to be a bottleneck.
 
3600? Really? Not even a dirt cheap 5600x? This is a pointless comparison. Anything above 5600x only matters if you have an 800$+ modern gpu.
In the past week I've seen comments like I'm upgrading my gtx 1080 to a 5090. So the R6 3600 is probably better than whatever cpu that person has.
Here is a Techpowerup's 4k 7800X3D vs other cpus. The 7700x is about a 5% delta performance loss which is a significant value. The upcoming 5090 is likely going to make a lot of gamers cpu bottleneck at 4k.1000038771.png
 
The up-scaling arguement is legit. Even those mad or rich enough to buy a $2000 GPU will find they won't be able to do 4k60 at full native high settings in most modern titles, except for older games and a handful of pleasent exceptions.

You will be using up-scaling, and at the very least you'll want to maximise the frame-rate for doing so. As such it's wise to pair a CPU that is less likely to leave performance on the table.

All of the possible combinations of hardware and software make PC gaming performance a very volatile thing to predict. The handful of charts and measures we see in the reviews can only be indicative, and won't cover every scenario we come across in real world usage.
 
First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.

Second off, if you look at your data, the only way to be CPU bound, and using max settings, is with a 4090.

Lastly, you just proven that if you don't use a 4090, you will likely see barely any improvement by playing at 2160p with a better CPU, so I don't understand the point of this article at all since EVERY time you made some CPU benchmarks, you LITERALLY disregarded any feedback requesting 2160p results because you were raising the point that you will be GPU bound. However, now you are going 180 degrees?

I can't follow your logic anymore.

Matter of fact is that if you play games at 2160p with a high end GPU;
  1. You will most likely play with really close to maximum graphical settings
  2. The title you are going to play will be GPU bound
Depends on the game. Online games I play high/medium @ 4k. Single player games is High and very rarely Ultra as there isn't' a huge difference if any. I rather have the eye candy than have 100+ FPS. 60FPS I'm happy with.
 
It is true that high fps and RTX on a $2000 modern GPU need a faster CPU than a 5 year old medium to low CPU that cost at the time only about $200. But even so, without RTX, it holds quite well on this extremely unbalanced setup.
 
The OP felt right, that the selection of Ryzen 3600 was off.
To me the article proved, that CPU doesn't matter for 4K gaming (within reason). If you really want to simplify this subject, then my conclusion is that.
One has to be illiterate in computers to have a PC with RTX 4090, an AM4 mobo, and a Ryzen 3600 in it. The very least a person with that mobo and GPU would do is to upgrade to the Ryzen 5600.
The whole article is set up to disprove an oversimplified statement, and only does a half-good job in that, and declares victory. The very minimum would be to include the R5 5600. Instead of spending time on tests, which's results are not even included, that would have been much more meaningful, but probably less satisfactory for the OP.
 
I think a lot of you are making arguments against things that weren't tested or brought up in this article. The test was whether the statement "CPUs don't matter for 4K gaming..." I think they proved that a CPU can matter in 4K gaming. Though every game chosen utilizes the CPU quite heavily, it's clear that a CPU can matter in 4K gaming.

Now, does that necessarily make the statement "CPUs don't matter as much in 4K gaming" false? No. But that's another test.
That's dumb though. Obviously CPU is going to matter at some point. They might as well have used a Q6600 if they just wanted to show that it's possible to create a CPU bottleneck. This is just no where near a realistic scenario so it's useless data. The question they posed was basically taken out of context, because any sane person would assume you're talking about a realistic scenario when asking if CPU matters for 4K. The answer is still that it generally doesn't matter, assuming you're talking about a reasonable set of hardware to begin with. Really no one should be using a CPU with less than 16 threads for modern gaming. The 3600 doesn't even have as many threads as a current gen console.
 
Good article! I'm surprised people would think CPU doesn't matter for 4K gaming? But this article definitely shows that indeed the CPU does matter.

One thing I will note, if you are using Linux as I daresay quite a few gamers are these days; I've read the Nvidia driver is noticeably more CPU intensive than Mesa Gallium (used by AMD and Intel GPUs). I have no idea if that's also the case in Windows; but something to keep in mind if you're considering skimping on the CPU for an Nvidia-based Linux gaming rig.

I'll aslo just note, with games like CP2077 and The Last of Us Part I both being quite resource-hungry especially with higher settings, and (in the case of TLOUI) rather poorly optimized port, you may regret skimping on the CPU in a few years if the games have even more CPU-intensive higher-end settings, or are poorly ported (so it needs more CPU time than it "should"), or both.
 
In deed a 4090 on a PCIE gen 3 link...maybe the test should be done with a CPU that has at least gen 4 PCIE.
 
This is a clever article and I do think the upscaling argument is it's strongest play as that has become a popular way to squeeze out some extra performance in a GPU limited scenario, but people don't generally think about suddenly bumping into a CPU limited scenario by enabling it.

As mentioned here in the comments, the traditional thinking was if you are CPU limited, pile on the image quality features (including VSR/DSR) until you shift back to a GPU limited scenario with a much nicer looking game.

Story time: Built a spare SteamPC for the kids out of some old parts I had lying around. 2200G + 980Ti Combo, a reasonable pairing given the vintage of both parts. Kids wanted to play Hogwart's Legacy and gawd damn but the system managed to run 1080p FSR Performance (so 540p internal resolution) but sort of stuck between 20-40FPS unless you were staring at a wall.

Same lightbulb went off in my head, I went out and snagged a 5600x as a nice dirt cheap drop in replacement. Same settings, system now does 30-60 FPS, staying on the 60 end much more than the 30 end.

Essentially the exact same experience with some age shifted hardware.
 
This is a clever article and I do think the upscaling argument is it's strongest play as that has become a popular way to squeeze out some extra performance in a GPU limited scenario, but people don't generally think about suddenly bumping into a CPU limited scenario by enabling it.

As mentioned here in the comments, the traditional thinking was if you are CPU limited, pile on the image quality features (including VSR/DSR) until you shift back to a GPU limited scenario with a much nicer looking game.

Story time: Built a spare SteamPC for the kids out of some old parts I had lying around. 2200G + 980Ti Combo, a reasonable pairing given the vintage of both parts. Kids wanted to play Hogwart's Legacy and gawd damn but the system managed to run 1080p FSR Performance (so 540p internal resolution) but sort of stuck between 20-40FPS unless you were staring at a wall.

Same lightbulb went off in my head, I went out and snagged a 5600x as a nice dirt cheap drop in replacement. Same settings, system now does 30-60 FPS, staying on the 60 end much more than the 30 end.

Essentially the exact same experience with some age shifted hardware.
Oh yeah, I have a Coffee Lake (i7-8700, 6C/12T) now but I had been running a Ivy Bridge (i5-3470, 4C/4T). GTX1650 GPU. I fired up CP2077 and it would get like 30FPS on low? (On the benchmark where he walks through the bar etc., it was generally a bit higher in real play.) (No resolution scaling, I'm running a 1280x1024 monitor on there already.) Then I looked at "nvidia-smi -l" (I could have gotten the same info via the GUI) and saw GPU utilization was only like 30%! Indeed, I cranked most of the settings up to high, got GPU utilization to about 90% and I think FPS dropped MAYBE 1-2FPS, but it sure looked nice!

On the Coffee Lake, on high it of course just pegs out the GPU (getting about 35FPS... 30 to 40.., 98% utilization) but I suppose given on low it was getting 30FPS at 30%, it would probably be pushing close to 100FPS on low. I haven't played it in a while but it's not exactly a twitch game (and I'm not a member of the "PC Master Race", I don't give a care about running at like 120 or 240FPS; 60FPS is better but people got by for decades playing at 30FPS.) So I'll probably keep it on high and enjoy the view if and when I get into some gameplay on it again.
 
Last edited:
I just fear, all those folks not giving a * and pairing their 4090 and their fancy 4k OLED setup with the old Ryzen R5 3600 or slower won't even bother.

Or maybe it's just one reckless guy ;)

To this crazy mf I say: The time has come, my friend. Upgrade to a 7500F with 16GB of 4800 MHZ Ram, so this fall your brandnew RTX 5090 has a decent partner for the next five years. And ditch the old 4090 into your second 4k backup rig, where your trusty i7 4790k awaits. Did you read the article? Don't you get the point? Do you still want to look at yourself in the mirror? Go buy the 7500f, for crying out loud :)
 
Last edited:
This is a bad comparison. CPUs don't matter much for 4k, so long as you have a decent CPU which the 3600 really isn't. Even just a 3700x over the 3600 would have made a huge difference. And also, cpu doesn't really matter for 4k unless you have a very powerful gpu. If you test the most powerful GPUs on the market with a $50 CPU from like 5 years ago, yea it's obviously going to be a bottleneck.

Half of this comments section needs to be submitted to r/whoosh. But seriously you think the 3700X would be much faster than the 3600 in this testing? How about.... a little bit faster, relative to the 7800X3D, a TINY bit faster.
 
First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.

Second off, if you look at your data, the only way to be CPU bound, and using max settings, is with a 4090.

Lastly, you just proven that if you don't use a 4090, you will likely see barely any improvement by playing at 2160p with a better CPU, so I don't understand the point of this article at all since EVERY time you made some CPU benchmarks, you LITERALLY disregarded any feedback requesting 2160p results because you were raising the point that you will be GPU bound. However, now you are going 180 degrees?

I can't follow your logic anymore.

Matter of fact is that if you play games at 2160p with a high end GPU;
  1. You will most likely play with really close to maximum graphical settings
  2. The title you are going to play will be GPU bound
First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.
spot on
 
Once again: Testing the CPU at high resolutions is pointless; the inherent GPU bottlenecks will cause most mid-tier and higher CPUs to produce the same FPS due to the GPU bottleneck.

That's why you test CPUs at lower resolutions; the point isn't to see what FPS they output, but to see how much headroom they have for growth in the future (assuming a future GPU upgrade).

I remember having this same argument with people back when Bulldozer released, where people seriously argued the FX-4000 series were as good CPUs as the FX-8000 (and even Sandy/Ivy Bridge) because "they performed the same at high resolution gaming". And while Bulldozer did not age well AT ALL, at least the 6000 and 8000 were usable, unlike their little cousin.
 
I'm a bit late to the thread, but it seems to still be the case from the data presented that pushing high graphical settings at 4K largely causes performance parity across different CPU's.

If whatever you're playing utilises the GPU to 100% at 4K and it's an "ancient" Ryzen 3000 series driving it, then upgrading the CPU will do little to improve your FPS. Marginal increases in the 0.1% lows with a newer CPU and faster RAM possibly, but nothing ground breaking.

Money is still better spent on the fastest GPU you can fit within budget.
 
Once again: Testing the CPU at high resolutions is pointless; the inherent GPU bottlenecks will cause most mid-tier and higher CPUs to produce the same FPS due to the GPU bottleneck.

That's why you test CPUs at lower resolutions; the point isn't to see what FPS they output, but to see how much headroom they have for growth in the future (assuming a future GPU upgrade).

I remember having this same argument with people back when Bulldozer released, where people seriously argued the FX-4000 series were as good CPUs as the FX-8000 (and even Sandy/Ivy Bridge) because "they performed the same at high resolution gaming". And while Bulldozer did not age well AT ALL, at least the 6000 and 8000 were usable, unlike their little cousin.
As long as their is a significant performance delta at 5% then it's acceptable testing. Also the unknown here is the 0.1 % lows which improve with the 3d cpus as well with less fram variance. So if the midrange cpu is acceptable which is considered r5 7600 or even a slightly higher end cpu r9 7900 there is definitely a cpu bottleneck even with the current lineup at 7% and 6% delta loss performance at 4k vs 7800X3D. if these cpus are showing a cpu bottleneck here than you can extrapolate this information for you specific use case. Meaning if you don't have a 4090 you still have some headroom but if your future gpu is a better performer than the 4090 than you should see significant performance deltas again without knowing the 0.1 % lows.
 
Surely a better title would be 'To bottle neck or not to bottle neck' since most people would just look at this CPU performance metric to decide whether or not to upgrade to a more powerful CPU. If a CPU doesn't bottle neck moving to to a more powerful CPU will gain nothing to Joe Average.
 
Half of this comments section needs to be submitted to r/whoosh. But seriously you think the 3700X would be much faster than the 3600 in this testing? How about.... a little bit faster, relative to the 7800X3D, a TINY bit faster.
Do I think 25% more threads would help with a CPU bottleneck? Yes, absolutely. There is a huge difference between a 3600 and a 3700x in cpu limited scenarios.

The only time a 3700x does poorly is if a game both very demanding AND poorly optimized. A well optimized game will perform much better on a 16 thread cpu vs a 12 thread one. In a game like cyberpunk which makes great usage of more threads, there would be a very significant difference.
 
Do I think 25% more threads would help with a CPU bottleneck? Yes, absolutely. There is a huge difference between a 3600 and a 3700x in cpu limited scenarios.

The only time a 3700x does poorly is if a game both very demanding AND poorly optimized. A well optimized game will perform much better on a 16 thread cpu vs a 12 thread one. In a game like cyberpunk which makes great usage of more threads, there would be a very significant difference.
Almost all games are primary thread limited, it's been this way forever. I think it's a 33% increase (25% fewer), generally we see around a 10% improvement, but of course it does depend on the game. The point here being relative to the 7800X3D it makes bugger all difference.

First off, people playing at 2160p, like me, don't play at low, or medium settings. They play at max settings. So your data is not representative of a REAL Use Case.
spot on

Not everyone plays like you...
 
Almost all games are primary thread limited, it's been this way forever. I think it's a 33% increase (25% fewer), generally we see around a 10% improvement, but of course it does depend on the game. The point here being relative to the 7800X3D it makes bugger all difference.



Not everyone plays like you...
That's not really true at all. Many games are well optimized to take advantage of many threads. Typically the more demanding games that need the extra CPU power, like cyberpunk or BF games. Of course there are exceptions, but in general, the more demanding a game is the better it will make use of more threads.
 
You didn't read the article...

"Now, we can already envision comments claiming that the data here is misleading because the Ryzen 5 3600 is so much slower than the 7800X3D, and if that's how you feel, we're sorry to say it, but you've missed the point. Yes, we did use an extreme example to illustrate the point we're trying to make, but the point is, it's the frame rate that's important, not the resolution."

However, I don't agree with Steve when the only way to fall into CPU bound situations at 2160p is by either using a 4090, or using low or medium settings.

Like I was explaining just before, if you play at 2160p, it is because you crave for image FIDELITY. Which mean you will likely use max settings AT ALL TIME, which also mean that your CPU will have almost no impact in your framerate at 2160p unless you have a 4090.
I dont think thats a fair assumption. I like high fps so I tend to play on a mixture of high and medium at 4k
 
I'm interested in how the 5800x3d compares in these tests as it's marketed as the Ultimate Gaming CPU for AM4. We know how it stacks up against the 7800x3d so that testing doesn't need to be redone. Just how it holds up in 4k performance as this testing provided some interesting results.

 
Back