Ryzen 5 5600X vs. Ryzen 7 5800X vs. Ryzen 9 5900X vs. 5950X: GPU Scaling Benchmark

131dbl

Posts: 28   +4
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.

Two problems with your statement.

One, open apps may or may not have any affect on a game. For instance having 20 tabs in a browser open will have a TINY affect if some of the tabs are actively updating the page, but the work on the CPU is TINY. It does on the other hand load up memory. I could go on with the other apps you pointed out and it's the same. They ONLY have an effect if they are actually doing work. Otherwise they're taking up space in memory for the most part. That could end up being a problem.

Point two. WHO DOES THAT????

When I'm playing a game, I'm not listening to some tunes or blah blah blah. I have everything I don't need shut down. I don't stream, so a 6 core CPU is fine. But, I build for a few years and different game companies have said they're working on improving CPU usage for higher core CPUs, so I got a 5800X. But, there's other work I do where a 5800X is better than the 5600X. I could get more benefit from the 5900X, but this is work that's a few hours per month, and it's not worth spending any more money than what I paid for the 5800X.
 

jpuroila

Posts: 319   +180
Although the results are valid, the value of this test is precisely zero.

Playing on pc differs from playing on a console - you can have 50 opened tabs in your web browser, play music through Spotify, Tidal etc., Discord app running, maybe downloading something to use later. F.e., if I die in a multiplayer match I may jump to Windows and browse the internet and move between multiple apps. How will it impact the smoothness of the system will all the things running simultaneously? Nobody knows.

The above stated is exactly the reason why somebody would consider 5800x or 5900x over 5600x "for gaming". Because who just turns on the pc, starts up the game, and does nothing else? And not because we know that 8-core CPU will help, but it will mitigate the potential problems tremendously.
If you're streaming or running some other heavy(-ish) workload in the background, that's one thing, but having spotify, web browser, etc. open in the background shouldn't even come close to maxing out a single core.

An extra couple of cores wouldn't hurt for the <1% of gamers out there who stream. For the 99%ers, Steve's numbers are relevant.

As another anecdote added to those few posts above this one: my kid wanted to record some gameplay and upload to YT (not stream, just record) and you know what? A crappy low-end slot-power-only GTX 1050Ti with NVENC actually did a damn good job in Fortnite at 900p 60fps locked, once you adjusted the settings (undervolt & a small underclock) so the card wasn't trying to draw too much power from the slot. I swapped in a GTX 1060 6G as we were asking quite a bit from that little guy and now everything just works, no adjustments needed.

So those extra cores must only be for the few streamers out there.
When Youtube already compresses the crap out of whatever you post there, you really want to start with the highest quality encoding you can manage. This means software encoding. Obviously that's not important when you're a kid who just wants to upload a clip for your friends, but actual youtubers allegedly care quite a bit.
 
Two problems with your statement.

One, open apps may or may not have any affect on a game. For instance having 20 tabs in a browser open will have a TINY affect if some of the tabs are actively updating the page, but the work on the CPU is TINY. It does on the other hand load up memory. I could go on with the other apps you pointed out and it's the same. They ONLY have an effect if they are actually doing work. Otherwise they're taking up space in memory for the most part. That could end up being a problem.

Point two. WHO DOES THAT????

When I'm playing a game, I'm not listening to some tunes or blah blah blah. I have everything I don't need shut down. I don't stream, so a 6 core CPU is fine. But, I build for a few years and different game companies have said they're working on improving CPU usage for higher core CPUs, so I got a 5800X. But, there's other work I do where a 5800X is better than the 5600X. I could get more benefit from the 5900X, but this is work that's a few hours per month, and it's not worth spending any more money than what I paid for the 5800X.

Me? I had a 4core previously, albeit with a strong 1core perf. And do you know what happened when I have switched to a 12core? A goddamn Christmas miracle.
Simple alt-tabbing from a heavy multiplayer game and then frenzying around the internet was suddenly instant, and with no waiting for something to unfreeze itself. Of course, this is an extreme example (4core to 12core).
But my point stands, you can say whatever you want about the impact of that app and this app on the performance of the CPU, my experience has become a whole lot more fluid.
The question is, would my experience be the same with a 6core? I don't know. And this test doesn't cover that question.
 
Last edited:

Athlonite

Posts: 214   +74
Something's not right I got way better FPS on 1080p Ultra in CP2077 with my Sapphire Pulse RX5700 8GB card than you show for the RX5700XT (avg fps 86 with 1% lows of 67) for the RX5700 I also get better fps than that 3070 with my Sapphire Nitro+ RX6800 16GB OC (avg fps 137 with 1% lows of 110) than what you're showing here same setup as with the RX5700 which is
Asus Strix X570-F Gaming
AMD R7 3700X @ 4275MHz all core 1.250V
16GB DDR4 TridentZ RGB 3200MHz
Adata XPG SX8200Pro 1TB NVMe (the good one not the one with the bait n switch **** controller)
Sapphire Nitro+ OC RX6800 16GB no oter OC than what the GPU comes with and a slightly better fan curve
 
PLEASE CLARIFY FOR THIS OLD MAN...

For those focusing on Flight Sims primarily and Combat Flight Sims more specifically...DCS, IL-2, etc...I have always read these games are CPU intensive/limited or whatever term is... Please help me understand why the quote highlighted in the summary section was presented as a definitive fact...of course unless it is.

In any case, I've been consistently told FLIGHT SIMS are CPU demanding and dependent more so than GPU...(throw RAM in the mix as well)... Without having a strong/fast CPU wouldn't be able to run DCS for example: with all the AI aircraft, scenery, clouds, landscapes, etc being rendered, etc...In short, the CPU is critical...In fact, the emphasis with these games, in particular, focuses on the fact these only utilize one core...they may max that one core out...but they only use 1...therefore 6-12+ cores are not gonna improve these games (am I wrong???)...

As I alluded to earlier my confusion occurred when I read the emphasized quote in the summary/conclusion section of this article. See below (not verbatim but close);

"Doesn't matter how fast your GPU is, could be an RTX3090 it will not be compromised by a Ryzen 5 5600X (6-core/12-Th) CPU.."

That quote was emphasized as a definitive conclusion which seems misleading...I realize is based on the averages taken and discussed...However, from my admittingly limited understanding, the data itself doesn't support that conclusion...Therefore, shouldn't the conclusion been phrased something to the effect of...

"GAMES WHICH ARE CPU (Limited) INTENSIVE WILL ONLY BE so ie. IMPACTED BY THE CPU WHEN THE GAME IS PLAYED @ 1080p RESOLUTION...OTHERWISE, so-called CPU intensive games at 1440p or 4K resolutions lead to the "CPU" dependence evaporating giving rise to a nearly exclusive GPU dependent game..."

In other words, would it not be best to emphasize the resolution as the break-point vs the CPU or GPU??? On a personal note, concerning FLIGHT SIMS, in truth don't they rely heavily on a robust CPU AND GPU as well as @ least 32GB of RAM @ 3200-4000MHz at a minimum for any resolution???
 
Last edited:

Gastec

Posts: 102   +46
Perhaps these Radeon RX 5700 XT and 5600 XT results will now please Nvidia and they will decide to resume the collaboration with TechSpot and Hardware Unboxed. Have they tweeted about it yet?
 

131dbl

Posts: 28   +4
Me? I had a 4core previously, albeit with a strong 1core perf. And do you know what happened when I have switched to a 12core? A goddamn Christmas miracle.
Simple alt-tabbing from a heavy multiplayer game and then frenzying around the internet was suddenly instant, and with no waiting for something to unfreeze itself. Of course, this is an extreme example (4core to 12core).
But my point stands, you can say whatever you want about the impact of that app and this app on the performance of the CPU, my experience has become a whole lot more fluid.
The question is, would my experience be the same with a 6core? I don't know. And this test doesn't cover that question.
Yes there's a difference between 4 and 6 core CPUs even for running GAMES. Next, if you're using a 4c/4t CPU those have already been shown to be slower and stutter when running certain games. So, you're going from extreme cases. You didn't have to jump up to a 12c/24t CPU to notice a big improvement; you would have seen that simply by going to a 6c/12t CPU.

This comment thread for the 5600X, 5800X and 5900X and there was no reason to throw in the 5950X but he did anyway. As he showed, in gaming the 6c/12t CPU gave a little slower performance than the other CPUs and any of these can be paired with the best GPU, for gaming. When you start throwing in other use cases you are out of the realm of these comments and the video.

So, I'm going to stick with what I said, assuming you're using a 6c/12t CPU at a min just like this video was about, and once again, if you have apps loaded, but aren't ACTIVELY running a task, they are stuck in memory, but aren't using CPU cycles, so having 20 tabs on a browser open makes ZERO difference, UNLESS they are actively updating data, as some websites do. If you're running other tasks that are ACTIVELY running in the background, well, that's just not a smart way to game and it doesn't matter how many cores you have because at times the apps are going to be competing for the same resources. So, once again my second point, almost no one does that. When they load games they stop all active programs so you don't have resource conflicts.

Maybe your first comment, when you're out of the realm of what the video is talking about should be more clear about what you're doing.
 
Yes there's a difference between 4 and 6 core CPUs even for running GAMES. Next, if you're using a 4c/4t CPU those have already been shown to be slower and stutter when running certain games. So, you're going from extreme cases. You didn't have to jump up to a 12c/24t CPU to notice a big improvement; you would have seen that simply by going to a 6c/12t CPU.

This comment thread for the 5600X, 5800X and 5900X and there was no reason to throw in the 5950X but he did anyway. As he showed, in gaming the 6c/12t CPU gave a little slower performance than the other CPUs and any of these can be paired with the best GPU, for gaming. When you start throwing in other use cases you are out of the realm of these comments and the video.

So, I'm going to stick with what I said, assuming you're using a 6c/12t CPU at a min just like this video was about, and once again, if you have apps loaded, but aren't ACTIVELY running a task, they are stuck in memory, but aren't using CPU cycles, so having 20 tabs on a browser open makes ZERO difference, UNLESS they are actively updating data, as some websites do. If you're running other tasks that are ACTIVELY running in the background, well, that's just not a smart way to game and it doesn't matter how many cores you have because at times the apps are going to be competing for the same resources. So, once again my second point, almost no one does that. When they load games they stop all active programs so you don't have resource conflicts.

Maybe your first comment, when you're out of the realm of what the video is talking about should be more clear about what you're doing.

Yes, I think I should have worded it differently in my first post too. How nimble the CPU is when jumping from task to task (gaming-web etc.) is important to me.

The decision between 6/12, 8/16 or 12/24 was forced by circumstances. I am not going to buy a 6/12 when there are 8/16 consoles. I did that mistake with a 4/4.

So, it was either 5800x for 500€, or 5900x for 600€. In no way and not until hell freezes will I buy an 8/16 for 500€. And only 100€ difference between 8/16 and 12/24? Sadly, there was nothing to think about.
 

131dbl

Posts: 28   +4
Yes, I think I should have worded it differently in my first post too. How nimble the CPU is when jumping from task to task (gaming-web etc.) is important to me.

The decision between 6/12, 8/16 or 12/24 was forced by circumstances. I am not going to buy a 6/12 when there are 8/16 consoles. I did that mistake with a 4/4.

So, it was either 5800x for 500€, or 5900x for 600€. In no way and not until hell freezes will I buy an 8/16 for 500€. And only 100€ difference between 8/16 and 12/24? Sadly, there was nothing to think about.

I approached the purchasing differently. It was, no way in hell am I going to buy a 12c/24t CPU when I know the work I do never requires that much power so why waste $100 USD. So I bought a 5800X. If you watch gaming reviews it's right at the top almost all the time, so I'd rather put $100 elsewhere. A 6c/12t CPU doesn't quite meet my needs for some of my workload, so it was a no brainer. If it's like how the Zen 2 CPUs compare, the 3700X was the best at power consumption based on any given workload. I imagine the same is true for the 5800X (how much power is consumed to complete any particular task, not how much power is consumed at any given moment). The tests I saw seemed like this should be true, and often it's better than the 5600X in certain workloads by more than what you would think it would be by simply being two more cores. So that's what my choices were, the 5600X or 5800X, and the 5800X seems more logical for what I do.
 
While you make some valid points, one could also argue that some might say, "Who games without turning off excess browsing tabs?"
Are you serious? You can't be serious. You have to be trolling.

Nobody goes through their tabs and gets rid of the ones they don't need every time they open a game. No one.
 

131dbl

Posts: 28   +4
If one browser instance is open, have setting in your browser that on restart, opens up the tabs that were open when you closed it.

If multiple instances of browsers are open, and you use Firefox as an example, open Task Manager, go to processes, there will be quite a few Firefox processes, find the biggest in memory usage, and end the process. If that doesn't end all the processes try another big one in memory usage. Sometimes killing the first one in the list that uses a bit of memory will close them all.

I have yet to see how to close multiple instances of Firefox and retain all tabs without killing Firefox in Task Manager.

And if you ask who does this, well, me for one, and it's simple and tabs recover correctly except on very rare occasion. If you feel this is dangerous because you want certain pages always open, make a damn folder and stick bookmarks in it so it's easy to go to that one folder and open your tabs again.

Browser 101?

I agree with the inverse question, and no I'm not a troll. I could understand if you're getting critical data from certain pages, but then why don't you just game on another system, on a different monitor? It's really hard to game and pay attention to real time data at the same time. Easier to have critical data on another screen, pause game, look at other screen, then go back to game.
 

Axle Grease

Posts: 193   +112
Cyberpunk should have a single overall preset for Max quality / performance graphics named “CyberPsycho”.

I got very good performance in Max settings with my 3090FTW3 on both a new i9 CPU and an older 5960x.

I remember having to change some settings in Cyberpunk as my 5960X was running at 90%. It's running about 70%, and at 2560x1440. I have an RTX 3080. The lower 60-80fps is helping the CPU which I think is in line with what this article shows.