Ryzen 7 9800X3D vs. Ryzen 5 7600X: CPU and GPU Scaling Benchmark

Considering I game at 4k, I'm going to wait until next gen to do a platform rebuild. My 5800x3d and 6700xt are fine for now considering I (mostly) play at 4k on my TV. I think the 9800X3D is overkill because I'm likely to buy whatever replaces the 9070XT. I Use a 120hz 4k TV as a monitor and I typically target 90+ FPS for gaming. Frankly, I'm more interested in replacing my Samsung with an OLED than I am in a platform rebuild, but the 6700XT is showing its age and was never the card I actually wanted(GPU shortage and whatnot). the CPU market is way more reasonable than the GPU market right now so I might as well, I'm still stuck on PCIe gen 3.
 
Last edited:
This kind of testing is great. I do wish we had seen an MMO or two thrown in there (WoW, etc.). I get that those games are harder to benchmark in a consistent manner, but it feels like something a lot of people would benefit to see, especially as they are notoriously CPU heavy.
 
Me when I see TechSpot when they are doing a scaling comparison currently using my CPU but then I realize I will never have enough money to set aside for a 5090 lol
 
I saw somewhere that the x3d processors benfits greatly in MMORPG games like world of warcraft etc during raids and guild wars.
where there are 50+ players in one map, that's also one area where 3d vache from x3d processors are able to flex their muscles.
 
I saw somewhere that the x3d processors benfits greatly in MMORPG games like world of warcraft etc during raids and guild wars.
where there are 50+ players in one map, that's also one area where 3d vache from x3d processors are able to flex their muscles.
RTS games, too
 
RTS games, too
Sins of a solar empire is unplayable on 8+ player maps late game without x3d chips. Even the 5800x3d struggles to maintain 60 FPS, my i7 9700k was hitting int he mid 30s and since the game was single threaded it meant a 50%+ miss rate on mouse clicks. Actual nightmare.

Also Supreme commander even with 3600 mhz memory and a x3d encounters lag and slowdown in late game with more than 4 players.
 
Please include UE5 games.

I have very, very STRONG feeling that this scrapyard engine is maximum CPU-bound. Like, core pipeline level bound.
It looks like my 5800X3D is a handicap of the whole system in Silent Hill 2, Stalker 2 and Frostpunk 2, especially holding 1% lows BELOW 30 FPS

Or even more obscure bottleneck in DRAM, if there's a difference between squeezed D4 B-die and D5 A-die for both 9800X and 7600X
 
Considering I game at 4k, I'm going to wait until next gen to do a platform rebuild. My 5800x3d and 6700xt are fine for now considering (mostly) play at 4k on my TV. I think the 9800X3D is overkill because I'm likely to buy whatever replaces the 9070XT. I Use a 120hz 4k TV as a monitor and I think typically target 90+ FPS for gaming. Frankly, I'm more interested in replacing my Samsung with an OLED than I am in a platform rebuild, but the 6700XT is showing its age and was never the card I actually wanted(GPU shorrage and whatnot). the CPU market is way more reasonable than the GPU market right now so I might as well, I'm still stuck on PCIe gen 3.
PCIe gen 3 is fine as long as you dont exceed your VRAM buffer, you already have 12GB VRAM and thats prolly whats saving you so far, for upgrades aim for 16 GB minimum, a 9070 XT is quite an upgrade but I agree on the OLED side, I use an "old" LG OLED C8 from 2018 and even with its shortcomings (limited to 60 Hz, no VRR) I cannot go to QLED anymore, OLED is just that good.
 
PCIe gen 3 is fine as long as you dont exceed your VRAM buffer, you already have 12GB VRAM and thats prolly whats saving you so far, for upgrades aim for 16 GB minimum, a 9070 XT is quite an upgrade but I agree on the OLED side, I use an "old" LG OLED C8 from 2018 and even with its shortcomings (limited to 60 Hz, no VRR) I cannot go to QLED anymore, OLED is just that good.
Well the 5800X3D will get stuffed in server rack in the garage. I'm going to wait for the next generation of GPUs and I'm going AMD because Linux, but I might wait until DDR6 to upgrade my CPU as the 5800X3D is fine for now. It's likely faster than the 7600X, but I'm still on a B350 motherboard so I'm definitely feeling the age.

Frankly, it was Oblivion Remastered that made me want to upgrade. I was Anti RT for awhile, but then Oblivion came out and now I want a new GPU. I actually stopped playing right after Kavatch because I want to save me replaying again for a good system. That game WAS highschool for me. It's all me and my friends talked about Junior and Senior year
 
"While this may not be the most insightful metric given we only tested four titles, we include it because skipping it for any reason will definitely generate a lot of comments about that alone, so here you have it to avoid that from happening."

I'm a data scientist in my day job, and with experience in presenting things to stakeholders, I feel this.
 
Basically unless you are playing super cpu limited games, it makes little difference about the cpu if not paired with the 5090. I haven't played at 1080p since about 2011. Even at 1440p with medium to high settings, 9800X3D brings little to the table outside the 5090. If you use a 9070XT or lower it's a pointless upgrade from 7600/7700X in the majority of titles.
 
The 9070XT is more popular than the 9070…
I was curious to see why the latter was chosen instead of the faster.
Then, as I was progressing thru the testing I understood why the slower 9070 was chosen.
But then I decided to ignore all that and focus on the CPU scaling which yielded good, usable results.
 
Yeah, now we need to know the list of CPU heavy games. I know Eve Online has reputation for this.
EVE runs on such an old engine that a client only uses 300MB VRAM when all settings are set to low. EVE is not a game that can be used for benchmarking.
 
What this testing doesn't reveal is frame time stability GN proved that the one major advantage of X3D chips is stable gameplay, not just higher fps.
 
Why not testing multiplayer games???
BECAUSE multiplayer cpu intensive games

look like gray zone warfare + The new unreal engine 5= most of extreme quantity of lagggs anomaly
 
What this testing doesn't reveal is frame time stability GN proved that the one major advantage of X3D chips is stable gameplay, not just higher fps.
least show 1% lows but we know 0.1% need it if want know how stable game without lags
 
With the cost of the gpus swelling out of control only to start to stabilize now to about 20% above msrp still, imo it's best to upgrade your cpu more frequently. Also the 9800X3D fell as low as $379 at microcenter with membership discount. While both cpus can be tweaked to improve performance the 9800X3D can bring around 10% further improvements in cpu workloads via tweaks.
 
Considering I game at 4k, I'm going to wait until next gen to do a platform rebuild. My 5800x3d and 6700xt are fine for now considering I (mostly) play at 4k on my TV. I think the 9800X3D is overkill because I'm likely to buy whatever replaces the 9070XT. I Use a 120hz 4k TV as a monitor and I typically target 90+ FPS for gaming. Frankly, I'm more interested in replacing my Samsung with an OLED than I am in a platform rebuild, but the 6700XT is showing its age and was never the card I actually wanted(GPU shortage and whatnot). the CPU market is way more reasonable than the GPU market right now so I might as well, I'm still stuck on PCIe gen 3.
At 4K a 5800X3D would still be good enough to drive a 4090/5080-level GPU. Or 9070 XT
 
At 4K a 5800X3D would still be good enough to drive a 4090/5080-level GPU. Or 9070 XT
I'm more worried about PCIe gen 3. All of my NVNE drives are either Gen 4 or 5. I also do lots of 3D CAD modeling so when I'm looking at an assembled projectto see how everything fits together, the extreme memory bandwidth of DDR 5 would help a lot. I'm trying to hold out for DDR6 as DDR 5 likes to overheat and throttle like DDR 2 did, but end of generation memory is usually better than the best a new generation has to offer so I might be better off trying to get a 9000 series zen chip to work with DDR 5 8000 memory
 
I saw somewhere that the x3d processors benfits greatly in MMORPG games like world of warcraft etc during raids and guild wars.
where there are 50+ players in one map, that's also one area where 3d vache from x3d processors are able to flex their muscles.
Yeah, people making weird claims like that and usually they are just wrong. People have been saying thr same about rts games and how 3d cache makes so much difference, when I tested a very heavy scene in total war that wasnt the case.
 
s
Yeah, people making weird claims like that and usually they are just wrong. People have been saying thr same about rts games and how 3d cache makes so much difference, when I tested a very heavy scene in total war that wasnt the case.
so it's a mixed bag then,
sometimes it just works...othertimes it could be the game engine, the driver, bad optimizations, etc etc.
it's hard to come to conclusion based from results on game per game basis I suppose.
 
Yeah, people making weird claims like that and usually they are just wrong. People have been saying thr same about rts games and how 3d cache makes so much difference, when I tested a very heavy scene in total war that wasnt the case.
I had the same issues with total war, but it makes a HUGR difference in AOE2 and Sins of a Solar Empire. Idk what total war's problem is. I love the game but I end up going to play it, play a few games over about a week and then drop it for another year.

S teir game, though. I have zero complaints about gameplay
 
Back