Faster GPU or Faster CPU: Ryzen 7600 + RTX 4070 Ti vs. Ryzen 7800X3D + RTX 4070

Would you really play in 1080p with a 4070 or a 4070TI?

If the answer is yes, than you are clearly clueless about DIY PC. Consoles are providing higher resolution gaming already.

This analysis would be better with GPUs like the 7600, 7700 XT, 4060 and 4060 TI.
 
Last edited:
That's one subject where I always go against the grain. I find that GPU bottlenecks are always easier to manage than CPU bottlenecks. So for me it's faster CPU, always.

When games are bottlenecked by the GPU, you get overall low framerates that can be improved in several ways. You can dial down in-game resolution, detail levels, etc. Even if you can't bring frame rates to the level you want, you can at least force a framerate limit to have a smoother console-like experience.

CPU bottlenecks, on the other hand, make most games a stuttery mess - you get high frame rates with lots of stutters and microfreezes. And often there's very little that can be done to remedy it other than overclocking the CPU.
 
Can help but feel people who buy 4070ti (~$800US) are getting fleeced. Can't even play 4k@60fps natively except for old games.
 
AMD is all-in on min/max.
They actually okay'd pairing a 7900XTX with an 8000 series APU.
A 7900xtx likely wouldn't bottleneck an 8700G as it's basically a ryzen 7700 with a iGPU. As the benchmarks show, at higher resolutions, it actually makes more sense to spend more on a GPU than a CPU. There isn't a single benchmark where the 7800x3D pairing with the 4070 comes out on top at 4k.

I also don't know why everyone thinks anyone in PC gaming has an unlimited budget, compromises will almost always have to be made.
 
Last edited:
So, in the end, higher resolution needs higher end gpu..
and whatever cpu used, it doesn't matter if the gpu isn't strong enough to deliver the performance needed..
 
Consoles are providing higher resolution gaming already.
No, they aren't. The more demanding games already run at 1080p internally (like Returnal), or have dynamic resolutions with a lower bound close to 1080p (like Dead Space, Alan Wake, Cyberpunk and so on). And in especially bad cases, like Jedi Survivor and Final Fantasy 16, the internal resolution can drop all the way down to 720p, as found by Digital Foundry in their tech reviews.

We're not even halfway through the generation yet, and the consoles are already going right back to being 1080p machines.
 
Always favour the GPU. In these esports titles that are CPU dependant you already have like 300 FPS so getting another 50-100 won't make much of a difference. As soon as you increase resolution, GPU dependance increases and in most games GPUs matter way more anyway, so...GPU.
 
Spider-man made me upgrade my living room PC from a 3900X to a 5800X3D to keep up with my 3080. (and next GPU upgrade)
 
In the age of 540Hz monitors, expensive GPUs and comp gaming? 100% yes.

ABSOLUTELY wrong when in today's age consoles are aiming for 1440p 120Hz and 2160p 60Hz...

Most of the time, those monitors are lacking HDR and VRR to boost the max frequency. They are absolute garbage.
 
Sure, if you want to see them all perform at their max with an R5.

This article is only an article to compare CPU and GPU bottlenecks.

The only time when you ask yourself the question about which one will be better for you is when you are on a budget, when you are asking yourself if you should invest more on your CPU or your GPU.

This will only happen at 1080p (and below) in modern day and age and nobody in their frigging mind will buy a 3090 or a 4070 TI to play at 1080p. For that reason, this eval should have been done with the tier of GPU below to pinpoint where that threshold is really starting to influence the scale.

Lastly, 1080p native resolution is dead in the display industry. All the TV panels are going 2160p. LG, Samsung and TCL are done with those panels.
 
Spider-man made me upgrade my living room PC from a 3900X to a 5800X3D to keep up with my 3080. (and next GPU upgrade)

Upgrading your screen would have made more sense if you are still trapped on 1080p. You could easily bought a 1440p monitor for about the same amount of money to dodge the issue of CPU bottleneck.
 
No, they aren't. The more demanding games already run at 1080p internally (like Returnal), or have dynamic resolutions with a lower bound close to 1080p (like Dead Space, Alan Wake, Cyberpunk and so on). And in especially bad cases, like Jedi Survivor and Final Fantasy 16, the internal resolution can drop all the way down to 720p, as found by Digital Foundry in their tech reviews.

We're not even halfway through the generation yet, and the consoles are already going right back to being 1080p machines.

Oh yes they are. Most of Sony First party game runs above 1080p. God of War even got a native 2160p mode.

wpLFb8ckrcFr3TeBuiwKoA-1200-80.jpg.webp
 
Upgrading your screen would have made more sense if you are still trapped on 1080p. You could easily bought a 1440p monitor for about the same amount of money to dodge the issue of CPU bottleneck.
My living room PC is hooked up to my 4K TV. I haven’t had a 1080pmonitor in over a decade.

But Spider-man is surprisingly CPU limited. First time my hand me down from my main PC CPU wasn’t fast enough.
 
Last edited:
For gamers, it's 101. A cpu only matters if it presents your gpu with a bottleneck. Period. If the gpu is able to hit 97-100% usage in all scenarios, a cpu upgrade probably won't help. Yes 1% lows matter greatly, but other factors could influence that beyond just cpu power.
 
Oh yes they are. Most of Sony First party game runs above 1080p. God of War even got a native 2160p mode.
That's because most first-party Sony titles are cross-gen. God of War Ragnarok is a PS4 game, not a next-gen game, did you forget that detail?
Returnal, on the other hand, is a PS5-exclusive first-party Sony game and it runs at 1080p.

This isn't something up for debate. The PS5 already has tons of 1080p and sub-1080p games, and that group is only gonna continue growing newer, more demanding games are announced. Epic literally recommends 900p as the target internal resolution for consoles for UE5 games using Nanite and Lumen.
 
ABSOLUTELY wrong when in today's age consoles are aiming for 1440p 120Hz and 2160p 60Hz...

Most of the time, those monitors are lacking HDR and VRR to boost the max frequency. They are absolute garbage.
And 540Hz 1080p monitors....
 
This article is only an article to compare CPU and GPU bottlenecks.

The only time when you ask yourself the question about which one will be better for you is when you are on a budget, when you are asking yourself if you should invest more on your CPU or your GPU.

This will only happen at 1080p (and below) in modern day and age and nobody in their frigging mind will buy a 3090 or a 4070 TI to play at 1080p. For that reason, this eval should have been done with the tier of GPU below to pinpoint where that threshold is really starting to influence the scale.

Lastly, 1080p native resolution is dead in the display industry. All the TV panels are going 2160p. LG, Samsung and TCL are done with those panels.
This is all kinds of wrong.
 
Hello, did anyone proof read the first sentence?

"Would buying a Ryzen 5 7600 with a GeForce RTX 4070 Ti be a better option than the Ryzen 7 7800X3D with the RTX 4070 Ti?"

I was like, what is the point of this article? Then I saw the mistake.
 
Last edited:
That's one subject where I always go against the grain. I find that GPU bottlenecks are always easier to manage than CPU bottlenecks. So for me it's faster CPU, always.

When games are bottlenecked by the GPU, you get overall low framerates that can be improved in several ways. You can dial down in-game resolution, detail levels, etc. Even if you can't bring frame rates to the level you want, you can at least force a framerate limit to have a smoother console-like experience.

CPU bottlenecks, on the other hand, make most games a stuttery mess - you get high frame rates with lots of stutters and microfreezes. And often there's very little that can be done to remedy it other than overclocking the CPU.

Agreed. The GPU is also easier to upgrade later (usually, it depends on the CPU vendor you go with and when in that platform's lifecycle you are buying). For a "right-now" analysis, if you know you will be playing primarily GPU-bound games, it would make sense to go with the better GPU.

But if you're like me, you don't *know* that is what you'll be playing for the couple of years between when you buy the system and when you decide to upgrade/get a new one. Indeed, I overestimated the amount of productivity work I'd be doing, so in retrospect I probably shouldn't have bought a Threadripper. This was a few years ago, so an upgrade means a mobo upgrade, etc. which speaks to the upgrade point, and one of the main games I play, MSFS 2020, is heavily CPU bound in places.
 
Back