Faster GPU or Faster CPU: Ryzen 7600 + RTX 4070 Ti vs. Ryzen 7800X3D + RTX 4070

For 1440p gaming, I would pick the 7800X3D + 4070 combo any day of the week. The games I play are more likely to be CPU bottlenecked. I don’t care much about current AAA offerings, nor do I play anything competitive or even multiplayer for that matter. I optimize for minimum framerate and cap so that the framerate is stable. Minimizing stuttering is more important to me than minimizing input latency because I don’t play anything fast paced either. It’s also easier to upgrade the GPU later. When it’s time for me to upgrade the CPU, it usually ends up being a complete system rebuild.
 
Would you really play in 1080p with a 4070 or a 4070TI?

If the answer is yes, than you are clearly clueless about DIY PC. Consoles are providing higher resolution gaming already.

This analysis would be better with GPUs like the 7600, 7700 XT, 4060 and 4060 TI.
Yeah, that's why GT7 only has ray tracing in replays, for example. That's why pretty much all current-gen AAA games have a "performance" mode and a "fidelity" mode. You can't have both, sorry.

Console gaming is full of hidden compromises, you just seem to have chosen to ignore them.

For reference, the PS5 GPU has 10.3 TFLOPS, while the 4070 and 4070 Ti have 29.1 and 40.1 respectively.

So if you think that a 3 times more performant GPU will provide worse performance, "you are clearly clueless" about computing in general. Yeah, single target hardware and razor sharp optimizations can help somewhat, but not THAT much, especially with cross platform titles, which is the vast majority.
 
That's because most first-party Sony titles are cross-gen. God of War Ragnarok is a PS4 game, not a next-gen game, did you forget that detail?
Returnal, on the other hand, is a PS5-exclusive first-party Sony game and it runs at 1080p.

This isn't something up for debate. The PS5 already has tons of 1080p and sub-1080p games, and that group is only gonna continue growing newer, more demanding games are announced. Epic literally recommends 900p as the target internal resolution for consoles for UE5 games using Nanite and Lumen.

Yeah but you can't deny Sony's first party games whether crossgen or not or 1080p, 1440p, 4k checkerboarding or not, they are good looking, even on a 4k oled TV.

Resolution is one thing, but if the artwork and style looks good then does it really actually matter that much? Does it really detract the gaming experience to the point you literally can't enjoy the game?

It's certainly not the case for me. I own a high end PC setup 13900k paired to the RTX 4080 and played every PC game under the sun, at max settings. I have no problems playing on my PS5 in the living room. In fact, many of the so called cross game looks bloody amazing on the big screen.
 
Hello, did anyone proof read the first sentence?

"Would buying a Ryzen 5 7600 with a GeForce RTX 4070 Ti be a better option than the Ryzen 7 7800X3D with the RTX 4070 Ti?"

I was like, what is the point of this article? Then I saw the mistake.

Seriously, fix the first sentence of the article. A day later it's stills wrong. If you don't understand the issue, it's that you wrote Ti for both and you need to delete the second one.
 
No, they aren't. The more demanding games already run at 1080p internally (like Returnal), or have dynamic resolutions with a lower bound close to 1080p (like Dead Space, Alan Wake, Cyberpunk and so on). And in especially bad cases, like Jedi Survivor and Final Fantasy 16, the internal resolution can drop all the way down to 720p, as found by Digital Foundry in their tech reviews.

We're not even halfway through the generation yet, and the consoles are already going right back to being 1080p machines.

100% This.

Reminds me of a guy on reddit telling me his PS5 runs everything great in 4k.

Then I said one word to him "Upscaling" and he had no clue.
 
My living room PC is hooked up to my 4K TV. I haven’t had a 1080pmonitor in over a decade.

But Spider-man is surprisingly CPU limited. First time my hand me down from my main PC CPU wasn’t fast enough.
I had a 5800X and when I upgraded to X3D model this game saw a huge boost in frame at 1440UW.
 
You need to edit the first paragraph in this article as it mentions 4070Ti as the gpu for both cpu's...

Would buying a Ryzen 5 7600 with a GeForce RTX 4070 Ti be a better option than the Ryzen 7 7800X3D with the RTX 4070 Ti?
 
Yeah but you can't deny Sony's first party games whether crossgen or not or 1080p, 1440p, 4k checkerboarding or not, they are good looking, even on a 4k oled TV.
"Good looking" is subjective and has nothing to do with technology. There are games from the SNES/N64/PS1 eras that are "good looking". Okami is a PS2/Wii game and still one of the most beatiful games out there.

Resolution is one thing, but if the artwork and style looks good then does it really actually matter that much?
I matters when the dude I replied to says "1080p is garbage, even consoles are providing higher resolution than that", which is resoundingly incorrect.
 
That's one subject where I always go against the grain. I find that GPU bottlenecks are always easier to manage than CPU bottlenecks. So for me it's faster CPU, always.

When games are bottlenecked by the GPU, you get overall low framerates that can be improved in several ways. You can dial down in-game resolution, detail levels, etc. Even if you can't bring frame rates to the level you want, you can at least force a framerate limit to have a smoother console-like experience.

CPU bottlenecks, on the other hand, make most games a stuttery mess - you get high frame rates with lots of stutters and microfreezes. And often there's very little that can be done to remedy it other than overclocking the CPU.

Good point, I didn’t even think of that. In most games it’s much easier to dial down graphic settings than it is CPU settings. Dialing down one or two graphic settings with barely any loss to detail is far better than a CPU bottleneck you can’t do anything about.
 
Agreed. The GPU is also easier to upgrade later (usually, it depends on the CPU vendor you go with and when in that platform's lifecycle you are buying). For a "right-now" analysis, if you know you will be playing primarily GPU-bound games, it would make sense to go with the better GPU.

But if you're like me, you don't *know* that is what you'll be playing for the couple of years between when you buy the system and when you decide to upgrade/get a new one. Indeed, I overestimated the amount of productivity work I'd be doing, so in retrospect I probably shouldn't have bought a Threadripper. This was a few years ago, so an upgrade means a mobo upgrade, etc. which speaks to the upgrade point, and one of the main games I play, MSFS 2020, is heavily CPU bound in places.

I’m glad to see someone mention MSFS. I play that in VR and can confirm the CPU makes a huge difference in that title. I had a very noticeable uplift in performance going from a 9700K / 3080 12G to a 12700K with the same GPU and a HTC Vive. Both systems used DDR4.
 
I’m glad to see someone mention MSFS. I play that in VR and can confirm the CPU makes a huge difference in that title. I had a very noticeable uplift in performance going from a 9700K / 3080 12G to a 12700K with the same GPU and a HTC Vive. Both systems used DDR4.
Yeah, anywhere there's traffic (airports) the frame rate gets jerky quite quickly. Traffic uses a lot of CPU. I was hoping the DX12 implementation would fix that, but they still don't seem to understand the concept of threading. When downloading updates in the content manager, it still freezes up the UI when it switches from downloading to decompressing to working on the next item. Why they couldn't just background thread it, I have no idea. I'm sure there's plenty of stuff like that in the traffic code, too.

They've improved it for sure, and I think they realized the architecture isn't very good, which is why they are making MSFS 2024. I just hope that the marketplace purchases transition smoothly to the new edition.
 
Seriously, fix the first sentence of the article. A day later it's stills wrong. If you don't understand the issue, it's that you wrote Ti for both and you need to delete the second one.

Yes I spent a minute checking what was going on here too
 
Back