AMD unveils Ryzen 9000 CPUs, debuts new Zen 5 architecture

He is not using a cpu encoder you maniac, who the heck would bench his CPU in a game while encoding on the background, lol.

Whatever man, your opinion will just not change no matter what. Whatever, yeah, he was encoding on the background :joy:
So your argument/excuse is that he is lying? Ok.
PS: I added an 5800x3D edit to the above comment.

"your opinion will just not change" - I don't have an opinion. I'm just using facts to counter your argument.
 
So your argument/excuse is that he is lying? Ok.
PS: I added an 5800x3D edit to the above comment.

"your opinion will just not change" - I don't have an opinion. I'm just using facts to counter your argument.
Of course he is lying, he added that on the description after he lost and then blocked me on twitter. You think a guy that challenges to you see the gaming performance actually runs encoding on the background? Really?
 
Of course he is lying, he added that on the description after he lost and then blocked me on twitter. You think a guy that challenges to you see the gaming performance actually runs encoding on the background? Really?
Yes I do because I have no reason to believe otherwise. He probably blocked you twitter because you've been insulting him, just like you did on youtube in the comments.

I do have examples of the 7800x3D running at ultra 1080p above 200FPS, just not in the same area as your video which means it's not a fair comparison. the FPS difference between areas is too big.
 
It's cute that you think that but it's not true. I've tested all of these CPUs with tuned ram, the 12900k and the 7800x 3d are on par in games, the 5800x 3d is nowhere near them and the 14900k obviously just smashes all of those.

You don't believe me? Pick your game and lets test it. I'll be running stock, no cpu overclocking.
Really, YOU... tested them..? Where is YOUR website?

Do you see how your strawman argument is not with me & Others, but with TECHSPOT's own CPU review..? It is more than obvious that AM5 is a superior platform to build a gaming rig on.


And everyone knows the 14900K is slower than the 7800X 3D... but somehow YOUR 12th gen chip is better, bcz it's owned by YOU. Do understand that everyone can see through your iNTEL bias.

AMD's platform is so much better than rebuilding a rig every 2 years... iNTEL is always trying to play catch-up. Ryzen is fore real.
 
Yes I do because I have no reason to believe otherwise. He probably blocked you twitter because you've been insulting him, just like you did on youtube in the comments.

I do have examples of the 7800x3D running at ultra 1080p above 200FPS, just not in the same area as your video which means it's not a fair comparison. the FPS difference between areas is too big.
I didn't insult him, quite the contrary. He said he is going to demolish my 12900k and it's not even worth testing. I said cool, go ahead and demolish it and after we published the videos he said Im cheating, added the encoding description and blocked me. I can show you the messages, no problem.

I also do have examples of the 7800x 3d since lots of my friends have it, it's just on par with a 12900k in TLOU with both tuned ram.
 
Really, YOU... tested them..? Where is YOUR website?

Do you see how your strawman argument is not with me & Others, but with TECHSPOT's own CPU review..? It is more than obvious that AM5 is a superior platform to build a gaming rig on.


And everyone knows the 14900K is slower than the 7800X 3D... but somehow YOUR 12th gen chip is better, bcz it's owned by YOU. Do understand that everyone can see through your iNTEL bias.

AMD's platform is so much better than rebuilding a rig every 2 years... iNTEL is always trying to play catch-up. Ryzen is fore real.
If you don't own the hardware you have to trust bars made on excel. I own the hardware so I don't have to. Half of the games in that review you just linked are just completely GPU bound, even with a 4090 at 1080p. Just an example, im using same settings on same planet on Ratchet, on a stock 12900k with just 6000c36 XMP (pretty bad ram). Im completely GPU bound the entire video, lol.


The same apples to TLOU, probably starfield (I'm about to test that one soon)
 
I didn't insult him, quite the contrary. He said he is going to demolish my 12900k and it's not even worth testing. I said cool, go ahead and demolish it and after we published the videos he said Im cheating, added the encoding description and blocked me. I can show you the messages, no problem.

I also do have examples of the 7800x 3d since lots of my friends have it, it's just on par with a 12900k in TLOU with both tuned ram.
Screenshot 2024-06-19 at 18.37.31.png
Ok.
 
If you don't own the hardware you have to trust bars made on excel. I own the hardware so I don't have to. Half of the games in that review you just linked are just completely GPU bound, even with a 4090 at 1080p. Just an example, im using same settings on same planet on Ratchet, on a stock 12900k with just 6000c36 XMP (pretty bad ram). Im completely GPU bound the entire video, lol.


The same apples to TLOU, probably starfield (I'm about to test that one soon)
Are you surprised that some games are GPU bound even at 1080p when using a 4090? why?
 
Of course he is lying, he added that on the description after he lost and then blocked me on twitter. You think a guy that challenges to you see the gaming performance actually runs encoding on the background? Really?
After a bit of searching I found this video. The one who posted it mentions a 10% drop in FPS while streaming and recording a video at the same time (he's using software encoding - hard to find these). Which is about on par with other results for other CPUs I've seen.

 
If you don't own the hardware you have to trust bars made on excel. I own the hardware so I don't have to. Half of the games in that review you just linked are just completely GPU bound, even with a 4090 at 1080p. Just an example, im using same settings on same planet on Ratchet, on a stock 12900k with just 6000c36 XMP (pretty bad ram). Im completely GPU bound the entire video, lol.


The same apples to TLOU, probably starfield (I'm about to test that one soon)
Just can't take you seriously.

I've poured LN2 for Shimano and K|ngp|n while they were breaking world records, stop clowning yourself about OC & tweaking skillz. TECHSPOT, Gamers Nexus and review sites can illustrate their findings as graphs, bcz they can easily visualize relative performance. Nobody is saying the 14900k is slow, just not as good as AM5 Compatible Ryzen 7 7800X 3D in Gaming.

Obv, you are stuck on iNTEL. That is why you keep talking about Your outdated 12th Gen system, that has no upgrade path. So (again) plan on spending bigly trying to top Zen5 X3D.

AM5 ftw...
 
After a bit of searching I found this video. The one who posted it mentions a 10% drop in FPS while streaming and recording a video at the same time (he's using software encoding - hard to find these). Which is about on par with other results for other CPUs I've seen.

The fps drops whether you are running a cpu or a gpu encoder. If you already gpu bound in the game and you start recording with nvenc you get a performance hit, obviously.
 
And maybe you were rude the whole time. I just don't know. Just like you don't know if he used CPU encoding or not.
I know that his fps is around about what a tuned 7800x 3d should be getting in that area of the game. I've seen people get a bit more and I've seen people get a bit less. There is no freaking way he is encoding on the background.
 
The fps drops whether you are running a cpu or a gpu encoder. If you already gpu bound in the game and you start recording with nvenc you get a performance hit, obviously.
The dedicated hardware encoder makes the "hit" a lot less noticeable (if any at all). It doesn't matter if you are GPU bound or not since you aren't using the CUDA cores.

A quick look at some benchmarks of using NVENC shows that the average drop is about 2-3% for 4K 60FPS 110mbps:

 
I know that his fps is around about what a tuned 7800x 3d should be getting in that area of the game. I've seen people get a bit more and I've seen people get a bit less. There is no freaking way he is encoding on the background.
An up to 10% difference isn't as big as you think in terms of FPS. It should land at around 185-190 FPS which is in line with the expected performance.
 
An up to 10% difference isn't as big as you think in terms of FPS. It should land at around 185-190 FPS which is in line with the expected performance.
You are not getting a 10% hit on tlou dude game maxes out my 16 physical cores.
 
You are not getting a 10% hit on tlou dude game maxes out my 16 physical cores.
Screenshot 2024-06-19 at 19.29.12.png
This says otherwise. The CPUs are too tightly packed. This means that the GPU is maxing out first even at 1080p with ST being more important.

This generally means that the hit to the CPU could be less than 10% since there is some wiggle room.
 
There is no need to argue with a guy who thinks his 12900k is faster than a 14900k... bcz he knows how to tweak.
 
There is no need to argue with a guy who thinks his 12900k is faster than a 14900k... bcz he knows how to tweak.
I also have a 14900k (and a 13900k). It's not faster but a tuned 12900k get's close to a stock 14900k. With tuning both obviously the 14900k is much faster. I have uploaded videos comparing the two on my channel bud.

Here ya go, stock 14900k vs memory tuned stock 12900k


Pretty close, no?
 
View attachment 89703
This says otherwise. The CPUs are too tightly packed. This means that the GPU is maxing out first even at 1080p with ST being more important.

This generally means that the hit to the CPU could be less than 10% since there is some wiggle room.
The area hub is testing yes, the 4090 is the bottleneck. But that wasn't the case in the area we tested.
 
You do understand you are only fibbing to yourself? That Gamers have sustained gameplay, not biased 10m "vid" proving your point... that iNTEL is not that bad and CAN BE nearly as fast as AMD, if you do these thingz and jump through these hoops.

Nearly everyone I know, has jumped ship. Sadly, AMD is plug & play and even a semi-professional OC'er myself has no need to tweak anything, Bcz AMD's board engineers already have.


Nothing you have said takes away from the fact that AM5 platform is a better choice for Gamers.
 
You do understand you are only fibbing to yourself? That Gamers have sustained gameplay, not biased 10m "vid" proving your point... that iNTEL is not that bad and CAN BE nearly as fast as AMD, if you do these thingz and jump through these hoops.

Nearly everyone I know, has jumped ship. Sadly, AMD is plug & play and even a semi-professional OC'er myself has no need to tweak anything, Bcz AMD's board engineers already have.


Nothing you have said takes away from the fact that AM5 platform is a better choice for Gamers.
I agree, amd 3d chips are better at plug and play.

But I don't care about it since ill tinker with the CPU no matter which one I buy, therefore I buy the fastest cpus after tweaking. Happens to be Intel currently, after tuning they are just untouchable by current zen 4 chips. Let's all pray 9950x 3d is actually good, ill be buying one.
 
I also have a 14900k (and a 13900k). It's not faster but a tuned 12900k get's close to a stock 14900k. With tuning both obviously the 14900k is much faster. I have uploaded videos comparing the two on my channel bud.

Here ya go, stock 14900k vs memory tuned stock 12900k


Pretty close, no?
I'm pretty sure that making general statements based on a game or two is not normal.

As I've shown, the CPUs are all stacked close to each other in TLOU, what exactly is your point? We are measuring differences of 5FPS in this title and we need to have a huge discussion on how because of this title every other game benchmark is not needed anymore?

I can play the "in this title" comment game too. can your tweaked 12900k beat the stock 7800x3D in: A Plague Tale: Requiem, Hogwarts Legacy, Baldur's Gate 3, Star Wars Jedi: Survivor, Assetto Corsa Competizione, Assassin's Creed Mirage, Watch Dogs: Legion?
 
Last edited:
Back