AMD unveils Ryzen 9000 CPUs, debuts new Zen 5 architecture

I paid twice for it cause I got it before the 7700x was even released. That's a very silly argument to make simply because the 13700k was released alongside the 7700x for the exact same MSRP and it's faster than my 12900k. I have no idea why you'd even mention that.

And no, the 9700x will get nowhere near the 12900k even at the same power draw. I can bet money on that. Locked to 125w I score 24k at stock, no undervolting or anything. I know amd fans like to mention fractions of power draws but in reality most competing amd cpus are slower than the intel parts at the same power draw, lol.

What good is an upgrade path if you have to buy multiple cpus to get what an Intel user had back in 2021? Doesn't make sense right?
The 7700x released at a bad price point (slightly under the 13700k), but in just a few months it dropped to 300$ (I think it even sold for cheaper than the 7700 non-x for a while).

"Locked to 125w I score 24k at stock" - just the IPC increase puts the 9700X on paper at ~23k (~19.9K + 16% = ~23k), and all rumours put the MT performance increase to be above the average IPC. which should place the 9700X right in the middle between the 12700k and 12900k (no 125W TDP limit), or at 12900k levels with 125W limit. (and I'm talking about a 65W CPU).

"What good is an upgrade path" - it is good because it simply makes upgrades cheap. you can go the 14900k "upgrade" and still be behind what AMD puts out with Zen5 (hell, you will be behind the Zen 4 3D CPUs :) ) and then in the future with Zen5 3D and Zen6.

Let me put things into perspective. If you limit the non binned non-x 7950x to 65W it still scores more than the 12900k at stock. a 65W limit... ouch.
 
Last edited:
The 7700x released at a bad price point (slightly under the 13700k), but in just a few months it dropped to 300$ (I think it even sold for cheaper than the 7700 non-x for a while).

"Locked to 125w I score 24k at stock" - just the IPC increase puts the 9700X on paper at ~23k (~19.9K + 16% = ~23k), and all rumours put the MT performance increase to be above the average IPC. which should place the 9700X right in the middle between the 12700k and 12900k (no 125W TDP limit), or at 12900k levels with 125W limit. (and I'm talking about a 65W CPU).

"What good is an upgrade path" - it is good because it simply makes upgrades cheap. you can go the 14900k "upgrade" and still be behind what AMD puts out with Zen5 (hell, you will be behind the Zen 4 3D CPUs :) ) and then in the future with Zen5 3D and Zen6.

Let me put things into perspective. If you limit the non binned non-x 7950x to 65W it still scores more than the 12900k at stock. a 65W limit... ouch.
The 9700x isn't a 65w cpu, it's 65w tdp. It will use around 90w. I highly doubt it will match a 125w locked 13700k, which is a 2022 cpu.

Obviously the 7950x is faster than the 12900k,lol. You should be comparing it to the 13/14900k in which case they score the exact same at 65w. But you are factually wrong, at 65w it is not faster than a stock 12900k. I know you will use Anandtechs graph, the graph is wrong, the reviewer himself said so at page 4. The 7950x is pulling a lot more than 65w during that run. My brother has the damn cpu, at 65w it scores 22k in cbr23.

Buying a slow cpu and constantly upgrading to catch up to a cpu that is very fast doesn't really save you money my man. A 13700k is faster than the 7700x and it will be faster than the 9700x. What money did you actually save here, you are buying 2 cpus and you are still behind a 2022 400$ intel cpu.
 
The 9700x isn't a 65w cpu, it's 65w tdp. It will use around 90w. I highly doubt it will match a 125w locked 13700k, which is a 2022 cpu.

Obviously the 7950x is faster than the 12900k,lol. You should be comparing it to the 13/14900k in which case they score the exact same at 65w. But you are factually wrong, at 65w it is not faster than a stock 12900k. I know you will use Anandtechs graph, the graph is wrong, the reviewer himself said so at page 4. The 7950x is pulling a lot more than 65w during that run. My brother has the damn cpu, at 65w it scores 22k in cbr23.

Buying a slow cpu and constantly upgrading to catch up to a cpu that is very fast doesn't really save you money my man. A 13700k is faster than the 7700x and it will be faster than the 9700x. What money did you actually save here, you are buying 2 cpus and you are still behind a 2022 400$ intel cpu.
Let's talk about that 9700x 65W TDP argument of yours.

The 7800x3D is an "120W TDP" CPU. Enjoy:

The 7800x3D never gets close to 100W in applications and doesn't go past 60W in gaming. the CB MT power draw is 80W. These results can be confirmed at multiple review websites.

As you wish I'm going to refrain from using the anadtech power scaling results. And also Hardware Unboxed had a bug in XTU that invalidates their results for the 7950x vs 13900k.

"Buying a slow cpu and constantly upgrading to catch up to a cpu that is very fast doesn't really save you money my man." - Yes it does. The vast majority can't afford to buy the top end CPU for "future proofing" and sometimes you just end up on a dead platform that you are forced to throw away entirely if you want to upgrade (CPU+Mobo+RAM). Simply upgrading from something like R7 2700x to the 5800x3D is massive and you can't tell me that you "didn't save" money by doing this.
 
Let's talk about that 9700x 65W TDP argument of yours.

The 7800x3D is an "120W TDP" CPU. Enjoy:

The 7800x3D never gets close to 100W in applications and doesn't go past 60W in gaming. the CB MT power draw is 80W. These results can be confirmed at multiple review websites.

As you wish I'm going to refrain from using the anadtech power scaling results. And also Hardware Unboxed had a bug in XTU that invalidates their results for the 7950x vs 13900k.

"Buying a slow cpu and constantly upgrading to catch up to a cpu that is very fast doesn't really save you money my man." - Yes it does. The vast majority can't afford to buy the top end CPU for "future proofing" and sometimes you just end up on a dead platform that you are forced to throw away entirely if you want to upgrade (CPU+Mobo+RAM). Simply upgrading from something like R7 2700x to the 5800x3D is massive and you can't tell me that you "didn't save" money by doing this.
The 7800x 3d is clock limited, the non 3d chips are not. Every non 3d chip behaves the same, draws ~30% more power than it's TDP.

The 13700k isn't top of the line though, it cost as much as the 7700x - and I'd argue the latter NEEDS the upgrade much sooner than the former. That's why you aren't really saving money.

Now to your 5800x 3d argument, I was in that exact same position. I had an R5 1600 on a b350 and I really needed an upgraded. The 5800x 3d launched for 450$. With that amount of money I could buy a 12700f + a brand new b660 mobo. Brand new, that could get even faster CPU's down the line. Sorry, but the upgradability of AM4 didn't save me anything, choosing not to use it landed me a brand new mobo, a faster CPU and money in the pocket by selling my old system. No. Just no.
 
Why do you need to tear apart anything? I'm still on a 12900k a 2021 CPU. Works fine. In fact it's a lot faster than your 7700x. In fact it'd be faster than the 9700x as well.


lol.. I don't care about your outdated mobo and CPU...!

It seems you understand the pickle your strawman puts you in. Bcz there is no where for you to go... so YOU will be tearing apart your system soon* to upgrade your outdated platform. I bought my $269 CPU to put on my AM5 board as a placeholder for a Zen5 X3D CPU coming out later this year. And as mentioned, I could've also bought a 2nd CPU (7800X3D $349).... having used 2 CPUs for the cost of just one outdated iNTEL cpu.

How much faster is your $600 CPU when you don't have USB-4, or PCIe5.0..?


Like I said, AM5 platform is just drop-in.
 
lol.. I don't care about your outdated mobo and CPU...!

It seems you understand the pickle your strawman puts you in. Bcz there is no where for you to go... so YOU will be tearing apart your system soon* to upgrade your outdated platform. I bought my $269 CPU to put on my AM5 board as a placeholder for a Zen5 X3D CPU coming out later this year. And as mentioned, I could've also bought a 2nd CPU (7800X3D $349).... having used 2 CPUs for the cost of just one outdated iNTEL cpu.

How much faster is your $600 CPU when you don't have USB-4, or PCIe5.0..?


Like I said, AM5 platform is just drop-in.
By the time a 12900k needs an upgrade you'd have spent 1k+ $ upgrading your mediocre 8core chip one after another. Sadly - it is the case.
 
By the time a 12900k needs an upgrade you'd have spent 1k+ $ upgrading your mediocre 8core chip one after another. Sadly - it is the case.

No.... I will have spent whatever the cost of the upcoming 8800X3D is..! (which is my point)


Try explaining your argument to all the 12th Gen owners who didn't buy a $600 12900k and instead opted for a 12700, etc..! What was their upgrade path today?

Also, are you claiming YOUR 12900k is faster than the 16 month old 7800X3D..? Heck, even 7 year old AM4 rigs rock hard with a 5800X3D... which was a nice upgrade from my 7 year old 1800X rig.

AM5 platform rocks, just like the AM4 platform rocked!
 
No.... I will have spent whatever the cost of the upcoming 8800X3D is..! (which is my point)


Try explaining your argument to all the 12th Gen owners who didn't buy a $600 12900k and instead opted for a 12700, etc..! What was their upgrade path today?

Also, are you claiming YOUR 12900k is faster than the 16 month old 7800X3D..? Heck, even 7 year old AM4 rigs rock hard with a 5800X3D... which was a nice upgrade from my 7 year old 1800X rig.

AM5 platform rocks, just like the AM4 platform rocked!
The 5800x 3d cost 450$ - more than a 12700f + a brand new mobo, so your upgradability argument doesn't hold any water, since with the cost of these cpus you could just buy a new mobo.

Yes, my 12900k is faster than the 16 month old 7800xt, be it in ST, MT or games.

This guy just like you thought his 7800x 3d is the real deal and wanted to compare. He got a reality check

 
The 5800x 3d cost 450$ - more than a 12700f + a brand new mobo, so your upgradability argument doesn't hold any water, since with the cost of these cpus you could just buy a new mobo.

Yes, my 12900k is faster than the 16 month old 7800xt, be it in ST, MT or games.

This guy just like you thought his 7800x 3d is the real deal and wanted to compare. He got a reality check


LOL^
Sorry, but for reference the top dog in the gaming world, 7950X3D costs $487....
(30m install, but still waiting for the 8800X3D.. !)

It is more than hilarious that you believe the 12900k is faster at gaming than the 7800X3D ($339), when the 14900K gets beat by it (See Techspot's own reviews). For perspective, my 7 year old 1800X AM4 rig was excited to grab a 5800X3D for only $399. (Which in many games, is faster than a 12900K).


The Strawman you keep proposing is about your system vs mine...
I think you do that on purpose for FUD. The whole point you keep side-stepping/missing is that once you choose AMD you don't have to tear your system apart (ONE MOTHERBOARD)... and use multiple CPU ovr a 4-7 year period.

See I now have two systems.. 5800X3d in my simulator is sometimes faster than my lowly 7700X in my AM5 system, depending on the game. That doesn't bother me, bcz I didn't buy my AM5 Hero board 19 months ago for the 7700X..

I bought into AMD's AM5 platform because I knew Zen5 was coming and would be a superb upgrade path for a Gaming rig.

Your argument is not with me, it's with logic. See I have 15+ clan members who also buy/build/upgrade their stuff too. And they have seen the light, with how well AM4 did (by witnessing it's upgrade path). Now they are ALL are firm believers in AM5 platform and many of them skipped the initial released and went bigly with a 7800X3D... so they can choose to upgrade to the 8800X3D, or wait for the 9800X3D in 2027.


One of my fav/best System builds was using Devil's Canyon... it ran ran my sim for many-many years... AMD's platform is just better than iNTEL's now.
 
LOL^
Sorry, but for reference the top dog in the gaming world, 7950X3D costs $487....
(30m install, but still waiting for the 8800X3D.. !)

It is more than hilarious that you believe the 12900k is faster at gaming than the 7800X3D ($339), when the 14900K gets beat by it (See Techspot's own reviews). For perspective, my 7 year old 1800X AM4 rig was excited to grab a 5800X3D for only $399. (Which in many games, is faster than a 12900K).


The Strawman you keep proposing is about your system vs mine...
I think you do that on purpose for FUD. The whole point you keep side-stepping/missing is that once you choose AMD you don't have to tear your system apart (ONE MOTHERBOARD)... and use multiple CPU ovr a 4-7 year period.

See I now have two systems.. 5800X3d in my simulator is sometimes faster than my lowly 7700X in my AM5 system, depending on the game. That doesn't bother me, bcz I didn't buy my AM5 Hero board 19 months ago for the 7700X..

I bought into AMD's AM5 platform because I knew Zen5 was coming and would be a superb upgrade path for a Gaming rig.

Your argument is not with me, it's with logic. See I have 15+ clan members who also buy/build/upgrade their stuff too. And they have seen the light, with how well AM4 did (by witnessing it's upgrade path). Now they are ALL are firm believers in AM5 platform and many of them skipped the initial released and went bigly with a 7800X3D... so they can choose to upgrade to the 8800X3D, or wait for the 9800X3D in 2027.


One of my fav/best System builds was using Devil's Canyon... it ran ran my sim for many-many years... AMD's platform is just better than iNTEL's now.
It's cute that you think that but it's not true. I've tested all of these CPUs with tuned ram, the 12900k and the 7800x 3d are on par in games, the 5800x 3d is nowhere near them and the 14900k obviously just smashes all of those.

You don't believe me? Pick your game and lets test it. I'll be running stock, no cpu overclocking.
 
It's cute that you think that but it's not true. I've tested all of these CPUs with tuned ram, the 12900k and the 7800x 3d are on par in games, the 5800x 3d is nowhere near them and the 14900k obviously just smashes all of those.

You don't believe me? Pick your game and lets test it. I'll be running stock, no cpu overclocking.
"12900k and the 7800x 3d are on par in games" - sorry, but this is objectively false and no amount of RAM fine tuning will change that.
 
"12900k and the 7800x 3d are on par in games" - sorry, but this is objectively false and no amount of RAM fine tuning will change that.
Have you actually tested it? Completely theoretically, if ram tuning on Intel gives you 15% - it stands to reason that they would be on par, right?
 
Have you actually tested it? Completely theoretically, if ram tuning on Intel gives you 15% - it stands to reason that they would be on par, right?
No it doesn't. You will not get 15% unless you are comparing 7200MHz tuned with maybe stock 4800-5200MHz.

In reality, when tuning the 7200MHz memory for a 13900K, you gain a 5% improvement on average at 1080p using a 4090 as shown here:

This isn't the only RAM scaling video/article that shows this. And it's also an unrealistic use case where you are stressing the CPU at 1080p with the best GPU on the market.

PS: notice how all of this talk is about "tuning" everything? how many people do you know that can even enter the BIOS? let alone tune the RAM and CPU.
 
No it doesn't. You will not get 15% unless you are comparing 7200MHz tuned with maybe stock 4800-5200MHz.

In reality, when tuning the 7200MHz memory for a 13900K, you gain a 5% improvement on average at 1080p using a 4090 as shown here:

This isn't the only RAM scaling video/article that shows this. And it's also an unrealistic use case where you are stressing the CPU at 1080p with the best GPU on the market.

PS: notice how all of this talk is about "tuning" everything? how many people do you know that can even enter the BIOS? let alone tune the RAM and CPU.
I'll start with your PS. We had these discussion before, I really don't care about how many people do what. I don't even know why that's relevant to me.

The surely someone with a 7800x 3d can post better numbers than my stock 12900k right? Why hasn't that happened in eg. TLOU, or starfield, or ratchet, or cyberpunk? Everyone I've tested against at the exact same area with exact same settings even with tuned 7800x 3d is on par or behind.
 
I'll start with your PS. We had these discussion before, I really don't care about how many people do what. I don't even know why that's relevant to me.

The surely someone with a 7800x 3d can post better numbers than my stock 12900k right? Why hasn't that happened in eg. TLOU, or starfield, or ratchet, or cyberpunk? Everyone I've tested against at the exact same area with exact same settings even with tuned 7800x 3d is on par or behind.
Sorry but you already have the results for these games tested and retested by many reviewers and even individuals. What more do you want? Your own personal results have zero meaning when there is no real apples to apples comparison.

We've known since it's launch that in Starfield you need 13th or 14th gen to beat the 7800x3D with the 13900k gen being quite a bit ahead in Starfield. (although after the Starfield performance update I think AMD got closer to Intel)

For Cyberpunk, the results used to be slightly ahead for AMD, but since Cyberpunk 2077 Phantom Liberty the 7800X3D should be ahead in most cases.

In TLOU the you need the 13900KS/14900K to get a few FPS more.

In Ratchet & Clank performance is much the same. Intel has a few more average FPS but loses in the 1% lows.

"Everyone I've tested against at the exact same area with exact same settings even with tuned 7800x 3d is on par or behind." - Yeah, in a few titles it might be ahead. Why are you so surprised?

I did a quick reddit search on "personal results" for other people. The numbers are so wildly different that it makes zero sense do even use as anything more than specific results for specific personal configs.

TL;DR no, the 12900k is not ahead of the 7800x3D in gaming. this is an established and verifiable fact by ALL reviewers. it may win by a bit in a few titles, but in most it is well behind even with expensive 7200MHz RAM used with the 12900k (around 10% delta according to techspot). You need the 13900k or 14900k to get closer.

PS here are a few tips for benchmarking:
you say "same area", but you don't take into account any other factors like:
1. the bloat of their system
2. cooling (even just having higher room temps can affect max boost clocks)
3. RAM used. was it 6000MHz CL30 or CL 40? 16 or 32GB RAM? intel isn't the only one who gains a few percent with better RAM
4. even the way the FPS was recorded can affect the end result.
5. version of the game and mods
6. driver versions
7. windows versions
8. GPU quality (2-3% dif can easily come from just this)
9. was any antivirus installed? did it have gaming mode enabled (if it has such a mode)?
 
Last edited:
Sorry but you already have the results for these games tested and retested by many reviewers and even individuals. What more do you want? Your own personal results have zero meaning when there is no real apples to apples comparison.

We've known since it's launch that in Starfield you need 13th or 14th gen to beat the 7800x3D with the 13900k gen being quite a bit ahead in Starfield. (although after the Starfield performance update I think AMD got closer to Intel)

For Cyberpunk, the results used to be slightly ahead for AMD, but since Cyberpunk 2077 Phantom Liberty the 7800X3D should be ahead in most cases.

In TLOU the you need the 13900KS/14900K to get a few FPS more.

In Ratchet & Clank performance is much the same. Intel has a few more average FPS but loses in the 1% lows.

"Everyone I've tested against at the exact same area with exact same settings even with tuned 7800x 3d is on par or behind." - Yeah, in a few titles it might be ahead. Why are you so surprised?

I did a quick reddit search on "personal results" for other people. The numbers are so wildly different that it makes zero sense do even use as anything more than specific results for specific personal configs.

TL;DR no, the 12900k is not ahead of the 7800x3D in gaming. this is an established and verifiable fact by ALL reviewers. it may win by a bit in a few titles, but in most it is well behind even with expensive 7200MHz RAM used with the 12900k (around 10% delta according to techspot). You need the 13900k or 14900k to get closer.

PS here are a few tips for benchmarking:
you say "same area", but you don't take into account any other factors like:
1. the bloat of their system
2. cooling (even just having higher room temps can affect max boost clocks)
3. RAM used (was it 6000MHz CL30 or CL 40? intel isn't the only one who gains a few percent with better RAM)
4. even the way the FPS was recorded can affect the end result.
5. version of the game and mods
6. driver versions
7. windows versions
8. GPU quality (2-3% dif can easily come from just this)
9. was any antivirus installed? did it have gaming mode enabled (if it has such a mode)?
There is nothing that can convince you except bars on an excel sheet. I'm doing real actual footage of the cpu in action that no review does and here you are handwaving it away. Whatever man, believe what you want, I know you ain't getting convinced. But I can be convinced, just find me some actual footage of a 7800x 3d beating my stock 12900k and I'll be all sold, I swear.

I've already posted footage in tlou of a tuned 7800x 3d with 6000c30 ram on AIO vs my stock passively cooled 12900k. Still lost. It must have been his keyboard I guess...
 
There is nothing that can convince you except bars on an excel sheet. I'm doing real actual footage of the cpu in action that no review does and here you are handwaving it away. Whatever man, believe what you want, I know you ain't getting convinced. But I can be convinced, just find me some actual footage of a 7800x 3d beating my stock 12900k and I'll be all sold, I swear.

I've already posted footage in tlou of a tuned 7800x 3d with 6000c30 ram on AIO vs my stock passively cooled 12900k. Still lost. It must have been his keyboard I guess...
then nothing will convince you since your own bars in excel are not true to life and don't represent a true comparison.

I trust the results from Steve where Intel wins in Starfield, loses in Cyberpunk and is equal in TLOU when comparing the 14900k with the 7800x3D. (tests done with DDR5-7200 CL34 for intel and DDR5-6000 CL30 for AMD)

no amount of "fine tuning" of expensive RAM will make the 12900k faster than the 14900k.
 
then nothing will convince you since your own bars in excel are not true to life and don't represent a true comparison.

I trust the results from Steve where Intel wins in Starfield, loses in Cyberpunk and is equal in TLOU when comparing the 14900k with the 7800x3D. (tests done with DDR5-7200 CL34 for intel and DDR5-6000 CL30 for AMD)

no amount of "fine tuning" of expensive RAM will make the 12900k faster than the 14900k.
Agreed, no amount of tuning can make the 12900k faster than the 14900k. But it is faster than the 7800x 3d.
 
Agreed, no amount of tuning can make the 12900k faster than the 14900k. But it is faster than the 7800x 3d.
According to nobody but you and your CL10 9001MHz RAM. I wouldn't be surprised if you are getting these results with the CPU limited to 125W either.
 
According to nobody but you and your CL10 9001MHz RAM. I wouldn't be surprised if you are getting these results with the CPU limited to 125W either.
Nope, less than 125w. Only in TLOU im getting around that power draw, in other games it's around 70-90.

If it's not the case then surely ill eventually find someone that will post some actual footage beating the results I posted. Let's all pray that happens.
 
Have you actually tested it? Completely theoretically, if ram tuning on Intel gives you 15% - it stands to reason that they would be on par, right?
No it doesn't. You will not get 15% unless you are comparing 7200MHz tuned with maybe stock 4800-5200MHz.

In reality, tuning the 7200MHz memory for a 13900K, you gain a 5% improvement at 1080p using a 4090 as shown here:

And this isn't the only RAM scaling video/article that shows this.
 
Nope, less than 125w. Only in TLOU im getting around that power draw, in other games it's around 70-90.

If it's not the case then surely ill eventually find someone that will post some actual footage beating the results I posted. Let's all pray that happens.
Give us a break with your singled out out game where you have 4-5 more FPS.

You cannot tell us that these two screenshots show similar test conditions:

from your video:
Screenshot 2024-06-19 at 15.10.49.jpg

the one you are comparing to:
Screenshot 2024-06-19 at 15.13.10.jpg

This looks like very different methods of recording the video (application/codecs) and different FPS tool. The second one mentions using OBS + CPU to get a better image quality and it shows. Yours is using Nvidia Shadowplay/Share.

This is why I told you that your results are not directly comparable.

I can't be bothered to find TLOU in random videos. Here's how 1080p 60FPS CPU encoding affects the 12900k with different settings. The FPS drops are big.
 
Last edited:
Give us a break with your singled out out game where you have 4-5 more FPS.

You cannot tell us that these two screenshots show similar test conditions:

from your video:
View attachment 89700

the one you are comparing to:
View attachment 89701

This looks like very different methods of recording the video (application/codecs) and different FPS tool. The second one mentions using OBS + CPU to get a better image quality and it shows. Yours is using Nvidia Shadowplay/Share.

This is why I told you that your results are not directly comparable.

I can't be bothered to find TLOU in random videos. Here's how 1080p 60FPS CPU encoding affects the 12900k with different settings. The FPS d

You have obviously no idea. If he used the CPU to encode he would be getting half the fps you are seeing. He has better quality cause I'm using the minimum amount of bandwidth, don't be silly. Ill rerecord with high bitrate just for you but it seems pointless, you won't accept the results.

This is my friend's 5800x 3d. He is using nvidia encoder just like I am. Now find another excuse bud

 
You have obviously no idea. If he used the CPU to encode he would be getting half the fps you are seeing. He has better quality cause I'm using the minimum amount of bandwidth, don't be silly. Ill rerecord with high bitrate just for you but it seems pointless, you won't accept the results.

This is my friend's 5800x 3d. He is using nvidia encoder just like I am. Now find another excuse bud

Stop embarrassing yourself dude. The video description clearly states that he used OBS with CPU encoding. Put out a CPU encoded video in OBS with the same codec, bitrate and other settings. If you can't then all you are doing is comparing an apple to a cow.

FYI it is known that the 5800x3D doesn't do well in TLOU. Benchmarks here put it below the 7600x. Why are you bringing this up as an "argument"? I can also say that the 11900k does even worse than the 5800x3D, but there is zero meaning in doing so.
 
Last edited:
Stop embarrassing yourself dude. Put out a CPU encoded video in OBS with the same codec, bitrate and other settings. If you can't then all you are doing is comparing an apple to a cow.
He is not using a cpu encoder you maniac, who the heck would bench his CPU in a game while encoding on the background, lol.

Whatever man, your opinion will just not change no matter what. Whatever, yeah, he was encoding on the background :joy:
 
Back