The Best CPUs: Productivity and Gaming

Clicking the pricing links on Intel processors like the 12700KF and 13600KF show that prices have jumped since this article was written (both are now $272). It really skews the analysis unless you can find those parts at these prices.
 
I just built an i7 13700KF and I gotta say I'm not impressed with the heat; even with a 360MM AIO it is not a good sign in my opinion for the future of intel if they can't get this stuff under control; my i9 10920X is worse in that department, but only my i7 10700KF has what I would call great temps. Speedwise it's not THAT impressive over my i7 10700KF@5.129Ghz on all cores; especially @1440p/4K max settings. I just hope there is some cooling breakthrough at some point in the near future.
 
The 13900k and 14900k are faster than the 7950x3d in productive apps. See here: https://www.pugetsystems.com/labs/a...tion-review/#Video_Editing_Adobe_Premiere_Pro

And since when did power users care about energy consumption?
You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.

It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.

I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
 
You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.

It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.

I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
Love is irrational.
 
You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.

It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.

I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
Did you even look at the benches? It wins in everything but blender and vray...
 
Did you even look at the benches? It wins in everything but blender and vray...
oh yeah, 5-10% (at most) “winning” for 92% more power usage…

Kinda embarrassing Intel doesn’t wipe the floor lets be honest, for such massive power draw, I’d want, minimum 20-30% more performance but for 92% more power, you’d be looking for 50% more performance.

As it stands, the fact AMD keep up or even beat Intel is simply embarrassing.
 
How about the Ryzen 7 5700X? It’s been selling on Amazon Canada for CAD216 ($159 US). And as a 65W part, one gets to use that old CPU cooler they had on their Ryzen 5 2600X.

If that’s not a bargain, I don’t know what is.
 
You mean, intel is slightly quicker in specifically photoshop, and Intel pull ahead slightly when Quick Sync is enabled in Premier Pro? Which let’s be honest, you probably will have a GPU in your system capable of the same tasks anyway if you’re spending that much on a top end CPU.

It’s not that power users care all that much about the power consumption, but why buy a slower CPU that eats nearly twice the power? And thanks to all that extra power consumed, you need to think harder about cooling as well.

I’ll never understand the love for Intel, for over 10 years they became incredibly lazy and got themselves into this mess…
Intel draws double the power if you want it to draw double the power. Reviewrs go into the bios and remove power limit and then test. That's nonsense. A 14900k at the same power limit as the 7950x offers very similar performance. Amd is not more power efficient, reviewers are just running intel balls to the wall just so they cna pretend they are not efficient
 
oh yeah, 5-10% (at most) “winning” for 92% more power usage…

Kinda embarrassing Intel doesn’t wipe the floor lets be honest, for such massive power draw, I’d want, minimum 20-30% more performance but for 92% more power, you’d be looking for 50% more performance.

As it stands, the fact AMD keep up or even beat Intel is simply embarrassing.
Again, that's nonsense. Performance and power draw doesn't scale linearly. You say the difference is only 10%, ok, how much power do you think the 7950x would need to run 10% faster? Probably double, exactly as much as the 14900k needs.

Fact is at 220w the 14900k does 40k. Pushing it to 400 watts to get 43k is just dumb, you can't do that and then complain about the power draw.
 
Man, the Intel fanboys are out today :joy:

I love the fact no matter how much evidence there is, no matter how many reviewers you watch, no matter how many benchmarks are done, some people will simply believe what they want to believe.

I’m glad I’m not one of them.
I just want to leave a quote from the article:

“For example, running Cinebench, the 14900K system consumed 533 watts, which was 92% more than the 277 watts used by the 7950X3D”
 
Fact is at 220w the 14900k does 40k. Pushing it to 400 watts to get 43k is just dumb, you can't do that and then complain about the power draw.
Then why 14900K takes 400 watts and not 220? Because Intel decided so.

I know it "feels" dumb to run current CPUs at full throttle but since large majority of people have no clue about adjusting settings or are unable to do so because motherboard is trash, CPUs are tested on default settings. If Intel makes CPU to take 400 watts on default, then that's how reviewers test it. That's Intel's fault.

Basically you should think Why 14900K runs so hot on default settings. I know answer but hopefully you do too.
 
Then why 14900K takes 400 watts and not 220? Because Intel decided so.
No, because the reviewer went into the bios and CHOSE the unlimited option.

All intel mobos I've tried in my life, the first time you get into the bios to enable XMP, they first force you to choose power limits out of 3 options. It's 3 big pictures taking the whole screen and you have to choose one of them. The first option is usually around the 125w - the second option is at 180-220 (it really depends on the CPU) and the 3rd one is unlimited. The reviewer chose unlimited and then complained about the power draw. Even my Z690 Apex, which is basically a motherboard for balls to the wall overclocking still forced me to choose on first boot.

And that's not even the issue, the issue is in all those workloads that Intel is faster (adobes and the like) it's not just faster, it's actually way more efficient than the AMD counterparts. Up to 70% more efficient in fact. Technotice has been running these tests for ages, and for alderlake and onward Intel is the clear choice for content creators both for speed and efficiency.

Even Steve himself posted on his twitter (I think it was back with the 12900k review) that HE did this, he went into the bios and chose the unlimited. So....this is userbenchmark level of propaganda.

The Intel default specs are 253 PL and 307 ICC, which will basically translate to around 220w on CBR23.
 
Last edited:
“For example, running Cinebench, the 14900K system consumed 533 watts, which was 92% more than the 277 watts used by the 7950X3D”
Try to make the 7950x 3d match the 14900ks performance and then tell me how much power it used. If I had to bet I'd say more than the 14900k, which is the point.
 
Try to make the 7950x 3d match the 14900ks performance and then tell me how much power it used. If I had to bet I'd say more than the 14900k, which is the point.
Soo when you limit both CPU's to lets say... 80 watts? Sound fair?
Lets go with 80 watts... let me find a reviewer that's done that hold on...

Ok, der8auer has done exactly this, found the 14900K to be laughably behind AMD

Are you going to now try and argue that the Intel CPU is better on power at a very specific range? Because if you are, you're just making AMD look better and better :cool:
 
Soo when you limit both CPU's to lets say... 80 watts? Sound fair?
Lets go with 80 watts... let me find a reviewer that's done that hold on...

Ok, der8auer has done exactly this, found the 14900K to be laughably behind AMD

Are you going to now try and argue that the Intel CPU is better on power at a very specific range? Because if you are, you're just making AMD look better and better :cool:
He tested a multithreaded workload and the 14900k is laughably behind? I call that bs. Sorry havent seen the video, but if that is what he found out I'm not clicking it, it's just flawed.
 
He tested a multithreaded workload and the 14900k is laughably behind? I call that bs. Sorry havent seen the video, but if that is what he found out I'm not clicking it, it's just flawed.
Oh yeah, der8auer is famous for flawed reviews /s

I'll leave it at that, it's just hilarious to see the Intel fanboys like yourself literally just, cover your eyes and ears and pretend away.
 
Oh yeah, der8auer is famous for flawed reviews /s

I'll leave it at that, it's just hilarious to see the Intel fanboys like yourself literally just, cover your eyes and ears and pretend away.
So you are saying that he actually tested multithreaded workloads with both cpus at 80w and the 14900k was laughingly bad? Is that what you are saying? I asked before but you never answered that. I have to assume you are lying cause as you've said yourself, he is not famous for flawed reviews,but you get the benefit of the doubt. Please, answer the question, thanks.

You call me an Intel fanboy but you keep squirming around the actual point not to admit the failure of your position
 
You call me an Intel fanboy but you keep squirming around the actual point not to admit the failure of your position
I'm squirming? The failure of my position?! HAHAHAHA! You are a funny one.
I have no position, Fact is, the AMD CPU is better in gaming, its usually faster, and in other workloads it's 5-10% behind for 90% less power usage.

Those are the facts, I know you don't like it, I don't know why you're trying to defend a multi-billion dollar corporation who doesn't care for you and fed us rubbish for many years, but conclusions of many MANY reviewers come to the same conclusion, the AMD offerings are overall better (for now).

It can change you know, maybe AMD's next range won't move the needle significantly forward and Intel release something decent, who knows.
All I know is, I'll do what any sane person does, wait for the reviews, buy whatever is the best for my budget at the time with a little eye on power consumption as power is expensive where I live and heat generation can be a problem during the summer.
 
I'm squirming? The failure of my position?! HAHAHAHA! You are a funny one.
I have no position, Fact is, the AMD CPU is better in gaming, its usually faster, and in other workloads it's 5-10% behind for 90% less power usage.
Your original position was that the 14900k needs 90% more power for 10% more MT performance than it's competitors. Which is of course correct, but on the same note, those competitors ALSO need 90% more power for 10% more performance. So - it's completely irrelevant - in mt workloads both amd and intels high end solutions are kinda tied (and intel wins in mid to low end). Set them both to the same wattage and then test - that's what sane people do that care about efficiency. If you allow one to draw 4096 watts and limit the other one to 150w, of course the latter is going to be more efficient, lol

Then you posted a video of der8auer that supposedly tested a 14900k at 80w in MT workloads and it was laughably bad. Which is a lie, cause that never happened, I in fact checked the video. So...why are you making stuff up? I don't get it
 
Your original position was that the 14900k needs 90% more power for 10% more MT performance than it's competitors.
Is that what I said? I wasn't just quoting the article at all?
Set them both to the same wattage and then test...
Then you posted a video of der8auer that supposedly tested a 14900k at 80w in MT workloads and it was laughably bad. Which is a lie, cause that never happened, I in fact checked the video. So...why are you making stuff up? I don't get it
Okay, Okay, you got me, I very quickly searched up anyone who'd done this (not a huge amount tbf since nobody buying a 14900K would limit like this) but it still shows, Intel sucks ;)
Edit: What timing:
When your much more expensive, and power guzzling Intel CPU loses almost across the board to the much cheaper and far less power hungry AMD CPU.

But I'm sure the Intel is better in many other ways :cool:
 
Last edited:
Burty117 and Strawman, please discontinue your personal argument in this thread. If you feel you need to continue, do so using PM. Thank you.
 
Back