Intel Core i9-14900K, i7-14700K and i5-14600K Review

I think upgrading/buying new CPUs has become pretty pointless now for 99.9999% of users. Even CPUs from 15 years ago like the i7 920 are plenty for everyday tasks/users and anything from the ivy bridge onward era is still plenty good for PC gamers. 4c8t CPUs from ye old bloomfield can still drive most games unless they have exclusive instruction sets like the last CoD and give you a decent experience. i7 10700KF is going to be my last stop and looking at these reviews only confirms it; just miniscule improvements in terms of the actual user experience...for a whole lot of coin.
 
Last edited:
Then if it's productivity you're interested in, the 7950X is hard to beat at $580 as 16 big cores if you will, often end up being faster for complex workloads.
As long as USB legacy device reliability isn’t important. Anecdotal reports of USB 2.0 connection problems still continue despite several AGESA ‘fixes’ being released over the last 2 years for both AM4 and AM5. Lots of unconfirmed theories to this, ranging from tetchy infinity fabric voltages to motherboard chipset layouts. Speaking as someone who uses an AMD system as primary workstation, but has to switch to a backup Intel box to read some Sandisk thumb drives and do some scanning.
 
I think the last time they pulled this stunt was with the Haswell refresh chips, but you could also argue Skylake all the way to 10th Gen desktop chips were all just clock and core count bumps. Zen 5 is supposedly a big jump over 4 so I hope Intel is still competitive with 15th Gen.
 
If I have to build a PC right now, either for gaming or productivity, I will definitely choose Ryzen 7000. Simply better upgradability and efficiency. Intel is just doing the same trick as Pentium 4 Extreme Edition, pushing performance with unreasonable power consumption.

Meteor Lake is interesting, hope Intel will have much better CPUs for the new Core (Ultra) series. More competition is better for us.
 
It's a mind game. If you have the inferior product in terms of efficiency or performance you just release a higher nomenclature number to perceive that its superior than previous generation for the brand loyalists, naive consumers and the such. This is to keep the price stack from collapsing imo. Zen 4 3d is probably a big headache especially with the price fall of the 7800X3D at $329.99.
Not really. Those who has already Zen 4 system,only needs 5800x3d is more than enough for latest VGA. (no need new motherboard,ddr 5 etc....
Somebody want new system becouse has very old, zen 5
 
I think upgrading/buying new CPUs has become pretty pointless now for 99.9999% of users. Even CPUs from 15 years ago like the i7 920 are plenty for everyday tasks/users and anything from the ivy bridge onward era is still plenty good for PC gamers. 4c8t CPUs from ye old bloomfield can still drive most games unless they have exclusive instruction sets like the last CoD and give you a decent experience. i7 10700KF is going to be my last stop and looking at these reviews only confirms it; just miniscule improvements in terms of the actual user experience...for a whole lot of coin.

Yea but marketing will always try to sell something you don't really need. Just the fact that my 4790k can run Starfield smoothly at average 45fps on 1440p high prooves your point. Sure I removed all bloatware from my PC and don't even have 20 tabs open while playing but game engines don't really utilise more than those 2-4 main cores anyway, Starfield has been supposedly in development for several years, Devs knew it was primarily going for AMD CPU in Xbox Series X and still failed to optimise it, funnily it runs better on Intel.

If they added atleast 2 ecores for 14100 and kept the same price I think it would sell because that's the bare minimum majority needs. But they are overcharging for extra cores anyway and diminishing returns in performance for extra core are in most cases crazy.
 
Where are 2k & 4k res charts?
But you are at a gpu bottleneck. They also have 720p data for the biggest deltas. Even 1440p has no significant leads between all the halo cpu products . Unless you want to compare it to an older cpu going back 3 generations or older.
 
I wish you guys would include an emulation based benchmark to your testing.

These days I don't really bother with new, big games but stick to emulators, where single core IPC & Ghz, reigns supreme.

It's why to this day I still tend to stick with Intel but I've heard these 3D cache CPUs from AMD are very effective in some emulators, so I could be tempted to switch.
The latest AMD Zen cpus seem to have much better IPC than any of these intel cpus.
 
More like 13.5th gen, it's i7-7700k all over again :D

(why is 11900k bad @ spider-man?)
7000 series were a refresh of the 6000 series which were basically a refresh of the 4000 series. So basically 3 generations of the same architechture with basically zero ipc improvements...lol

On the flipside, I am still running my 4790k because it is still decent for lower tier gaming and as an HTPC on my secondary comp.
 
Running blenders and cinebenches on a loop with no power limits is something nobody does, very informative data there
All the gaming power numbers are there also. Are you whining about too much information? Nobody uses CPUs for rendering? Just ignore the data that's not relevant to you! Gasp!
 
All the gaming power numbers are there also. Are you whining about too much information? Nobody uses CPUs for rendering? Just ignore the data that's not relevant to you! Gasp!
No, I said nobody runs blender and cinebenches at 6ghz with 4096w power limit. That information is there just for the clicks imo. They don't serrve anyone any purpose, since noone is gonna be running his CPU at 400w for 10 hour long workloads.
 
Yea but marketing will always try to sell something you don't really need. Just the fact that my 4790k can run Starfield smoothly at average 45fps on 1440p high prooves your point. Sure I removed all bloatware from my PC and don't even have 20 tabs open while playing but game engines don't really utilise more than those 2-4 main cores anyway, Starfield has been supposedly in development for several years, Devs knew it was primarily going for AMD CPU in Xbox Series X and still failed to optimise it, funnily it runs better on Intel.

If they added atleast 2 ecores for 14100 and kept the same price I think it would sell because that's the bare minimum majority needs. But they are overcharging for extra cores anyway and diminishing returns in performance for extra core are in most cases crazy.
I locked all 8 corescon my 10700KF to 5.128Ghz and the experience is definitely not night and day from my i7 5930K@4.5Ghz @1440p max/ultra settings. It is not the massive difference of going from my PIII 1Ghz to my P4 2.8Ghz or my Q9650 to my i7 920.
 
Last edited:
Running blenders and cinebenches on a loop with no power limits is something nobody does, very informative data there
The power usage in games is even worse for Intel compared to AMD.

And yes, it is very informative data because you don't want to build a PC that will just shut down or blow your power supply if you do a simple 2-3 minute render in blender, something that I do often enough.

You seem to have an issue with testing the CPU for specific workloads.

"noone is gonna be running his CPU at 400w for 10 hour long workloads" - what? tell us more about how you don't know how people actually render. the CPU can 100% be used to speed up the GPU, and the start of the render is heavy on the CPU even if you only render on the GPU.

this is not about running the CPU at 100% for half a day. why do you have to give extreme examples to counter-argument something that is common sense?
 
Last edited:
The power usage in games is even worse for Intel compared to AMD.

And yes, it is very informative data because you don't want to build a PC that will just shut down or blow your power supply if you do a simple render in blender, something that I do often enough.
It's called power limit. Do you run blender at 400 watts? Who does that
 
I'm saying, people that rub hour long workloads don't blast it at 400w. That's common sense.
And what about short workloads? Those don't count when it comes to power usage?

And the gaming workloads are 100% long term workloads. The system hit ~650W in Hitman 3 for crying out loud. Almost 160W more than the 7950X and 200W more than 7950X3D. I don't think you understand just how huge that is.

Repeat after me: it used 200 more watts in a game.
 
Back