Intel Core i7-12700KF Review: Better than Core i9?

Yeah, more efficient because CPU load is far from 100%. That's how you cherry pick testing methods :D

Socket 1700: 2021
Socket AM4: 2017

Basically Alder Lake is hot as hell, has very poor software compatibility, requires ultra-expensive DDR5 to run properly and upgrade paths are abysmal.

Intel just sucks, admit it already :bomb:
You test efficiency with the applications you are using. If you are not going to render, there is no reason to care about efficiency when rendering, is there? If you are going to render, you are power limit the CPU to 160w where it's most efficient. No that hard is it?

In 99% of productivity workloads the 12900k absolutely massacres zen 3 in both performance and efficiency. That's just a fact. Photoshop / premiere / transcoding/ export / solidworks / autocad and a bunchload of scienitific applications.

Also, no matter which socket you buy right now, you are going to get 1 upgrade for each, intel 12th gen and zen 3d, so that argument is just invalid either way.
 
Somewhat better, yes.

Crushing ryzen? Lol no.

Waiting to see what 3d v-cache can do. And quite frankly, the real show will be zen 4 vs. Raptor lake.
No worries. Raptor Lake will have difficulties even against Zen3 with 3D-cache. Zen4 will be much better.
You test efficiency with the applications you are using. If you are not going to render, there is no reason to care about efficiency when rendering, is there? If you are going to render, you are power limit the CPU to 160w where it's most efficient. No that hard is it?

In 99% of productivity workloads the 12900k absolutely massacres zen 3 in both performance and efficiency. That's just a fact. Photoshop / premiere / transcoding/ export / solidworks / autocad and a bunchload of scienitific applications.

Also, no matter which socket you buy right now, you are going to get 1 upgrade for each, intel 12th gen and zen 3d, so that argument is just invalid either way.
Render applications that put CPU on maximum load tell quite well how much power CPU will consume on maximum load. That also sets minimums for cooling, VRM and PSU for obvious reasons. 99.9% of users are not limiting CPU power after testing where it's most efficient or anything like that.

Adobe software are crappily optimized, same goes for Autocad. Of course Intel is "efficient" there because CPU load is very small.

It's not invalid in because there is no guarantee you could get even 1 upgrade on LGA1700. It also proves AMD offers much better upgrade paths in general. And, also, there are tons of CPU's you can use with AM4 socket. However, there are only few ones can be used with LGA1700. That makes huge difference when having used motherboard without CPU.
 
Render applications that put CPU on maximum load tell quite well how much power CPU will consume on maximum load. That also sets minimums for cooling, VRM and PSU for obvious reasons. 99.9% of users are not limiting CPU power after testing where it's most efficient or anything like that.

Adobe software are crappily optimized, same goes for Autocad. Of course Intel is "efficient" there because CPU load is very small.

It's not invalid in because there is no guarantee you could get even 1 upgrade on LGA1700. It also proves AMD offers much better upgrade paths in general. And, also, there are tons of CPU's you can use with AM4 socket. However, there are only few ones can be used with LGA1700. That makes huge difference when having used motherboard without CPU.
But if you are not rendering you shouldn't care about consumption during rendering. If you are you should set a power limit. Most users that get a K chip do actually tinker with it, that's the whole point.

If the CPU load is very small why aren't the zen 3 as efficient? Why are they getting destroyed in both efficiency and performance?
 
But if you are not rendering you shouldn't care about consumption during rendering. If you are you should set a power limit. Most users that get a K chip do actually tinker with it, that's the whole point.

If the CPU load is very small why aren't the zen 3 as efficient? Why are they getting destroyed in both efficiency and performance?
Point is that other software than rendering tend to use cores much better than before. And future software may do it even better. If it happens, then efficiency is lower.

Because Ryzen 5000 desktop series are not meant to be efficient AND it's not even meant to perform well. They are basically downgraded server chips made with very low effort. For efficient and good performing chips, look at Ryzen 5000 laptop APU's.

AMD only uses chiplets for desktop CPU's because:

1. AMD has to use GlobalFoundries 12/14nm for something
2. Chiplet design saves 7nm die area
3. Making 16 core desktop CPU is trivial

But to make efficient and fast desktop CPU, chiplets are no-go.
 
Somewhat better, yes.

Crushing ryzen? Lol no.

Waiting to see what 3d v-cache can do. And quite frankly, the real show will be zen 4 vs. Raptor lake.

In some games actually Alder Lake is truly crushing Ryzen 5000. Age of Empires 4 for example.

120 vs 80 minimum fps between i5-12600K and 5800X here.

If this brand new game is a sign of things to come, AMD should be worried. Alder Lake is a brand new design and the performance is not fully maxed yet. It's a work in progress just like Ryzen 1000 series.

I expect Raptor Lake vs Zen 4 to be really interresting.
 
In some games actually Alder Lake is truly crushing Ryzen 5000. Age of Empires 4 for example.

120 vs 80 minimum fps between i5-12600K and 5800X here.

If this brand new game is a sign of things to come, AMD should be worried. Alder Lake is a brand new design and the performance is not fully maxed yet. It's a work in progress just like Ryzen 1000 series.

I expect Raptor Lake vs Zen 4 to be really interresting.
Age Of Empires is using outdated engine. It's not brand new game. Just like Far Cry 6 is using engine from 2012 despite launched this year. Heck, you can release original Doom today and tell it's "brand new" despite technically being 1994 stuff. And honestly, who cares about FPS on strategy game?
already three generations.

At least, you should learn basics. Sunny Cove ("Tiger Lake") was developed from Skylake. Alder Lake uses Golden Cove that is developed from Willow Cove that was based on Sunny Cove. While Ryzen was really brand new, Alder Lake is based on Sunny Cove that is based on Skylake that is based on Sandy Bridge from 2011. It's NOTHING like Ryzen 1000 series and is already pretty much maxed out.
 
Pricewise it is on 5800 level but threadwise it is 5900. I think AMD will soon adjust its price accordingly. Thanks Intel for returning to competitive mode, not everyday Intel 's doing that to put it mildly.

Having said that many people choose 5800 since they can fit into small(ish) chassis. Not this one with its 360 AIO requirement. This in itself will land it into non mainstream users since not everyone wants full atx tower.
 
Age Of Empires is using outdated engine. It's not brand new game. Just like Far Cry 6 is using engine from 2012 despite launched this year. Heck, you can release original Doom today and tell it's "brand new" despite technically being 1994 stuff. And honestly, who cares about FPS on strategy game?
already three generations.

At least, you should learn basics. Sunny Cove ("Tiger Lake") was developed from Skylake. Alder Lake uses Golden Cove that is developed from Willow Cove that was based on Sunny Cove. While Ryzen was really brand new, Alder Lake is based on Sunny Cove that is based on Skylake that is based on Sandy Bridge from 2011. It's NOTHING like Ryzen 1000 series and is already pretty much maxed out.

High refresh rate gamers do, 120-144 Hz is pretty much bare minimum for most gamers today, so 80 and 120 fps is like day and night. Who cares about what engine the game use, I look at reality and performance only.

Do you avoid games that run best on Intel? Haha. Far Cry series is hugely popular with high critics scores.

Ryzen 1000 series was a beta test, I know because I had one and I know several others that sold their Ryzen 1000 chips and went back to Intel with the release of 8700K, OCed to 5 GHz on all cores and had flawless performance and still does. An overclocked 8700K easily beats Ryzen 1000, 2000 and 3000 series, it even performs on par with 5000 series in many games.

First with Ryzen 5000 AMD was a true alternative for high-end users, however Intel is back with Alder Lake and AMD is back to focussing on performance for money, like they should. AMD is the cheaper brand always will be. No fanboy just facts. Their R&D budget is 1/10 of Intel and Nvidia.

AMDs timing with Ryzen was perfect, because Intel was stuck on 14nm. However they are not anymore and AMD is forced to cut prices.

3D Cache will do next to nothing. It's a stop gap solution to counter Alder Lake. AMD can't use TSMC 5nm before late 2022 or early 2023. Cost is too high and it would ruin AMD's pricing completely. Apple owns TSMC 5nm as long as they want. Apple is TSMC's best customer, always will be unless Apple leaves them.

Zen 4 will probably be a beta platform like Ryzen 1000 and Alder Lake too tho. Raptor Lake will be mature at that point, 20% IPC uplift, better node, more cores + Windows 11 and Thread Director perfected + Software and games bugs are ironed out. I will be very impressed if AMD will be close in terms of performance.

AMD had a good run tho.
 
Last edited:
Pricewise it is on 5800 level but threadwise it is 5900. I think AMD will soon adjust its price accordingly. Thanks Intel for returning to competitive mode, not everyday Intel 's doing that to put it mildly.

Having said that many people choose 5800 since they can fit into small(ish) chassis. Not this one with its 360 AIO requirement. This in itself will land it into non mainstream users since not everyone wants full atx tower.

i7-12700K uses ~50 watts more than 5800X at stock and beats it in pretty much everything with ease.

You can easily cool a 12700K with air or cheap 240mm, and run it in an ITX case just fine.

Did you watch reviews? The chip is cool at stock. 64c according to Techpowerup. 5800X hit 75c in their test using THE SAME COOLER.



So lets be real here. AMD fanboys are raging and grasping at straws about the watt usage but Alder Lake first get hot and draw alot of watts when you overclock them alot. When you do this, they destroy Ryzen 5000 series in terms of performance. No-one says you should OC your CPU and barely anyone does. Stock performance and stock watts is what matters for 99% of people.

Entusiasts have the option to run Alder Lake at 5.2-5.4 GHz on all cores, if they want to. The OC headroom is there. Ryzen is hitting a brickwall in terms of clockspeeds and cooling won't make much difference for 24/7 usage.

It's funny why some people think that a CPU that can use 250-300 watts in synthetics and burn-in will run anywhere near that in regular workloads and gaming.

i9-12900K at 5.2 GHz on all cores draws like 100 watts in gaming. Look at reality instead of cherry pick your numbers. Peak watt usage using AVX2 workloads does not matter for most people. Has nothing to do with real world temp or heat.
 
Last edited:
High refresh rate gamers do, 120-144 Hz is pretty much bare minimum for most gamers today, so 80 and 120 fps is like day and night. Who cares about what engine the game use, I look at reality and performance only.
Playing low paced strategy games, difference between 60 and 144 FPS is almost non-existent. You hardly notice anything. Try if don't believe.

Outdated engines usually to get better and then performance is much different.
Do you avoid games that run best on Intel? Haha. Far Cry series is hugely popular with high critics scores.
Engine is outdated, period. "Critics" usually don't understand anything if giving high scores for dumbed down games. While I have played Far Cry 2 for several hundred hours, for example Far Cry 5 is just way too easy to have any kind of challenge. I conquered town with crappy melee weapon on hardest difficulty. That sucked.

There you go

High scores for 2020 game that is miles behind 2012 game from same series 🤦‍♂️
Ryzen 1000 series was a beta test, I know because I had one and I know several others that sold their Ryzen 1000 chips and went back to Intel with the release of 8700K, OCed to 5 GHz on all cores and had flawless performance and still does. An overclocked 8700K easily beats Ryzen 1000, 2000 and 3000 series, it even performs on par with 5000 series in many games.
It does but with huge power consumption. Just like Ryzen 5000 is third generation Ryzen, Alder Lake is third generation Cove. And since Cove architecture starts on 2011, there is not much room for improvement unless doing heavy redesign.
First with Ryzen 5000 AMD was a true alternative for high-end users, however Intel is back with Alder Lake and AMD is back to focussing on performance for money, like they should. AMD is the cheaper brand always will be. No fanboy just facts. Their R&D budget is 1/10 of Intel and Nvidia.

AMDs timing with Ryzen was perfect, because Intel was stuck on 14nm. However they are not anymore and AMD is forced to cut prices.

3D Cache will do next to nothing. It's a stop gap solution to counter Alder Lake. AMD can't use TSMC 5nm before late 2022 or early 2023. Cost is too high and it would ruin AMD's pricing completely. Apple owns TSMC 5nm as long as they want. Apple is TSMC's best customer, always will be unless Apple leaves them.
AMD is focusing on server market, there money is.

AMD is already using 5nm TSMC. They want to build stock before releasing though.
Zen 4 will probably be a beta platform like Ryzen 1000 and Alder Lake too tho. Raptor Lake will be mature at that point, 20% IPC uplift, better node, more cores + Windows 11 and Thread Director perfected + Software and games bugs are ironed out. I will be very impressed if AMD will be close in terms of performance.

AMD had a good run tho.
Why Zen4 would be beta platform? Because hybrid approach? Luckily Intel has already done much of beat testing.

Zen 3D cache gives 15% more and Zen4 gives another 20% plus better node plus perhaps it's not downgraded desktop chip but real dekstop chip that gives at least 10% more. Make your calculations from there. Also in case Raptor Lake is simply Alder Lake with steroids, AMD will again have huge advantage in power consumption.
Did you watch reviews? The chip is cool at stock. 64c according to Techpowerup. 5800X hit 75c in their test using THE SAME COOLER.
You cannot compare temperatures between different architecture CPU's.
It's funny why some people think that a CPU that can use 250-300 watts in synthetics and burn-in will run anywhere near that in regular workloads and gaming.

i9-12900K at 5.2 GHz on all cores draws like 100 watts in gaming. Look at reality instead of cherry pick your numbers. Peak watt usage using AVX2 workloads does not matter for most people. Has nothing to do with real world temp or heat.
Of course it can if "regular workload" is same as burn in.

Do you limit cooling, VRM, PSU etc based on "worst case" or "regular" scenario? Exactly.
 
Playing low paced strategy games, difference between 60 and 144 FPS is almost non-existent. You hardly notice anything. Try if don't believe.

Outdated engines usually to get better and then performance is much different.

Engine is outdated, period. "Critics" usually don't understand anything if giving high scores for dumbed down games. While I have played Far Cry 2 for several hundred hours, for example Far Cry 5 is just way too easy to have any kind of challenge. I conquered town with crappy melee weapon on hardest difficulty. That sucked.

There you go

High scores for 2020 game that is miles behind 2012 game from same series 🤦‍♂️

It does but with huge power consumption. Just like Ryzen 5000 is third generation Ryzen, Alder Lake is third generation Cove. And since Cove architecture starts on 2011, there is not much room for improvement unless doing heavy redesign.

AMD is focusing on server market, there money is.

AMD is already using 5nm TSMC. They want to build stock before releasing though.

Why Zen4 would be beta platform? Because hybrid approach? Luckily Intel has already done much of beat testing.

Zen 3D cache gives 15% more and Zen4 gives another 20% plus better node plus perhaps it's not downgraded desktop chip but real dekstop chip that gives at least 10% more. Make your calculations from there. Also in case Raptor Lake is simply Alder Lake with steroids, AMD will again have huge advantage in power consumption.

You cannot compare temperatures between different architecture CPU's.

Of course it can if "regular workload" is same as burn in.

Do you limit cooling, VRM, PSU etc based on "worst case" or "regular" scenario? Exactly.

Hahaha, AMD is not using TSMC 5nm node AT ALL stop talking BS.
So they are stocking up with 5nm chips for their 2023 Zen 4 release? 😂😂

Intel still dominates the enterprise server market, same with enterprise laptop market. Complete Intel domination here. Lenovo with Thinkpad series ships like 95% Intel based laptops. Intel were able to deliver during lockdowns, AMD was not. That is the reason why Intel financials did not suffer at all. Intel shipped millions and millions of chips during COVID, perk of owning their own fabs. AMD relies 100% on TSMC in comparison.

3D Cache does not give flat 15% hahahaha 😂 it's "up to 15%" by AMDs numbers which are most likely cherry picked. It will be a minor refresh like 3000 XT models was. You will see in 3-4 months.

In non-cache limited workloads you will see next to no gains. And Alder Lake will age in the coming months and deliver even better performance.

Raptor Lake will be a huge step up from Alder Lake.
 
I think staying with i7 would be best, as i9s happen to have some lacking's. The new processors are not able to boot games such as Assassin Creed: Valhalla.
 
I dont care how fast your cpu is , if it needs 150/ 250w to work. my 5600x consumes 30W gaming ( -25 curve) so. no intel. plz dont suck with arc?
 
I dont care how fast your cpu is , if it needs 150/ 250w to work. my 5600x consumes 30W gaming ( -25 curve) so. no intel. plz dont suck with arc?

Hahah, no it does not.

AMD and Intel CPUs are pretty much identical in terms of watt usage in gaming; https://www.techpowerup.com/review/amd-ryzen-5-5600x/19.html

5600X even draws more than 9900K and 5800X in gaming, why? Because 5600X uses the worst silicon.

Funny how many AMD fanboys and owners spreading BS about the watt usage in real world workloads

Watt usage has become the most important thing about a CPU now when Alder Lake is out 😂 Grasping at straws.

I remember back in the early Ryzen days, Cinebench multithreaded was the most important number according to AMD fanboys back then 😂 Single thread perf and real world perf did not matter 😂

Intel is back with Alder Lake and I'm glad they are. Now we will see much better prices from AMD. They overcharged for all 5000 series SKUs and left out Non-X variants because they knew this day would come. They wanted to reap the most benefit. AMD did not even want to support 3000 and 5000 series CPUs on 300 series chipsets to begin with, but community was raging and AMD gave in, however tons of 300 series boards don't support 3000 and 5000 series chips anyway. Bad VRM, lack of ROM size etc.

Now AMD is forced back to lowering prices, which was what made Ryzen popular in the first place. Prices took a big step up with Ryzen 5000 series and suddenly 6 cores and 12 threads was more than enough for most AMD fanboys, because 5600X was the only chip they could afford 😂 Before that they praised chips like 1700, 2700, 3700/3800 for having 8 cores and 16 threads ... Sigh 🤣

AMD would have done EXACTLY the same thing as Intel when they had complete domination over CPU market. We saw the beginning of that with 5000 series. Now AMD is forced to cut prices instead 👌😉

Chips like i5-12600K and i7-12700K is what will sell the best. The i9 is a gimmick chip for most people just like Ryzen 5950X is.

i5-12400 looks like a value king according to leaks, smashing Ryzen 5600X with ease for 199 dollars even with the use of DDR4.
 
Last edited:
I dont care how fast your cpu is , if it needs 150/ 250w to work. my 5600x consumes 30W gaming ( -25 curve) so. no intel. plz dont suck with arc?
No it doesnt consume 30w while gaming. Thats asinine and whoever says something like that has no idea about pcs, he is just fanboying.
 
i7-12700K uses ~50 watts more than 5800X at stock and beats it in pretty much everything with ease.

You can easily cool a 12700K with air or cheap 240mm, and run it in an ITX case just fine.

Did you watch reviews? The chip is cool at stock. 64c according to Techpowerup. 5800X hit 75c in their test using THE SAME COOLER.



So lets be real here. AMD fanboys are raging and grasping at straws about the watt usage but Alder Lake first get hot and draw alot of watts when you overclock them alot. When you do this, they destroy Ryzen 5000 series in terms of performance. No-one says you should OC your CPU and barely anyone does. Stock performance and stock watts is what matters for 99% of people.

Entusiasts have the option to run Alder Lake at 5.2-5.4 GHz on all cores, if they want to. The OC headroom is there. Ryzen is hitting a brickwall in terms of clockspeeds and cooling won't make much difference for 24/7 usage.

It's funny why some people think that a CPU that can use 250-300 watts in synthetics and burn-in will run anywhere near that in regular workloads and gaming.

i9-12900K at 5.2 GHz on all cores draws like 100 watts in gaming. Look at reality instead of cherry pick your numbers. Peak watt usage using AVX2 workloads does not matter for most people. Has nothing to do with real world temp or heat.
I couldn't testify on behalf of everyone else but we built 2 mobile workstations end of 2020, one with 5950x and the other one was 10900K. By means of mobile was in smaller chassis because we need to carry them anywhere we go to do our heavy images and data crunching.
I guess everyone could predict which one hit the thermal ceiling and freeze up frequently. I have to dial down the turbo boost in intel xtu to make it work.

Now this one consume somewhere on 10900K ballpark. Oh well.
 
I know a bit more that you.

You are playing a game that has your CPU at idle (4%) with a hard 60 fps lock. Try an actual game, like ac or cyberpunk or battlefield or what have you. Im pretty sure a 1290k overclocked to 6ghz consumes 30watts as well on journey.

I mean minesweeper is technically a game, claiming my 11600k consumes 15w while gaming would be stupid though, wouldn't it?
 
You are playing a game that has your CPU at idle (4%) with a hard 60 fps lock. Try an actual game, like ac or cyberpunk or battlefield or what have you. Im pretty sure a 1290k overclocked to 6ghz consumes 30watts as well on journey.

I mean minesweeper is technically a game, claiming my 11600k consumes 15w while gaming would be stupid though, wouldn't it?
keep moving your goalpost, this are actual games, ( better games) btw played Valhalla at 35w, so no. you loose.
 
Last edited:
keep moving your goalpost, this are actual games, ( better games) btw played Valhalla at 35w, so no. you loose.
Again... provide some THIRD PARTY evidence of this... not just your PC - you DO realize that screenshots from your own PC don't really constitute proof of anything, right?
 
Back