Just How Much Faster are Intel CPUs for Gaming?

It is you just don't listen.
I got you any anyone arguing dead to rights.
f your losing by 10 FPS, your losing by 10 FPS.
If your losing by 20 FPS, your losing by 20 FPS.
If your losing by 30 FPS, your losing by 30 FPS.

[link]

Battlefield 5: 8700K faster by 11FPS/18FPS
8700K:114/167
3600: 103/149

Tomb Raider: 8700K faster by 10FPS/19FPS
8700K: 78/114
3600: 68/95

Far Cry: 8700K faster by 10FPS/7/FPS
8700K: 84/110
3600: 74/103

World War Z: 8700K faster by 34FPS/25FPS
8700K: 169/201
3600: 135/176

Rage 2: 8700K faster by 4FPS/8FPS
8700K: 122/168
3600: 118/160

Hitman 2: 8700K faster by 12 FPS/13FPS
8700K: 94/118
3600: 82/105

Intel chips game better then AMD chips, sometimes by a little, sometimes by alot, from 720p, 1080p to 1440p @ 60hz. These are not niche results.
I'll keep posting them over and over though, I love ramming data down whining fanboys mouths and hearing them whimper and say comments like 'well it may not be noticeable.
No one argued what was 'noticeable' for the past two decades going back to my first Voodoo Banshee. Thats a new argument for a loser trying to play off a loss. It's pathetic.
If it was the other way around, we would never hear the end of it.
But its not, so tough sh!t.

Sure, high-end gamers are but those are far from the majority.
But there is still alot of them.
 
Last edited:
I got you any anyone arguing dead to rights.
f your losing by 10 FPS, your losing by 10 FPS.
If your losing by 20 FPS, your losing by 20 FPS.
If your losing by 30 FPS, your losing by 30 FPS.

[link]

Battlefield 5: 8700K faster by 11FPS/18FPS
8700K:114/167
3600: 103/149

Tomb Raider: 8700K faster by 10FPS/19FPS
8700K: 78/114
3600: 68/95

Far Cry: 8700K faster by 10FPS/7/FPS
8700K: 84/110
3600: 74/103

World War Z: 8700K faster by 34FPS/25FPS
8700K: 169/201
3600: 135/176

Rage 2: 8700K faster by 4FPS/8FPS
8700K: 122/168
3600: 118/160

Hitman 2: 8700K faster by 12 FPS/13FPS
8700K: 94/118
3600: 82/105

Intel chips game better then AMD chips, sometimes by a little, sometimes by alot, from 720p, 1080p to 1440p @ 60hz. These are not niche results.
I'll keep posting them over and over though, I love ramming data down whining fanboys mouths and hearing them whimper and say comments like 'well it may not be noticeable.
No one argued what was 'noticeable' for the past two decades going back to my first Voodoo Banshee. Thats a new argument for a loser trying to play off a loss. It's pathetic.
If it was the other way around, we would never hear the end of it.
But its not, so tough sh!t.


But there is still alot of them.

lol you can keep posting the same drivel over and over.
That has nothing to do with the I quoted you saying.
And my response to that quote, as someone else mentioned your post are quite meaningless!
Gaming at 120Hhz or 144Hz is niche?
Lol, do you need a warm washcloth?
 
Drivel, bias, all the same.
Reality? Only one of them.
Gaming? Intel>AMD.
Period.
We're done here, not trying to be mean, but its time to accept it and move on.
 
Drivel, bias, all the same.
Reality? Only one of them.
Gaming? Intel>AMD.
Period.
We're done here, not trying to be mean, but its time to accept it and move on.

We all know that Intel provides higher FPS than AMD in more games than not, if it's a high end K chips, and low resolution, and an expensive high end graphics card, all at the same time, which might be perceivable in certain games if you have a high refresh monitor. No one is crying about it or denies it. The argument is that this tiny niche benefit is outweighed by AMD's advantages for the VAST majority of gamers. This is made even more obvious by the VAST majority of the most knowledgeable gamers in the DIY market buying AMD. We accept Intel as the 'king' of gaming benchmarks, and you should accept AMD as better for more gamers.
 
Last edited:
Wow, whether or not its noticeable, 10-30FPS is pretty substantial.
I knew Intel was better for gaming but when you compare them like this, I didn't think it was THAT much of a difference.
Intel for gaming all the way, which is why, gamers haven't had a reason to switch since Ryzen released. In fact, if building a gaming rig in 2020, Intel still makes alot of sense. Glad to finally see a review that purely focuses on this aspect, seeing a 20-30FPS difference in some cases is really eye opening.
10 year old Intel architecture with a paint job STILL putting a beatdown on AMD's best, yikes, that will drive a stake through any AMD fanboy's heart.


27 fps increase (2080ti, 1080p, settings that no one would use to try to emphasize CPU bottleneck to its max) from say a base of 30 is substantial. 30 -57 -- yes I agree, substantial.

But from 400 - 427 (9 game average with same settings I just mentioned), is what in percent? Is that substantial? You are conveniently, and purposely ignoring that percent difference needs to be the qualifying calculation, so you can try to present a slanted view that you think no one will notice. Or really you maybe just don't understand basic math principles, not sure, but consider this a lesson then.

400 to 427 (nine game average at best possible case scenario for Intel with settings that no one plays at) is only a six percent difference. 6%

With a 2060 Super, we can see it is 0% -- with a 2070(s) it will be maybe 3-4%. Start lifting the settings above low, like how real people game in real life, and that tiny difference disappears even more quickly.

I don't think any AMD fanbois would even care about that 6% in an artificially induced situation, nor should they ... the Intel fanbois are the ONLY ones who care about this 6% per my observations (for obvious reasons). If one believes that 427 vs 400 is going to change their gaming experience, then they might want to go for a visit to the dr. and get that sorted.
 
Last edited:
IMO gaming is the only benchmark for a CPU where after a certain point in performance, the difference becomes indistinguishable for majority of human beings. In fact I will go ahead and say that no person on earth can tell the difference between 300 fps and 330 fps.
But every other benchmark for a CPU, be it compiling code or video editing or any other productivity task. There is a huge difference between getting your job done in 10 mins vs 7 mins.
This is the current scenario with AMD and Intel. Intel supposedly wins against AMD in gaming by a small average margin of 5-10% but they get destroyed in every other kind of benchmark. And when you mix in price of these CPUs into the consideration, it just doesn't make sense to buy an Intel chip.
 
Thats completely true, if you ignore results like these:
https://www.techspot.com/review/1871-amd-ryzen-3600/

Battlefield 5: 8700K faster by 11FPS/18FPS
8700K:114/167
3600: 103/149

... Our graphics card of choice was the MSI Trio GeForce RTX 2080 Ti....

Besides that, the 8700k plus HSF costs twice as much as the 3600.

My point was if you are using a mid range GPU like a 2060 super or a 5700XT, there is no difference.
 
... Our graphics card of choice was the MSI Trio GeForce RTX 2080 Ti....

Besides that, the 8700k plus HSF costs twice as much as the 3600.

My point was if you are using a mid range GPU like a 2060 super or a 5700XT, there is no difference.
Comparing 3600 vs 8700k for gaming would be an entirely wrong argument because for the price difference it's possible to upgrade GPU of the system with 3600. $250+ difference is sufficient to make a good upgrade at GPU side. I've tested it at pcpartpicker. Now, when you have (let's say) $1500 to build, you have better potential with a 3600 between the two. Which one would you prefer? 8700k + 2060s or 3600 + 2080/s? That's for us, the erathlings, with limited budget...

3700x vs 10600k, which is the subject of this article, makes for a more balanced comparison, in my opinion. Yes, it's 8 vs 6 core, but at similar price.

amstech's point is that, when we take price out of the equation and when we look at the static numbers, intel has the advantage in gaming. In these conditions he's right.
 
Let's assume we all have


A 9600K is a good gaming CPU, you are right. My R5 1600 is also a good gaming CPU, and I can say that for sure. It's been smooth as silk in every game I play.
A Ryzen 5 1600 is a good gaming machine in which parallel dimension ?
It is barely acceptable...
 
Drivel, bias, all the same.
Reality? Only one of them.
Gaming? Intel>AMD.
Period.
We're done here, not trying to be mean, but its time to accept it and move on.

I don't think you know reality.

I have 3 gaming rigs that I maintain, no hardware favorites. But when People game, they do not do so in a vacuum.


I have noticed that on my Intel rig's, while streaming I have to close all my other apps down, while playing. I do not even have to worry about such things with my older Zen chip.

The very minute you have other things running in the background than just a Game.... AMD wins.
 
A Ryzen 5 1600 is a good gaming machine in which parallel dimension ?
It is barely acceptable...

In the dimension where it plays thousands of games smooth as butter, including all of my games. You are comparing everything to a 9900K on a 2080Ti. Folks with old 4c/4t i5's are happily trudging along enjoying all kinds of games on GTX 1060's and lower.
 
Last edited:
I think we've had enough personal comments. Please confine your remarks to the topic. Thank you.
 
Do these much more often, even in your own real reviews. The size of the demographic that wants to maximize frames while running at competitive settings is significant. The fact pretty much every review site neglects them is ridiculous.

We need reviewers to include both, the casual gamer who wants everything to look pretty, and the player that cares much more about achieving every competitive advantage possible.
 
There was $150 difference between 3600 and 8700k, if you are willing to run an overclock, you can't rely on a $120 motherboard like the Ryzen can. Decent x90 intel boards cost at least $200. I wouldn't touch anything cheaper than that for my costly system. Then, you already have to buy an aftermarket cooler. Whether or not liquid cooling, your choice. Difference grows even further. Let's say you got a %20 FPS advantage on average over a cheaply 3600. You should be willing to pay at least $250+ over the Ryzen build. Probably north of $300 with liquid cooling, if you prefer. So, you see, you got a perceivable FPS boost, only if you ignore the price difference :)

In a world where both systems cost the same, I'd definitely get the 8700k, gladly. But it seems we're not living in such a world where we can ignore such price differences.

Then, you'd be better off giving the example of ryzen 3700x vs 10600k, because that would be an apple to apple comparison. (even though you wouldn't want to put your 10600k on a cheap motherboard, which is still a Ryzen's strong point)
There was $150 difference between 3600 and 8700k, if you are willing to run an overclock, you can't rely on a $120 motherboard like the Ryzen can. Decent x90 intel boards cost at least $200. I wouldn't touch anything cheaper than that for my costly system. Then, you already have to buy an aftermarket cooler. Whether or not liquid cooling, your choice. Difference grows even further. Let's say you got a %20 FPS advantage on average over a cheaply 3600. You should be willing to pay at least $250+ over the Ryzen build. Probably north of $300 with liquid cooling, if you prefer. So, you see, you got a perceivable FPS boost, only if you ignore the price difference :)

In a world where both systems cost the same, I'd definitely get the 8700k, gladly. But it seems we're not living in such a world where we can ignore such price differences.

Then, you'd be better off giving the example of ryzen 3700x vs 10600k, because that would be an apple to apple comparison. (even though you wouldn't want to put your 10600k on a cheap motherboard, which is still a Ryzen's strong point)
This is dumb, even without overclocking the i5 pulls ahead in average. No need to add all that if the goal is to get better gaming performance than amd at the same price
 
Do these much more often, even in your own real reviews. The size of the demographic that wants to maximize frames while running at competitive settings is significant. The fact pretty much every review site neglects them is ridiculous.

We need reviewers to include both, the casual gamer who wants everything to look pretty, and the player that cares much more about achieving every competitive advantage possible.


You are wrong.

It is the Players who have to read a review, and determine if it helps suit their situation.

If you are a "casual" gamer and reading a review about the performance of a card at 120hz... then obviously it will also work for casual gameplay. Each and every game is different, as are their settings and effects.

If you like lush environment, then turn it up and watch your frames drop. Which typically doesn't matter one bit in a casual game.
 
Back