AMD vs. Intel: The Evolution of CPU Gaming Performance

I remember when I moved from my AMD X2 5600+ to my Phenom II x4 940 - the improvement was certainly felt and it seemed that maybe AMD might be able to do okay keeping within striking range of what Intel had to offer, but we all know that wasn't the case. Once Haswell hit the shelves it just blew out of the water anything AMD had to offer and completely left them in the dust.

The PII x4 940 had better IPC over AMD's Bulldozer and the PII released 2 years prior.

I kept my PII x4 940 and even pushed her to 3.71, I used the CPU for a good 4-5 years and I kept hoping that AMD would find a way to make their architecture work better.

I watched Bulldozer come out and felt it was kind of a step backwards...so I waited to see what Piledriver would offer. Piledriver comes out and finally has some IPC gains and surpasses what the Phenom II line can offer, but it still pales in comparison Intel.

So I wait, wanting to see what Steamroller could do. I wait and wait and wait....
Steamroller is dropped; if I remember correctly I think a couple of APUs were released using Steamroller, but the rest of the lineup wouldn't see the light of day.

In the end AMD pulled the plug on the Bulldozer architecture and I needed something better to run my GPUs. I moved to the Haswell lineup from Intel and picked up an i5-4670k. Just running the 4670K at stock blew the PII x4 940 @ 3.71 out of the water. It was like I got a whole new system, the 4670K gave my GTX 570s new life, my fps nearly doubled across most games. I ran her for 6 years or so and now I've gone back to AMD and using a 5900x.

I always liked AMD. I always had AMD CPUs since I purchased my own computer and the AMD CPU I had in it smoked the Intel system my roommate in college had back in 1999. He talked his new computer up and made fun of my system for running an AMD processor, that was until my system ran games better than his and gave me zero issues whereas he was constantly struggling with slow downs. I was hooked on AMD after that, it smashed the Pentium 4 my step-dad had in his computer that he was so very proud of, too. So I kept with AMD and was sad to see them fall so far behind Intel for so many years. I'm glad they're back at it and that they've made things very competitive now.
 
What speed was the memory running at on the 2600K? The test setup section mentions DDR3 platforms using DDR3-2400, but the Sandy Bridge 2600K's IMC did not support (overclocked or otherwise) a divider higher than 2133MT.
 
Excellent article (as always).

Last Intel CPU I had was a Pentium 150. From there, I moved to AMD. I went through Duron, Athlon, Phenom and now Ryzen.

I rebuilt my PC in July/August of 2020 when my 10+ year old Phenom II X3 720BE motherboard finally died. I got 10+ years out of that where I even was able to unlock the 4th disabled core and it worked without one issue. I even OC'd it. I was able to "play" Witcher 3 on it. Play being limited to just over 40 FPS and dropping into the mid 20's at times but I still could play many games and do other non-gaming things. Of course, the GTX 560 was also a bottleneck. :)

Of course, once I upgraded to my current Ryzen 5 3600 and a GTX 1660 (I didn't want to spend the money on a 2000 series and 3000 wasn't out), I could not believe how in the world I ever played Witcher 3 on that old machine.

For me, gaming was a factor, but I also do programming and other non-gaming things with my PC, so the best "bang for my buck" was AMD. When I was ready to upgrade last year, I looked at both Intel and AMD and I was considering going back to Intel. But research showed that for what I wanted/needed, the AMD Ryzen 5's were the way to go, especially price for performance returns. I was able to save a lot more money going with the Ryzen over the Intel and still be able to get the same performance in most games, if not more, as opposed to some of the more expensive Intel chips. And those Intel chips that were at the same level of the Ryzen 5 3600, the Intel route was still more expensive. And while they might have still performed better than the Ryzen, it was negligible at best. The extra money to squeeze maybe 3% more performance gains was not worth it. But that's for me. Other people may have different reasonings and there's nothing wrong with that.

Overall, having a choice is the best thing we as PC Builders have. It will be interesting to see what Alder Lake will provide. All that can do is then make AMD come back and try to best it. Competition breeds innovation and for us as consumers, that's a win-win!
 
Clock for clock comparison is stupid.... because some architectures can't clock high.... Some intel CPU can go 5Ghz out of the box.... while AMD CPU can't reach those clock speed even when overclocked (specially early ryzen that struggled to go above 4GHz)

 
I remember when I moved from my AMD X2 5600+ to my Phenom II x4 940 - the improvement was certainly felt and it seemed that maybe AMD might be able to do okay keeping within striking range of what Intel had to offer, but we all know that wasn't the case. Once Haswell hit the shelves it just blew out of the water anything AMD had to offer and completely left them in the dust.

The PII x4 940 had better IPC over AMD's Bulldozer and the PII released 2 years prior.

I kept my PII x4 940 and even pushed her to 3.71, I used the CPU for a good 4-5 years and I kept hoping that AMD would find a way to make their architecture work better.

I watched Bulldozer come out and felt it was kind of a step backwards...so I waited to see what Piledriver would offer. Piledriver comes out and finally has some IPC gains and surpasses what the Phenom II line can offer, but it still pales in comparison Intel.

So I wait, wanting to see what Steamroller could do. I wait and wait and wait....
Steamroller is dropped; if I remember correctly I think a couple of APUs were released using Steamroller, but the rest of the lineup wouldn't see the light of day.

In the end AMD pulled the plug on the Bulldozer architecture and I needed something better to run my GPUs. I moved to the Haswell lineup from Intel and picked up an i5-4670k. Just running the 4670K at stock blew the PII x4 940 @ 3.71 out of the water. It was like I got a whole new system, the 4670K gave my GTX 570s new life, my fps nearly doubled across most games. I ran her for 6 years or so and now I've gone back to AMD and using a 5900x.

I always liked AMD. I always had AMD CPUs since I purchased my own computer and the AMD CPU I had in it smoked the Intel system my roommate in college had back in 1999. He talked his new computer up and made fun of my system for running an AMD processor, that was until my system ran games better than his and gave me zero issues whereas he was constantly struggling with slow downs. I was hooked on AMD after that, it smashed the Pentium 4 my step-dad had in his computer that he was so very proud of, too. So I kept with AMD and was sad to see them fall so far behind Intel for so many years. I'm glad they're back at it and that they've made things very competitive now.

I've been running AMD only since '99 myself, and skipped the entire FX line. Had an Athelon II x3 instead (I was far more budget restricted at the time) for what felt like forever until jumping to a Ryzen 5 2600 build, currently a R7 3700x which I plan on swapping out for a R9 5900x or 5950x soonishly (though may hold out for the 3D cache refresh next year now).

It's been a hell of a ride with them, and good to see them back in the game doing so well.

 
The funniest part, I can confirm, because I'm running one, is that 10 years later, for gaming, it's not useful to upgrade from a i7 2600K (mine running at 4.0Ghz, but can go up to 4.5Ghz if needed be) to a 10y younger CPU, just because of gaming, as it is good enough to pull 60+FPS (and many times around 80-100FPS) in the games I play in 1080p (many at 60 fps even in my old setup of 3840x1024 = 3 1280x1024 monitors)
They're a mix of Tomb Raider, Witcher, Assassin's Creed series up to Odissey, W.Warcraft, and other Steam MMO and Single games, Destiny2, Star Wars games.
Mine is paired atm w/a GTX 1060 6GB (best partner for it as both are pushed to the limit around the same load). The only game I've seen this combo struggle is with MS Flight Simulator.
And it's also good enough for daily use, browsing, running VirtualBox and Bluestack machines and all other kind of stuff, except may be rendering and hyper specific video tooling.
So no need to upgrade, people. Stick to your 10y CPU, it's good enough, and go spend your money elsewhere. Travelling...
 
""Please note no cores were disabled on the FX-8350, so if you believe it’s an 8-core CPU then I guess 8 cores were active. Whatever the case, it’s either a slow quad-core or a beyond terrible 8-core processor, we’ll leave that for you to decide.""

hahahahahaha ... Since more than 80% of the time you cannot run all 8 ''pipelines'' simultaneously ... not a true 8 core cpu. IIRC .. the FPUs could never run all 8 at the same time regardless of compilation ... hence gaming pit of death
 
The funniest part, I can confirm, because I'm running one, is that 10 years later, for gaming, it's not useful to upgrade from a i7 2600K (mine running at 4.0Ghz, but can go up to 4.5Ghz if needed be) to a 10y younger CPU, just because of gaming, as it is good enough to pull 60+FPS (and many times around 80-100FPS) in the games I play in 1080p (many at 60 fps even in my old setup of 3840x1024 = 3 1280x1024 monitors)
They're a mix of Tomb Raider, Witcher, Assassin's Creed series up to Odissey, W.Warcraft, and other Steam MMO and Single games, Destiny2, Star Wars games.
Mine is paired atm w/a GTX 1060 6GB (best partner for it as both are pushed to the limit around the same load). The only game I've seen this combo struggle is with MS Flight Simulator.
And it's also good enough for daily use, browsing, running VirtualBox and Bluestack machines and all other kind of stuff, except may be rendering and hyper specific video tooling.
So no need to upgrade, people. Stick to your 10y CPU, it's good enough, and go spend your money elsewhere. Travelling...
I disagree.

At 65/95 for a i7-2600K vs 112/156 for an i7-9700K (same as the tested i9-9900K for gaming), that's a HUGE upgrade.

Sure, that chip isn't that bad by today's standards, but let's acknowledge the numbers here.

Either way, one couldn't simply "upgrade" to a 10-year newer CPU anyway without building an entirely new system.
 
Last edited:
I disagree.

At 65/95 for a i7-2600K vs 112/156 for an i7-9700K (same as the tested i9-9900K for gaming), that's a HUGE upgrade. I guess "good enough" is subjective, but...

Sure, that chip isn't that bad by today's standards, but let's acknowledge the numbers here.

Either way, one couldn't simply "upgrade" to a 10-year newer CPU anyway without building an entirely new system.
N/T
 
Last edited:
The funniest part, I can confirm, because I'm running one, is that 10 years later, for gaming, it's not useful to upgrade from a i7 2600K (mine running at 4.0Ghz, but can go up to 4.5Ghz if needed be) to a 10y younger CPU, just because of gaming, as it is good enough to pull 60+FPS (and many times around 80-100FPS) in the games I play in 1080p (many at 60 fps even in my old setup of 3840x1024 = 3 1280x1024 monitors)
They're a mix of Tomb Raider, Witcher, Assassin's Creed series up to Odissey, W.Warcraft, and other Steam MMO and Single games, Destiny2, Star Wars games.
Mine is paired atm w/a GTX 1060 6GB (best partner for it as both are pushed to the limit around the same load). The only game I've seen this combo struggle is with MS Flight Simulator.
And it's also good enough for daily use, browsing, running VirtualBox and Bluestack machines and all other kind of stuff, except may be rendering and hyper specific video tooling.
So no need to upgrade, people. Stick to your 10y CPU, it's good enough, and go spend your money elsewhere. Travelling...

Depending on your GPU, your CPU could be holding it back.

Going from a Phenom II x4 940 (pretty much equivalent in IPC to Piledriver CPUs) at 3.71GHz to a stock i5-4670k netted me double the FPS from my GTX 570s I was running in SLI (those two cards in SLI perform a bit behind a single GTX 970). A GTX 1060 6GB is about on par with GTX 980, which in turn is only about 15% faster over a GTX 970.

Going from a 2600k to a 8700K or a 3800X would give a similar increase in performance if you have a decent GPU that's being held back by the CPU.

Even if you're GPU isn't a super gaming machine and just ideal for 1080p, even a faster CPU will improve the lows and bring up your minimum fps for games.

But, as you say, if your setup is working well enough for your needs then there certainly is no reason for you to upgrade.

I was happy moving from my 4670K to a 5900x, not just for gaming, but for transcoding. It would take 2-2.5 hours to transcode a blu-ray movie to .mp4 to put on my plex server. I'd have to run my computer overnight and throughout the day while I'm at work so it could do 6-8 movies. Using my 5900x, similar sized blu-ray movies take around 35 minutes. What I could do in around 16-20 hours with my 4670K, I can now do in about 4-5 hours.
 
I was ready for a new build in 2017 and went with the 1800X instantly for three reasons:

1) It gave me 8 cores and 16 threads, up from measly non-SMT quads I'd been stuck on.

2) It's existence opened my eyes to the mulcting we'd been getting from Intel; and I loathed Intel for that.

3). Most importantly, I also then realized what a miracle it was for AMD to have come back, and what an awful future we'd have if they hadn't. AND if they still didn't survive.

SO - I will be buying only AMD CPUs for the foreseeable future. Any premium on those is fine with me. Intel won't get a nickel, because they've made it terribly clear just where we'll be if they can somehow take over again.

Not everything shows up in the FPS charts.

And don't bother telling me that AMD could be just as bad yadda yadda. Of course they could, theoretically. But we're light-years from that in reality. I just hope they've made up enough ground to withstand Intel resurgent. It was an awful close run in 2012-17.
 
Interesting article. It would have been nice to see a few games benchmarked at 4k to remind readers that nearly all games will be GPU bound except for perhaps the two bottom CPU performers. Even the venerable 4770k performs within 1-2 FPS of the 5800x at 4k ultra for most games.
 
Clock for clock comparison is stupid.... because some architectures can't clock high.... Some intel CPU can go 5Ghz out of the box.... while AMD CPU can't reach those clock speed even when overclocked (specially early ryzen that struggled to go above 4GHz)


HAHAHAHA, IPC is not related to clockspeed alone, these intel fans lol
 
Back