How We Test: CPU Gaming Benchmarks

7700K was supposed to be future proof because games use at most 4 cores. Now when Intel has i5,8400, i7-8700K etc 6-core chips, suddenly games seem to demand more than 4 cores, at least nobody recommends 4 core CPU's for gaming any more.. That's Intel fanboy logic. If Intel does not have more than 4 cores to offer, then 4 cores is best for gaming. When Intel now has 6 core CPU for mainstream market, 6 core CPU is suddenly best for gaming. It's very easy to predict that games run best on 8 cores according to Intel fanboys when Intel finally offers octa core to mainstream market.

Ryzen 2 was just little manufacturing tech upgrade, even architectural upgrades were present on Ryzen 1 but not enabled. Zen 2 will be real upgrade.



That depends on games and settings.

7700K vs 2700 is too close of a race to be calling AMD a winner in anything but multi-threaded applications. The majority of gamers are running 1080p displays meaning you have to spend more on monitor and video card to make Ryzen worth buying for gaming at 1440p.

AMD went with high core count, because they suck at raising core clocks. They needed a new process just for cache improvments and miniscule bump in core clocks. No one with Ryzen 1 should upgrade to Ryzen 2. Intel could raise core counts easily and they have already started with CFL, 8086K and rumoured 8 core CFL-S chips.

Ryzen isn't necessarily a mainstream product, and if it does become even more competitive, you can bet AMD will price them accordingly. You do remember the 9590, Nano and Pro Duo, right? That's just a taste of what will happen if Ryzen takes the lead. It's common sense really.
 
Last edited:
7700K vs 2700 is too close of a race to be calling AMD a winner in anything but multi-threaded applications. The majority of gamers are running 1080p displays meaning you have to spend more on monitor and video card to make Ryzen worth buying for gaming at 1440p.

AMD went with high core count, because they suck at raising core clocks. They needed a new process just for cache improvments and miniscule bump in core clocks. No one with Ryzen 1 should upgrade to Ryzen 2. Intel could raise core counts easily and they have already started with CFL, 8086K and rumoured 8 core CFL-S chips.

Ryzen isn't necessarily a mainstream product, and if it does become even more competitive, you can bet AMD will price them accordingly. You do remember the 9590, Nano and Pro Duo, right? That's just a taste of what will happen if Ryzen takes the lead. It's common sense really.

I was talking about this future proof thing. Quad cores were supposed to be future proof but when looking at https://www.tomshardware.com/reviews/cpu-hierarchy,4312.html it's easy to see that already hexa and quad cores are faster than 7700K. Even 7700K's faster clock speed does not help. So quad cores lost "future proof" in less than a year. Very nice to see situation next year.

I agree that very few Ryzen owners will "upgrade" to Ryzen 2. Upcoming Zen 2 is another story. Intel cannot raise core counts so easily because ring bus will become slower and slower when core count is up and mesh is more suitable for much bigger amount of cores.

I disagree that reason for high core count. Currently AMD manufactures only three different Ryzen/Threadripper/Epyc cores: 8-core Zeppelin "Ryzen 1", 8-core Pinnacle Ridge "Ryzen 2", 4-core Raven Ridge "APU". Making just one (or at this time: few) CPU core saves lot of resources and money.
 
As a gaming chip an OC 7700k beats anything AMD has and the only people who toss around the term future proof about their PC parts are the same people who fail to understand what 720p gaming results represent.
 
As a gaming chip an OC 7700k beats anything AMD has and the only people who toss around the term future proof about their PC parts are the same people who fail to understand what 720p gaming results represent.

7700K does not beat several Ryzen 2000-series on gaming. It has been proven many times that 720p results are useless. Only Intel fanboys consider those low resolution benchmarks have some value. When AMD is faster on 720p benchmarks, they are useless according to Intel fanboys. Just wait and you'll see :p
 
7700K does not beat several Ryzen 2000-series on gaming. It has been proven many times that 720p results are useless. Only Intel fanboys consider those low resolution benchmarks have some value. When AMD is faster on 720p benchmarks, they are useless according to Intel fanboys. Just wait and you'll see :p
Yet every single test shows it does, once again truth hurts you so so much lol.
 
7700K does not beat several Ryzen 2000-series on gaming. It has been proven many times that 720p results are useless. Only Intel fanboys consider those low resolution benchmarks have some value. When AMD is faster on 720p benchmarks, they are useless according to Intel fanboys. Just wait and you'll see :p

Actually there is a value on 720p benchmarks and I have no problem with them. What I have a problem with is using 720p benchmarks as an indication of future gaming performance, when it's clearly a very flawed method.
 
Aaaawww! @HardReset go on son! Make AMD proud! I hope you never leave Techspot. So just like the last however many years you've been spouting your nonsense all over the Nvidia reviews, you can eat your own words on CPU's too :) or you never know, you might even get your chance to shine since AMD are only really competitive on CPU's these days :D
 
Point me, you just got shut out and beaten by....facts! 2600 can't touch an OC 7700k! Lol you are zero for life!

https://tpucdn.com/reviews/AMD/Ryzen_5_2600/images/perfrel_1280_720.png

So that means i7-7700K is better gaming CPU than i5-8400 or i5-8500. Then why the heck all Intel fanboys recommend i5-8400 over i7-7700K for gaming *nerd*

Reason: Intel fanboys consider i5-8400 more future proof :D

Aaaawww! @HardReset go on son! Make AMD proud! I hope you never leave Techspot. So just like the last however many years you've been spouting your nonsense all over the Nvidia reviews, you can eat your own words on CPU's too :) or you never know, you might even get your chance to shine since AMD are only really competitive on CPU's these days :D

I keep my opinions, Intel and Nvidia fanboys constantly change their opinions for whatever is best suitable for that time.

So mark my words: When Nvidia releases cards that are much better than current ones on DirectX 12 games, DX12 immediately becomes very important. Right now when Nvidia sucks on DX12, it's completely useless.
 
Nope, absolutely not.

Future proof has nothing to do with current performance, that's why there is a "future" there in that phrase. It's completely apparent to anyone with a working brain that modern games require more and more cores to run properly. A recent example is AC origins. While an R5 1600 and a i5 7600k were neck and neck in most games, the SMT of the R5 1600 allowed it to easily surpass the 7600k on this game, due to the lack of threads.

When you see CPU's hitting close to 100% usage on games, that's not a good sign of future proofness. They are already in their limits. When it comes to those AAA titles, the 12 threads of the R5 lineup are going to surpass the 6 of the 8400 sooner or later. There is no doubt about that, I mean, they already do.

Now on that note, both are freaking awesome CPU's for the money, it's just that the R5 give more bang for the buck, and anyone arguing it about it is just wrong.

Threads != More Performance. You can quite easily cause the reverse case, such as when a handful of threads overload a single core (which happened with AMDs Bulldozer architecture).

What matters is that no individual core reaches full load. You can either do that by having fewer faster cores, or more weak ones. As a general rule, the first architecture will be faster as the second can always be bogged down if any particular cores workload approaches 100%.

And before you criticize: I'm a Software Engineer. I've seen plenty of implementations where unnecessary threading slows applications to a crawl.
 
Threads != More Performance. You can quite easily cause the reverse case, such as when a handful of threads overload a single core (which happened with AMDs Bulldozer architecture)

That was a weakness of the particular architecture of the CPU, not a general rule. In general, more threads = more performance.

What matters is that no individual core reaches full load. You can either do that by having fewer faster cores, or more weak ones. As a general rule, the first architecture will be faster as the second can always be bogged down if any particular cores workload approaches 100%.

The first architecture will be faster assuming the total horsepower of the CPU is enough. If it's not enough, it doesn't matter how fast your single thread speed is, since cores will have to run multiple threads simultaneously. A prime example is a 7600k, when even overclocked at 5ghz it can be plenty slower than an R5 1600 in games like AC:O, that actually require lot's of horsepower.
 
Low resolution testing is to test CURRENT single thread performance. Is it an indication of future performance? Maybe, but most likely no... Why? Because we have to program sideways, rather than single core IPC performance only, to be able to get the performance we want. We are close to a ceiling on single core performance, so slowly but surely that will be phased out for anything other than indie games. The only games where a higher performance at lower resolution will be more future proof are in games where the single core performance matters the most. The majority of those games are old games, not new ones. Anything else, and the low resolution method does not work as a method to determine future performance. This method of using a lower resolution is an old method where CPUs only had one or two cores. It works still, but only for that purpose. Now, there are too many variables..

If you really want to tackle the CPU part, you need to eliminate as many variables as possible. And there is one glaringly obvious variable that is often not addressed, and that is GPU drivers. nVidia drivers and AMD drivers work quite differently. To eliminate the drivers as the limit, you have to test two similar GPUs from both vendors. In this case it would be a Vega 64 and GTX 1080.

Another variable is the APIs. DX11, DX12 and Vulkan work quite differently. Ideally the same game should be run with the different APIs to test this. Sadly, there aren't many games out there that support at least two APIs.

And then there's all the different versions of Windows 10, spectre/meltdown patches, which again make a difference.

Lastly, as time goes on, more and more people play on higher resolutions, where the CPU matters less. In fact, in certain cases, rather than upgrading your CPU, you're better off buying a higher res screen instead.
 
As usual the hardcore AMD defence league are claiming that AMD is more “future-proof”.

“It might be clearly second best now but in 3 years it will be faster”.

Yeah games aren’t going to suddenly become core heavy, not when the vast majority of people out there have 4 cores of less. Also improved APIs will see less utilisation of the CPU in the future. DX12 compared to DX11 usually sees a massive reduction in CPU overhead often allowing i5 performance on an i3.

If you like AMD be happy, they make the better CPUs at the moment with gaming being one of the few user scenario exceptions that’s better on Intel.
 
As usual the hardcore AMD defence league are claiming that AMD is more “future-proof”.

“It might be clearly second best now but in 3 years it will be faster”.

Yeah games aren’t going to suddenly become core heavy, not when the vast majority of people out there have 4 cores of less. Also improved APIs will see less utilisation of the CPU in the future. DX12 compared to DX11 usually sees a massive reduction in CPU overhead often allowing i5 performance on an i3.

If you like AMD be happy, they make the better CPUs at the moment with gaming being one of the few user scenario exceptions that’s better on Intel.

No defense needed, according to the benchmarks the 2600x > 8400 even in gaming. And of course it is more future proof, that's kinda brainless.
 
Way too much bitching about future-proofing when talking about cores vs SMT. The differences are way too small (in the 10-20% range) to be considered future-proofing.

What I consider to be future-proofing is what options I'll have a few years down the line when I upgrade that will enable me to get a significant performance boost (50 to 100%).

For example, if I buy a 2600x now I'll most likely be able to achieve that performance boost (or close to it) with Zen3 CPUs if I get something with more cores and higher IPC/clocks. If I buy an 8600K then my only option would be the 9900k 2-3 years from now or to buy an entirely new PC.
 
The goal is to work out which CPU will offer you the most bang for your buck at a given price point, now and hopefully in the future.
Not long ago we compared the evenly matched Core i5-8400 and Ryzen 5 2600. Overall the R5 2600 was faster once fine-tuned, but ended up costing more per frame making the 8400 the cheaper and more practical option for most gamers.

So 6C/6T CPU is more future proof than 6C/12T CPU "(y)"

Does not compute.

Edit: IIRC you previously stopped recommending quad core i5 for gaming and to get i7 instead. Difference between i5 and i7? SMT.

So your logic is pretty flawed here. You talk about future, recommend i7 over i5 because SMT but when AMD has SMT and Intel does not, suddenly SMT is ignored :confused:
Did you read the line after that?
 
As a gaming chip an OC 7700k beats anything AMD has and the only people who toss around the term future proof about their PC parts are the same people who fail to understand what 720p gaming results represent.
what is this? 2006 to be talking about 720p for god sake, the Chills that use 720p are Intel biased sites.
 
Hello Steve.

Nice article.

I always enjoy reading your CPU tests, reviews and benchmarks.

However, could you include the i5-6600k / i5-7600k in more benchmarks? I feel like the vast majority still has 4/4 Intel CPU's, but it's hard to know whether or not it makes sense to upgrade.
 
Back