Making a Fast Quad-Core Gaming CPU

However his numbers are quite suspect as his 1% lows as we can see them in the video are the same as TS's here, but with a notably slower CPU and weaker GPU:

FYI a much ower GPU like the GTX 1070 Ti will drastically reduce CPU load.

So why dont the old X299 parts like the 5960X, which has 20MB of L3 cache go as fast as a 10600K then?

"And of course, you can’t compare cache capacity of different CPU architectures to determine which is better, just as you can’t do that with cores, because factors such as cache latency, bandwidth, and the way they're used will vary."

Who would think of doing this??? how about testing all this "efficient" new 180w / 200w /300w GPUS AT JUST 115W and see how they compare to the number 1 GPU on steam the last 5 years? yea no, lets NOT do that!
Sounds like a really boring test, which is probably why no one else out of the tens of thousands of requestes we get each month has asked for it.
 
Last edited:
Who games at 150 FPS anyway? Only a small minority does

That comments really shows that you have no idea what the current market is like. I'm assuming you have no idea that all the tens, to hundreds of thousands of people that play esports titles (which easily run at 100s+ FPS on low-end hardware) all try to run at the highest possible FPS for the sake of input lag reduction, which matters because it can make a difference even on a meager 60Hz display (as to whether that matters at all for the avg. casual players' performance is up for debate of course). Have you not seen what AMD and NV's marketing is like for their lower-end cards (e.g. gtx 1650/ti, RX 5600/5500/6600 etc) ? It's virtually all about showcasing their ability to run these super-popular games at 100+FPS.

You not caring about that section of the market actually puts YOU in the minority, as far as being the target audience for their prospective market share gains.
 
The conclusion doesn't match the numbers. In which games does it clearly show a bad experience for those with a 4/8 core cpu?
You'll need to quote the part you disagree with because we never claimed quad-cores show a bad experience. We said for high-end gaming they're out and if you look at the Battlefield V and Shadow of the Tomb Raider results for example, that should be obvious.
 
Everyone beefing about the testing hardware are missing the point entirely. It's about how CPU Cache impacts performance. As someone who consults with several small businesses on hardware choices, it's interesting to note that the cache hit can be so dramatic and I now have to pay attention to cache size of the CPU I recomend to clients.

Although many small businesses don't buy high end gaming systems for anyone other then the top brass, they do tend to want the most performance from their minions, which means the CPU Cache now has to be considered when recomending hardware as the impact can be dramatic. Too much software today is poorly coded so more cache is going to help performance.
 
Very interesting analysis, thanks.

So since you decided to briefly revisit the overhead issue, why not add a few tests on older / weaker CPU when reviewing lower end GPU like the 3060, 6600XT and 3060Ti ?
That surely would be as useful as testing the effect of the x8 lane limitation for owners of PCIe 3 systems (and I‘m not being sarcastic here, that was interesting information).

I am sure there are many 8400, 2600 etc owners who would love to know which GPU upgrade would be worth it for their particular system. Even I with my 2700X am not sure which GPU upgrade would give me the best results for the money spent once GPU prices reach an acceptable level.

You should probably go for Radeon GPU if you worry about driver overhead
 
? My 4770K Haswell can play any game made in the past 5 years and I definitely do not need a 6 Core CPU to do so.

Who games at 150 FPS anyway? Only a small minority does. I don't even have a monitor capable of 100 FPS+ nor do I want to buy one.

Linus Sebastian, using Steam Survey, underlined that 43% of the Steam Userbase are still using 4 Core CPUs. It is obviously absurd to think that game developers would only launch games playable by 6 Core CPUs and up only and would ignore a huge chunk of the User Base who are still on 4C/8T.

As Linus says in the video, the OP tests and machines used concern such a miniscule portion of the global playerbase as to be in essence, 1% tests for 1% machines.

If you think for a moment that the GTX 1060, 1050TI and similar capabilities GPUs still top the Steam Survey lists, you will understand why 150+ FPS gaming concerns for the most part, 1 percenters.


Not trying to be an *** here but 4770K is pure trash by todays standards, a locked i3 for $80 would be faster than an overclocked i7 4770K which means that this i7 would be classed as below low end as a gaming CPU.... trying something modern on it like AC Valhalla or Cyberpunk and it will choke that CPU in no time....
 
That comments really shows that you have no idea what the current market is like. I'm assuming you have no idea that all the tens, to hundreds of thousands of people that play esports titles (which easily run at 100s+ FPS on low-end hardware) all try to run at the highest possible FPS for the sake of input lag reduction, which matters because it can make a difference even on a meager 60Hz display (as to whether that matters at all for the avg. casual players' performance is up for debate of course). Have you not seen what AMD and NV's marketing is like for their lower-end cards (e.g. gtx 1650/ti, RX 5600/5500/6600 etc) ? It's virtually all about showcasing their ability to run these super-popular games at 100+FPS.

You not caring about that section of the market actually puts YOU in the minority, as far as being the target audience for their prospective market share gains.
The more I read this guy's comments, the more I wonder if he actually plays games at all, actually, especially those made in the last 5 years.

Like you said, he clearly doesn't realize that there are actually gamers today who play eSports and 150+ FPS is their life blood, literally, making the difference between life and death in the game.

But then he then clearly has never played Shadow of the Tomb Raider, either, since he posts a video where the 4 core CPU clearly dips to 31 min FPS and is hitting 100% CPU load when there are more NPCs appearing. This is all what this Techspot article is talking about. But he lives in denial claiming "My 4770K Haswell can play any game made in the past 5 years and I definitely do not need a 6 Core CPU to do so."

He also claims that his "As for FFXV, if my 4770K cannot run a 2016 game maxed out w/o even trying too hard, I will delete my account leave the world and go take Monastic Vows". He clearly has never played FFXV since, if he did, he would realize that FFXV with all the settings at "maxed out" still even punishes today's systems. I await the day he actually plays FFXV so he can "go take Monastic Vows" once he realizes just what "maxed out" FFXV requires (at lot more than his 4 core i7-4770K)

He then insults anyone who has upgraded their system in the last 8 years saying "hardware companies depend on ppl like you who are not knowledgeable about PC hardware to rip you off and sell you their planned obsolescence hardware which you do not need."

It is pretty clear this guy doesn't play games, but loves feeling he is "better" than others by still running a PC from 8 years ago, loves trolling and most likely didn't even read this Techspot article that shows it's time to upgrade from 4 cores if you actually want to play games from the last 5 years at decent fluidity without dips in FPS due to 100% CPU load from 4 cores just no longer cutting it.
 
Not trying to be an *** here but 4770K is pure trash by todays standards, a locked i3 for $80 would be faster than an overclocked i7 4770K which means that this i7 would be classed as below low end as a gaming CPU.... trying something modern on it like AC Valhalla or Cyberpunk and it will choke that CPU in no time....
I have an overclocked 4790K, which isn’t much faster than a 4770K. “Pure trash” is wildly inaccurate. So far I’ve only found the CPU limits Far Cry 5, dropping to 50 fps here and there. I don’t play Shadow of the tomb raider so I couldn’t comment, the game reviews like crap so it’s not a big deal. AC Valhalla is fine as is Read Dead Redemption, Cyberpunk is not fine but that’s my GPU. Aside from that pretty much any game runs fine. Valve are putting a quad core into next years steam deck so they clearly aren’t out of date yet. Unless you have a modern graphics card, there isn’t much point upgrading a Haswell Intel quad right now. I could go and buy a new i5 or something and get a boost but I’m stuck with an RX480 so there’s no point upgrading until I can get a GPU.

Considering how old these Haswell parts are, it’s actually very impressive how well they still hold up today. Mine is now going on 7 years, I paid £240 at the time, for the top end SKU on the socket. I miss the times when AMD hid for 10 years and Intel stagnated the market, it was actually a lot cheaper for everyone!
 
You should probably go for Radeon GPU if you worry about driver overhead
It‘ll probably be a 6700XT due to that, a 3060Ti would also be good, but I‘m not sure if I‘ll lose performance due to it not having a hardware scheduler. But I would like to know for sure.

Either way, there would be extra load on the CPU but I could live with that if it does not affect performance and it‘s reflected in the price (I.e. lower).
 
It‘ll probably be a 6700XT due to that, a 3060Ti would also be good, but I‘m not sure if I‘ll lose performance due to it not having a hardware scheduler. But I would like to know for sure.

Either way, there would be extra load on the CPU but I could live with that if it does not affect performance and it‘s reflected in the price (I.e. lower).
What resolution and Hz are you planning on playing with either of the cards?
 
I guess that would mean you don't play Shadow of the Tomb Raider (2018) or Final Fantasy XV (2016). I tried both on my i7-3770K and the games would freeze and lag like crazy. It wasn't until I switched to my 6 core i7-4930K workstation did I realize that these two games were constantly using 80%+ of the 6 core CPU when looking at Task Manager while the games were running. Techspot's analysis confirms that it was because of having only 4 cores.

It would seem you play very different games than the ones I have played in the last 5 years so thus have a very different perception of "can play any game in the past 5 years".

I have played Final Fantasy XV and Shadow of the Tomb Raider on my old rig (4c/4t i5 4690 + GTX 1070) on high/very high and they ran pretty smoothly at 1080p/60 fps. I have played games where that cpu was an obvious bottleneck but these two weren't among them.
 
I have played Final Fantasy XV and Shadow of the Tomb Raider on my old rig (4c/4t i5 4690 + GTX 1070) on high/very high and they ran pretty smoothly at 1080p/60 fps. I have played games where that cpu was an obvious bottleneck but these two weren't among them.
You played these games "maxed out"? The guy specifically said "As for FFXV, if my 4770K cannot run a 2016 game maxed out w/o even trying too hard, I will delete my account leave the world and go take Monastic Vows"

I know FFXV and Shadow of the Tomb Raider can run on even PS4 and XBox One, but those are with the setting turned way down from "maxed out"
 
FYI a much ower GPU like the GTX 1070 Ti will drastically reduce CPU load.



"And of course, you can’t compare cache capacity of different CPU architectures to determine which is better, just as you can’t do that with cores, because factors such as cache latency, bandwidth, and the way they're used will vary."


Sounds like a really boring test, which is probably why no one else out of the tens of thousands of requestes we get each month has asked for it.
true Steve BUT im not here to be "entertained" and Sausagemeat also liked the idea... so wrong, again. 2people. But hey Steve, I don't want to fight , you seem like a good guy, keep catering to folks like you and Faelan, ( that rock double 8 k ultra-wide/ 240hz monitors with overclocked -over already OCD from factory- super silly GPUs like the 3090 Super TI( or a 6900 watercooled with liquid gold in your case...)

AND NOT to the billions of regular -BORING- folks like me, that are just FINE still using 60hz and 1080p monitors ( DOWGRADED from 4k btw) + shitty GPUS like the still amazing 1060 ( my 27 HP ips monitor consumes just 4w gaming, your gamerz monitor? 450w just to run a base console game designed to function internally at just 60 fps... ( and low resolution) genius!
 
Last edited:
I have an overclocked 4790K, which isn’t much faster than a 4770K. “Pure trash” is wildly inaccurate. So far I’ve only found the CPU limits Far Cry 5, dropping to 50 fps here and there. I don’t play Shadow of the tomb raider so I couldn’t comment, the game reviews like crap so it’s not a big deal. AC Valhalla is fine as is Read Dead Redemption, Cyberpunk is not fine but that’s my GPU. Aside from that pretty much any game runs fine. Valve are putting a quad core into next years steam deck so they clearly aren’t out of date yet. Unless you have a modern graphics card, there isn’t much point upgrading a Haswell Intel quad right now. I could go and buy a new i5 or something and get a boost but I’m stuck with an RX480 so there’s no point upgrading until I can get a GPU.

Considering how old these Haswell parts are, it’s actually very impressive how well they still hold up today. Mine is now going on 7 years, I paid £240 at the time, for the top end SKU on the socket. I miss the times when AMD hid for 10 years and Intel stagnated the market, it was actually a lot cheaper for everyone!

Shadow of the Tomb Raider is actually a good game, I really liked it but the the Rise of the Tomb Raider was better in my opinion. As for the performance of the 4790K or any other Haswell CPU in 2021, if all you aim for is 1080/60 then maybe that's fine but a lot of people are moving away from that, especially form 60Hz gaming
 
Very interesting analysis, thanks.

So since you decided to briefly revisit the overhead issue, why not add a few tests on older / weaker CPU when reviewing lower end GPU like the 3060, 6600XT and 3060Ti ?
That surely would be as useful as testing the effect of the x8 lane limitation for owners of PCIe 3 systems (and I‘m not being sarcastic here, that was interesting information).

I am sure there are many 8400, 2600 etc owners who would love to know which GPU upgrade would be worth it for their particular system. Even I with my 2700X am not sure which GPU upgrade would give me the best results for the money spent once GPU prices reach an acceptable level.
but that would be SUPER BORING... the target audience only uses high end cpus and only cares about da mAX FPZ!
 
Shadow of the Tomb Raider is actually a good game, I really liked it but the the Rise of the Tomb Raider was better in my opinion. As for the performance of the 4790K or any other Haswell CPU in 2021, if all you aim for is 1080/60 then maybe that's fine but a lot of people are moving away from that, especially form 60Hz gaming
Well I’m ready to move on from 1080p/60 but there’s absolutely no point buying a new monitor and CPU if you can’t get a graphics card. You can get a 3060 where I live. For £650, which is absurd. I can get a 6600XT for around MSRP but I’m never going Radeon again. Also it doesn’t have ray tracing or DLSS and I want those. If I could get a 3080 or 3070 at MSRP I’d be building away!

As for Rise if the Tomb raider, I thought it was average. Maybe I’m like the gaming equivalent of a boomer but modern tomb raiders don’t have the character and soul of the old school ones on the PlayStation.
 
Btw the ORIGINAL Tomb raider from 1996 had more charm and personality (and tits) on a Voodoo 1 than all this linear boring cinematic shits, crystal dynamic pushes , on the "best" gpu , they are all soulless zzzz bore fests. but people are like, ohh the reflection in the puddle. im gonna need a 850wgpu for that.
 
Well I’m ready to move on from 1080p/60 but there’s absolutely no point buying a new monitor and CPU if you can’t get a graphics card. You can get a 3060 where I live. For £650, which is absurd. I can get a 6600XT for around MSRP but I’m never going Radeon again. Also it doesn’t have ray tracing or DLSS and I want those. If I could get a 3080 or 3070 at MSRP I’d be building away!

As for Rise if the Tomb raider, I thought it was average. Maybe I’m like the gaming equivalent of a boomer but modern tomb raiders don’t have the character and soul of the old school ones on the PlayStation.

What did Radeons did to you? 😅 and what are you trying to move to? 1440p high refresh or 1080 high refresh? because as 3080 owner who uses a 1440p screen at 32" I can tell you DLSS is just about useable at the Quality setting anything else is not useful in my opinion
 
Btw the ORIGINAL Tomb raider from 1996 had more charm and personality (and tits) on a Voodoo 1 than all this linear boring cinematic shits, crystal dynamic pushes , on the "best" gpu , they are all soulless zzzz bore fests. but people are like, ohh the reflection in the puddle. im gonna need a 850wgpu for that.

I'm 30 years old but I still prefer the new ones over the old ones, so much more action 😅 its even better than Uncharted I think 😜
 
true Steve BUT im not here to be "entertained" and Sausagemeat also liked the idea... so wrong, again. 2people. But hey Steve, I don't want to fight , you seem like a good guy, keep catering to folks like you and Faelan, ( that rock double 8 k ultra-wide/ 240hz monitors with overclocked -over already OCD from factory- super silly GPUs like the 3090 Super TI( or a 6900 watercooled with liquid gold in your case...)

AND NOT to the billions of regular -BORING- folks like me, that are just FINE still using 60hz and 1080p monitors ( DOWGRADED from 4k btw) + shitty GPUS like the still amazing 1060 ( my 27 HP ips monitor consumes just 4w gaming, your gamerz monitor? 450w just to run a base console game designed to function internally at just 60 fps... ( and low resolution) genius!
You can save all the power by turning off your PC.
 
What did Radeons did to you? 😅 and what are you trying to move to? 1440p high refresh or 1080 high refresh? because as 3080 owner who uses a 1440p screen at 32" I can tell you DLSS is just about useable at the Quality setting anything else is not useful in my opinion
I’m currently using an RX480 and the software is atrocious, so many issues and bugs that my friends on Nvidia, who play the same games just don’t get. It has been awful from the beginning really. Radeons were good, once upon a time. My R9 280X was quite decent. But these days people should avoid them like the plague. AMD just don’t care about you once they have your money.

My brother has a 1440p monitor and an RTX 2080 super on a 32 screen and DLSS looks amazing, in fact in control and death stranding it looks better than native, this was with the quality setting most likely, why would a 3080 owner use anything less at 1440p? Besides I haven’t bought my monitor yet and 27 is most likely what il get, my desk is a little small. I’m so bloody jealous of his RTX, Minecraft RTX is the biggest deal for me, we play loads of minecraft and the guys on the server with RTX parts keep posting screenshots in the discord chat that look so good!
 
I’m currently using an RX480 and the software is atrocious, so many issues and bugs that my friends on Nvidia, who play the same games just don’t get. It has been awful from the beginning really. Radeons were good, once upon a time. My R9 280X was quite decent. But these days people should avoid them like the plague. AMD just don’t care about you once they have your money.

My brother has a 1440p monitor and an RTX 2080 super on a 32 screen and DLSS looks amazing, in fact in control and death stranding it looks better than native, this was with the quality setting most likely, why would a 3080 owner use anything less at 1440p? Besides I haven’t bought my monitor yet and 27 is most likely what il get, my desk is a little small. I’m so bloody jealous of his RTX, Minecraft RTX is the biggest deal for me, we play loads of minecraft and the guys on the server with RTX parts keep posting screenshots in the discord chat that look so good!

To be honest I used Radeons for 8 years before I got my 3080, R9 280X was fine, R9 390X was good too, Vega 64 did give me some problems but the most problems I got with Radeon VII, my wife used RX5700XT for a year and that was just hell, I think AMD did dropped the ball for the GCN drivers later on in their life cycle since they had to move all resources for the CPU department and the first RDNA but from what I'm hearing its much better now and going forward I'm expecting them to be good too, I would have no problems buying a Radeon GPU if they were in stock when I was looking for one but there was 0 so I picked up the 3080 and it is a good GPU but I feel like that 10GB memory wont last me very long. My wife is using 3070 on a 1440p 27" and although the DLSS is much more useable there as the picture is sharper in general I think giving that card only 8GB was a total joke.

One thing I will add is that I have a friend who used the 480 and then upgraded to 5700XT and had 0 issues, he and my wife both played RDR2 and he completed the game but my wife could not get more than 30 minutes out of it without it crapping out, it was really strange, he had a 2700X she is using 3700X, he's got a B450 board she is on X570, there was something really weird with that card 😅 😅
 
To be honest I used Radeons for 8 years before I got my 3080, R9 280X was fine, R9 390X was good too, Vega 64 did give me some problems but the most problems I got with Radeon VII, my wife used RX5700XT for a year and that was just hell, I think AMD did dropped the ball for the GCN drivers later on in their life cycle since they had to move all resources for the CPU department and the first RDNA but from what I'm hearing its much better now and going forward I'm expecting them to be good too, I would have no problems buying a Radeon GPU if they were in stock when I was looking for one but there was 0 so I picked up the 3080 and it is a good GPU but I feel like that 10GB memory wont last me very long. My wife is using 3070 on a 1440p 27" and although the DLSS is much more useable there as the picture is sharper in general I think giving that card only 8GB was a total joke.

One thing I will add is that I have a friend who used the 480 and then upgraded to 5700XT and had 0 issues, he and my wife both played RDR2 and he completed the game but my wife could not get more than 30 minutes out of it without it crapping out, it was really strange, he had a 2700X she is using 3700X, he's got a B450 board she is on X570, there was something really weird with that card 😅 😅
Im just avoiding Radeon for the time being. It used to be that Radeons were always cheaper than their Nvidia competitors, my 280x was a lot cheaper than the gtx770 and that made the inferior software experience worth it. Back then the driver issues were minimal (just some issues with boost clocks in some games and flickering). But now Radeon cards cost pretty much the same if not more, the issues have got worse and you miss out on ray tracing and DLSS.

I actually have a small morsel of hope that Intels cards might be decent. If they can give me 3070 performance for £400-£500 then il snap that up. Might just grab one of their Alder lake CPUs to feed it at the same time along with some DDR5.
 
Back