Making a Fast Quad-Core Gaming CPU

This is why my previous i7 980x 12 mb of cache at 4.3 ghz all cores 6/12 threads triple channel memory with 12 gigs at 2ghz dominators c7 lasted a decade. I was able to play metro exodus at 3440 by 1440p on extreme with hairworks and tesselation on. Update as well as Shadows of Tomb Raider Maxed out.
 
Last edited:
So why dont the old X299 parts like the 5960X, which has 20MB of L3 cache go as fast as a 10600K then?
ipc once you remove the cache bottleneck that is being shown. It seems once you have enough cache and at least 6 cores in current gen titles going forward the ipc or clock speed would be the bottleneck when comparing cpus you mentioned.
 
The conclusion doesn't match the numbers. In which games does it clearly show a bad experience for those with a 4/8 core cpu?
It's not supposed to show you a bad experience, it shows you data, you can extrapolate the and conclude what works best for you. Especially if you are in the market shopping to future proof your next build. Do you buy a current gen quad core cpu? Or is 6 cores the new minimum for future proofing your rig. While cache size is important the cache size is usually directly proportional to the core count.
The article is not intended to trigger 10 generations of stagnated intel quad core cpu owners at least I think it's not.
 
Last edited:
Each to their own. Personally, I feel 60 Hz gaming is a pretty terrible experience compared to what you can get at 120 Hz+. When I jump from my gaming PC (144 Hz) to my work PC (60 Hz) even Windows feels broken.

We know for a fact that most gamers who read/watch our content are now aiming for high refresh rate gaming which makes sense given how much better the experience is.

If you're complaining about the use of the 6900 XT and RTX 3090 for CPU testing, I'm afraid you're missing the point. Obviously the idea here is to test CPU performance and while you might not be looking at 6900 XT levels of performance with a CPU right now, you might be in a few years time.


Ok steve... how about testing radeon chill? at 1080p 60fps? https://www.amd.com/es/technologies/radeon-chill it should be a fairly simple thing to do :)
 
Last edited:
Seeing this result, intel should ditch avx 512 for consumer processors and integrated graphics for all dekstops processors.
the transistors are more useful to be used for caches
 
If you are buying a gaming PC in 2021 of course you are going to be looking at a 6 or 8 (or more) core part. But the idea that 4c/8t gaming is obsolete or incapable of a good gaming experience today is absolute nonsense.
I have a i7 7700 (non K) currently paired with a RTX 3060Ti and there has not been a single game I have played that gives me a poor experience. Does every single game max out my 165hz 1440p monitor? No, but with gsync enabled I don't notice or care.
 
I have a i7 7700 (non K) currently paired with a RTX 3060Ti and there has not been a single game I have played that gives me a poor experience. Does every single game max out my 165hz 1440p monitor? No, but with gsync enabled I don't notice or care.
This is on a steam library of over 250 games including Cyberpunk, Metro Exodus Enhanced Edition and many other super demanding games. My monitor is the MSI 274QRF-QD. Thanks to Techspot for their review, it's why I picked it.
 
It's not supposed to show you a bad experience, it shows you data, you can extrapolate the and conclude what works best for you. Especially if you are in the market shopping to future proof your next build. Do you buy a current gen quad core cpu? Or is 6 cores the new minimum for future proofing your rig. While cache size is important the cache size is usually directly proportional to the core count.
The article is not intended to trigger 10 generations of stagnated intel quad core cpu owners at least I think it's not.
I was referring to their interpretation of the data. Anyways, I'm not a triggered quad-core intel owner, as I currently run a 5800X. However, nothing shown in those tests has any trouble running well on an intel quad core.
 
Not only were they FAT, loud bastards, but you also took a nice shower of EMF radiation with every use. Nice! But hey, CRT is capable of perfectly reproducing any resolution instantly...

Unlike the fragile and $ oled which is STILL a fixed pixel based display one that looks like CRAP if you don't use the native resolution. yet. low power and amazing color rep, and it bends!

Anyway, rant, looking forward to yr undervolting exploration on the 6700xt vs 3060 TI for me it would be interesting, hell I might even get one if you "magically" mage to produce a RADEON / RTX GPU x2 my retro 1060 at just 120W. AMD / NVIDIA couldn't, so far...
I call bs on oled "not looking good at anything but native" I have a 65" lg CX 4k/120hz oled and before I got a 3080 I ran it with a 2080ti at 1440p/120 and it looked Marvelous the built in scaling is quite impressive and it was kinda hard to see the jump in image quality once I did move up to native 4k.

Take your FUD Elsewhere... Please!
 
I call bs on oled "not looking good at anything but native" I have a 65" lg CX 4k/120hz oled and before I got a 3080 I ran it with a 2080ti at 1440p/120 and it looked Marvelous the built in scaling is quite impressive and it was kinda hard to see the jump in image quality once I did move up to native 4k.

Take your FUD Elsewhere... Please!
fragile much? it has a good scaler, hurray (witch introduces issues) Im talking about pc monitors and native resolutions, here, not scaling FSR or DLS2
 
Not trying to be an *** here but 4770K is pure trash by todays standards, a locked i3 for $80 would be faster than an overclocked i7 4770K which means that this i7 would be classed as below low end as a gaming CPU.... trying something modern on it like AC Valhalla or Cyberpunk and it will choke that CPU in no time....

Sorry no, at least not where Cyberpunk is concerned. My main system when I played C2077 was a i7 40790K @ 4.6Ghz that would boost to 4.8. My GPU was a 2080S running to 3 60hz 1900x1200 monitors with a bezel corrected 6060x1200 surround resolution. My graphics setting were high with both DLSS and Ray tracing set around the medium settings.

While the game had/has numerous bugs and glitches it ran smooth as butter for me. Same with Outriders with similar settings. No idea what AC would do since I never played. Yes I know my monitors are old as dirt with a 60hz refresh. But other than that they're fantastic IPS ones that I just can't give up even if it limits me to 60fps.

I've even played some modern but less demanding games like Wastelands 3 on an older secondary system consisting of a i7 875K @ 3.7 and a 980 GPU to a 1440p monitor and they ran fine with high GPU settings. I wouldn't use anything less than a 4c/8t processor but saying a more modern locked i3 is better? Not in my experience if you OC the older CPU.

Saying that I have upgraded to a i5 10600KF @5Ghz for my main gaming system which still rocks the 2080S and the three old monitors, and I've noticed a slight improvement. But not enough to make me regret waiting so long to upgrade.

In the end all the benchmarks in the world are only indicators of the potential end user experience and blanket performance statements are prone to personal biases more than anything else. For example my idle resource use is pegged about 1-2% on the i7 47090K and I have 32Gb of memory. So unlike someone that might be running twitch, a number of game platforms, discord, etc. my games are seldom starved for resources since it's the only thing running, and that will make a massive difference in performance.
 
Sorry no, at least not where Cyberpunk is concerned. My main system when I played C2077 was a i7 40790K @ 4.6Ghz that would boost to 4.8. My GPU was a 2080S running to 3 60hz 1900x1200 monitors with a bezel corrected 6060x1200 surround resolution. My graphics setting were high with both DLSS and Ray tracing set around the medium settings.

While the game had/has numerous bugs and glitches it ran smooth as butter for me. Same with Outriders with similar settings. No idea what AC would do since I never played. Yes I know my monitors are old as dirt with a 60hz refresh. But other than that they're fantastic IPS ones that I just can't give up even if it limits me to 60fps.

I've even played some modern but less demanding games like Wastelands 3 on an older secondary system consisting of a i7 875K @ 3.7 and a 980 GPU to a 1440p monitor and they ran fine with high GPU settings. I wouldn't use anything less than a 4c/8t processor but saying a more modern locked i3 is better? Not in my experience if you OC the older CPU.

Saying that I have upgraded to a i5 10600KF @5Ghz for my main gaming system which still rocks the 2080S and the three old monitors, and I've noticed a slight improvement. But not enough to make me regret waiting so long to upgrade.

In the end all the benchmarks in the world are only indicators of the potential end user experience and blanket performance statements are prone to personal biases more than anything else. For example my idle resource use is pegged about 1-2% on the i7 47090K and I have 32Gb of memory. So unlike someone that might be running twitch, a number of game platforms, discord, etc. my games are seldom starved for resources since it's the only thing running, and that will make a massive difference in performance.

Obviously you wouldn't see that much of an improvement when you are GPU bottlenecked and only trying to play at 60fps. Core i3 10105F that cost £74.99 with DDR4 3200Mhz is most likely better than any mainstream Haswell CPU
 
Obviously you wouldn't see that much of an improvement when you are GPU bottlenecked and only trying to play at 60fps. Core i3 10105F that cost £74.99 with DDR4 3200Mhz is most likely better than any mainstream Haswell CPU
Again you're speculating based on your personal biases, not any real world testing. More importantly as I said it's end user experience not benchmark numbers that really matter. You could have a system with triple fps that stutters like crazy. Is that the system you'll go with because of the numbers? Really?

Edited to add: As well you say I'm GPU limited, so why do I need a better CPU in that case? So you're argument is moot...
 
Again you're speculating based on your personal biases, not any real world testing. More importantly as I said it's end user experience not benchmark numbers that really matter. You could have a system with triple fps that stutters like crazy. Is that the system you'll go with because of the numbers? Really?

Edited to add: As well you say I'm GPU limited, so why do I need a better CPU in that case? So you're argument is moot...

And you assuming I didn't have any experience with these i7's....
 
A 67% increase in cache resulting in a 18% improvement in an absolute best case scenario does not scream cache being the biggest factor to me. Maybe there is no big factor. I'm obviously not a hardware engineer but I think there's variables being missed beyond cores and cache for why a cpu over twice as powerful can't get twice the results.
 
Back