Making a Fast Quad-Core Gaming CPU

You played these games "maxed out"? The guy specifically said "As for FFXV, if my 4770K cannot run a 2016 game maxed out w/o even trying too hard, I will delete my account leave the world and go take Monastic Vows"

I know FFXV and Shadow of the Tomb Raider can run on even PS4 and XBox One, but those are with the setting turned way down from "maxed out"

Not maxed out, but not "turned way down" either - for me that would be everything on high at most, or worse. In most games I use custom settings with everything maxed out but shadows and AA tuned down a level or two, and disabling subsurface scattering when that's available. I remember this being the case for FFXV and SOTR. (these are the settings I often still use now with an i7 10700k + RTX 2060 Super)

And on Tomb Raider, since everyone is giving opinions... my favorites are the 7th gen trilogy (Legend + Anniversary + Underworld). The new ones are okay.
 
Im just avoiding Radeon for the time being. It used to be that Radeons were always cheaper than their Nvidia competitors, my 280x was a lot cheaper than the gtx770 and that made the inferior software experience worth it. Back then the driver issues were minimal (just some issues with boost clocks in some games and flickering). But now Radeon cards cost pretty much the same if not more, the issues have got worse and you miss out on ray tracing and DLSS.

I actually have a small morsel of hope that Intels cards might be decent. If they can give me 3070 performance for £400-£500 then il snap that up. Might just grab one of their Alder lake CPUs to feed it at the same time along with some DDR5.

I do agree with you that Radeons should be cheaper but AMD have tried that in the past and nVidia outsold them anyway so hopefully their new tactic pays off in the long run and we have more choice of good ( not cheap ) products. I also hope Intel surprises us all but they aren't knows for being reasonable with their prices, plus their drivers might be even worse than AMD's... 😅 You really want to jump on first gen DDR5?
 
I do agree with you that Radeons should be cheaper but AMD have tried that in the past and nVidia outsold them anyway so hopefully their new tactic pays off in the long run and we have more choice of good ( not cheap ) products. I also hope Intel surprises us all but they aren't knows for being reasonable with their prices, plus their drivers might be even worse than AMD's... 😅 You really want to jump on first gen DDR5?
Are you mad? You think it’s more important that AMD beats it’s competitor than delivers lower prices?
 
Are you mad? You think it’s more important that AMD beats it’s competitor than delivers lower prices?

Well if RTG wants to stay in business they need to start making AMD money, as a consumer of course I don't want to pay more but in the past you were getting average Radeon or better quality GeForce, because of that Radeons were seen as the cheap alternative to GeForce, Lisa wants to change that but that of course will come at a cost us 🤷‍♂️
 
Well if RTG wants to stay in business they need to start making AMD money, as a consumer of course I don't want to pay more but in the past you were getting average Radeon or better quality GeForce, because of that Radeons were seen as the cheap alternative to GeForce, Lisa wants to change that but that of course will come at a cost us 🤷‍♂️
I couldn’t care less if any of these companies go out of business and absolutely don’t want to pay more to save one. Competition should benefit the consumer, not the corporation. If one of these companies died another would pop up. And let’s face it, it would be difficult for a company to make worse products than AMD!

Besides, we have competition coming from Intel anyway, we don’t need Radeon anymore.
 
I’m currently using an RX480 and the software is atrocious, so many issues and bugs that my friends on Nvidia, who play the same games just don’t get. It has been awful from the beginning really. Radeons were good, once upon a time. My R9 280X was quite decent. But these days people should avoid them like the plague. AMD just don’t care about you once they have your money.

My brother has a 1440p monitor and an RTX 2080 super on a 32 screen and DLSS looks amazing, in fact in control and death stranding it looks better than native, this was with the quality setting most likely, why would a 3080 owner use anything less at 1440p? Besides I haven’t bought my monitor yet and 27 is most likely what il get, my desk is a little small. I’m so bloody jealous of his RTX, Minecraft RTX is the biggest deal for me, we play loads of minecraft and the guys on the server with RTX parts keep posting screenshots in the discord chat that look so good!
I'm currently using RX 570 and I don't have any issues. I even prefer their software to Nvidias one, built-in OC and all... Catalyst used to be atrocious tho. Guess it depends on the card, most people reporting issues were on RX 5700 XT. But I think a lot of it was exaggerated.
 
I couldn’t care less if any of these companies go out of business and absolutely don’t want to pay more to save one. Competition should benefit the consumer, not the corporation. If one of these companies died another would pop up. And let’s face it, it would be difficult for a company to make worse products than AMD!

Besides, we have competition coming from Intel anyway, we don’t need Radeon anymore.

You are so wrong!! If competing with nVidia was so easy we would already have more companies doing it, Intel can afford it and if they want to stay relevant they had no choice but to get into the GPU game and can you even image the prices we would get if nVidia was alone? :joy: :joy:
 
You are so wrong!! If competing with nVidia was so easy we would already have more companies doing it, Intel can afford it and if they want to stay relevant they had no choice but to get into the GPU game and can you even image the prices we would get if nVidia was alone? :joy: :joy:
Im not wrong. But I can’t be bothered to argue.

I get it, you like AMD. I don’t, I’m not buying their garbage GPUs again and I don’t care if they go out of business.
 
Im not wrong. But I can’t be bothered to argue.

I get it, you like AMD. I don’t, I’m not buying their garbage GPUs again and I don’t care if they go out of business.

You are talking to a guy who is running Intel + nVidia rig 😅 😅 I mean I do like AMD, I would probably prefer to have AMD + AMD machine but I am happy with what I currently have and wont be changing it for a while :p👌
 
Last edited:
Quite interesting, I read almost all posts and try to summarize:

- most games today are very badly optimized and need gigantic loads of CPU and GPU and RAM

- as even so, most gamers pay "whatever it will be needed", so CPU and GPU are also not being much more optimized anymore, they get some tiny updates and the rest is "smaller nodes, more heat and more of the same transistors allowed", so more performance and much more energy consumed

At the end, most gamers don't care that their gamers' desktop pull over 1000W and heats the room as a heater on plain winter... as "miners" don't care the energy/ pollution/ costs of virtual currencies that may disappear from a day to another...
 
Quite interesting, I read almost all posts and try to summarize:

- most games today are very badly optimized and need gigantic loads of CPU and GPU and RAM

- as even so, most gamers pay "whatever it will be needed", so CPU and GPU are also not being much more optimized anymore, they get some tiny updates and the rest is "smaller nodes, more heat and more of the same transistors allowed", so more performance and much more energy consumed

At the end, most gamers don't care that their gamers' desktop pull over 1000W and heats the room as a heater on plain winter... as "miners" don't care the energy/ pollution/ costs of virtual currencies that may disappear from a day to another...
and responsible = boring
 
true Steve BUT im not here to be "entertained" and Sausagemeat also liked the idea... so wrong, again. 2people. But hey Steve, I don't want to fight , you seem like a good guy, keep catering to folks like you and Faelan, ( that rock double 8 k ultra-wide/ 240hz monitors with overclocked -over already OCD from factory- super silly GPUs like the 3090 Super TI( or a 6900 watercooled with liquid gold in your case...)

AND NOT to the billions of regular -BORING- folks like me, that are just FINE still using 60hz and 1080p monitors ( DOWGRADED from 4k btw) + shitty GPUS like the still amazing 1060 ( my 27 HP ips monitor consumes just 4w gaming, your gamerz monitor? 450w just to run a base console game designed to function internally at just 60 fps... ( and low resolution) genius!
Each to their own. Personally, I feel 60 Hz gaming is a pretty terrible experience compared to what you can get at 120 Hz+. When I jump from my gaming PC (144 Hz) to my work PC (60 Hz) even Windows feels broken.

We know for a fact that most gamers who read/watch our content are now aiming for high refresh rate gaming which makes sense given how much better the experience is.

If you're complaining about the use of the 6900 XT and RTX 3090 for CPU testing, I'm afraid you're missing the point. Obviously the idea here is to test CPU performance and while you might not be looking at 6900 XT levels of performance with a CPU right now, you might be in a few years time.
 
Each to their own. Personally, I feel 60 Hz gaming is a pretty terrible experience compared to what you can get at 120 Hz+. When I jump from my gaming PC (144 Hz) to my work PC (60 Hz) even Windows feels broken.

We know for a fact that most gamers who read/watch our content are now aiming for high refresh rate gaming which makes sense given how much better the experience is.

If you're complaining about the use of the 6900 XT and RTX 3090 for CPU testing, I'm afraid you're missing the point. Obviously the idea here is to test CPU performance and while you might not be looking at 6900 XT levels of performance with a CPU right now, you might be in a few years time.
Techspot is testing whether the CPU is the bottleneck. If you use a weak GPU, how do you know if it is the CPU or the GPU that is the bottleneck? How do you make sure it isn’t the GPU isn’t the bottleneck? Use the fastest NVidia or AMD graphics card you can. Hence the RTX 3090 and the 6900 XT for this article
 
Techspot is testing whether the CPU is the bottleneck. If you use a weak GPU, how do you know if it is the CPU or the GPU that is the bottleneck? How do you make sure it isn’t the GPU isn’t the bottleneck? Use the fastest NVidia or AMD graphics card you can. Hence the RTX 3090 and the 6900 XT for this article
After all these years and countless explanations, it's amazing how this one still comes up every time :D But yes, you are 100% correct. I'm not sure why these guys want a hard GPU bottleneck in their CPU testing, I don't see how that's helpful.
 
Each to their own. Personally, I feel 60 Hz gaming is a pretty terrible experience compared to what you can get at 120 Hz+. When I jump from my gaming PC (144 Hz) to my work PC (60 Hz) even Windows feels broken.

We know for a fact that most gamers who read/watch our content are now aiming for high refresh rate gaming which makes sense given how much better the experience is.

If you're complaining about the use of the 6900 XT and RTX 3090 for CPU testing, I'm afraid you're missing the point. Obviously the idea here is to test CPU performance and while you might not be looking at 6900 XT levels of performance with a CPU right now, you might be in a few years time.
Let's agree to disagree, I know what today's "faster screens" can do... and -to me- is clear all current LCD technology is pretty terrible, whether it's 60 or 120 -from a speed point of view- (they can't touch a good pro CRT or even a mediocre one)

How does this technology stand out? (LCD) substantially lower consumption, weight and occupied space, and that is why I adopt it, (prioritizing all these advantages) even the fastest LCD is still a turtle on skates.


edit. like I said. Ryzen 5600X user, for "GAmigz" I think im covered for a few years, even with 6 cores, (considering consoles use something similar to a 1600 with 8mb cache ) I know why Steve uses this monsters of Gpus, I would like to see how they fare at a low power envelope ( 6700 xT actually, 230+W with 300w spikes is also terrible)
 
Last edited:
Let's agree to disagree, , I know what today's "faster screens" can do... and to me is clear all current LCD technology is pretty terrible, whether it's 60 or 120 -from a speed point of view- (they can't touch a good pro CRT or even a mediocre one)

How does this technology stand out? (LCD) substantially lower consumption, weight and occupied space, and that is why I adopt it, (prioritizing all these advantages) even the fastest LCD is still a turtle on skates.
What about OLED? Anyway, you're entitled to that opinion, as strange as it might be :D

Just know you're in the minority on this one, not the majority. So that might help explain why we do things the way we do. Also, I had a pro-grade CRT and compared to modern LCD's it sucked, just the noise alone is laughable compared to what we have today (not to mention the size).
 
What about OLED? Anyway, you're entitled to that opinion, as strange as it might be :D

Just know you're in the minority on this one, not the majority. So that might help explain why we do things the way we do. Also, I had a pro-grade CRT and compared to modern LCD's it sucked, just the noise alone is laughable compared to what we have today (not to mention the size).
Not only were they FAT, loud bastards, but you also took a nice shower of EMF radiation with every use. Nice! But hey, CRT is capable of perfectly reproducing any resolution instantly...

Unlike the fragile and $ oled which is STILL a fixed pixel based display one that looks like CRAP if you don't use the native resolution. yet. low power and amazing color rep, and it bends!

Anyway, rant, looking forward to yr undervolting exploration on the 6700xt vs 3060 TI for me it would be interesting, hell I might even get one if you "magically" mage to produce a RADEON / RTX GPU x2 my retro 1060 at just 120W. AMD / NVIDIA couldn't, so far...
 
Last edited:
Each to their own. Personally, I feel 60 Hz gaming is a pretty terrible experience compared to what you can get at 120 Hz+. When I jump from my gaming PC (144 Hz) to my work PC (60 Hz) even Windows feels broken.
“Terrible experience” is a bit much mate. I definitely agree that 120hz+ or whatever is considerably better. But I think a lot of people can enjoy a game at 60 just fine. For most of my life as a PC gamer, particularly when I was a lot younger 60hz was the goal. I think going backwards from a 120hz to a 60hz monitor is more noticeable than moving up to it, you’re just a victim of having access to a lot of modern expensive products!

I’m on 60hz now, I’d have to spend astronomical money on a GPU, CPU and monitor to get above 60. It’s definitely not worth it with the extended prices. If prices come down again I’d get into it but right now I’m just putting up with 1080p60 and it’s not the end of the world. Good games are still good games at 60hz and it doesn’t feel choppy to me like 30hz does, it’s what I’m used to I guess.

Also agree with you about CRTs, I’d take a good modern VA over an old CRT, for image quality and footprint. Not really sure why people love them so much. I loved my 19” CRT in 2004 but we’ve come a long way since then.
 
“Terrible experience” is a bit much mate. I definitely agree that 120hz+ or whatever is considerably better. But I think a lot of people can enjoy a game at 60 just fine. For most of my life as a PC gamer, particularly when I was a lot younger 60hz was the goal. I think going backwards from a 120hz to a 60hz monitor is more noticeable than moving up to it, you’re just a victim of having access to a lot of modern expensive products!

I’m on 60hz now, I’d have to spend astronomical money on a GPU, CPU and monitor to get above 60. It’s definitely not worth it with the extended prices. If prices come down again I’d get into it but right now I’m just putting up with 1080p60 and it’s not the end of the world. Good games are still good games at 60hz and it doesn’t feel choppy to me like 30hz does, it’s what I’m used to I guess.

Also agree with you about CRTs, I’d take a good modern VA over an old CRT, for image quality and footprint. Not really sure why people love them so much. I loved my 19” CRT in 2004 but we’ve come a long way since then.
Going from high-refresh rate to 60 Hz is at the very least a terrible transition. I know, I do it daily and it sucks. I'll have a solution soon, but for now it's a jarring experience.
 
Hilarious, One of the reasons why I turn off "motion blur" in games, is that my lcd sorto already has the same effect of mild blur when moving, included from factory. ( my bro super gamer 144 lcd does the same btw)

im blasting aliens with no problem at 60 FPS (and 75w) but hey im not a pro gamer. edit. yes I have a good response time, and that's all that matters- TO ME- xp for non competitive FPS / gaming

como esta tu español Steve? este mr es de los mios, no nota UNA SOLA DIFERENCIA entre 60 y 144 (gaming)
translation

joined that community of users who after trying the experience, were left with a strange feeling: that despite what they say, the step from 60 Hz to 144 Hz did not mean anything to me"
 
Last edited:
Hilarious, One of the reasons why I turn off "motion blur" in games, is that my lcd sorto already has the same effect of mild blur when moving, included from factory. ( my bro super gamer 144 lcd does the same btw)
I find motion blur to be very different from game to game. but generally I leave it on, I think in most modern games it’s a good thing. But we have had some poor implementations in the past.
 
I find motion blur to be very different from game to game. but generally I leave it on, I think in most modern games it’s a good thing. But we have had some poor implementations in the past.
to me, motion blur, chromatic aberration and film grain = same fate, OFF edit. found a guy that does this with a 6700 xt even uses a 120W bios. but its a MINER :(
 
Last edited:
Played Tomb 2018 on my i7 2600K@5.1GHz just fine but if memory serves me right it was in DX12 mode where it worked the best. Also played the latest PC version of final fantasy and it also worked decently enough on the system.
The problem with your system is that DDR3 heavily bottlenecks gameplay. There's a few videos out there where people have an overclocked i5 2500k on Witcher 3 and an RX 480/580 and changed the memory from 1600 to 2133Mhz and seen double figures in fps recovered. Still an impressive overclock though, 5.1Ghz is unheard of from Sandy Bridge.
 
Back