A Decade Later: Does the Q6600 Still Have Game in 2017?

Locked @30FPS, FC5, The Division 2, and Ghost Recon Wildlands are all playable on medium settings on my Q9650@3.2Ghz paired with a 780Ti. Bump up a generation and my i7 920@4.0Ghz, i7 960@4.3Ghz, and i7 980X@4.5Ghz are still great for 1080p/1440p high to ultra gaming paired with R9 Nano/1060 3GB/980Ti/1070Ti.
 
I wonder how my i7 920 would do with similar specs and tests...
It would do very well, testing with my i7 920@3.8Ghz and R9 Nano as well as a 980Ti it is still fantastic for 99% of games (the 1% being the last 2 Assassin's Creed Games which use instructions not supported by this CPU, so they run @30FPS pretty much).
 
Last edited:
LGA775 was a legendary platform. I had an E6600 and then got a cherry picked Q6700 sample which was the model above this. I overclocked the bejesus out of them. Good times. Especially when you could buy a super budget pentium E2140 1.6ghz and make it easily run 3.2ghz+ and perform like a chip 5 times its price....

I quickly moved on to LGA1366 and an i7 920 in late 2008 though because that was a genuine leap. It was a best part of 30 percent faster clock for clock than the Core 2 Quads, 8 threads and an integrated DDR3 memory controller with three channels, meaning huge memory bandwidth.

I kept that machine around until last year still doing odd jobs until the X58 board sadly failed. 8 years, tens of thousands of hours of solid running ain't bad for the stress I put it under! I bet that platform and chip holds up a lot better than the Core 2 today because of the aforementioned modern features. I know that if you overclocked the old i7 920 it was still competitive albeit hot and power hungry. Test one at say 3.6ghz and insert the results here please!!

I know it's been quite a while, but just in case you're still interested; ran these the past few days:

HRlSaww.png
 
Last edited:
I have an old i5-2500K paired with a new GTX 1070 and 8Gb of DDR3 and I can still play most new titles with all the bells and whistles turned on at 1080p resolution.

Thinking about making the leap to Kaby Lake but im rather ambivalent to shell out for a new i5, motherboard and DDR4 considering the cost of memory.

I'd wait for DDR5. Not so long away. A year or so? Your 2500k will really have got its money worth by then. And the difference should be more noticeable. Plus new processors and G cards... PCIE 4.0 should be common on both AMD and Intel by then. Gives you time to save, and you'll feel the difference greater. Nothing worst than upgrading and spending your pennies expecting to it to soo much better and realising it didnt need to be done - you could've waited. I woulda thought a 2500K wouldn't be upto by now, but after seeing this article - wow. I'd wanna "run it into the ground" at this point. Your choice though.
 
I'd wait for DDR5. Not so long away. A year or so? Your 2500k will really have got its money worth by then. And the difference should be more noticeable. Plus new processors and G cards... PCIE 4.0 should be common on both AMD and Intel by then. Gives you time to save, and you'll feel the difference greater. Nothing worst than upgrading and spending your pennies expecting to it to soo much better and realising it didnt need to be done - you could've waited. I woulda thought a 2500K wouldn't be upto by now, but after seeing this article - wow. I'd wanna "run it into the ground" at this point. Your choice though.
Nothing is using ddr4 memory speeds yet... So no big jump there besides powerful draw. Same goes for pcie4 nothing is fully utilizing pcie 3 yet.. so yes you could wait. Or just buy and Right now as it's a better hang per buck. Also less power usage on CPU side.
 
Back