A Decade Later: Does the Q6600 Still Have Game in 2017?

I have an old i5-2500K paired with a new GTX 1070 and 8Gb of DDR3 and I can still play most new titles with all the bells and whistles turned on at 1080p resolution.

Thinking about making the leap to Kaby Lake but im rather ambivalent to shell out for a new i5, motherboard and DDR4 considering the cost of memory.
 
I have an old i5-2500K paired with a new GTX 1070 and 8Gb of DDR3 and I can still play most new titles with all the bells and whistles turned on at 1080p resolution.

Thinking about making the leap to Kaby Lake but im rather ambivalent to shell out for a new i5, motherboard and DDR4 considering the cost of memory.

My advice would be to wait and see what AMD brings to the table with Ryzen. It's only a few months away now.
 
My advice would be to wait and see what AMD brings to the table with Ryzen. It's only a few months away now.
That would be my thinking also, kind of prefer Intel all the same considering AMD contribution to the CPU market for the last few cycles has been rather piss poor by comparison.
 
That would be my thinking also, kind of prefer Intel all the same considering AMD contribution to the CPU market for the last few cycles has been rather piss poor by comparison.

Sadly they made a bad bet and had to live with it for the past 5 years. It doesn't look like they have made the same mistake again, Ryzen looks to be the real deal. Anyway we will know for sure soon ;)
 
My advice would be to wait and see what AMD brings to the table with Ryzen. It's only a few months away now.
That would be my thinking also, kind of prefer Intel all the same considering AMD contribution to the CPU market for the last few cycles has been rather piss poor by comparison.

I'd call Intel's recent contribution to the desktop market piss poor.
Let's all hope Ryzen can wake Intel up a little.
 
I just recently stopped using a q9450. It was pretty good in all but games from the last 2-3 years (and unusually CPU dependent games like GTA IV and skyrim) but the real limiter was the 4gb, DDR2 ram IMO. still sitting in a box and I dunno what ill do with it.
 
Even though it may not seem like it, DDR2 @ 800 MHz will slow modern games down significantly compared to running faster DDR3 on the same CPUs, with a DDR3 motherboard. You probably can't go as high on the CPU clock as you would with DDR2-1066 though, because you don't want to be going above 400 FSB on DDR3 as the speeds will be forced past 1600 MHz, which probably won't be stable on a C2Q.

Seeing the CPU utilization at 100% on all cores will trick you out of thinking the RAM is slowing you down.
 
I retired my QX8200 when BF4 was released as it couldn't cope (the general fps was fine but stuttering and random freezes made playing impossible). Replaced with an i7 3770 that so far I have no plans to replace. The core 2 quad is now my media player and casual steam gaming machine (Linux mint now as it was originally Vista).
 
Great article, I'd rather read articles such as this then reviews of overpriced $400+ hardware

I have a OC 2500k @ 4ghz paired with a factory OC GTX 1060 3GB, there isn't a game that pairing can't handle.

I even have a stock phenom II x4 955 paired with a crucial SSD that works great as a PC for my wife, handles web surfing and picture uploads with no issues.

Chasing synthetic performance gains is simply a waste of money
 
I have an old i5-2500K paired with a new GTX 1070 and 8Gb of DDR3 and I can still play most new titles with all the bells and whistles turned on at 1080p resolution.

Thinking about making the leap to Kaby Lake but im rather ambivalent to shell out for a new i5, motherboard and DDR4 considering the cost of memory.

The most cost effective solution would be to throw a 3770k into the same board you've got there and call it a day for another few years. 3770k is still a hell of a chip not too far behind skylake/kaby lake.

https://youtu.be/xhuC8Tf9i3I
 
Like everyone else who read this article, I came to the obvious conclusion is that Intel has been at nearly a full stop since the 2500k. What was missing from the benchmarks, and would have slammed this point home would have been to take the 2500k to 4.5 Ghz (feel free to bump the 6700k for comparison too) and you'll see that at that point, you are GPU bound at any playable resolution.

Memory advances are technically great but the performance difference between DDR3 and DDR4 in the real world is nothing.

Even with AMD's Ryzen coming out, I don't know if there's any reason to upgrade from the i5-3570k at 4.4 GHz. GPU's are the new CPU's and those are bound to hit the same fate soon too. They keep getting bigger and bigger but there are limits and we are getting close.
 
I had a Q6200 w/5850 xfire but I had to give it up for a Core i7 i7-3770, it got so bad that the CPU load would hit nearly 100% while running full screen. Sadly it was time to retire that LGA775 Rig.
 
You know it's old tech, and you assume new tech is better even when they have "similar" mhz performance just because it's new, yet you don't realize how much things have changed until you see the benchmarks and even when it's only a couple mhz of difference, improvements go a long way.
 
I think I got about 4-5 years out of my Q6600, might have been 6. I still have it actually, in it's original anti-static case it came in, in my desk drawer. I'm planning on building a HTPC/media server and dropping it in there

For now, my 3770K is still trucking along. Probably won't get replaced until a year after Ryzen release (let AMD shake out, and see what Intel's response is)
 
Wow for the i5-2500k + gtx1060 gaming results!
is it okay to pair an i3-3240 and gtx 1060 for 1080p gaming? I don't want that thing called 'bottleneck'...
 
Nice article. Can't believe Q6600 is already 10 years old, damn time is flying. Q6600 was my first quad core and it was awesome, Weird to see an i3 blowing Q6600 out of the water. :D
 
My spouse still uses my old Dell Inspiron 530 Q6600 (upgraded with an SSD).

9 years old and still is a great home PC.
 
Back