A Decade Later: Does the Q6600 Still Have Game in 2017?

I can pop an old q9400 (closely related to the q6600), something like a HD7770 and 8gb of ram into an Optiplex 745 for less than 100 dollars and have a perfectly legitimate gaming machine for...... well less than 100 dollars. That's a bargain no matter how you look at it.
 
All this testing and we didn't get any overclocked results from the Q6600?
Can it run stable @ 3.5GHz?
Clock speeds must be somewhat close for true comparison testing IMO.

Also, I would love to see the results of every generation i7 @ 4.0GHz for these same benchmarks.
 
Cool article, not terribly surprised by the outcome however, a 10 year old CPU is always going to be a 10 year old CPU. It was a great chip while I had it, overclocked easily to 3.2GHz, but when I sold it to a friend and moved up to the i7 920 it was night and day. That chip was as much a turning point in CPU technology as the Q6600 was in it's day, as with others it would have been nice to have it in the article as well to see what the next step could get you and then compared to the latest and greatest.

Perhaps a Seven Generation of the i7 article is coming? That would be really cool to see, because believe it or not the i7 has already been around for 8 years as well, really amazing to think from Q4'06 to Q4'08 was as long as the Core 2 Quad series was king, a short 2 years compared to how long the i7s have been the top dog with no sign of that changing. Really hoping Ryzen has what it takes to mix things up in the industry, it's been too long since anything has shown us what the future of CPUs can actually do.
 
I can pop an old q9400 (closely related to the q6600), something like a HD7770 and 8gb of ram into an Optiplex 745 for less than 100 dollars and have a perfectly legitimate gaming machine for...... well less than 100 dollars. That's a bargain no matter how you look at it.

If that's your idea of a "legitimate" gaming machine that's your opinion, it also depends on what games your playing on it, but don't be calling it "legitimate" call it value or budget (bargain like you yourself mention), but even at that, you can build something far more efficient that will pay for it self on electrical bill savings.
 
Interesting read

I have 2 LGA 775 rigs with me and they are still running to this day

One rig is powered by the QX6800 which was the 1000 dollars CPU during the heydays and now only cost me 50 dollars to get one. I paired it with the Gigabyte EP45 UDP3 mobo with 8GB of DDR2 800MHz RAM and a GTX950 2GB GPU. OC to 3.2GHz from 2.93GHz

at 1080P Med in games like Battlefield 1, I get at least 30fps. If my OC was a lot more aggressive, I could extract prob a few more fps. But the board is a hand me down and my cooler is still up to the job. This is a 130W TDP CPU and even the famous CoolerMaster Hyper 212X cooler is only keeping it at 60+ to 70 degrees during high loads

The other is the Q6600 HTPC paired with a G41 board and 4GB of DDR3 1333MHz from old HP PCs. With the DDR3, I get a stable 3GHz OC on the Q6600 without me tweaking any voltages and it still on the stock Intel cooler that came with the Q6600. Those cooler with the copper center. The HDMI output was solved with the HD5450 1GB DDR3. And dual boot to Win 10 and Windows XP. This is my retro gaming PC for games like Mechwarrior 4 and Need For Speed Underground.
If 30 fps is playable to you more power to you. I wouldn't play a game where I only got 30fps. Especially when I have a 120hz monitor. Minimum for me to be happy is 60 frames. And I do realize though its nice still being able to use older hardware.
 
"We set out to discover if the decade old Core 2 Quad Q6600 could cut the mustard in 2017"

And you did that by testing everything on Ultra?

It would have been much more interesting to see how it fared with the silly graphics options turned off. Whether it could run the games smoothly while still having them look as intended by using medium or similar settings.
"silly graphics options"?

Graphics, along with the mouse & keyboard (for some), are the only reasons people game on PC. Who decided that games "look as intended" at medium settings? The more logical assumption is that their intended look is at max settings, with lower settings being offered to accommodate slower rigs. Otherwise, why would game designers even off higher settings?
 
This whole article is a funny. From someone who just had a Q6600 last year, and I had it for 2 years. I ran Over-watch, Rocket League, and other games just fine. Overwatch I played at 1600x900 nearly max settings and I got between 39-60fps, rocket league 1600x900 max 50fps. My specs was Intel Q6600 2.4ghz, 4gb ddr2 memory, Win764bit, PNY GTX 550 TI.

The main problem with this arctile is that the new cards are too powerful for the quad, thus massive bottleneck. So its basically a pointless inaccurate article.

*Sigh* That IS the point. Anyone thinking that a new GPU is all they really need to bring a 10 year old PC in the modern world now knows that their old CPU- while fine for surfing - won't cut it for gaming. There's nothing pointless or inaccurate about it.
 
OH MY GOD! In 2011 2500K platform cost in EU currencies was half of what 6700K cost today. Even in US$, 6700k cost of $339.99 vs $216 for 2011 2500K. It's not a stalemate, It's a regress! Forget slowdown, It's a miracle PC market hasn't halted dead already! Shouldn't some regulatory body look into possible Intel's monopoly practises?
Slow down there, Sparky! The 6700K in this test is an i7, not an i5. Two different beasts.

The current i5 is the Kaby Lake i5-7600K and it sells for $250. That's about a 15% increase 6 years later.
 
All this testing and we didn't get any overclocked results from the Q6600?
Can it run stable @ 3.5GHz?
Clock speeds must be somewhat close for true comparison testing IMO.

Also, I would love to see the results of every generation i7 @ 4.0GHz for these same benchmarks.

It ran stable at up to 3.1GHz and the results were shown in every graph in the article...
 
[
So it is "basically a pointless inaccurate" to see if the Q6600 can tackle the latest games using modern graphics cards? That is somehow inaccurate...

The point of the article wasn't to see if the Q6600 can handle low-end games using entry-level settings. If it were we would have tested using 1600x900 and medium to low quality settings.

Q6600 2007 + 1060/1070 2017 = Bottleneck. What did you expect was going to happen?
The headline reads: " Does the Q6600 Still have the game in 2007? "
To answer that question, yes it does. Only if you use the right setup. And one of them is not a 1060 or 1070 card, a 550 ti or 750 ti would be the best to use using a q6600. or even a GTX 970 which interestingly can max out Doom at ultra settings with no problems using a q6600. So to me its a pointless inaccurate article. It contradicts what the headline reads.

Then at "We set out to discover if the decade old Core 2 Quad Q6600 could cut the mustard in 2017, and the answer is a resounding no. " Well it can, as I mention if your setup is right with the right card and right drivers and you game at high or max settings at a lower resolution 1600x900 or 1920x1080.
A) The tests WERE run at 1080p, as indicated at the top of every test.

B) By "the right card and the right drivers" you mean OLD ones, which would defeat the purpose of the article.

C) No, you can't max out DOOM using a Q6600, period... unless you mean the original Doom from 1993.

Why don't you just cut to the chase: do you use a Q6600, and find this article offensive? Why else would you defend a ten year old gaming dinosaur so vigorously? Do you think calling the article pointless makes you sound smart?
 
Cool article, not terribly surprised by the outcome however, a 10 year old CPU is always going to be a 10 year old CPU. It was a great chip while I had it, overclocked easily to 3.2GHz, but when I sold it to a friend and moved up to the i7 920 it was night and day. That chip was as much a turning point in CPU technology as the Q6600 was in it's day, as with others it would have been nice to have it in the article as well to see what the next step could get you and then compared to the latest and greatest.

Perhaps a Seven Generation of the i7 article is coming? That would be really cool to see, because believe it or not the i7 has already been around for 8 years as well, really amazing to think from Q4'06 to Q4'08 was as long as the Core 2 Quad series was king, a short 2 years compared to how long the i7s have been the top dog with no sign of that changing. Really hoping Ryzen has what it takes to mix things up in the industry, it's been too long since anything has shown us what the future of CPUs can actually do.
Ryzen isn't going to do anything that the current i7 can't, it'll just supposedly be cheaper. Early "leaks" show it hanging with the i7 in synthetic benchmarks and the i5 in gaming. It'll be nice to see some competition again, but performance-wise it's not going to be breaking any new ground.
 
OH MY GOD! In 2011 2500K platform cost in EU currencies was half of what 6700K cost today. Even in US$, 6700k cost of $339.99 vs $216 for 2011 2500K. It's not a stalemate, It's a regress! Forget slowdown, It's a miracle PC market hasn't halted dead already! Shouldn't some regulatory body look into possible Intel's monopoly practises?
Slow down there, Sparky! The 6700K in this test is an i7, not an i5. Two different beasts.

The current i5 is the Kaby Lake i5-7600K and it sells for $250. That's about a 15% increase 6 years later.
Wait, what? You lost Me there! The i7-6700k Skylake in the test cost $339 at launch and $320 nowadays. Why did You jump to $250 i5-7600k Kaby, if it's not in the test?
 
Wait, what? You lost Me there! The i7-6700k Skylake in the test cost $339 at launch and $320 nowadays. Why did You jump to $250 i5-7600k Kaby, if it's not in the test?
Sorry, I thought that since you were comparing the cost of the 2500K (i5) and the 6700K (i7), and saying how things have regressed- you believed both models were in the i5 family and that prices had skyrocketed. I was then saying that there isn't a large price difference between the latest i5 and the 2500K.
 
It doesn't. I sold mine four years ago. And back then, it was already noticeably slower than current Intel quad models. I noticed the difference in games, apps and overall OS performance. If you still use one, I think your only excuse can be that you only do web browsing and mostly old apps/games. As slow as CPU progress going lately, the difference is still tremendous.
 
It doesn't. I sold mine four years ago. And back then, it was already noticeably slower than current Intel quad models. I noticed the difference in games, apps and overall OS performance. If you still use one, I think your only excuse can be that you only do web browsing and mostly old apps/games. As slow as CPU progress going lately, the difference is still tremendous.

Yeah I replaced a Q6600 machine for someone that was breaking down with a second hand i3 4150 dual core + Z97 and it comprehensively duffs it up even with half the cores. Sips power.

If you still have old hardware and it does the job then run it into the ground. But replacing it with something vastly better all this time later is really easy and not expensive.

Not to mention the board upgrade is tremendous going from that time period. Native USB 3, SATA 3, NVMe and the like.
 
It was fun to see this Steve- thanks for doing this. I'd call the results of Gears of War 4 @1080P Ultra (GTX1070) impressive to say the least:

i5-2500K Avg/Min = 84/62
i7-6800K Avg/Min + 118/100

I'm guessing that Gears actually utilizes hyperthreading- otherwise the whole "i7 is a overkill for a gaming rig" mantra is bogus; 35fps+ is no joke. But again, it would be cool to see the difference between the i5-2500K and the shiny new i5-7600K. I'm thinking THAT difference is <10 fps?

Think I'll splurge for an i7 for my next rig. New games may start using HT more often. I guess we'll see what the next few years bring.
 
Interesting article, yet... I seem to feel it's almost like most political stories, that get tilted and molded to fit one particular side or view. The reviewer only tested in Ultra. I think while these test systems were set up, perhaps high-med-low quality settings should have been tested as well. Doing so would show the audience what compromises they are making in order to achieve playable frame rates in these games... additionally, there are other games in which the C2Q & C2D still handle just fine... Speaking of Overwatch... my son plays on a C2D E8400 paired with a 750ti... 1080p. Is it on Ultra? certainly not.. but.. it is playable and he enjoys it just the same.
 
Interesting article, yet... I seem to feel it's almost like most political stories, that get tilted and molded to fit one particular side or view. The reviewer only tested in Ultra. I think while these test systems were set up, perhaps high-med-low quality settings should have been tested as well. Doing so would show the audience what compromises they are making in order to achieve playable frame rates in these games... additionally, there are other games in which the C2Q & C2D still handle just fine... Speaking of Overwatch... my son plays on a C2D E8400 paired with a 750ti... 1080p. Is it on Ultra? certainly not.. but.. it is playable and he enjoys it just the same.

Did you read the article or did you just look at the graphs?
 
Interesting read

I have 2 LGA 775 rigs with me and they are still running to this day

One rig is powered by the QX6800 which was the 1000 dollars CPU during the heydays and now only cost me 50 dollars to get one. I paired it with the Gigabyte EP45 UDP3 mobo with 8GB of DDR2 800MHz RAM and a GTX950 2GB GPU. OC to 3.2GHz from 2.93GHz

at 1080P Med in games like Battlefield 1, I get at least 30fps. If my OC was a lot more aggressive, I could extract prob a few more fps. But the board is a hand me down and my cooler is still up to the job. This is a 130W TDP CPU and even the famous CoolerMaster Hyper 212X cooler is only keeping it at 60+ to 70 degrees during high loads

The other is the Q6600 HTPC paired with a G41 board and 4GB of DDR3 1333MHz from old HP PCs. With the DDR3, I get a stable 3GHz OC on the Q6600 without me tweaking any voltages and it still on the stock Intel cooler that came with the Q6600. Those cooler with the copper center. The HDMI output was solved with the HD5450 1GB DDR3. And dual boot to Win 10 and Windows XP. This is my retro gaming PC for games like Mechwarrior 4 and Need For Speed Underground.
If 30 fps is playable to you more power to you. I wouldn't play a game where I only got 30fps. Especially when I have a 120hz monitor. Minimum for me to be happy is 60 frames. And I do realize though its nice still being able to use older hardware.


A lot of this parts were hand me downs from family and friends. I not exactly rich and even with a job. I can barely scrap enough for food and other expenses. Let alone a game console or a nice gaming rig. So I can complain about my hardware as long I can play games. I can prob get more fps if I choose 720P Med
 
Interesting article, yet... I seem to feel it's almost like most political stories, that get tilted and molded to fit one particular side or view. The reviewer only tested in Ultra. I think while these test systems were set up, perhaps high-med-low quality settings should have been tested as well. Doing so would show the audience what compromises they are making in order to achieve playable frame rates in these games... additionally, there are other games in which the C2Q & C2D still handle just fine... Speaking of Overwatch... my son plays on a C2D E8400 paired with a 750ti... 1080p. Is it on Ultra? certainly not.. but.. it is playable and he enjoys it just the same.

Did you read the article or did you just look at the graphs?

Yeah, read it twice. And the two places where you call the game unplayable made me laugh. Here's the thing though, there's been other articles that basically have done what you did, but kept the expectations reasonable... Such that, to answer the question of your article title, "... does it still have game" I would say yes, however you've basically set it up for failure in your testing methodology. Can people game on C2Q's and C2D's? Yes, they can and certainly do, and I'm pretty sure they're playing more than Minesweeper and Solitaire. Here is a link to an article that pairs an old C2Q with a GTX 750, and they get playable framerates in BF4, Dirt Rally, Dragon Age Inquisition & Dying Light, GTA 5, and Witcher 3.
http://www.hardwaresecrets.com/whic...d-but-high-end-or-entry-level-and-new-part-2/
http://www.hardwaresecrets.com/is-a-high-end-cpu-a-real-need-for-a-gaming-computer/

And for those who like to think that dual channel memory > single channel, and that more ram > less ram here's 2 very interesting articles.
http://www.hardwaresecrets.com/does-dual-channel-memory-make-difference-in-gaming-performance/
http://www.hardwaresecrets.com/does-more-ram-improve-gaming-performance/

How's that saying go? If you judge a fish by its ability to fly....
 
I am running my old Q6600 with a GTX 1050 Ti. It does fine. Obviously I lower settings on the games that are processor hogs. No big deal. If someone runs ultra settings at 1080 on a 10 year old processor, they get what they get.
 
Interesting article, yet... I seem to feel it's almost like most political stories, that get tilted and molded to fit one particular side or view. The reviewer only tested in Ultra. I think while these test systems were set up, perhaps high-med-low quality settings should have been tested as well. Doing so would show the audience what compromises they are making in order to achieve playable frame rates in these games... additionally, there are other games in which the C2Q & C2D still handle just fine... Speaking of Overwatch... my son plays on a C2D E8400 paired with a 750ti... 1080p. Is it on Ultra? certainly not.. but.. it is playable and he enjoys it just the same.
What motive would Techspot possibly have to make a ten year old chip -with extremely limited capabilities in 2017- look worse than it is? If the 6700K is offering 3-6 times the frame rates at Ultra, it will offer roughly the same margins at other settings. Techspot didn't bother doing this because it's a waste of time on a decade-old, irrelevant cpu.

There's nothing to be "tilted and molded" here. There is no "side or view". Facts aren't open to interpretation: The Earth is bigger than the moon... a beam of light is faster than a Porsche...and modern CPUs destroy old ones in all measurable areas at all settings and resolutions.
 
Last edited:
It ran stable at up to 3.1GHz and the results were shown in every graph in the article...
Wow I glanced over the charts and didn't notice it there, sorry about that Steve.
Some reader I am.
That being said, I truly wonder how it would compare at similar clock speeds, I know this chip can't handle it, but maybe the Q8400 will get a little closer? God I would love to see these same benchmarks at 3.5GHz to 3.8GHz... got any dry ice hanging around?
 
Back