A Decade Later: Does the Q6600 Still Have Game in 2017?

This is as true statement, as naive one. Testing both 1080 and 750Ti with same Ultra setting will give You objective performance comparison between the cards, but will not tell the relatively poor, Indian, Chinese or East European 750Ti owner if He should invest in a new game. Such test will look for him like a one performed by a sellout Intel/nVidia wh*re of a journalist. And then He will go to YT to look for "GTA5 on a q6600 and 750Ti". And then it will turn out this whole article might be a scam sponsored by Intel.

If you checked on YouTube you would just find a heap of videos like this...
 
If you checked on YouTube you would just find a heap of videos like this...

I don't see settings for both videos, but I know for a fact that My economically challenged younger brother had two playthroughs of a second hand Witcher 3 on my old q9550 2.8GHz and HD6850 on minimum/medium @ 720p with some 24-32fps, so very "consolish" experience and then He successfully sold the game to a third person, costing him whole eperience some 10 bucks. Probably Your head would explode trying to play this game like that but He's an old fan of the Sapkowski's book saga and was happy as a child (that He technically is...)
 
Last edited by a moderator:
[
So it is "basically a pointless inaccurate" to see if the Q6600 can tackle the latest games using modern graphics cards? That is somehow inaccurate...

The point of the article wasn't to see if the Q6600 can handle low-end games using entry-level settings. If it were we would have tested using 1600x900 and medium to low quality settings.

Q6600 2007 + 1060/1070 2017 = Bottleneck. What did you expect was going to happen?
The headline reads: " Does the Q6600 Still have the game in 2007? "
To answer that question, yes it does. Only if you use the right setup. And one of them is not a 1060 or 1070 card, a 550 ti or 750 ti would be the best to use using a q6600. or even a GTX 970 which interestingly can max out Doom at ultra settings with no problems using a q6600. So to me its a pointless inaccurate article. It contradicts what the headline reads.

Then at "We set out to discover if the decade old Core 2 Quad Q6600 could cut the mustard in 2017, and the answer is a resounding no. " Well it can, as I mention if your setup is right with the right card and right drivers and you game at high or max settings at a lower resolution 1600x900 or 1920x1080.
 
LGA775 was a legendary platform. I had an E6600 and then got a cherry picked Q6700 sample which was the model above this. I overclocked the bejesus out of them. Good times. Especially when you could buy a super budget pentium E2140 1.6ghz and make it easily run 3.2ghz+ and perform like a chip 5 times its price....

I quickly moved on to LGA1366 and an i7 920 in late 2008 though because that was a genuine leap. It was a best part of 30 percent faster clock for clock than the Core 2 Quads, 8 threads and an integrated DDR3 memory controller with three channels, meaning huge memory bandwidth.

I kept that machine around until last year still doing odd jobs until the X58 board sadly failed. 8 years, tens of thousands of hours of solid running ain't bad for the stress I put it under! I bet that platform and chip holds up a lot better than the Core 2 today because of the aforementioned modern features. I know that if you overclocked the old i7 920 it was still competitive albeit hot and power hungry. Test one at say 3.6ghz and insert the results here please!!
 
2.83Ghz Intel Q9550 @ stock
Gigabyte GA-EP45-UD3P rev. 1.1 BIOS rev. f9
8GB (4x2GB) 1066Mhz Corsair Dominator & Gskill DDR2 RAM
6GB EVGA GTX1060 primary, 2GB EVGA GTX550ti Physx only
320GB, 500GB, 1TB Western Digital HDDs
16x Sony DVD+\-RW, Mr.Floppy
Soundblaster X-Fi
Antec TPQ 850 watt PSU
Win7 Pro x64
Built in June 2009, been going strong ever since.
Plays everything currently at Ultra Detail settings @ 1080p DOOM runs at 49-110+ FPS
Load times on some games like BF1 can get a little lengthy but the game runs like a champ with very few hiccups/stuttering even on a 64 player server. an SSD HDD would probably help the load times.
Rome 2 Total War, The Witcher3, Fallout 4, RB6 Siege+ 4k texturepack, HITMAN, Killingfloor2
 
The Q9550 has that glorious 12MB cache, it originally had 4GB's ram and a 1GB EVGA Geforce 9800 as primary, and another of the same exact card for physx only, seems like a waste but typically it would pull better minimum FPS and anything running physx in game would be buttery smooth, and is now even more so with the 550ti running that duty.
 
When I saw the benchmarks on the i5 2500k 6 years ago, I knew it was a rare gem that appears once every few tech generation. I'm still running that chip happily today and won't be upgrading again for the next few years.

I'll see your 2500k and raise you one generation. I'm running a i7920 @ 3.8 no worries. There was a big jump from core 2 quad to Nehalem then baby steps ever since. And as I suspected its better to keep a 2500k and get a GTX 1070 then get a 6700k + GTX 1060. Less money too.
 
I still run two systems with a core 2 quad q6600. I use them for League of Legends, but any games beyond that run at about 30 fps no matter what graphics card I out in it. One of the systems has a GTX 580 and even though it's an old GPU the CPU still bottlenecks it. I use the q6600 because they're only $13-$15.
 
Q6600 2007 + 1060/1070 2017 = Bottleneck. What did you expect was going to happen?

Naturally we expected a serious CPU bottleneck to occur. What we wanted to show was how the Q6600 compares using modern GPUs in modern games to more modern dual-core and quad-core processors. Looking at the feedback so far it seems like 99.9% of the readership had no trouble understanding this and were very interested in the results.

The headline reads: " Does the Q6600 Still have the game in 2007? "
To answer that question, yes it does. Only if you use the right setup. And one of them is not a 1060 or 1070 card, a 550 ti or 750 ti would be the best to use using a q6600. or even a GTX 970 which interestingly can max out Doom at ultra settings with no problems using a q6600. So to me its a pointless inaccurate article. It contradicts what the headline reads.

Right… so let me try and wrap my head around this logic. The Q6600 can indeed play modern games if you use a slower graphics card, like the 550 Ti or 750 Ti. The GTX 1060 is just too fast. Makes sense, so can gamers expect those minimums in Battlefield 1, Gears of War 4 and Total War: Warhammer to increase with a much lower end graphics card? Before you answer with something ignorant please be aware that even with the lowest possible quality settings enabled those minimum frame rates shown in this article, with either the 1060 or 1070 didn’t improve.

Then at "We set out to discover if the decade old Core 2 Quad Q6600 could cut the mustard in 2017, and the answer is a resounding no. " Well it can, as I mention if your setup is right with the right card and right drivers and you game at high or max settings at a lower resolution 1600x900 or 1920x1080.

Again please let me know the correct drivers and settings for playable performance in say… Battlefield 1 with a Q6600 :D

Really it’s like you didn’t even read the article anyway as we addressed many of the points you raised.

“Pairing the Q6600 with either the GeForce GTX 1060 or GTX 1070 is a bad idea. Even the GTX 1050 and RX 460 seem like overkill. Given how much faster the Haswell dual-core Pentium processor was, we see no reason why anyone would bother with a Core 2 series system.”

“It's still possible to play games such as Rocket League and Dota 2 on a Core 2 Quad, so casual gamers will still find a use for them however that won't require a powerful GPU so the combination is still not justified. Also given the excessive power consumption and other shortcomings, we would suggest trying to pick up a more modern secondhand PC. Also remember that now Kaby Lake Pentium processors feature Hyper-Threading, essentially making them lower clocked Core i3 processors, so there's even less incentive in using older quad-cores.”
 
Fwiw, "GOT game" is the cliche`

Thanks for finding an excellent method for re-stating what a goldmine the 2500K still is.. the Best, bar none, money I ever spent. I'll repeat myself, as another poster above, and remind you that a 'K runs @ not less than 4GHz on every enthusiast or above mobo.

With a brainless, factory cool, one-button OC offering itself for real-world comparison, it's a fair guess that 'average' joes were able to get that performance value. Mine is 4GHz 'light' OC for 5 years 24/7 and bangin'.. (and saving CPU money skipping the Endless tick-cycles goes a Long way toward payin' for a 1070, even a 1080 - MHO) Put another way, have you heard of a 2500K that was unable to attain 4GHz / >20%OC at install with a competitive mobo?
 
Last edited:
When I saw the benchmarks on the i5 2500k 6 years ago, I knew it was a rare gem that appears once every few tech generation. I'm still running that chip happily today and won't be upgrading again for the next few years.
My main PC is an i5-2500k as well (using it to write this now!). It's conservatively overclocked in terms of voltage (and kept below 60C under load), as I intend to keep it a long time-- and in that configuration, it's a dead match for the stock i7-7700k (as reported by Passmark) in both overall and single-core scores (~12k overall, ~2600 single-core).

Yes, I'm comparing an overclocked CPU to a stock-clocked one, but how many games in the next few years will be written to need more CPU power than the stock i7-7700k can provide? I'm guessing none of them-- and if that's true, my i5 should do just fine.
 
That's unplayable. Anything under 100 FPS is laggy in a competitive shooter.
Come on! First it was 30, then it was 60, now its 100. What will it be in four years 150? Anything over 60 is not laggy, I don't care what type of game you are playing. If you are experiencing lag at 60 FPS, your problem doesn't have anything to do with frame rate. It's more likely ping rate causing issues than FPS.
 
So to me its a pointless inaccurate article. It contradicts what the headline reads.
You're funny! How you choose to interpret the headline is up to you, but others could claim your choice to interpret it as "Can you game with a low end graphics card and the Q6600 in 2017" to be be just as inaccurate. Stop trolling.

On another note, I'm still running my i7 2600K @ 4.7Ghz and will probably put the next graphics card I buy in it too unless there is evidence that it will bottle neck it (currently 2 x AMD 7970 Ghz editions in crossfire).
 
I'm not trolling. I read the headline and answered the question. Does the Q6600 still have game in 2017 and the answer is yes. Obviously if you're using a 2007 CPU and a brand new 2017 GPU, things will not go smooth "doesn't matter" what settings you run the game at. That's really the only point I was making.

Besides that, @CodeCmdr, some people like to game at extreme settings , higher than 1920x1080, 100fps. That's them, not me. I enjoy playing my games just fine at 1368x768 or 1600x900 and 1920x1080 high to max. and 30fps to 60fps solid is just fine.
 
I'm not trolling. I read the headline and answered the question. Does the Q6600 still have game in 2017 and the answer is yes. Obviously if you're using a 2007 CPU and a brand new 2017 GPU, things will not go smooth "doesn't matter" what settings you run the game at. That's really the only point I was making.

Besides that, @CodeCmdr, some people like to game at extreme settings , higher than 1920x1080, 100fps. That's them, not me. I enjoy playing my games just fine at 1368x768 or 1600x900 and 1920x1080 high to max. and 30fps to 60fps solid is just fine.

If this article was just a headline then I guess your comments are justified, somehow?

Anyway big John has stepped in and acknowledged your tap, you can walk away safely now :D
 
Interesting read

I have 2 LGA 775 rigs with me and they are still running to this day

One rig is powered by the QX6800 which was the 1000 dollars CPU during the heydays and now only cost me 50 dollars to get one. I paired it with the Gigabyte EP45 UDP3 mobo with 8GB of DDR2 800MHz RAM and a GTX950 2GB GPU. OC to 3.2GHz from 2.93GHz

at 1080P Med in games like Battlefield 1, I get at least 30fps. If my OC was a lot more aggressive, I could extract prob a few more fps. But the board is a hand me down and my cooler is still up to the job. This is a 130W TDP CPU and even the famous CoolerMaster Hyper 212X cooler is only keeping it at 60+ to 70 degrees during high loads

The other is the Q6600 HTPC paired with a G41 board and 4GB of DDR3 1333MHz from old HP PCs. With the DDR3, I get a stable 3GHz OC on the Q6600 without me tweaking any voltages and it still on the stock Intel cooler that came with the Q6600. Those cooler with the copper center. The HDMI output was solved with the HD5450 1GB DDR3. And dual boot to Win 10 and Windows XP. This is my retro gaming PC for games like Mechwarrior 4 and Need For Speed Underground.
 
I'm not trolling. I read the headline and answered the question. Does the Q6600 still have game in 2017 and the answer is yes. Obviously if you're using a 2007 CPU and a brand new 2017 GPU, things will not go smooth "doesn't matter" what settings you run the game at. That's really the only point I was making.

Besides that, @CodeCmdr, some people like to game at extreme settings , higher than 1920x1080, 100fps. That's them, not me. I enjoy playing my games just fine at 1368x768 or 1600x900 and 1920x1080 high to max. and 30fps to 60fps solid is just fine.
have game/got game - This to me means "Is it a great performer" or "is an impressive performer", and the answer in general seems to be no. It's ok performance and you "can game" with it at lower resolutions happily. Hell I play some games on a Surface Pro 3 with its Intel HD5000 on-die graphics at low settings at times so I'm not against playing things on non top end systems and dropping the graphic quality. I just think calling the article inaccurate is pretty much, well, inaccurate.
 
Still using my pc tower with the Q6600 in it. It does struggle, I can't play any 4K video on it (even youtube), I've maxed out the RAM to 4.25GB, put in a better graphics card, put windows 10 64bit, and it still won't play it. Does everything else I throw at it. it's slow, disk rendering runs the CPU at 100%.
But it still works! but gonna look into a custom pc this year

How is watching videos at 1080p/1920x1080? If you're trying to watch videos at 4k , remember the q6600 came out in 2007, just saying. But you shouldn't have any problems watching videos at 1080p, which IMO I don't see any freaking difference between that and 4k and I have seen video at 4k, I see no difference. I hardly see any "major" difference between 720 and 1080p.

Watching 1080p video is not a problem (even high bitrate or Bluerays), but 4K video just freezes the screen (whether downloaded or streaming), but the audio plays. Ironically my advent vega note 7 £100 tablet that is 3 years old can play 4K with no issues, that's progress!
 
I ran Over-watch ... just fine. Overwatch I played at 1600x900 nearly max settings and I got between 39-60fps.

That's unplayable. Anything under 100 FPS is laggy in a competitive shooter.

Completely depends on your monitor. If its refresh rate is limited to 60Hz, then 100FPS will be just as "laggy" on your screen as 60FPS.

Of course, "laggy" has much more to do with your connection (ping) than with how many frames your monitor can show. You could be rocking 144FPS on a 144Hz screen...but if your ping zips up to 500ms, your pretty display won't prevent you from lagging badly.
 
So you want to take load off the GPU?

I presume some of the settings also have a CPU or memory bandwidth cost, so if it means higher overall framerates then absolutely.

Anyone who has kept a Q6600 this long, and I'd probably be one of them if mine hadn't got stuck by lightning, isn't interested in maximum graphical quality but in having the game run smoothly enough to not distract from the gameplay.

If it can run at the native resolution of a common screen (1920x1080/1200) using graphical settings that don't completely change the look of the game then it's perfectly acceptable. The ultra benchmarks are interesting but doubts about CPU/Memory cost of those settings mean that as practical advice the article's conclusion is uncertain.
 
So you want to take load off the GPU?

I presume some of the settings also have a CPU or memory bandwidth cost, so if it means higher overall framerates then absolutely.

Anyone who has kept a Q6600 this long, and I'd probably be one of them if mine hadn't got stuck by lightning, isn't interested in maximum graphical quality but in having the game run smoothly enough to not distract from the gameplay.

If it can run at the native resolution of a common screen (1920x1080/1200) using graphical settings that don't completely change the look of the game then it's perfectly acceptable. The ultra benchmarks are interesting but doubts about CPU/Memory cost of those settings mean that as practical advice the article's conclusion is uncertain.
I had a QX6850 (stock clocked for the last 4-5 years of its life) right up until last summer. Paired with a GTX 970 it handled even the newest titles just fine at 1080p (stuttered a bit on Gamestream tho...)
 
Back