A Decade Later: Does the Q6600 Still Have Game in 2017?

I am running my old Q6600 with a GTX 1050 Ti. It does fine. Obviously I lower settings on the games that are processor hogs. No big deal. If someone runs ultra settings at 1080 on a 10 year old processor, they get what they get.
I'm glad your Q6600 still works for you, but The Title is "Does the Q6600 Still Have Game in 2017?", not "Does the Q6600 Still Work in 2017?".

This test was done out of curiosity and fun- no one expected it to keep up. It was just interesting to see how badly it would be trounced. Of course it would do better with lower settings, lower resolutions, and on older games- but the point is to see if it could hang with today's tech, not to replicate a scenario from 2007.
 
I never understood all the love for the Q6600. At the time, for most applications you were better off with fast dual-core. The i7-920 came out a year and a half later and it didn't have the hamstrung onboard memory controller. A Nahelem quad run at reasonable 3.4 GHz won't look very dated.
 
My i5-750 was pushing BF1 1080p with a AMD r7 360 on low at 30ish fps an occasional dip.

I tried to put a rx 480 in and updated drivers and on low I got 14ish fps with lots of stalls.

So I upgraded... it all

but hey it lasted since 2009
 
I never understood all the love for the Q6600. At the time, for most applications you were better off with fast dual-core. The i7-920 came out a year and a half later and it didn't have the hamstrung onboard memory controller. A Nahelem quad run at reasonable 3.4 GHz won't look very dated.
You buy whats available sometimes, although running my q9550 im curious how it oc'd to 4ghz would do!
 
I'm sorry, but this is ridiculous. 2 games? Also, why does modern gaming always have to be hard to run stuff? The VAST majority of games coming out do not require higher end hardware to run and look nice. Not every cares about the latest AAA games.
 
I wonder how much more my old collectors/sentimental then-overkill rig would perform over this setup? Here are my specs for that old PC I still have and will keep as my retro and sentimental rig since it was my first overkill rig that I got into debt for nearly 2 years afterwards but it was sure worth it at the time, lol. The only upgrades since is an enterprise class SSD being a 960GB Toshiba HK3R2 in IDE Mode, even without TRIM, this SSD will last a LONG time with the far superior read/write endurance over consumer-class SSD's and I bought these SSD's cheap locally from a business closing its doors for good. I still have the original WD Raptor HDDs I bought back then and had them running in RAID 0 at the time for the best then performance in the age before SSD's. Too bad my old 30 inch Dell Ultrasharp 2560x1600 finally went out, I may try replacing the boards to get it working someday to keep it all original with the same era hardware except the SSD and it sure was sweet at the time being able to max out modded Oblivion and be among the few thousands of gamers on Earth then that was able to play Crysis at decent FPS nearly maxing the graphics then when it first came out, lol. I may try using both my GTX 980 Ti's in SLI and see what it can do but that may just be too much for the old beast.

CPU: Intel Core 2 Extreme QX6700 w/Zalman CNPS9700 NT @3.3ghz
Mobo: EVGA nForce 680i SLI A1
RAM: 8GB Corsair Dominator DDR2 1066 (4x2GB sticks)
GPU(s): XFX 8800 GTX in SLI
PSU: Enermax Galaxy 1000w
Sound: Creative Sound Blaster X-Fi Platinum Fatal1ty Champion
OS: XP Pro/7 Ultimate x64 on a 960GB Toshiba HK3R2 in IDE Mode
Case: Cooler Master Stacker 830 Nvidia Edition
KB/M: Logitech G15/G5 (original version of G15)
 
My brother in law is still running my old Q6600 system with a GTX750 for Heroes of the Storm and general computing. It manages 60 fps on medium settings most of the time but sometimes struggles along at 20-30. I'll need to tell him not to bother sticking in a new GPU and invest in a new CPU instead. Thanks :)

I have the same config, and play The Witcher 3 and Rise of the Tomb Raider all right ;)
(but I still need to test Mad Max and Arkham Knight)
 
I have the same config, and play The Witcher 3 and Rise of the Tomb Raider all right ;)
(but I still need to test Mad Max and Arkham Knight)
Good luck running Arkham Knight as it's rather demanding. That being said my i5 2500K and GTX770 done ok with said title.
 
I have the same config, and play The Witcher 3 and Rise of the Tomb Raider all right ;)
(but I still need to test Mad Max and Arkham Knight)
My brother in law informed me recently he upgraded it to a Q9650. I doubt it would have made all the much difference to the ageing machine :)
 
I can't say too much for the Q6600, since I don't have one. Considering its age, I think it is amazing what it CAN do in 2017. However, I see a lot of people recommending it for budget builds, and that just leaves me sitting wondering why... I run a Q9550, and that sometimes struggles with a title or two.

That being said, my Q9550 with 8GB of DDR2 RAM and a 1050 Ti can run the latest DOOM at 1080p Ultra at around 50 to 60FPS, rarely dropping into the 40s. Not as good as a modern CPU, but definitely a playable experience for someone on a tight budget. The only game I've tried, so far, that doesn't run well on my setup is Hellblade. It was playable, but not the smoothest experience. Luckily, I found the game to be boring anyway, so it wasn't a huge blow to me.

After taxes, I'll probably rebuild my system with a new mobo, RAM, and probably that Pentium that everybody raves about for modern budget builds. I'll keep my 1050Ti and PSU, since the whole point to buying those was to upgrade in the future. My Q9550 was just my foot in the door, and is definitely enough to keep me satisfied for a little bit.

My brother in law informed me recently he upgraded it to a Q9650. I doubt it would have made all the much difference to the ageing machine :)

It very well could have. I built my wife a cheapo computer (all used parts) so she could play Elder Scrolls Online. The idea was for it to play this single game better than her laptop could. Originally it ran a Q8200 @ 2.3GHz, as that is what was on the mobo when we bought it. It performed well enough to play the game, but it would stutter significantly whenever things needed to be loaded quickly (ie, when her character was sprinting). So, just for the hell of it, I threw in an X3323 @ 2.5 GHz (the whole 771 to 775 mod) that I had laying around. Now the game doesn't stutter at all, and plays smooth as silk.

I doubt the Q9650 would necessarily make it much more capable of playing the latest and greatest titles (maybe a small performance increase). But he may have noticed a difference in the games he was playing more frequently. I guess you'd just have to ask him if it made any real significant difference, if you haven't done so already.
 
This is actually "shocking" and TBH I didn't expect the review that negative. I had my old Q6600 stored for "emergencies" and never planned to use it again. But what happened is that my wife's board smoked up, and as an alternative we only had her Win10 laptop (with an i3, means a relatively modern CPU, 8GB of memory), but some low/medium-range Radeon. What happens: She can NOT play merely halfway decent games, like SWTOR on the laptop.

So I dug out the ancient Q6600 system (which I run overclocked to 3.5ghz), and on the board with a GTX970 she can play SWTOR and whatever she plays without problems. Means: In THIS CASE, even an ancient CPU is "ok" at least until I can replace the board. I even have a MUCH older system still as my HTPC with some Athlon64 (lol, I know). I can tell you I could throw in a modern GPU (say GTX970, GTX 660 etc.) and still halfway "ok" play less demanding games like WoW.

I would not in any way recommend that anyone TODAY build a Q6600 system. (That would be absurd). Most importantly also because i3/i5/i7 systems use MUCH less power, especially in idle and with stuff like movie watching. But it's not THAT bad...and seeing my wife playing right now..there is no "obvious" difference really to her on the i5 3570K, or the very old Q6600. I'd rather put together some emergency system with an old CPU and a modern GPU, and not a system with a new CPU and some poor GPU. The first will definitely be better for gaming. The second a stutter fest. I am actually surprised how the Q6600 still does an "ok" job, I actually totally forgot about this and at first thought it would not be worth it at all to build a temporary system for wife from it.
 
This test is somewhat flawed, because someone still using a Q6600 for pc gaming isn't flipping things on 1080p ultra. Benchmarking @1080p medium/low would have made more sense. Buddy of mine still uses a Q6600 fpr pc gaming, running Killing Floor 2, Payday 2, Doom 2016, but he's certainly not putting it on prohibitive settings.
 
I wonder how my i7 920 would do with similar specs and tests...
Quite well;especially if overclocked. I have an i7 960@4.1Ghz paired with a 980Ti that keeps up with my haswell and skylake systems equipped with 980Ti and 1070. Might lose a few fps here and there, but in practice it is unnoticeable.
 
This whole article is a funny. From someone who just had a Q6600 last year, and I had it for 2 years. I ran Over-watch, Rocket League, and other games just fine. Overwatch I played at 1600x900 nearly max settings and I got between 39-60fps, rocket league 1600x900 max 50fps. My specs was Intel Q6600 2.4ghz, 4gb ddr2 memory, Win764bit, PNY GTX 550 TI.

That's funny, cause I had Blizzard refund my money on Overwatch when my Q9550 wouldn't play it at all. During the beta, the game ran well. But sometime after beta and before full release, they dropped support on C2Q processors. Everything else on my machine met the requirements but the C2Q. They were nice enough to refund my money since it was only 24 hours since I purchased it. Long story short, I'm having a hard time seeing your Q6600 playing Overwatch at all, no matter the other specs of the machine.
 
Like everyone else who read this article, I came to the obvious conclusion is that Intel has been at nearly a full stop since the 2500k. What was missing from the benchmarks, and would have slammed this point home would have been to take the 2500k to 4.5 Ghz (feel free to bump the 6700k for comparison too) and you'll see that at that point, you are GPU bound at any playable resolution.

Memory advances are technically great but the performance difference between DDR3 and DDR4 in the real world is nothing.

Even with AMD's Ryzen coming out, I don't know if there's any reason to upgrade from the i5-3570k at 4.4 GHz. GPU's are the new CPU's and those are bound to hit the same fate soon too. They keep getting bigger and bigger but there are limits and we are getting close.

Been that way long before the 2500k; even Bloomfield can keep up just fine.
 
This whole article is a funny. From someone who just had a Q6600 last year, and I had it for 2 years. I ran Over-watch, Rocket League, and other games just fine. Overwatch I played at 1600x900 nearly max settings and I got between 39-60fps, rocket league 1600x900 max 50fps. My specs was Intel Q6600 2.4ghz, 4gb ddr2 memory, Win764bit, PNY GTX 550 TI.

That's funny, cause I had Blizzard refund my money on Overwatch when my Q9550 wouldn't play it at all. During the beta, the game ran well. But sometime after beta and before full release, they dropped support on C2Q processors. Everything else on my machine met the requirements but the C2Q. They were nice enough to refund my money since it was only 24 hours since I purchased it. Long story short, I'm having a hard time seeing your Q6600 playing Overwatch at all, no matter the other specs of the machine.
Were you playing @1600x900?
 
I have an old i5-2500K paired with a new GTX 1070 and 8Gb of DDR3 and I can still play most new titles with all the bells and whistles turned on at 1080p resolution.

Thinking about making the leap to Kaby Lake but im rather ambivalent to shell out for a new i5, motherboard and DDR4 considering the cost of memory.

Not worth it.
 
These results are NOT TRUE what so ever. I have a EVGA 680i sli MB with 4GB of ram a Q6600 not even overclocked and a ASUS GTX 1050 ti OC Dual with a 27inch 1920x1080p monitor. Can play DOOM (2016) ultra settings evething maxed out at a locked 60 FPS. also many other newer titles play very well om this rig. So I would like to dismiss these results as they are fudged...
 
These results are NOT TRUE what so ever. I have a EVGA 680i sli MB with 4GB of ram a Q6600 not even overclocked and a ASUS GTX 1050 ti OC Dual with a 27inch 1920x1080p monitor. Can play DOOM (2016) ultra settings evething maxed out at a locked 60 FPS. also many other newer titles play very well om this rig. So I would like to dismiss these results as they are fudged...

Post validated benchmarks, or it didn't happen.
 
Post validated benchmarks, or it didn't happen.
That's only 4GB(2GB per card) of GFX memory and what I assume to be 4GB DDR3 combined with a 2.5Gh CPU. Whilst said specs may well be capable of running some games at 1080p with most everything maxed, I would like to see the benchmarks also. ;)
 
OK. Anyone can YouTube a q6600 and a 1050ti and see that it plays newer games quite well. Of course not all will be flawless. But the problem here is that every system will act a little or alot differently. I seen a vid on gamers nexus that showed a g4560 and a 1080ti and the frame rates were just so low and choppy they couldn't figure out why it was doing this. But soon as they matched the g4560 with a more reasonable card like a 1060 the frame rates were pretty good. 60 avg. So what I think is going on is maybe the system or board they are using to pair the q6600 and 1060orc1070 is causing the same results.
 
Back