The Most Memorable Overclocking-Friendly CPUs

This is one of the things I dislike in the way you test Intel and AMD CPUs. For example, the FX lineup is meant to be tested at 4 GHz+. Intel is more 3 GHz+. Yet you test them both in ranges from 2.5 GHz to 4.5. The thing is, the current top AMD CPU, the 9590, operatest at 4.7 stock and boosts up to 5, but this is nowhere shown in the breakdown performance on AMD's chart by frequency since your chart tops at 4.5 Ghz.

By all means, please shift the gears by 500 MHz for Intel and AMD. Test CPUs at frequencies from 3 GHz to 5 GHz please. At least it would should everything AMD's top performer could do, even if slower than Intel.

Where games favor Intel, it shows. Where games favor single core performance it shows again, because Intel has separate cache for each core where AMD has share cache for every 2 cores. I don't expect miracles, I know the reality of things. I just like to great the clearest picture on AMD's 3 years old now CPU performance at the highest possible freq.

I enjoy the way you often state how much you dislike the way we do things and then make ridiculous comments such as Intel is meant to be tested at 3GHz+ and AMD 4GHz+. How do you come up with this stuff?

Then as dividebyzero has pointed out your entire rant is unjust as we do test the FX-9590 at its default 4.7GHz with a 5.0GHz boost.

You are confusing or simply not understanding the purposes of the CPU scaling tests, adding another 500MHz or even 1GHz (if we could) does nothing to improve the CPU scaling of these tests, 2.5GHz to 4.5GHz tells the story we need to know.

Where games favor Intel, it shows. Where games favor single core performance it shows again, because Intel has separate cache for each core where AMD has share cache for every 2 cores. I don't expect miracles, I know the reality of things. I just like to great the clearest picture on AMD's 3 years old now CPU performance at the highest possible freq.

But that’s it you clearly don’t want to see the ‘clearest picture’, every time you are given the facts in black and white comments like this crop up.

Where games favour Intel, where is that? Where games favour single core performance, when was the last time we saw a AAA+ use a single core?

I am only writing this in reply to one of your off beat comments for the trillionth time. Intel delivers superior core performance to AMD and this is the reason why we see Intel offering a distinct advantage in any game that uses anywhere from 1 to 8 threads. Even in a game such as GTA V which does use all 8 threads of an FX processor the 4 threaded Core i3 is often faster as those two dedicated cores are just that much more efficient.
 
The Q6600 was my first CPU and that thing is still kicking in a buddies Rig after nearly a decade of abuse! Great read up and the comments are great!
 
What always gets me going though in these flashbacks is the pricing, which goes to show too how much things have come along. I've been hoping for a shakeup in the market to get things speeding along again (looking at you Zen), but I have trouble being optimistic. I digress...

And that brings me to what I have today. I have an i7-3770k that I'm proud to say hit the 5ghz mark and I will go no further than that. In hitting 5ghz the automatic fan speed controller on my motherboard broke. Luckily whenever it broke it got stuck in the 100% on position. Adjusting the fan speed manually in the bios results in no change. I do not run an overclock on it at all now though. The thing is so fast that I don't NEED to OC it at all. If it starts to show its age in a few years I'll toy with the idea of running a 24/7 OC.
I must tip my hat to you! The furthest I could reach with my 3770K (which I'm still using) was 4.8GHz. Even at that point, the heat generation was taking such a sharp climb that I couldn't find it realistic/fun to mess with anymore, and decided to stick with 4.5GHz. Then again, I have a no-name stepping, and lack the mighty Costa Rica print, so I think I did fine.

Looking back, it's become so much easier to overclock. I personally don't miss bothering with unclock and QPI voltages individually anymore.
 
Whatever happened to Nvidia's nForce Chipset's and motherboards? They just sort of, vanished one day?
 
Whatever happened to Nvidia's nForce Chipset's and motherboards? They just sort of, vanished one day?
In a word...Intel.
Nvidia were cruising along until the mostly good (but hot) 680i SLI. Intel decided to change the VRD spec on Penryn (Wolfdale/Yorkfield) at the last possible moment and basically kicked Nvidia's enthusiast chipset sales into the dumpster. BIOS updates were supposed to provide eventual support but it was a very mixed bag ( I had an EVGA 680i SLI and EVGA 790i SLI FTW at various stages).
Intel basically started making noises about not continuing Nvidia's Intel FSB chipset licensing once Intel changed from FSB to QPI/DMI. Nvidia launched what was basically a warmed over 650i/680SLI with the 750i/780i SLI, and played out the string with the 790i/790i Ultra SLI knowing that their chipset days were numbered. IGP and AMD chipsets didn't warrant the R&D for Nvidia to continue making chipsets.
Intel made good on their threat to deny Nvidia QPI/DMI access. Cue lawsuits. Intel and Nvidia settled. Intel got assurances that Nvidia wouldn't pursue x86 processors either by hardware implementation or emulation, and access to Nvidia's chipset IP (since a lot of Intel legacy boards used Nvidia as does Intel's own graphics), and Nvidia got access to Intel's non-x86 IP and received $1.5 billion.

It was basically the last phase of an Intel operation that started when Intel began making their own decent chipsets in the Pentium era and destroyed Acer Labs (ALi) and OPTi's chipset business, and eventually killed VIA, SiS, and Nvidia on Intel platforms.
 
In a word...Intel.
Nvidia were cruising along until the mostly good (but hot) 680i SLI. Intel decided to change the VRD spec on Penryn (Wolfdale/Yorkfield) at the last possible moment and basically kicked Nvidia's enthusiast chipset sales into the dumpster. BIOS updates were supposed to provide eventual support but it was a very mixed bag ( I had an EVGA 680i SLI and EVGA 790i SLI FTW at various stages).
Intel basically started making noises about not continuing Nvidia's Intel FSB chipset licensing once Intel changed from FSB to QPI/DMI. Nvidia launched what was basically a warmed over 650i/680SLI with the 750i/780i SLI, and played out the string with the 790i/790i Ultra SLI knowing that their chipset days were numbered. IGP and AMD chipsets didn't warrant the R&D for Nvidia to continue making chipsets.
Intel made good on their threat to deny Nvidia QPI/DMI access. Cue lawsuits. Intel and Nvidia settled. Intel got assurances that Nvidia wouldn't pursue x86 processors either by hardware implementation or emulation, and access to Nvidia's chipset IP (since a lot of Intel legacy boards used Nvidia as does Intel's own graphics), and Nvidia got access to Intel's non-x86 IP and received $1.5 billion.

It was basically the last phase of an Intel operation that started when Intel began making their own decent chipsets in the Pentium era and destroyed Acer Labs (ALi) and OPTi's chipset business, and eventually killed VIA, SiS, and Nvidia on Intel platforms.
That's a shame, I remember having an Nvidia Chipset years ago, when Doom 3 launched, I remember it being quite good and reliable, Imagine if Nvidia had continued to make chipsets, I would place money they'd be pretty epic today.
 
That's a shame, I remember having an Nvidia Chipset years ago, when Doom 3 launched, I remember it being quite good and reliable, Imagine if Nvidia had continued to make chipsets, I would place money they'd be pretty epic today.
Hard to say I think. Once Intel got IGP onto the CPU package with Clarkdale (and then on-die with Sandy Bridge), I think Intel would have squeezed out Nvidia's low end. As for the high end, Intel pretty much caught up on performance by the time P35/X38 launched, and Nvidia's chipsets did have an annoying habit of corrupting data on occasion with their SATA controller.
Having said that, Nvidia's chipsets were king for memory overclocking thanks to the independent clock rates rather than being limited by FSB divider ratios - although they ran damn hot. Those northbridge fans weren't there for show. One thing is certain, if Nvidia had remained in the enthusiast chipset business, Intel wouldn't be charging motherboard vendors $50 for non-existent X79/X99 "chipsets" as they currently manage to do.
 
What no mention of the K6-2 300?!? I took mine to 502 MHz with only a modest increase in voltage while keeping the stock fan/heatsink. I really do wish I still had it.
 
I seem to recall running my Pentium 90 a bit higher than norm (100MHz?) but the P166 was my first properly overclocked chip (though not that extreme)
 
Best I have ever take a processor was my AMD 700Mhz and I took it to 1Ghz with no additional cooling (using a gold finger device). My next best overclock was probably the E6850 and I took this to about 4.5Ghz stable running only on air cooling.
 
I ran my Celeron 300A on an Abit BH6 @ 504 MHz for quite a long time using a Whopper Celery Sandwich HSF and it was rock solid the whole time. One day I decided to show someone how easy it was to set up the overclock and set my speed back to base... after that I was never able to hit 504 stable again and left it at 450. After that I ran most stuff at stock speeds till I built a Dual 733 SMP machine that I ran at 800+ (don't remember exactly what speed - just that 733 was the fastest available chip at the time and I was gonna go faster). That machine was also rock solid and served as both a gaming machine and my local gaming server... till I built my Dual 933 SMP machine which I also OC'd a bit but nothing like the 300A. Next thing I OC'd was an Athlon 2600+ but it didn't get much increase and then after that an E8400 which I only gave a tiny boost. Next was an i7 930 which I ran at 3.8 GHz and is still running at that speed in my daughter's machine. I'm currently still running an i7 2600K @ 4.6 GHz and it's been rock solid the entire time I've had it.
The very first computer I owned was a 8 bit, Z80 3.5 Mhz (not Ghz) CPU, 16 KB memory Sinclair ZX Spectrum. It needed a C60 cassette tape to run the OS to boot up. I don't think it was overclockable, in fact I know it wasn't, overclocking wasn't a thing those days.
Anyway the only CPU I ran overclocked for a long period was my i5 760, up from 2.8 to 3.5 which it still runs at today on a Hyper 212+ cooler. My wife uses the PC, it still has a GTX 580 installed and it's still capable of very respectable 1080p gaming at quite high settings.
 
There is a micro P nobody talk at all. The athlon 5350 comes at 2.05GHz stock... well, every single one I tested (about half dozen to this date) reaches 2.4 GHz (+17%) on stock cooler and voltage and 2.7 GHz (32%) on the stock and horribly tiny cooler. Other people have reached close to 3GHz (~50%!) on air solutions.
GPU, again in every single one, reaches 40~45% overclock on stock voltage and cooler.
 
Best OC cpu I got so far was intel celeron e3200 2.4GHz, it went to 3.2 on stock voltage and to 4.2 with some adjustments, that is 75% OC and made it better than some pentium dual cores back then.
 
AMD Athlon Tbird Tbred - Legendary OC.
And my Intel C2Q Q6600 was fantastic too.
Still running the i7 2600K from 2011 in my current desktop.
 
Sadly... I was never able to overclock my 386sx-16. I did get the math co-processor though. And the additional 2MB ram that was literally a foot long and fit in an ISA slot. Don't forget the sweet 256 color VGA graphics!
 
The newish 4790k chips are very well known for their overclock ability. But even with water cooling I don't think you are going to see more than 20%. (4.0ghz~4.8ghz)
 
Socket 370 / celeron (Mendocino) / plastic PGA (PPGA) package / speed from 466MHz @ 1 GHz = > voltage pins mod. in late 90-s it was freaking awesome xD
 
The 300A must be at the top of the list for guaranteed overclockability. In today's era where whole databases are constructed to extrapolate likely "golden sample" candidates from production batch numbers for the modern processor, it is a far cry from having any 300A being able to produce a 50% OC.

and -not that anyone cares- the 386/25's were the same, anyone submitting a PO for a 386/33 was laughed off the IT floor, as Every 386/25 Was a /33 by moving the jumper.
 
Intel Celeron 300A, Intel 486DX2-40, Intel Pentium III 500E, Intel Core 2 Duo E6600, and now an i5 4690K. I've never been tempted to try an AMD processor.
 
I want to add a Barton chip , AMD Athlon XP 3200, up to 3300 MHz from 2200, at 1.8 V on Abit motherboard with NForce2 chipset and Crucial Ballistix RAM, 233 MHz FSB x 15 multiplier, stable 24/7 on air with a huge copper heatsink with oversized fan. Still occasionally working to these days, but in a horizontal position of the case because the heatsink is too heavy.
 
I made a custom phase change unit just to tame my 1700+ TBred paired with the godlike Abit NF7. I remember the overclock being around 2.6/2.8GHZ which was crazy fast for that architecture.
 
I remember when the shop I bought the celeron 300a from, told me how easy it was to overclock it to 450.
Man, those were the days. Tweaking and tweaking, modify the ini files, bat/sys files.
Now, I just turn it on and use it...like a coffee pot or toaster oven.
 
Back