CPU overclockers hit a ceiling, and it happened about 17 years ago

Shawn Knight

Posts: 15,626   +198
Staff member
The big picture: Intel recently invited two leading overclockers to its headquarters to discuss achieving the world's first 9 GHz CPU overclock. The duo of Pieter-Jan Plaisier (SkatterBencher) and Jon Sandström (ElmorLabs) also looked back at CPU frequency overclocking milestones along the way, highlighting an obvious wall that was hit roughly 17 years ago.

Record-high CPU overclocks feel like a pretty common occurrence but according to the data, that is far from the case. A chart accompanying the overclockers' presentation highlights the first chip to reach 1 GHz – AMD's legendary Athlon 650 MHz – way back in October 1999.

It would take about 14 more months for overclocking enthusiasts to breach the 2 GHz barrier, and just another nine months to crack 3 GHz. The 4 GHz mark was crossed in early 2002 and roughly 16 months later, the first 5 GHz overclock was achieved.

With the odds in their favor (and Intel Pentium 4 chips in their motherboards), overclockers did not look back. The world's first 6 GHz overclock came in May 2004, the 7 GHz walls came down in August 2005, and 8 GHz was achieved in early 2007. Then, it all came to a screeching halt.

It wouldn't be until the end of 2022 – more than 15 years later – that the 9 GHz barrier finally came down. The current world record, 9,117 MHz set by Elmor with a Core i9-14900KS, could very well be the new ceiling for the foreseeable future. And that elusive 10 GHz mark? Well, good luck with that.

That's not to say that CPUs have not improved since the mid-2000s. Today's chips are way faster and more efficient than CPUs of yesteryear, and base clock speeds on air cooling are significantly higher than what you would get with a retail chip a decade and a half ago. But in terms of sheer clock speed on a single core with liquid nitrogen or liquid helium cooling, it seems the ceiling has not budged much.

Permalink to story:

 
We could easily design a chip that could break 10, 20ghz. The thing is, the chip would be basically useless for anything. Oh, wait, I forgot, these chips are cooled with exotic sub zero cooling and it doesn't matter.

Can we start working on photonics? I want chips in the terahertz range.
 
Pentium E2160.

What a chip that was. Near doubling the clocks was pretty easy and having a budget CPU match up to the higher end SKUs at three times the price. Matter of fact all the Conroe parts and derivatives were stupendous, looking back.
 
Wow, this is crazy! I thought CPU overclocking was still going strong, but apparently we hit a wall over a decade ago. Makes me wonder what other tech plateaus we might not even realize we've hit. Gotta give props to those overclockers who just smashed the 9 GHz barrier though, that's nuts!
 
Pentium E2160.

What a chip that was. Near doubling the clocks was pretty easy and having a budget CPU match up to the higher end SKUs at three times the price. Matter of fact all the Conroe parts and derivatives were stupendous, looking back.
I haven't heard the name "Conroe" in easily 5 years, maybe 10. But those things took the CPU market not only by storm, but complete surprise. I still received tiger direct magazines back then
 
The highest frequency matters very little as it becomes very inefficient and hot; it is the equivalent to say that engines should go high rpm instead of low rpm but much better combustion.

ATM the goal is to improve the efficiency. P4 Intel was highly overclockable but highly inefficient.
 
OCing sandy bridge and ivy bridge was a lotta fun, the FX series was a blast to play with. Nothing quite like hitting 5.6 GHz on a FX 8150 in high school.

Watercooling and OCing Fermi was also a barrel full of monkeys. Pushing over 900 mhz on a GTX 590 SLI setup was absolutely nuts.
It's been dead to me since "The Pencil Trick".
https://www.techspot.com/articles-info/904/images/2014-10-13_12-45-21-j.webp

And the last time I pushed a GPU was when I put a heatsink/fan combo on a passively cooler GeForce2 MX.

It hasn't been worth it for 2 decades for the average person. Here is twice the heat and noise for extra 5%. No thanks.
Did you just suck at overclocking? Because a decade after the geforce 2, gains of 45-60% were not unheard of. The core 2 lineup as a basic example. Even 20-30% was not unusual for GPUs if you put on a big enough cooler. More recently Maxwell and Pascal were overclocking champs. The "double the power for 5%" thing is a much more recent phenomenon, mostly because modern parts do 97% of OCing for you already.
 
Mr Shawn Knight,
The graph should use logarithm scale on the vertical axis. and should go back to 1981 with the launch of the PC.

And those frequencies reached were not done on equal footing either. It's simply a bad graph all round.
 
We could easily design a chip that could break 10, 20ghz. The thing is, the chip would be basically useless for anything. Oh, wait, I forgot, these chips are cooled with exotic sub zero cooling and it doesn't matter.

Can we start working on photonics? I want chips in the terahertz range.
Photonic chips are being worked on. Speed in hertz is an the issue with them. It's information density. https://phys.org/news/2022-06-world-ultra-fast-photonic-processor-polarization.html
 
I wonder when ill have a CPU that hits 6. I went from i5-2500k to 7800X3D. That was quite the jump! Will a gaming CPU go that far within the next 10 years?
 
We could easily design a chip that could break 10, 20ghz. The thing is, the chip would be basically useless for anything.

lel :) Pentium 4 WAS uselss for anything :) it always the useless chips that the subzero captains are most interested in :)
 
Yea It was fun for a short while. I decided I wasn't going to spend time and money overclocking as sport.
 
Remember when Intel touted Netburst as an architecture capable of reaching 10GHz? LOL. 25 years on, they're having talks about reaching 9GHz...
 
Mr Shawn Knight,
The graph should use logarithm scale on the vertical axis. and should go back to 1981 with the launch of the PC.

And those frequencies reached were not done on equal footing either. It's simply a bad graph all round.

The graph was released by the overclockers, not the author of the article. The graph is pretty good at conveying the message it wants to convey, in my opinion. I agree that a log scale graph would be interesting but is it going to convey this message even more clearly? I doubt it.
 
Most "overclocking" is automatic these days - if you have the thermal and power envelope headroom the CPU will push for higher clocks. Unfortunately this will always impact the life of the chip (as intel has recently found out with motherboard partners allowing unlimited power). My current CPU is overclocked via the intel tool that boosts the "coolest" cores 200MHz above the lower boost the other cores have (so two cores clock to 4.5 the rest to 4.3 during the "boost" envelope with a drop to 3.6 or even a lower 0.8 - 1.1 or "parked" during idle times). I can extend the boost envelopes even futher but temps get too close to 100 degrees for my liking. The days of playing FSB clocks, CPU multiplier, the northbridge multiplier, the HyperTransport multiplier, and the memory frequency are generally long gone as OEMs have automated much of these processes to gain as much performance from a chip as possible - not saying you can't tweek to get a bit more but you are unlikely to get the gains we used to see (as much as being a higher class CPU entirely).
 
I think for the average PC user now, undervolting is the new overclocking. Not for those people who go for records though, obviously.
I've overclocked every cpu/gpu I owned for as long as I can remember, but now my attention is more focused on undervolting and seeing what I can get out of the hardware for the least power and heat possible. I've done away with the custom water loops too, which I started doing around 2006/2007. Air cooling only these days, as it's much easier for cleaning out the PC with a good Noctua air cooler than water ever was.
 
I'd say grab a FX blueprint; increase the pipeline even longer, bake it on the latest node that take a beating, and try again.
 
Very interesting that they're getting to the point where uncertainty of the true clock rate is a concern. You can't simply take the bclk x multiplier at these speeds. Kind of sounds like it's entering the quantum realm.
 
[...] if you have the thermal and power envelope headroom the CPU will push for higher clocks. Unfortunately this will always impact the life of the chip (as intel has recently found out with motherboard partners allowing unlimited power).

I think 'always' is obviously false. The issue with the motherboard partners was, clearly, a failure of the motherboard partners, not the chip; but regardless of that, it doesn't support the 'always' claim.

It's all a matter of degree. The lifecycle of an enthusiast's use of a CPU is limited more by the desire for the shiny object ahead, rather than the CPU 'wearing out'. Non-enthusiasts don't overclock.
 
Most "overclocking" is automatic these days - if you have the thermal and power envelope headroom the CPU will push for higher clocks. Unfortunately this will always impact the life of the chip (as intel has recently found out with motherboard partners allowing unlimited power). My current CPU is overclocked via the intel tool that boosts the "coolest" cores 200MHz above the lower boost the other cores have (so two cores clock to 4.5 the rest to 4.3 during the "boost" envelope with a drop to 3.6 or even a lower 0.8 - 1.1 or "parked" during idle times). I can extend the boost envelopes even futher but temps get too close to 100 degrees for my liking. The days of playing FSB clocks, CPU multiplier, the northbridge multiplier, the HyperTransport multiplier, and the memory frequency are generally long gone as OEMs have automated much of these processes to gain as much performance from a chip as possible - not saying you can't tweek to get a bit more but you are unlikely to get the gains we used to see (as much as being a higher class CPU entirely).

Life span of a CPU is determined by many things. To much heat will shorten their life. To much voltage can hurt them as well as cause to much heat. With that said I had my i7 2600K @ 5.1Ghz 1.45 volts from 2011 to 2021 & just before I upgraded to a new platform, I took it up to 5.2Ghz 1.46 volts. It was able to run everything I threw at it pretty easy. When I upgraded to a Ryxen 5900x system I gave i7 2600K and the mainboard to a buddy. because he does not have water cooling like I had and only air cooling I have him at 5.0Ghz 1.40 Volts and it is rock stable. Basically, if you can control the heat and keep the CPU cool enough it will last for a very long time.

Unlike what Intel did with the 14th gen they threw to much voltage at the cores and made it hard to control the heat which caused them to start going bad a lot faster than they should have. But we have hit a wall and as stated you are right CPU's and GPU's basically OC themselves these days as long as you can stay within a heat range and power draw range, they ramp up nicely. I set my power curve on my CPU way into the negative range and other settings in the bios and it ramps up to 5Ghz but never stays there for very long like my old i7 2600K woould stay at 5.2Ghz constantly when using it to the max. Same with my 7900XTX under volt and set it to max out at 3Ghz core and it bumps up to 3Ghz a lot but never stays there for long and then drops to 2950-2975Mhz then bumps 3Ghz again. If you have a game that uses the GPU to the max and causes high core usage it drops to 2850-2875Mhz either way, it runs a lot higher than stock and all I did was under volt it and max the power limit out. it's pretty easy to OC these days but is it really overclocking any more lol.



 
Back