Intel Core i7-7700K overclocked past 7GHz barrier

No its vanity and almost a sickness. I know people that will build a computer for their dad using a cheap video card (like gt610) less than 10 gig/sec bandwidth that will be used for surfing and movies. The guy will secretly overclock dads video card which does nothing but add heat, draw more power and increase risk for instability with ZERO gain in anything his dad does. What do you call this? Its not vanity its insanity. They overclock when there is no reason to, they can't help it, its like an alcoholic, its a sickness to some. They would overclock a digital alarm clock to speed up time (in their mind)
I call it, "children who need some discipline introduced into their lives". Something which is sorely lacking these dayz. <(note the ironic spelling of "days").
 
Just think if this liquid nitrogen cooling hit the home power user set. Once a week or so, you'll need a pumper truck full of it delivering the stuff to your door.

While this may be hailed as a "great technological leap forward" for the hard core gaming set, it would set the rest of the world back going on a hundred years, before widespread mechanical refrigeration became available........Although I suppose, who isn't nostalgic to hear the "iceman, yo iceman" call, at the crack of dawn every Friday.
No its vanity and almost a sickness. I know people that will build a computer for their dad using a cheap video card (like gt610) less than 10 gig/sec bandwidth that will be used for surfing and movies. The guy will secretly overclock dads video card which does nothing but add heat, draw more power and increase risk for instability with ZERO gain in anything his dad does. What do you call this? Its not vanity its insanity. They overclock when there is no reason to, they can't help it, its like an alcoholic, its a sickness to some. They would overclock a digital alarm clock to speed up time (in their mind)
A lot of these exotic cooling methods are being used in the D-Wave quantum computer and other quantum computer designs. One of the reasons that these cooling methods are so successful at making high clock speeds is that they give the chips near-superconducting properties. If you make it too cold then you have electron leakage between transistors, which is why they use liquid Nitrogen instead of liquid Helium. Electron leakage is a problem that needs to be solved for us to shrink chips further and stabilize quantum computers for longer periods of time.

These problems wont be solved if we don't push the boundaries of what is possible, just like every other new technology that has been developed. While we were still using vacuum tubes making processors out of transistors was pushing the boundaries. Bill Gates is famously quoted saying "I don't know why anyone would need more than 64KB of ram". Now look at us, through pushing software boundaries we now have supercomputers making use of TiB's of memory ( 1 TiB=1024^4 gigabytes). The Titan Supercomputer uses 693 TiB of memory

The idea that this is just vanity is absurd.
 
If you make it too cold then you have electron leakage between transistors, which is why they use liquid Nitrogen instead of liquid Helium. ...[ ]... .
See, you learn something new everyday. I thought they were using nitrogen because there is a "helium shortage". I suppose we'll just have to wait til the sun makes more, then go there and get it.

BTW, why are you quoting me in this post anyway? I should be readily apparent to a person of your stature, I was just breaking ballz anyway. :D
 
Very excited to see where the CPU race goes in the next 2-3 with AMD Rizen on the horizon and rumors that Intel is revamping its architecture.
 
If you can't leave all the cores and features on this "achievement" stands for nothing.
HT has been doing wonders for Windows OS and Gaming performance since 2009 and I wouldn't game without it on.
The point of the overclock was simply to see if 7GHz could be achieved for the FUN of it. He never said it was usable as a daily driver.

"HT has been doing wonders for ... Gaming performance since 2009".

No, not really. I suppose if your flogging a two-core chip HT can help, but it has been well documented that HT does nothing for quad-core i5s and i7s as gaming goes. In fact, many people turn HT off when gaming as it has been shown to hinder performance- it's all over the internet. That's why it is widely recommended to go with the i5 in a gaming rig and spend the extra $100 (for the i7) elsewhere.
 
Always found these extreme overclocks to be an exercise in vanity. 7GHz would be great...if it were usable. But this type of thing is about as useful as stripping a Lamborghini down to the wheels and motor, putting a spoiler in front of the driver, and proclaiming, "Faster than a Bugatti!"
"But this type of thing is about as useful as stripping a Lamborghini down to the wheels and motor, putting a spoiler in front of the driver, and proclaiming, "Faster than a Bugatti!" "

What's wrong with that? Do people climb Mount Everest for any useful purpose? No- they do it because they want to, and because it's there. This guy was just seeing if he could crack the 7GHz mark using any compromise necessary. No reason to make it a negative thing.
 
I wouldn't go as far as saying it's vanity. These people are pushing the limits of what's currently technologically possible, who knows if these exotic cooling will prove useful in the future.

No its vanity and almost a sickness. I know people that will build a computer for their dad using a cheap video card (like gt610) less than 10 gig/sec bandwidth that will be used for surfing and movies. The guy will secretly overclock dads video card which does nothing but add heat, draw more power and increase risk for instability with ZERO gain in anything his dad does. What do you call this? Its not vanity its insanity. They overclock when there is no reason to, they can't help it, its like an alcoholic, its a sickness to some. They would overclock a digital alarm clock to speed up time (in their mind)
So it makes you angry that some guy- who you don't know- chose to overclock his chip with compromises just to see how far it could go? Do you walk around looking for reasons to be offended?
 
So it makes you angry that some guy- who you don't know- chose to overclock his chip with compromises just to see how far it could go? Do you walk around looking for reasons to be offended?
The fact that they do it doesn't bother me. The thought that they are likely doing it on our dime does.
 
"But this type of thing is about as useful as stripping a Lamborghini down to the wheels and motor, putting a spoiler in front of the driver, and proclaiming, "Faster than a Bugatti!" "

What's wrong with that? Do people climb Mount Everest for any useful purpose? No- they do it because they want to, and because it's there. This guy was just seeing if he could crack the 7GHz mark using any compromise necessary. No reason to make it a negative thing.

I gave you the reason. From a tech perspective, this isn't useful to the vast majority of chip users.

If someone wants to OC a chip to 7ghz, then more power to them.

That doesn't create a condition wherein someone else can't or shouldn't say they don't see the point.

Personally, I think climbing Everest is worth it for most people. Others think it is way too much risk for validating one's ego. This "needless negativity" is an excellent way to start a conversation and learn about other people's model of the world.

Saying, "wow, that's really cool" will get me a couple likes from overclockers.

Saying it's a waste of time will get me different perspectives.

One is more useful than the other.
 
See, you learn something new everyday. I thought they were using nitrogen because there is a "helium shortage". I suppose we'll just have to wait til the sun makes more, then go there and get it.

BTW, why are you quoting me in this post anyway? I should be readily apparent to a person of your stature, I was just breaking ballz anyway. :D
Didn't mean to double quote, but the information I posted is still relevant. People also don't consider that helium is a major byproduct in nuclear reactors in the form of alpha particles(helium nuclei) looking for electrons. There isn't a major helium shortage, just a major shortage of helium collection in nuclear reactors.
 
Last edited:
There isn't a major helium shortage, just a major shortage of helium collection in nuclear reactors.
Yeah well, we had helium balloons before we had nuclear reactors. Maybe there's more of an, "excess demand" issue at play here.... :suspicious:
 
The point of the overclock was simply to see if 7GHz could be achieved for the FUN of it. He never said it was usable as a daily driver.

"HT has been doing wonders for ... Gaming performance since 2009".

No, not really. I suppose if your flogging a two-core chip HT can help, but it has been well documented that HT does nothing for quad-core i5s and i7s as gaming goes. In fact, many people turn HT off when gaming as it has been shown to hinder performance- it's all over the internet. That's why it is widely recommended to go with the i5 in a gaming rig and spend the extra $100 (for the i7) elsewhere.

HT has been doing wonders for years, which is why an i3 keeps up with an i5 in many titles when a game needs 4 cores.
i7's have always ruled the gaming charts and always will, i5's are a good bargain.

HT hurt a few FPS in VERY few titles (think 4 or 5 of 30 tested).






 
Last edited:
A lot of these exotic cooling methods are being used in the D-Wave quantum computer and other quantum computer designs. One of the reasons that these cooling methods are so successful at making high clock speeds is that they give the chips near-superconducting properties. If you make it too cold then you have electron leakage between transistors, which is why they use liquid Nitrogen instead of liquid Helium. Electron leakage is a problem that needs to be solved for us to shrink chips further and stabilize quantum computers for longer periods of time.

These problems wont be solved if we don't push the boundaries of what is possible, just like every other new technology that has been developed. While we were still using vacuum tubes making processors out of transistors was pushing the boundaries. Bill Gates is famously quoted saying "I don't know why anyone would need more than 64KB of ram". Now look at us, through pushing software boundaries we now have supercomputers making use of TiB's of memory ( 1 TiB=1024^4 gigabytes). The Titan Supercomputer uses 693 TiB of memory

The idea that this is just vanity is absurd.

Its not absurd its the truth, deal with it. Don't change the subject to quantum computing. You have sickness get help.
 
Its not absurd its the truth, deal with it. Don't change the subject to quantum computing. You have sickness get help.
Look my Prince @yRaz the infidel newcomers just sassed you something awful. It's heresy, blasphemy I tell you...:eek:
 
Last edited:
Look my Prince @yRaz the infidel newcomers just sassed you something awful. It's heresy, blasphemy I tell you...:eek:
It's fine, they don't realize I've already got all their banking information off that "meet horny singles near you" pop up they clicked on.
 
I gave you the reason. From a tech perspective, this isn't useful to the vast majority of chip users.

If someone wants to OC a chip to 7ghz, then more power to them.

That doesn't create a condition wherein someone else can't or shouldn't say they don't see the point.

Personally, I think climbing Everest is worth it for most people. Others think it is way too much risk for validating one's ego. This "needless negativity" is an excellent way to start a conversation and learn about other people's model of the world.

Saying, "wow, that's really cool" will get me a couple likes from overclockers.

Saying it's a waste of time will get me different perspectives.

One is more useful than the other.
Some random dude oc-ing a chip doesn't need to benefit anyone else, and it's his time to waste. This isn't about you.

"This "needless negativity" is an excellent way to start a conversation and learn about other people's model of the world."
"Saying it's a waste of time will get me different perspectives."

OK... are you working on a homework assignment or something?
 
HT has been doing wonders for years, which is why an i3 keeps up with an i5 in many titles when a game needs 4 cores.
i7's have always ruled the gaming charts and always will, i5's are a good bargain.

HT hurt a few FPS in VERY few titles (think 4 or 5 of 30 tested).

Those performance differences are tiny (~6%) for the large i7 premium (50-100%), and in the the Fallout 3 graph, the 6 fps difference between the 2500k and the 2600k is at least in part due to the slightly higher clock speed, not HT. I would hardly describe those differences as "doing wonders".

If you're into graphs, here are a few from Techspot's recent review of the i7-6700k and i7-7700k, which paint a very different picture:
https://www.techspot.com/review/1299-intel-core-kaby-lake-desktop/page8.html

iGPU_01.png

iGPU_02.png


iGPU_03.png


To be fair, I'm also including the one graph where there was a notable difference, but as both the i5 and i7 were over 200 fps it's a moot point.
dGPU_02.png
 
Those performance differences are tiny (~6%) for the large i7 premium (50-100%), and in the the Fallout 3 graph, the 6 fps difference between the 2500k and the 2600k is at least in part due to the slightly higher clock speed, not HT. I would hardly describe those differences as "doing wonders".
A dual core CPU keeping up and matching a quad core CPU is not doing wonders?
You looking at the same thing I am!? The Core i3 is neck and neck with the i5.
And in the Fallout graph, the i7 870 with a 2.93GHz clock speed is beating the i5 2500k at 3.3GHz.

If you're into graphs, here are a few from Techspot's recent review of the i7-6700k and i7-7700k, which paint a very different picture:
Those graphs display exactly what I said.
Any game that uses 4 cores the i3 will match or just about match the i5, due to HT and having 4 'logical' cores.
If an i5 keeps up with an i7, its due to the game being limited to only utilizing a a certain amount of CPU power, its not because the i5 is as good. When a game is programmed to use more power, the i7 will kicks its tail, like in Gears, Fallout and many other games.


I love the 920 beating the 750 at the same clock speed here, and how in both graphs, the 6 core i7 is untouched,even at the same or lower clock speed.




The i5 is a bargain.
But in many cases, the i7 will kick its ***, and not just the 6 core version.
i7's are also better for running multiple apps at once, you can overwhelm the i5's cores if asking a PC to do more then game while your gaming. i5's are very nice chips, but you still get what you pay for.
 
Last edited:
A dual core CPU keeping up and matching a quad core CPU is not doing wonders?
You looking at the same thing I am!? The Core i3 is neck and neck with the i5.
And in the Fallout graph, the i7 870 with a 2.93GHz clock speed is beating the i5 2500k at 3.3GHz.


Those graphs display exactly what I said.
Any game that uses 4 cores the i3 will match or just about match the i5, due to HT and having 4 'logical' cores.
If an i5 keeps up with an i7, its due to the game being limited to only utilizing a a certain amount of CPU power, its not because the i5 is as good. When a game is programmed to use more power, the i7 will kicks its tail, like in Gears, Fallout and many other games.


I love the 920 beating the 750 at the same clock speed here, and how in both graphs, the 6 core i7 is untouched,even at the same or lower clock speed.




The i5 is a bargain.
But in many cases, the i7 will kick its ***, and not just the 6 core version.
i7's are also better for running multiple apps at once, you can overwhelm the i5's cores if asking a PC to do more then game while your gaming. i5's are very nice chips, but you still get what you pay for.
A dual core CPU keeping up and matching a quad core CPU is not doing wonders?
You looking at the same thing I am!? The Core i3 is neck and neck with the i5.
And in the Fallout graph, the i7 870 with a 2.93GHz clock speed is beating the i5 2500k at 3.3GHz.


Those graphs display exactly what I said.
Any game that uses 4 cores the i3 will match or just about match the i5, due to HT and having 4 'logical' cores.
If an i5 keeps up with an i7, its due to the game being limited to only utilizing a a certain amount of CPU power, its not because the i5 is as good. When a game is programmed to use more power, the i7 will kicks its tail, like in Gears, Fallout and many other games.


I love the 920 beating the 750 at the same clock speed here, and how in both graphs, the 6 core i7 is untouched,even at the same or lower clock speed.




The i5 is a bargain.
But in many cases, the i7 will kick its ***, and not just the 6 core version.
i7's are also better for running multiple apps at once, you can overwhelm the i5's cores if asking a PC to do more then game while your gaming. i5's are very nice chips, but you still get what you pay for.
Well, the i7-3960X is a $1000 chip, so it shouldn't even be part of this debate. Moving down to the i7-2700k- a more realistic matchup to the i5-2500k, I see a 3 fps difference in Skyrim and 2 fps difference in Borderlands... not exactly what I'd call an a** kicking. In fact, in real-world gaming that's a tie. For a $100+ premium, the i7 is decidedly not a case of getting what you pay for.

"i7's are also better for running multiple apps at once, you can overwhelm the i5's cores if asking a PC to do more then game while your gaming"

Absolutely- I've already conceded that the i7 is better for multitasking. But we're focusing strictly on gaming performance here, and aside from the i7X it's basically a wash. For every chart you post showing the i7 winning, I can find another chart showing the i5 winning:
Crysis-3-1080p-Stock-Clocks-840x560.png
gaming
GTA-V-1080p-Overclocked-840x560.png
 
Well, the i7-3960X is a $1000 chip, so it shouldn't even be part of this debate. Moving down to the i7-2700k- a more realistic matchup to the i5-2500k, I see a 3 fps difference in Skyrim and 2 fps difference in Borderlands... not exactly what I'd call an a** kicking. In fact, in real-world gaming that's a tie. For a $100+ premium, the i7 is decidedly not a case of getting what you pay for.

"i7's are also better for running multiple apps at once, you can overwhelm the i5's cores if asking a PC to do more then game while your gaming"

Absolutely- I've already conceded that the i7 is better for multitasking. But we're focusing strictly on gaming performance here, and aside from the i7X it's basically a wash. For every chart you post showing the i7 winning, I can find another chart showing the i5 winning: gaming

Broadwell and Skylake being more SMT-focused, then when you disables HTT the CPU shows is true power per core, I confirm that on i3 6100 thats runs so well on 4-threaded scenarios (without lossing much as older models) older models lost between 60-70% per thread when pass from 2 to 4 threads, then the gain is minimal 20-30%

im not interested in Kabylake or Skylake since you must pay +180$ for under 3.00GHz base i5 to have DDR4 support, im waiting for Zen and hope that has Haswell-Broadwell IPC on Monothreaded scenarios

good day everyone
 
Back