How We Test: CPU Benchmarks, Misconceptions Explained

Status
Not open for further replies.
If the argument was that there was no use for 1080p results at all, then this article was a big waste of time, because of course there is, but 1080p only to represent them all? - NO.
WOW, you'll just pull anything out of the air to try sounding sound smart, eh? Bottom line is, you're clearly the hahahanoob here. Admit you're in over your head... and losing street cred by the minute. You're either unteachable or trolling.

Hey - let's change the subject and debate gravity! That's open to your interpretation as well - right? :laughing:
 
More powerful CPU is not absolute thing. Remember Pentium G3258? Overclocked it was considered even faster than i7-4790K when looking at benchmarks. However it only has 2 cores. Guess what happens when benchmarking games that utilize more than 2 cores? Is it expected to perform better in future too?

Information might be relevant but like I stated above, there is difference between review and future prediction. If it's future prediction, then everything should be considered. If it's today's review, then it's OK to concentrate on performance today. What Techspot seems to do according to article is to partially (but only partially) predict what future would be.

This summarizes my point:

Techspot is trying to say CPU A is better in future but at same time refuse to say which CPU is better buy for future. What? :confused:

Leaving GPU out of equation I disagree. I played somewhat lot a game that uses all available CPU cores but stress for GPU is minimal. Because it utilizes all available cores, it will slow down computer IF it has all cores available. Solution? Give it only few cores.

You do want to give at least two cores for other applications. With quad core Intel, you can give only 2 cores for it. With Piledriver you could give six out of eight. Now it's 6 FX cores vs 2 Ivy Bridge cores. I doubt Ivy Bridge would be clear winner. That is gaming basically without any GPU stress.

Like Pentium G3258 will be future proof gaming CPU because it performed well when launched?

And if Ivy Bridge is only dual core part, then what? You seem to assume it's quad core that was more expensive than 8-core Piledriver.

I agree that today we have mostly enough cores for gaming but again dual cores were not enough for future, no matter what benchmarks at that time said. Just like with single core vs dual core when dual core CPUs came out. Single core CPUs usually dominated benchmarks but in reality dual cores were much more usable and very soon faster ones. Good example where benchmarks didn't tell anything about future.
You seem incredibly obsessed with the future. Are you and hahahanoobs the same person with dual accounts?
 
You seem incredibly obsessed with the future. Are you and hahahanoobs the same person with dual accounts?
He's been like this the whole time he's been on TechSpot. HardReset and I go way back. The days of arguing on the Nvidia 780 review comment section :cool:
 
Lol it blows my mind people still argue this, great article.
computers aren’t cars, and you’re still not understanding it lol.
WOW, you'll just pull anything out of the air to try sounding sound smart, eh? Bottom line is, you're clearly the hahahanoob here. Admit you're in over your head... and losing street cred by the minute. You're either unteachable or trolling.

Hey - let's change the subject and debate gravity! That's open to your interpretation as well - right? :laughing:
You seem incredibly obsessed with the future. Are you and hahahanoobs the same person with dual accounts?
He's been like this the whole time he's been on TechSpot. HardReset and I go way back. The days of arguing on the Nvidia 780 review comment section :cool:
As usual, when trolls run out of arguments, they turn discussion to personal things. Nothing new here.
 
As usual, when trolls run out of arguments, they turn discussion to personal things. Nothing new here.
Nothing personal here, it's your usual argument across the literal years. You seem to fall on some obsession with the future even though no one is talking about the future.

This article couldn't be more clear, but you simply will not believe that all reviewers are not trying to tell the future for some reason. I don't know why you obsess over it, you can't explain where you're getting your future predictions from and nobody here is trying to predict the future. But it's all you seem to talk about.
 
WOW, you'll just pull anything out of the air to try sounding sound smart, eh? Bottom line is, you're clearly the hahahanoob here. Admit you're in over your head... and losing street cred by the minute. You're either unteachable or trolling.

Hey - let's change the subject and debate gravity! That's open to your interpretation as well - right? :laughing:
I think you're the one that's confused!
 
I think you're the one that's confused!
You do not test the cpu... when doing a GPU test. You do not test the GPU when doing a CPU test. You do NOT want one, effecting the other... when parsing the performance difference of said test.


It is that^ simple...
 
To whoever at TechSpot deleted all the comments, was it really considered "personal"?

We didn't deviate from the article in the discussion either. It stayed civil and we were just waiting on a response from HardReset?
 
Usually forum moderators and admins suffer from "god syndrome" but here they are quite balanced I can say. Here I had posts deleted only when I was out of control.......

I just seen the YT video for this article and Steven was quite upset, tired and sick of explaining this subject. But at least he was not about to cry like the time Nvidia refused to give him cards for reviews.
 
The sad thing is, no matter how many times you explain it, the knuckle dragging m0r0ns will still reply with "why would you test low end CPUs with a 4090" and "who uses a 4090 at 1080p". Steve's got one hell o an uphill fight to make these people see the light.
But at least now you can easily link this article instead of explaining the whole thing :)
 
No it's not reinforced at all.
Let's break it down and start with what you said:
'You can't predict the future'. 100% agreed! So the argument (to detect in 2017 that the 8700k will be the significantly better CPU over time, then buying it) is completely invalidated. It's a luck game: You can get a good lifespan CPU, like the legendary 4790K that will last you 5, 6 or 7 good years, or a real dud 'Krappy Lake' that needs updating very fast.

Secondly, if you don't upgrade your GPU rapidly after your CPU purchase, your experience won't be a better one for years to come. As a matter if fact it will be basically the same gaming experience - no things changed. Again, we are talking 3% difference between 8700k and 2600x with the GTX 1070@1080p in High Settings. You won't tell which is which by playing both rigs without an FPS Counter on.

Yet Steves article suggests, that in retrospect the 8700k aged far better than the 2600x (which it has, but who could have known in 2017?). And you could detect and calculate those probabilities by deep diving into the 1080p benchmarks (you can possibly not, of course, like you said already). This is the flaw in the argument. Hope, I made my point clearer.
What? It was painfully obvious that the 8700k would last way longer than the 2600x. What the heck are you talking about, really?
 
From article conclusion:
[HEADING=1]"How We Test, Explained[/HEADING]
Hopefully by this point you have a more complete understanding why testing CPU performance with a strong GPU bottleneck is a bad idea, and significantly more misleading than the well established testing methods used by almost all tech media outlets."

Talking about kicking yourself in the nuts or just my English is bad today?
 
What? It was painfully obvious that the 8700k would last way longer than the 2600x. What the heck are you talking about, really?

No, it was painfully obvious that the 8700k was a one time LGA1151 investment... while the 2600x could be replaced with a 3600... or a 5600... or a 5800X 3D using the same mobo.

I went from a 1800x to a 5800X3D.... 5 years later.



That is why many are laughing at all the lemmings who are building Intel 13800k rigs right now... they havnt learned from their mistakes.
 
My God, so much stupid sh*t going on around here these days. So many clueless nerds itching for a fight, yet unwilling to listen and learn.

Please go back to GameSpot; this is a site for educated grownups with actual tech knowledge. If I hear one more negative comment on a CPU tested with an RTX 4090 at 1080p I'm going to puke.
 
I'd like to see a 8 core vs 16 core game review. To see how certain game engines & Developers use multithreading & cores, etc..
 
Status
Not open for further replies.
Back