You can't calculate the future gap, no one can predict the future, However, by at least showing a gap, you know which one is the better CPU and even if that gap stays narrow, you still bought the better CPU and you still are having a better experience. How does this do anything to the 2600x to 8700k argument other than reinforce it?Thanks for the article.
Just to point out a flaw in the argument:
Back in the day the differences were small between the i7 8700k and R5 2600x with the GTX 1070. You then measured a 3% difference in 1080p Ultra in AC:O.
You pointed out this was half of what the difference with a 1080Ti was, a card Nvidia 2017 claimed to be the fastest gaming graphics card money can buy.
I understand your point: What then was a narrow difference in gaming performance between two cpus, could become a widening gap with modern GPUs over years to come. So lower resolutions are chosen to inform the reader.
The flaw: How could one calculate the degree of future gap between those two CPUs back then without todays modern GPUs and without todayd modern game engines? It's not possible. The gap could stay narrow, or it could evolve into a serious difference. There are lot's of variables there. This imho disables the whole 8700k 2600x argument in the article.
Now I kinda get this, if anyone is asking for different resolutions, going down to 720p would be it but again, No monitors sold today or for the past what, 8 years have been 720p?Another point: Testing in 1080p Ultra does not help your cause (the 3% difference). 1080p medium or low or even 720p would have shown a greater gap between those two CPUs.
No I like this writing style more than the normal boring clinical style. Too often, and especially in the west, media are terrified of calling an egg an egg. This article is for people who can't grasp incredibly basic testing methodology that's used by all reviewers for a clearly good reason. If you take offence to any of the writing, well... I think that says way more about yourself than it does the article.Last point about the writing style: I feel it's not the smartest thing to talk about 'Misconceptions' while refering to a portion of your reader base. Obviously you are fighting 'misconceptions' in your own reader base, otherwise it would be pointless to write that emotionally. You don't get that reaction while browsing the comments section of quora. Yes, maybe some of your readers got things wrong. Maybe, as a writer I guess, one can get very upset when digging through some of the comments like in this recent Techspot article:
https://www.techspot.com/review/2615-ryzen-7600-vs-ryzen-5600/
And yes, people sometimes lack experience. Sometimes they are rude. Both can happen simultaneously. But as a professional tech blog your writers should stay above that. Don't start to mansplain benchmarks.
For instance please do not write: "When People Don't Understand CPU Benchmarks, Point Them Here". That's the actual Subtitle of this article. Just think about this sentence for a minute.
I liked Techspots writing style in general up to this point, but this article was not up to your standards imho. It felt like reading a tweet with graphs.
Just pointing these points out to reconsider, of course this is not meant as a harm and is my personal view. Other views may and will differ.
Where did they say any of that? They didn't expect anything. Simply put, the 8700K was faster than the 2700X when they were reviewed. Had they reviewed using higher resolutions you wouldn't have known the 8700K was faster, which proves using higher resolutions is pointless.I partially agree with what you wrote on article. Problem here is: You don't tell on reviews that you made benchmarks in following things in mind:
- You expect that people will upgrade GPU at least once
- You expect that people will upgrade just GPU not CPU
- You expect that current GPUs behave same way future GPUs do (todays top end GPU = top end GPU 3 years later)
- You expect that there will be no major improvements on game engines (like core usage, not very long time ago you didn't need more than 4 cores for gaming etc)
If any of there don't apply, your benchmarks are pretty much screwed. Let's take first one. You assume people will upgrade GPU. Then came both COVID and crypto boom, GPU prices skyrocketed. Now, if you made your benchmarks few years ago expecting that people will make GPU upgrades like on "normal situation" ... 🤦♂️
Where did they say any of that? They didn't expect anything. Simply put, the 8700K was faster than the 2700X when they were reviewed. Had they reviewed using higher resolutions you wouldn't have known the 8700K was faster, which proves using higher resolutions is pointless.
I say 3% faster or even 5% faster is nothing more than "faster".If we were to compare the 8700K and 2600X at 1080p with the Ultra High preset using the GTX 1070, we'd find that both CPUs are capable of around 70 fps and that's certainly very playable performance for such a title. This would have led us to believe that the 8700K was just 3% faster than the 2600X, less than half the margin we found when using the GeForce GTX 1080 Ti back in 2017.
There is big difference between CPU review and CPU future predicting. Review may apply solely on today's situation. Future predicting MUST also consider things like CPU upgrade paths. That's the problem. CPU review expects people upgrading GPUs but not CPUs. Err whatThen showing how each CPU runs today's games on modern GPU's goes to show how the gap can widen over time, making it even more important to show which CPU is faster at time of review and reinforces why using the highest end GPU at time review is invaluable.
Another point: Testing in 1080p Ultra does not help your cause (the 3% difference). 1080p medium or low or even 720p would have shown a greater gap between those two CPUs.
To some people (myself included) 3% is basically margin of error or just not noticable. The examples given here are way more than that back in the original reviews. That's why lower resolutions are used, if one CPU is considerably better than the other, it will show up on CPU bound games.I say 3% faster or even 5% faster is nothing more than "faster".
If your current PC is CPU bottlenecked, then yeah, it would benefit from a better CPU. Can your GPU handle higher resolution? No? Then the CPU won't make a difference. What are you struggling to understand here?Would it still be faster if user uses same GPU or uses higher resolutions?
No one said predicting the future, you've put that into your own head. Reviewers have never expected people to upgrade anything, that's up to the end user to decide after reading a review of said product.There is big difference between CPU review and CPU future predicting. Review may apply solely on today's situation. Future predicting MUST also consider things like CPU upgrade paths. That's the problem. CPU review expects people upgrading GPUs but not CPUs. Err what![]()
So that you get differences years later when upgraded GPU to something that didn't exist back then? That's predicting future, not reviewing product for today.To some people (myself included) 3% is basically margin of error or just not noticable. The examples given here are way more than that back in the original reviews. That's why lower resolutions are used, if one CPU is considerably better than the other, it will show up on CPU bound games.
When there is no difference, reviews Must somehow make that difference using unrealistic scenarios. Again, that is not review, that is more predicting future.If your current PC is CPU bottlenecked, then yeah, it would benefit from a better CPU. Can your GPU handle higher resolution? No? Then the CPU won't make a difference. What are you struggling to understand here?
In article we are discussing, they are saying exactly that. They expect users to upgrade GPU and make articles with that thing in mind. Predicting future, that's what they try to do.No one said predicting the future, you've put that into your own head. Reviewers have never expected people to upgrade anything, that's up to the end user to decide after reading a review of said product.
If you read a review, but take it as "predicting the future", you will never understand any review on the internet because you're taking it for something it's not.
No, it doesn't. You can say product A performs better today than product B but if both products have certain strengths and weaknesses (they always have), future performance depends on how those strengths and weaknesses affect on future software. You will need prediction for that. If you take product that has very big strength but it does not apply on everything (like Ryzen 5800X3D), it's very hard to predict how it will perform in future. In cases large victim cache does not help at all, it's even slower than regular 5800X, but when it helps, it's much faster than 5800X.You and me have been on TechSpot a long time, we've had this discussion many times now, you seem to believe some kind of "future proofing" exists which isn't true, You buy the best performing product you can at the time and because it does perform better (2700X vs 8700K) it'll continue to perform better into the future. That's not future proofing, that's not prediction, that's just how the world works.
You just said one major problem. Reviews try to predict future but miserably fail because there might be something coming on future they have no idea about. When Techspot reviewed 2700X and predicted future GPU performance, they also should have considered 5800X3D as future upgrade for 2700X platform. You might say they had no idea about 5800X3D because it didn't exist at time but it doesn't change fact they failed to predict future. If predicting future is so hard, why they try to do it? Why not concentrate performance on today because that can be done?I know you love AMD so I'll give you an example where AMD destroys the competition for its day, the 5800X3D was either slightly slower or slightly faster than intel's best (12900K) but was essentially half the price whilst using considerably less power. It's also aged really well, still able to hang with the latest and greatest. The review didn't know that would be the case a year later, it's just that much better than anything else you can slot into the AM4 platform which they got across in the review.
You can't calculate the future gap, no one can predict the future, However, by at least showing a gap, you know which one is the better CPU and even if that gap stays narrow, you still bought the better CPU and you still are having a better experience. How does this do anything to the 2600x to 8700k argument other than reinforce it?
Never mind, you definitely are someone who simply will never understand what a review is and you will never be satisfied with any review on the planet.Reviews try to predict future but miserably fail because there might be something coming on future they have no idea about.
It's a luck game? So you reckon there was a chance the much slower 2700X would be faster than the much faster 8700K in the future?No it's not reinforced at all.
Let's break it down and start with what you said:
'You can't predict the future'. 100% agreed! So the argument (to detect in 2017 that the 8700k will be the significantly better CPU over time, then buying it) is completely invalidated. It's a luck game: You can get a good lifespan CPU, like the legendary 4790K that will last you 5, 6 or 7 good years, or a real dud 'Krappy Lake' that needs updating very fast.
Steve never used the 1070, where did you get that information? They used the 1080Ti (the best GPU at the time) to review the CPU's. If you're GPU is already your bottleneck, of course a CPU upgrade won't do anything. That's basic logic, what are you struggling with here?Secondly, if you don't upgrade your GPU rapidly after your CPU purchase, your experience won't be a better one for years to come. As a matter if fact it will be basically the same gaming experience - no things changed. Again, we are talking 3% difference between 8700k and 2600x with the GTX1070@1080p in High Settings. You won't tell which is which by playing both rigs without an FPS Counter on.
Suggests where? Where does he suggest you can predict the future? The 8700K and 2600X were used because they're more extreme examples. Precisely because they couldn't have predicted how much better the 8700K would age, it goes to show how important it is to use lower resolutions, the best GPU and use CPU bound games to really show the performance difference between CPU's. If one CPU performs considerably better today (2017) over another, it will continue to be that way until the end of time. We could do this test in the year 2953 and the 8700K will still be the better processor.Yet Steves article suggests, that in retrospect the 8700k aged far better than the 2600x (which it has, but who could have known in 2017?). And you could detect and calculate those probabilities by deepdiving into the 1080p benchmarks (you can possibly not, of course, like you said already). This is the flaw in the argument. Hope, I made my point clearer.
The comment section validates the article entirely, thank you to everyone![]()
I partially agree with what you wrote on article. Problem here is: You don't tell on reviews that you made benchmarks in following things in mind:
- You expect that people will upgrade GPU at least once
- You expect that people will upgrade just GPU not CPU
- You expect that current GPUs behave same way future GPUs do (todays top end GPU = top end GPU 3 years later)
- You expect that there will be no major improvements on game engines (like core usage, not very long time ago you didn't need more than 4 cores for gaming etc)
The comment section validates the article entirely, thank you to everyone![]()
More powerful CPU is not absolute thing. Remember Pentium G3258? Overclocked it was considered even faster than i7-4790K when looking at benchmarks. However it only has 2 cores. Guess what happens when benchmarking games that utilize more than 2 cores? Is it expected to perform better in future too?Not really; the review is answering one specific question: Which CPU is more powerful. Even if you never change the hardware ever, the reviews still show that CPU A is faster then CPU B when the GPU is removed from the equation. That may or not matter to the end user, but it's still important to know for those users who WILL do GPU upgrades down the road, as it informs them which CPU would be expected to perform better down the line.
Information might be relevant but like I stated above, there is difference between review and future prediction. If it's future prediction, then everything should be considered. If it's today's review, then it's OK to concentrate on performance today. What Techspot seems to do according to article is to partially (but only partially) predict what future would be.Which is how most people upgrade; most people build a platform then do a couple of GPU upgrades until the platform is no longer viable. Obviously, if you are planning on replacing the CPU in the short-term, the need to know which CPU will perform better in the long term is N/A. But the information provided is quite relevant for the majority of users
Techspot is trying to say CPU A is better in future but at same time refuse to say which CPU is better buy for future. What?Of course, we're not saying the 8700K was the better purchase, as it was a more expensive CPU and the Ryzen 5 was a better value offering...
Leaving GPU out of equation I disagree. I played somewhat lot a game that uses all available CPU cores but stress for GPU is minimal. Because it utilizes all available cores, it will slow down computer IF it has all cores available. Solution? Give it only few cores.They typically will, barring a complete re-thinking of how rendering is done. Simple example: an Ivy Bridge is still going to perform better then Bulldozer in gaming, despite the fact we've gone from a strictly linear non-threaded graphical pipeline (DX9/OpenGL) to a fully multithreaded one with ray tracing support (DX12/Vuklan). One CPU is simply better then the other, despite the massive amounts of changes done to GPU rendering over the past decade.
Like Pentium G3258 will be future proof gaming CPU because it performed well when launched?Obviously, if these fundamental assumptions change, then all bets are off. But in the short term (10+ years) they are safe assumptions to make.
And if Ivy Bridge is only dual core part, then what? You seem to assume it's quad core that was more expensive than 8-core Piledriver.See my above point on Ivy Bride v Bulldozer.
Now yes, there is the argument in the longish term that as programs get better at using more cores CPUs with more cores will tend to perform better; classic example would be the E8600 vs Q6600 debate. But we're basically at the point where adding more cores produces minimal benefit outside of specific desktop software that scales basically to infinity; games in particular aren't going to scale in a way where continuing to throw cores is going to affect performance [barring cases where other applications, like streaming apps, are also running; that would actually be a *Very* good test case to start testing with...].
As a car reviewer, are you only going to review how well a Jeep does offroad? Only test how well sports cars go in a straight line?
You want to remove bottlenecks. Cool. Do it. 1080p. But....
I think what initiated this is/was with 1080p ONLY testing results with "overkill" hardware. One comment in the first image in this article is complaining about exactly that, and that's what I had a big problem with. Those are all the people that have no use for 1080p ONLY results. Those are the keywords here: 1080p ONLY.
It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.
Aside from that, I also had the idea of a tech site eventually throwing a popular mainstream build in the mix (included in review/separate video series) with say an i3/i5/R5 + 16GB 3200MHz CL16 / DDR5 6000MHz paired with new low to high end GPU's, and a vice versa with low to high end GPU's? Rigs closer to what the majority actually have. I feel it would help a lot of people with upgrading in addition to standard reviews.
As a car reviewer, are you only going to review how well a Jeep does offroad? Only test how well sports cars go in a straight line?
You want to remove bottlenecks. Cool. Do it. 1080p. But....
I think what initiated this is/was with 1080p ONLY testing results with "overkill" hardware. One comment in the first image in this article is complaining about exactly that, and that's what I had a big problem with. Those are all the people that have no use for 1080p ONLY results. Those are the keywords here: 1080p ONLY.
It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.
Aside from that, I also had the idea of a tech site eventually throwing a popular mainstream build in the mix (included in review/separate video series) with say an i3/i5/R5 + 16GB 3200MHz CL16 / DDR5 6000MHz paired with new low to high end GPU's, and a vice versa with low to high end GPU's? Rigs closer to what the majority actually have. I feel it would help a lot of people with upgrading in addition to standard reviews.
If the argument was that there was no use for 1080p results at all, then this article was a big waste of time, because of course there is, but 1080p only to represent them all? - NO.computers aren’t cars, and you’re still not understanding it lol.