How We Test: CPU Benchmarks, Misconceptions Explained

Status
Not open for further replies.
All CPU tests should be done with GT 730 and GPUs with a Celeron G1820. Truth is, all hardware performs the same, big tech is lying to us!
 
Man, never would have expected Steve to get this salty over YouTube comments, they're probably the lowest of the low when it comes to comments... Well maybe matched by some commenters on Techspot as well mind you.

TechSpot's CPU benchmarks are what keep me from upgrading CPUs in most cases as it shows the longevity of hardware better than anywhere else when taking gaming into consideration, also why I find the concept of upgrading CPUs on any given platform completely pointless as this seems completely unrealistic. i7 920 lasted me from 2009 until 2016, 5960x from 2016 to 2022, current platform will easily go another 5-6 years. But I digress.

Simply put, if people want GPU benchmarks they can find those separate from CPU benchmarks which really allows both datasets to be more complete than trying to do everything at once and keeps the articles from being too overwhelming. But, alas, you'll never be able to please everyone right.
 
I have a couple things I like to test personally.

The more data the better! Of course, time is limited, so I go to the extremes instead.

One way you can test is absolute minimum settings at the lowest resolution allowed. This really hones in on what the absolute limit of the CPU is. Yes, this means 480p or 720p testing. Sure it's unrealistic, but that's kind of the point.

1080p is closer to realistic, but I would modify to be minimum settings. If you're running into a hard framerate limit you want to overcome, you're going to practically drop all settings but resolution if possible. It is useful to see 1080p, 1440p, 4K min settings performance to see what the game/CPU can do if you want those frames like I do.

If there's something to adopt, it would be changing to minimum settings to emphasize that isolation more.
 
This article is fantastic! It really helps to point out the lack of articles discussing how to build a balanced system. I would love to see you guys throw something together that explains how to determine when it's appropriate to purchase a new GPU vs upgrading to a new CPU, or when it might be a better consideration to move up in monitor resolution to keep that old CPU relevant.

I have an 8700k, and would love to upgrade my 2070super. If I was going to stick with 1440p, it's not clear to me what GPU I would consider the max that this setup can handle. I know that I could move to 4k and a better GPU at the same time, but even that would have it's limits. What can we as consumers do to determine the finality of our current build?
 
I partially agree with what you wrote on article. Problem here is: You don't tell on reviews that you made benchmarks in following things in mind:

- You expect that people will upgrade GPU at least once
- You expect that people will upgrade just GPU not CPU
- You expect that current GPUs behave same way future GPUs do (todays top end GPU = top end GPU 3 years later)
- You expect that there will be no major improvements on game engines (like core usage, not very long time ago you didn't need more than 4 cores for gaming etc)

If any of there don't apply, your benchmarks are pretty much screwed. Let's take first one. You assume people will upgrade GPU. Then came both COVID and crypto boom, GPU prices skyrocketed. Now, if you made your benchmarks few years ago expecting that people will make GPU upgrades like on "normal situation" ... 🤦‍♂️
 
Thanks for the article.

Just to point out a flaw in the argument:

Back in the day the differences were small between the i7 8700k and R5 2600x with the GTX 1070. You then measured a 3% difference in 1080p Ultra in AC:O.
You pointed out this was half of what the difference with a 1080Ti was, a card Nvidia 2017 claimed to be the fastest gaming graphics card money can buy.

I understand your point: What then was a narrow difference in gaming performance between two cpus, could become a widening gap with modern GPUs over years to come. So lower resolutions are chosen to inform the reader.

The flaw: How could one calculate the degree of future gap between those two CPUs back then without todays modern GPUs and without todayd modern game engines? It's not possible. The gap could stay narrow, or it could evolve into a serious difference. There are lot's of variables there. This imho disables the whole 8700k 2600x argument in the article.
You can't calculate the future gap, no one can predict the future, However, by at least showing a gap, you know which one is the better CPU and even if that gap stays narrow, you still bought the better CPU and you still are having a better experience. How does this do anything to the 2600x to 8700k argument other than reinforce it?
Another point: Testing in 1080p Ultra does not help your cause (the 3% difference). 1080p medium or low or even 720p would have shown a greater gap between those two CPUs.
Now I kinda get this, if anyone is asking for different resolutions, going down to 720p would be it but again, No monitors sold today or for the past what, 8 years have been 720p?
Yes it would have shown a greater gap and the fact you knew this without having to run the test shows the 1080p results did their job and explained to you one CPU is faster than the other.
Last point about the writing style: I feel it's not the smartest thing to talk about 'Misconceptions' while refering to a portion of your reader base. Obviously you are fighting 'misconceptions' in your own reader base, otherwise it would be pointless to write that emotionally. You don't get that reaction while browsing the comments section of quora. Yes, maybe some of your readers got things wrong. Maybe, as a writer I guess, one can get very upset when digging through some of the comments like in this recent Techspot article:
https://www.techspot.com/review/2615-ryzen-7600-vs-ryzen-5600/
And yes, people sometimes lack experience. Sometimes they are rude. Both can happen simultaneously. But as a professional tech blog your writers should stay above that. Don't start to mansplain benchmarks.


For instance please do not write: "When People Don't Understand CPU Benchmarks, Point Them Here". That's the actual Subtitle of this article. Just think about this sentence for a minute.
I liked Techspots writing style in general up to this point, but this article was not up to your standards imho. It felt like reading a tweet with graphs.

Just pointing these points out to reconsider, of course this is not meant as a harm and is my personal view. Other views may and will differ.
No I like this writing style more than the normal boring clinical style. Too often, and especially in the west, media are terrified of calling an egg an egg. This article is for people who can't grasp incredibly basic testing methodology that's used by all reviewers for a clearly good reason. If you take offence to any of the writing, well... I think that says way more about yourself than it does the article.
 
I partially agree with what you wrote on article. Problem here is: You don't tell on reviews that you made benchmarks in following things in mind:

- You expect that people will upgrade GPU at least once
- You expect that people will upgrade just GPU not CPU
- You expect that current GPUs behave same way future GPUs do (todays top end GPU = top end GPU 3 years later)
- You expect that there will be no major improvements on game engines (like core usage, not very long time ago you didn't need more than 4 cores for gaming etc)

If any of there don't apply, your benchmarks are pretty much screwed. Let's take first one. You assume people will upgrade GPU. Then came both COVID and crypto boom, GPU prices skyrocketed. Now, if you made your benchmarks few years ago expecting that people will make GPU upgrades like on "normal situation" ... 🤦‍♂️
Where did they say any of that? They didn't expect anything. Simply put, the 8700K was faster than the 2700X when they were reviewed. Had they reviewed using higher resolutions you wouldn't have known the 8700K was faster, which proves using higher resolutions is pointless.

Then showing how each CPU runs today's games on modern GPU's goes to show how the gap can widen over time, making it even more important to show which CPU is faster at time of review and reinforces why using the highest end GPU at time review is invaluable.
 
Where did they say any of that? They didn't expect anything. Simply put, the 8700K was faster than the 2700X when they were reviewed. Had they reviewed using higher resolutions you wouldn't have known the 8700K was faster, which proves using higher resolutions is pointless.
If we were to compare the 8700K and 2600X at 1080p with the Ultra High preset using the GTX 1070, we'd find that both CPUs are capable of around 70 fps and that's certainly very playable performance for such a title. This would have led us to believe that the 8700K was just 3% faster than the 2600X, less than half the margin we found when using the GeForce GTX 1080 Ti back in 2017.
I say 3% faster or even 5% faster is nothing more than "faster".

Would it still be faster if user uses same GPU or uses higher resolutions?
Then showing how each CPU runs today's games on modern GPU's goes to show how the gap can widen over time, making it even more important to show which CPU is faster at time of review and reinforces why using the highest end GPU at time review is invaluable.
There is big difference between CPU review and CPU future predicting. Review may apply solely on today's situation. Future predicting MUST also consider things like CPU upgrade paths. That's the problem. CPU review expects people upgrading GPUs but not CPUs. Err what :confused:
 
Another point: Testing in 1080p Ultra does not help your cause (the 3% difference). 1080p medium or low or even 720p would have shown a greater gap between those two CPUs.

It would if the render pipeline was totally GPU, but it's not. Developers have reduced CPU usage in the pipeline by a substantial amount over the years but have yet to reduce it to zero and most likely never will. Running the CPU benchmark with the graphics settings on ultra simply ensures that the CPU is being fully loaded, by things like physic and AI engines, as well as the render pipeline.
 
I say 3% faster or even 5% faster is nothing more than "faster".
To some people (myself included) 3% is basically margin of error or just not noticable. The examples given here are way more than that back in the original reviews. That's why lower resolutions are used, if one CPU is considerably better than the other, it will show up on CPU bound games.
Would it still be faster if user uses same GPU or uses higher resolutions?
If your current PC is CPU bottlenecked, then yeah, it would benefit from a better CPU. Can your GPU handle higher resolution? No? Then the CPU won't make a difference. What are you struggling to understand here?
There is big difference between CPU review and CPU future predicting. Review may apply solely on today's situation. Future predicting MUST also consider things like CPU upgrade paths. That's the problem. CPU review expects people upgrading GPUs but not CPUs. Err what :confused:
No one said predicting the future, you've put that into your own head. Reviewers have never expected people to upgrade anything, that's up to the end user to decide after reading a review of said product.

If you read a review, but take it as "predicting the future", you will never understand any review on the internet because you're taking it for something it's not.

You and me have been on TechSpot a long time, we've had this discussion many times now, you seem to believe some kind of "future proofing" exists which isn't true, You buy the best performing product you can at the time and because it does perform better (2700X vs 8700K) it'll continue to perform better into the future. That's not future proofing, that's not prediction, that's just how the world works.

I know you love AMD so I'll give you an example where AMD destroys the competition for its day, the 5800X3D was either slightly slower or slightly faster than intel's best (12900K) but was essentially half the price whilst using considerably less power. It's also aged really well, still able to hang with the latest and greatest. The review didn't know that would be the case a year later, it's just that much better than anything else you can slot into the AM4 platform which they got across in the review.
 
To some people (myself included) 3% is basically margin of error or just not noticable. The examples given here are way more than that back in the original reviews. That's why lower resolutions are used, if one CPU is considerably better than the other, it will show up on CPU bound games.
So that you get differences years later when upgraded GPU to something that didn't exist back then? That's predicting future, not reviewing product for today.
If your current PC is CPU bottlenecked, then yeah, it would benefit from a better CPU. Can your GPU handle higher resolution? No? Then the CPU won't make a difference. What are you struggling to understand here?
When there is no difference, reviews Must somehow make that difference using unrealistic scenarios. Again, that is not review, that is more predicting future.
No one said predicting the future, you've put that into your own head. Reviewers have never expected people to upgrade anything, that's up to the end user to decide after reading a review of said product.

If you read a review, but take it as "predicting the future", you will never understand any review on the internet because you're taking it for something it's not.
In article we are discussing, they are saying exactly that. They expect users to upgrade GPU and make articles with that thing in mind. Predicting future, that's what they try to do.
You and me have been on TechSpot a long time, we've had this discussion many times now, you seem to believe some kind of "future proofing" exists which isn't true, You buy the best performing product you can at the time and because it does perform better (2700X vs 8700K) it'll continue to perform better into the future. That's not future proofing, that's not prediction, that's just how the world works.
No, it doesn't. You can say product A performs better today than product B but if both products have certain strengths and weaknesses (they always have), future performance depends on how those strengths and weaknesses affect on future software. You will need prediction for that. If you take product that has very big strength but it does not apply on everything (like Ryzen 5800X3D), it's very hard to predict how it will perform in future. In cases large victim cache does not help at all, it's even slower than regular 5800X, but when it helps, it's much faster than 5800X.

Now, make your prediction. How will 5800X3D perform against today's Intel CPUs if say two years. Better or worse than today? You WILL need prediction for that.
I know you love AMD so I'll give you an example where AMD destroys the competition for its day, the 5800X3D was either slightly slower or slightly faster than intel's best (12900K) but was essentially half the price whilst using considerably less power. It's also aged really well, still able to hang with the latest and greatest. The review didn't know that would be the case a year later, it's just that much better than anything else you can slot into the AM4 platform which they got across in the review.
You just said one major problem. Reviews try to predict future but miserably fail because there might be something coming on future they have no idea about. When Techspot reviewed 2700X and predicted future GPU performance, they also should have considered 5800X3D as future upgrade for 2700X platform. You might say they had no idea about 5800X3D because it didn't exist at time but it doesn't change fact they failed to predict future. If predicting future is so hard, why they try to do it? Why not concentrate performance on today because that can be done?

"We cannot predict future but we still make our reviews with future predictions in mind" :D
 
You can't calculate the future gap, no one can predict the future, However, by at least showing a gap, you know which one is the better CPU and even if that gap stays narrow, you still bought the better CPU and you still are having a better experience. How does this do anything to the 2600x to 8700k argument other than reinforce it?

No it's not reinforced at all.
Let's break it down and start with what you said:
'You can't predict the future'. 100% agreed! So the argument (to detect in 2017 that the 8700k will be the significantly better CPU over time, then buying it) is completely invalidated. It's a luck game: You can get a good lifespan CPU, like the legendary 4790K that will last you 5, 6 or 7 good years, or a real dud 'Krappy Lake' that needs updating very fast.

Secondly, if you don't upgrade your GPU rapidly after your CPU purchase, your experience won't be a better one for years to come. As a matter if fact it will be basically the same gaming experience - no things changed. Again, we are talking 3% difference between 8700k and 2600x with the GTX 1070@1080p in High Settings. You won't tell which is which by playing both rigs without an FPS Counter on.

Yet Steves article suggests, that in retrospect the 8700k aged far better than the 2600x (which it has, but who could have known in 2017?). And you could detect and calculate those probabilities by deep diving into the 1080p benchmarks (you can possibly not, of course, like you said already). This is the flaw in the argument. Hope, I made my point clearer.
 
Reviews try to predict future but miserably fail because there might be something coming on future they have no idea about.
Never mind, you definitely are someone who simply will never understand what a review is and you will never be satisfied with any review on the planet.

Unless it outright lies of course, I'm sure if a reviewer put up a review that simply read "7600X is faster than anything else right now and will be for the next 10 years" You'd be ecstatic.
 
No it's not reinforced at all.
Let's break it down and start with what you said:
'You can't predict the future'. 100% agreed! So the argument (to detect in 2017 that the 8700k will be the significantly better CPU over time, then buying it) is completely invalidated. It's a luck game: You can get a good lifespan CPU, like the legendary 4790K that will last you 5, 6 or 7 good years, or a real dud 'Krappy Lake' that needs updating very fast.
It's a luck game? So you reckon there was a chance the much slower 2700X would be faster than the much faster 8700K in the future?
Secondly, if you don't upgrade your GPU rapidly after your CPU purchase, your experience won't be a better one for years to come. As a matter if fact it will be basically the same gaming experience - no things changed. Again, we are talking 3% difference between 8700k and 2600x with the GTX1070@1080p in High Settings. You won't tell which is which by playing both rigs without an FPS Counter on.
Steve never used the 1070, where did you get that information? They used the 1080Ti (the best GPU at the time) to review the CPU's. If you're GPU is already your bottleneck, of course a CPU upgrade won't do anything. That's basic logic, what are you struggling with here?
Yet Steves article suggests, that in retrospect the 8700k aged far better than the 2600x (which it has, but who could have known in 2017?). And you could detect and calculate those probabilities by deepdiving into the 1080p benchmarks (you can possibly not, of course, like you said already). This is the flaw in the argument. Hope, I made my point clearer.
Suggests where? Where does he suggest you can predict the future? The 8700K and 2600X were used because they're more extreme examples. Precisely because they couldn't have predicted how much better the 8700K would age, it goes to show how important it is to use lower resolutions, the best GPU and use CPU bound games to really show the performance difference between CPU's. If one CPU performs considerably better today (2017) over another, it will continue to be that way until the end of time. We could do this test in the year 2953 and the 8700K will still be the better processor.

The point of the article is to show if you try and measure in any other way as lots of comments suggest (using a 1070 for example or running at higher resolutions), you are providing false information as both CPU's would appear to be the same performance wise when they just aren't.
 
Despite all this some a-holes still insist to see GPU 4k benchmarks in CPU measurements.

To tell the truth, the comments section is supposed to be devoid of debates questioning the principle, when the article is about explanation itself. Only empty cans rattle the most.

Looks like Steven's effort is still falling on deaf ears.
 
The comment section validates the article entirely, thank you to everyone :D

This is one of the best tech articles I've ever read. Even though I've read CPU reviews as intended, seeing the absolute max fps like that really helps.

Definitely gained a new reader, looking forward to the upcoming 7950x3d review.

Would AMD Smart Access Memory be a useful feature for a CPU review or that's more suited for a GPU review?
 
Face the FACTS, folks. Preventing the GPU from doing the heavy lifting as much as possible - which requires lower resolutions and settings to achieve - is the best way to isolate CPU performance. This isn't rocket science. It's not a buffet. It is isolating ONE component and evaluating its performance, relative to others.

CPU reviews are exactly that: CPU reviews. They are not a "system" review, nor are they meant to be recommendations on which CPU and GPU combos are best paired for a given resolution.

If you need recommendations on pairing CPUs and GPUs, there are hundreds of articles on the subject.

School those noobs, Steve!
 
I partially agree with what you wrote on article. Problem here is: You don't tell on reviews that you made benchmarks in following things in mind:

Lets go through these, shall we?

- You expect that people will upgrade GPU at least once

Not really; the review is answering one specific question: Which CPU is more powerful. Even if you never change the hardware ever, the reviews still show that CPU A is faster then CPU B when the GPU is removed from the equation. That may or not matter to the end user, but it's still important to know for those users who WILL do GPU upgrades down the road, as it informs them which CPU would be expected to perform better down the line.

- You expect that people will upgrade just GPU not CPU

Which is how most people upgrade; most people build a platform then do a couple of GPU upgrades until the platform is no longer viable. Obviously, if you are planning on replacing the CPU in the short-term, the need to know which CPU will perform better in the long term is N/A. But the information provided is quite relevant for the majority of users.

- You expect that current GPUs behave same way future GPUs do (todays top end GPU = top end GPU 3 years later)

They typically will, barring a complete re-thinking of how rendering is done. Simple example: an Ivy Bridge is still going to perform better then Bulldozer in gaming, despite the fact we've gone from a strictly linear non-threaded graphical pipeline (DX9/OpenGL) to a fully multithreaded one with ray tracing support (DX12/Vuklan). One CPU is simply better then the other, despite the massive amounts of changes done to GPU rendering over the past decade.

Obviously, if these fundamental assumptions change, then all bets are off. But in the short term (10+ years) they are safe assumptions to make.

- You expect that there will be no major improvements on game engines (like core usage, not very long time ago you didn't need more than 4 cores for gaming etc)

See my above point on Ivy Bride v Bulldozer.

Now yes, there is the argument in the longish term that as programs get better at using more cores CPUs with more cores will tend to perform better; classic example would be the E8600 vs Q6600 debate. But we're basically at the point where adding more cores produces minimal benefit outside of specific desktop software that scales basically to infinity; games in particular aren't going to scale in a way where continuing to throw cores is going to affect performance [barring cases where other applications, like streaming apps, are also running; that would actually be a *Very* good test case to start testing with...].
 
Not really; the review is answering one specific question: Which CPU is more powerful. Even if you never change the hardware ever, the reviews still show that CPU A is faster then CPU B when the GPU is removed from the equation. That may or not matter to the end user, but it's still important to know for those users who WILL do GPU upgrades down the road, as it informs them which CPU would be expected to perform better down the line.
More powerful CPU is not absolute thing. Remember Pentium G3258? Overclocked it was considered even faster than i7-4790K when looking at benchmarks. However it only has 2 cores. Guess what happens when benchmarking games that utilize more than 2 cores? Is it expected to perform better in future too?
Which is how most people upgrade; most people build a platform then do a couple of GPU upgrades until the platform is no longer viable. Obviously, if you are planning on replacing the CPU in the short-term, the need to know which CPU will perform better in the long term is N/A. But the information provided is quite relevant for the majority of users
Information might be relevant but like I stated above, there is difference between review and future prediction. If it's future prediction, then everything should be considered. If it's today's review, then it's OK to concentrate on performance today. What Techspot seems to do according to article is to partially (but only partially) predict what future would be.

This summarizes my point:
Of course, we're not saying the 8700K was the better purchase, as it was a more expensive CPU and the Ryzen 5 was a better value offering...
Techspot is trying to say CPU A is better in future but at same time refuse to say which CPU is better buy for future. What? :confused:
They typically will, barring a complete re-thinking of how rendering is done. Simple example: an Ivy Bridge is still going to perform better then Bulldozer in gaming, despite the fact we've gone from a strictly linear non-threaded graphical pipeline (DX9/OpenGL) to a fully multithreaded one with ray tracing support (DX12/Vuklan). One CPU is simply better then the other, despite the massive amounts of changes done to GPU rendering over the past decade.
Leaving GPU out of equation I disagree. I played somewhat lot a game that uses all available CPU cores but stress for GPU is minimal. Because it utilizes all available cores, it will slow down computer IF it has all cores available. Solution? Give it only few cores.

You do want to give at least two cores for other applications. With quad core Intel, you can give only 2 cores for it. With Piledriver you could give six out of eight. Now it's 6 FX cores vs 2 Ivy Bridge cores. I doubt Ivy Bridge would be clear winner. That is gaming basically without any GPU stress.
Obviously, if these fundamental assumptions change, then all bets are off. But in the short term (10+ years) they are safe assumptions to make.
Like Pentium G3258 will be future proof gaming CPU because it performed well when launched?
See my above point on Ivy Bride v Bulldozer.

Now yes, there is the argument in the longish term that as programs get better at using more cores CPUs with more cores will tend to perform better; classic example would be the E8600 vs Q6600 debate. But we're basically at the point where adding more cores produces minimal benefit outside of specific desktop software that scales basically to infinity; games in particular aren't going to scale in a way where continuing to throw cores is going to affect performance [barring cases where other applications, like streaming apps, are also running; that would actually be a *Very* good test case to start testing with...].
And if Ivy Bridge is only dual core part, then what? You seem to assume it's quad core that was more expensive than 8-core Piledriver.

I agree that today we have mostly enough cores for gaming but again dual cores were not enough for future, no matter what benchmarks at that time said. Just like with single core vs dual core when dual core CPUs came out. Single core CPUs usually dominated benchmarks but in reality dual cores were much more usable and very soon faster ones. Good example where benchmarks didn't tell anything about future.
 
As a car reviewer, are you only going to review how well a Jeep does offroad? Only test how well sports cars go in a straight line?

You want to remove bottlenecks. Cool. Do it. 1080p. But....
I think what initiated this is/was with 1080p ONLY testing results with "overkill" hardware. One comment in the first image in this article is complaining about exactly that, and that's what I had a big problem with. Those are all the people that have no use for 1080p ONLY results. Those are the keywords here: 1080p ONLY.

It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.

Aside from that, I also had the idea of a tech site eventually throwing a popular mainstream build in the mix (included in review/separate video series) with say an i3/i5/R5 + 16GB 3200MHz CL16 / DDR5 6000MHz paired with new low to high end GPU's, and a vice versa with low to high end GPU's? Rigs closer to what the majority actually have. I feel it would help a lot of people with upgrading in addition to standard reviews.


The thing is, Techspot does MANY reviews of cross criteria. That ONE review can not have all the data. That is why you have to review EVERY product that makes up your (Someone's) particular system.

That component review is NOT how it will perform in your at rig at home... it's how it performs in relation to other components tested HERE.

Your scores will vary.
 
As a car reviewer, are you only going to review how well a Jeep does offroad? Only test how well sports cars go in a straight line?

You want to remove bottlenecks. Cool. Do it. 1080p. But....
I think what initiated this is/was with 1080p ONLY testing results with "overkill" hardware. One comment in the first image in this article is complaining about exactly that, and that's what I had a big problem with. Those are all the people that have no use for 1080p ONLY results. Those are the keywords here: 1080p ONLY.

It's a review, right? Test the more unlikely configuration to show best performance, but also test the more likely scenario especially for that type of CPU and GPU combo by upping the resolution my man! There is no need for guesswork. Tim doesn't even touch 1080p monitors in his MUB reviews. We're getting excited about higher refresh rates almost weekly. OLED. HDR. 4K. 8K. 1440p+ is just where the industry is going. Omitting it just doesn't make sense.

Aside from that, I also had the idea of a tech site eventually throwing a popular mainstream build in the mix (included in review/separate video series) with say an i3/i5/R5 + 16GB 3200MHz CL16 / DDR5 6000MHz paired with new low to high end GPU's, and a vice versa with low to high end GPU's? Rigs closer to what the majority actually have. I feel it would help a lot of people with upgrading in addition to standard reviews.

computers aren’t cars, and you’re still not understanding it lol.
 
computers aren’t cars, and you’re still not understanding it lol.
If the argument was that there was no use for 1080p results at all, then this article was a big waste of time, because of course there is, but 1080p only to represent them all? - NO.
 
Last edited:
Status
Not open for further replies.
Back