The Best Entry Level Gaming CPU: Athlon 200GE vs. Pentium G5400 vs. Ryzen 3 2200G

Normally I like these reviews, but this one is a bit strange. When dealing with very low end stuff, personally I think it's way more useful to add 1080p Medium & High benchmarks and mid-range dGPU instead of over-benchmarking 1440p / 4K on 2080Ti's. In many games, "High" is typically the best well optimised preset whilst Ultra is often "let's see how much stupid sh*t like Chromatic Aberration, crippling 64x Tessellation & Hairworks we can cram into this for the sake of it". Literally no-one buys these to pair with a $1,500 dGPU and then argue over 19 vs 25 vs 33 min fps in Hitman (yes I know you need to eliminate bottlenecks, but for low-end budget builds, target market relevance often becomes more important, and last year's budget "scaling" comparison was definitely a lot more useful as a buyer's guide to the vast majority with 1050-1060 class dGPU's).

Same goes with choice of games, those of us who also buy them do so for older / Indie games or at least a wider mix. Eg, I grabbed a Pentium G4560 for just £39, threw in a £129 1050Ti for a retro rig build, and am getting 270fps Portal 2, 120fps Bioshock Infinite / DX:HR / Dishonored / Skyrim / The Witness / Divinity Original Sin / Talos Principle, etc) with a GPU costing 1/10th of the review 2080Ti. A GTX 1060 bumps up those 120fps games to nearer 180-200fps. For those who buy a premium 4K monitor specifically for the heaviest, newest & worst optimised $60 AAA's, I honestly think everyone already knows to aim for an i5-8400 / R5 2600 minimum rather than skimp on that last $50.
 
Last edited:
Normally I like these reviews, but this one is a bit strange. When dealing with very low end stuff, personally I think it's way more useful to add 1080p Medium & High benchmarks and mid-range dGPU instead of over-benchmarking 1440p / 4K on 2080Ti's. In many games, "High" is typically the best well optimised preset whilst Ultra is often "let's see how much stupid sh*t like Chromatic Aberration, crippling 64x Tessellation & Hairworks we can cram into this for the sake of it". Literally no-one buys these to pair with a $1,500 dGPU and then argue over 19 vs 25 vs 33 min fps in Hitman (yes I know you need to eliminate bottlenecks, but for low-end budget builds, target market relevance often becomes more important, and last year's budget "scaling" comparison was definitely a lot more useful as a buyer's guide to the vast majority with 1050-1060 class dGPU's).

Same goes with choice of games, those of us who also buy them do so for older / Indie games or at least a wider mix. Eg, I grabbed a Pentium G4560 for just £39, threw in a £129 1050Ti for a retro rig build, and am getting 270fps Portal 2, 120fps Bioshock Infinite / DX:HR / Dishonored / Skyrim / The Witness / Divinity Original Sin / Talos Principle, etc) with a GPU costing 1/10th of the review 2080Ti. A GTX 1060 bumps up those 120fps games to nearer 180-200fps. For those who buy a premium 4K monitor specifically for the heaviest, newest & worst optimised $60 AAA's, I honestly think everyone already knows to aim for an i5-8400 / R5 2600 minimum rather than skimp on that last $50.

The disclaimer you're looking for is right above the Benchmarks title. This is not a PC build guide; it's a CPU comparison review and you want to remove all the other possible bottlenecks. That's how it's done and that's how it will continue to be. The other way around for GPUs comparison: regardless of GPU tier, the best CPU for gaming will be used; with demanding game settings to push the limits of each CPU/GPU tested. There's no good in 720p lowest settings when you can hit a possible FPS wall imposed by the game itself across all compared test subjects and not be able to observe any difference between them because of that.
 
The disclaimer you're looking for is right above the Benchmarks title. This is not a PC build guide; it's a CPU comparison review and you want to remove all the other possible bottlenecks.
Yes, and I already mentioned that I understood bottleneck elimination in my post. My main point was that 1080p Med vs High vs Ultra would have added a lot more useful data points for budget gamers as did last year's wider mix of heavy / light games. It may well be "just" a budget CPU comparison instead of a budget build, but in both cases there's some things people simply don't do in the real world, and my suggestion is - a lot of people who buy low-end do so for reasons other than they want really bad fps on enthusiast settings in new games and are increasingly skipping over "Ultra-only" tech site reviews that end up arguing over unplayably low frame-rates and just head straight for more common sense side-by-side time-synced comparisons that helps them optimise and avoid that in the first place.

Example for GTA V on 1050Ti - One of these presets is disproportionately less well optimised than others, (and the other three are disproportionately more likely to be used by budget gamers using same hardware being reviewed). Think about who the target market is and what and how they actually play in real life.
 
The disclaimer you're looking for is right above the Benchmarks title. This is not a PC build guide; it's a CPU comparison review and you want to remove all the other possible bottlenecks.
Yes, and I already mentioned that I understood bottleneck elimination in my post. My main point was that 1080p Med vs High vs Ultra would have added a lot more useful data points for budget gamers as did last year's wider mix of heavy / light games. It may well be "just" a budget CPU comparison instead of a budget build, but in both cases there's some things people simply don't do in the real world, and my suggestion is - a lot of people who buy low-end do so for reasons other than they want really bad fps on enthusiast settings in new games and are increasingly skipping over "Ultra-only" tech site reviews that end up arguing over unplayably low frame-rates and just head straight for more common sense side-by-side time-synced comparisons that helps them optimise and avoid that in the first place.

Example for GTA V on 1050Ti - One of these presets is disproportionately less well optimised than others, (and the other three are disproportionately more likely to be used by budget gamers using same hardware being reviewed). Think about who the target market is and what and how they actually play in real life.
So, "I read the disclaimer and continue to ignore it so I can complain about how the benchmarks were done and how they were not 100% what I wanted to see".
 
So, "I read the disclaimer and continue to ignore it so I can complain about how the benchmarks were done and how they were not 100% what I wanted to see".
For those who are struggling with reading comprehension, here's what I'm saying in fewer words - If I were limited to a $100 CPU, rather than mess about with 18-22 min fps, I'd rather turn down the settings. That's called "common sense". Now if Review A only benchmarks Ultra, whilst review B benchmarks all presets, then I'll read review B because the whole dataset of review A then becomes pointless. The review for what it does (including disclaimer) is accurate. It's just far less helpful data than many other reviews. Kind of like how enthusiasts are wrapped up in their own epeen with "Can it run Crysis" whilst everyone else just rolls their eyes, nudges the settings down a notch and struggles to see the difference. Your language skills aren't THAT bad guys... ;)
 
Last edited:
If you live in the US, Micro Center's had the 2200G on sale for $79.99 for months. It used to be that price was only available in-store, but they apparently dropped that policy this year and you can get it online. At that price, there doesn't seem much point in even looking at the other two chips.
 
For those who are struggling with reading comprehension, here's what I'm saying in fewer words - If I were limited to a $100 CPU, rather than mess about with 18-22 min fps, I'd rather turn down the settings. That's called "common sense". Now if Review A only benchmarks Ultra, whilst review B benchmarks all presets, then I'll read review B because the whole dataset of review A then becomes pointless. The review for what it does (including disclaimer) is accurate. It's just far less helpful data than many other reviews. Kind of like how enthusiasts are wrapped up in their own epeen with "Can it run Crysis" whilst everyone else just rolls their eyes, nudges the settings down a notch and struggles to see the difference. Your language skills aren't THAT bad guys... ;)
I get you, believe me, but like @EEatGDL tried to explain this is not meant to serve as a build guide that explains the exact frame rates you can get from a budget PC using either of these three processors. You can infere some of that, and you can look back at our individual reviews and previous tests for that, too.

But when we run this kind of comparisons we prefer to remove all bottlenecks and...... actually we wrote a full article explaining why :).

How We Test: CPU Gaming Benchmarks
or: How I Learned to Stop Worrying and Benchmark using the GTX 1080 Ti
A highlight quote from that article: "When testing new CPUs we have two main goals in mind: #1 to work out how it performs right now, and #2 how ‘future-proof’ is it. Will it still be serving you well in a year's time, for example?"

I think we'll be bumping that next week for #tbt *nerd*
 
I get you, believe me, but like @EEatGDL tried to explain this is not meant to serve as a build guide that explains the exact frame rates you can get from a budget PC using either of these three processors.
Sure I understand that Julio. My comment was intended to be positive feedback suggestion rather than negative, but whereas top-end GPU eliminates GPU bottlenecks, presets obviously do impact the CPU as well. Budget gamers make a lot more use of them yet few tech sites benchmark them (including supporting "build" articles), and it's simply been noticeable that a lot of the new faster growing Youtube reviewers are focusing on optimising overall playability, and increasingly becoming the "go to" place for "How will x CPU + y GPU work on z game and what can I do to improve that?" all in one place. Perhaps an idea for a future article? Likewise the suggestion of a wider choice of "Middleweight" games is simply a reflection of what most people are actually playing and streaming.
 
Last edited:
I get you, believe me, but like @EEatGDL tried to explain this is not meant to serve as a build guide that explains the exact frame rates you can get from a budget PC using either of these three processors.
Sure I understand that Julio. My comment was intended to be positive feedback suggestion rather than negative, but whereas top-end GPU eliminates GPU bottlenecks, presets obviously do impact the CPU as well. Budget gamers make a lot more use of them yet few tech sites benchmark them (including supporting "build" articles), and it's simply been noticeable that a lot of the new faster growing Youtube reviewers are focusing on optimising overall playability, and increasingly becoming the "go to" place for "How will x CPU + y GPU work on z game and what can I do to improve that?" all in one place. Perhaps an idea for a future article? Likewise the suggestion of a wider choice of "Middleweight" games is simply a reflection of what most people are actually playing and streaming.

Its ok if some presets do use more CPU than others. As long as all cpus where tested with the same presets. The cpu with the better $ to fps will be the same.
 
... and am getting 270fps Portal 2, 120fps Bioshock Infinite / DX:HR / Dishonored / Skyrim / The Witness / Divinity Original Sin / Talos Principle, etc) with a GPU costing 1/10th of the review 2080Ti. ...
Never understood this xxx fps fetish as it makes zero sense. Thus, testing it makes again zero sense. As I have also 1050Ti, you need to be playing at 720p or low setting to get 120 fps. Really enjoyable.
 
I have to agree with BSim500 on this. Comparing with a high end card is of academic interest, but it's not all that useful for buyers looking to optimise their low end build.

Think of it from the point of view of someone in a country where salaries are 10th of what they're in the west. For such a person, $40 is like $400 to you. Those $40 may be the difference between having an SSD or not having one, or between a GeForce 1050 2GB and GeForce 1050 Ti 4GB.

When making such buying decisions, if the GPU is already a bottleneck, then spending $40 extra on the CPU is like throwing it away. Without benchmarks that take into consideration the actual hardware a budget user will have, and game settings relevant to such hardware, it will be hard for such a user to make a good buying decision.

It's possible that a 2200G would still be considerably faster than a 200GE with a low end card and lower settings, but it's also possible that it won't. Which is why this benchmark isn't really helpful in that case.

So yes, this article is of some general interest, but it's not of much practical help.
 
I have to agree with BSim500 on this. Comparing with a high end card is of academic interest, but it's not all that useful for buyers looking to optimise their low end build.

Think of it from the point of view of someone in a country where salaries are 10th of what they're in the west. For such a person, $40 is like $400 to you. Those $40 may be the difference between having an SSD or not having one, or between a GeForce 1050 2GB and GeForce 1050 Ti 4GB.

When making such buying decisions, if the GPU is already a bottleneck, then spending $40 extra on the CPU is like throwing it away. Without benchmarks that take into consideration the actual hardware a budget user will have, and game settings relevant to such hardware, it will be hard for such a user to make a good buying decision.

It's possible that a 2200G would still be considerably faster than a 200GE with a low end card and lower settings, but it's also possible that it won't. Which is why this benchmark isn't really helpful in that case.

So yes, this article is of some general interest, but it's not of much practical help.

It makes sense, however, the everyday practice proves it wrong. Saving money on CPU is something you will regret in 2 or 3 years. Those who saved the money and bought dual cores (Pentiums, i3) instead of 4cores are pushed to upgrade, as 4core is nowadays a must, many games prove this. And just 3 years ago, many Pentiums and i3 were sold. But today, they are breathless in games like BF1, etc using more threads.
Well, the situation was completely different, the CPU progress thanks to Intel was steady as *** and it looked like 4 core would be with us for another 20 years, but getting more on CPU is more worthy, as it is more complicated (depends on platform/socket) to change.
 
CPU is more worthy, as it is more complicated (depends on platform/socket) to change.

For AMD, at least, I think it's rather easy. If a 200GE is enough now, but won't be in 2 years, it shouldn't be a problem to buy a new low end CPU then. I think there's a good chance that by 2020 we'd have low end AMD CPUs with more cores.

As for the point itself, it depends on when you plan to upgrade the GPU. Unupgraded, the GPU will likely remain a bottleneck for future games, even if they want more cores. I think that's the case even today -- that is, games which need more cores typically also need more GPU power.

I think that when optimising over time, as long as you can use existing board for a future CPU upgrade, buying a somewhat better GPU is more useful than buying a somewhat better CPU, because the GPU is the more expensive component.

(If going for Intel, it's possible that indeed it's worth paying more for the CPU up front, because upgrades will be harder.)

Of course, this all hinges on the assumption that a 200GE is enough when paired with, say, a GeForce 1050 Ti and running at a suitable quality for decent FPS. It may be that the 2200G is still significantly better even in this configuration, or even that a 2200G + 1050 is better than a 200GE + 1050 Ti. Who can say? And that's what's missing from this article.
 
If you live in the US, Micro Center's had the 2200G on sale for $79.99 for months. It used to be that price was only available in-store, but they apparently dropped that policy this year and you can get it online. At that price, there doesn't seem much point in even looking at the other two chips.

I was literally just about to say that.
Don't forget they also give you another 30 bucks off if you buy a motherboard too. I bought my son a 2200g and a b350 tomahawk for 130 bucks total several months ago. I feel bad for people that don't have one near them.
 
For those who are struggling with reading comprehension, here's what I'm saying in fewer words - If I were limited to a $100 CPU, rather than mess about with 18-22 min fps, I'd rather turn down the settings. That's called "common sense". Now if Review A only benchmarks Ultra, whilst review B benchmarks all presets, then I'll read review B because the whole dataset of review A then becomes pointless. The review for what it does (including disclaimer) is accurate. It's just far less helpful data than many other reviews. Kind of like how enthusiasts are wrapped up in their own epeen with "Can it run Crysis" whilst everyone else just rolls their eyes, nudges the settings down a notch and struggles to see the difference. Your language skills aren't THAT bad guys... ;)
I get you, believe me, but like @EEatGDL tried to explain this is not meant to serve as a build guide that explains the exact frame rates you can get from a budget PC using either of these three processors. You can infere some of that, and you can look back at our individual reviews and previous tests for that, too.

But when we run this kind of comparisons we prefer to remove all bottlenecks and...... actually we wrote a full article explaining why :).

How We Test: CPU Gaming Benchmarks
or: How I Learned to Stop Worrying and Benchmark using the GTX 1080 Ti
A highlight quote from that article: "When testing new CPUs we have two main goals in mind: #1 to work out how it performs right now, and #2 how ‘future-proof’ is it. Will it still be serving you well in a year's time, for example?"

I think we'll be bumping that next week for #tbt *nerd*

Who is this review meant to serve though? People that want to know the theoretical maximum performance of entry level cpu's for curiosity's sake only? People that would actually consider buying them? If buyers, how do they translate this into useful information? How much does that theoretical peak performance with a 2080ti actually translate into real performance when I actually get a midrange gpu I might consider? Am I just suppose to think, higher, better, more future-proof and that's it? If a gpu review with an 8700K+GTX 1050 matches the performance of a cpu review of a 200GE+GTX 2080ti, then does this mean that the 200GE+GTX 1050 is a well matched combo? Are there varying driver overhead or settings issues with higher end cards that could skew the results? Is it possible to have the necessary information in one review so I don't have to go searching for several reviews just to interpret the data?
 
Without repeating myself, or the article about how we test (and why, I'll paraphrase this specific article's conclusion...

Who is this review meant to serve though?

If you want to spend $100 or less on a CPU for gaming, the Intel Pentium is crap at the current price that's well over the MSRP.

If you can spend the full $100, buy the 2200G, that's your best bet at this price point.

If you can't spend that much (an extra $40) then you're not doing a GPU and only require the basics, for that the Athlon is pretty decent for sure.
 
Nice to see these reviews in TechSpot. You don't see these in other sites often. But I think a GTX 1050 would have been a better choice here. Also integrated graphics. Yes we already know the results. But people new in computer hardware, that only know that "Intel is better than AMD" will easily assume that this is the case also in integrated graphics. A few words in the review will probably get lost in the "fast forward, looking only at charts" way most people "read" reviews.
 
None of the FPS scores you have here are realistic. A person spending £1500 on a graphics card is not the same person spending £300 to buy an entire entry level system. Its a bit like saying "here are 3 skateboards, so we put a Porsche engine on them because 'reasons' "

I want to know the specifics of what the FPS of each chip is capable of, not the 2080ti. The fact is that nowhere have you made any attempt to explain that the onboard Vega graphics cores on both the AMD chips are MILES BETTER than the intel integrated graphics are. I'm currently in the market for a cheap setup to build a new HTPC, but this review is telling me "dont bother, the 200GE is rubbish" when it is in fact perfectly acceptable to use for watching Blu-Rays and Streaming services and will run flash games and steam games just fine at a price point that makes sense. theres no point making disclaimers about the review after the fact, or was the title of the article just clickbait?
 
Last edited:
If I was gaming on a budget I would rather have a quad core i3 than any of the budget parts here. The i3’s are cheaper in the UK than prices quoted in this article. And are clearly superior for gaming.
 
If I was gaming on a budget I would rather have a quad core i3 than any of the budget parts here. The i3’s are cheaper in the UK than prices quoted in this article. And are clearly superior for gaming.

with no dGPU added. I don't think so.
 
Back