Core i5-8400 vs. Overclocked Ryzen 5 1600

@Theinsanegamer
This article was testing cpus that are available RIGHT NOW, with games that are also available RIGHT NOW, not games 5 years from now.
Really? Because those 720p results try to come to conclusions about 5 years from now. With games that are available today. Isn't this suppose to be the reasoning behind testing at 720p? That's what I am saying. You can't come to conclusions about what a processor will be capable off in 2-3 years, when that processor comes with the minimum number of cores that it is considered necessary today for avoiding bottlenecks. The 8400 will start dropping in charts pretty fast from next year when Intel starts selling 8 core mainstream processors. And while it will probably never fall at Ryzen levels, that have their restrictions in gaming, the difference will be much smaller compared to what is shown here.

@BSim500
The review is already unfair when 720p gets all the attention, for the reasons I already posted. And even if we agree that Civ VI is an anomaly, you can't start finding excuses why Intel is losing in that benchmark, when just in the previous benchmark, with the same GPU and the same API, you didn't feel the need to post a number of possible reasons for the results.

@dirtyferret
We just started seeing the first 6 core Intel processors. 4 cores are dead and everyone knows it. Probably even you. The second chart of yours shows what I am talking about. Just compare the Pentium with the i3 results. Huge difference in the minimum FPS.

720p results "can" show the future performance of the processor but it is never guaranteed nor is future performance anything you can guarantee. In essence, 720p shows the maximum potential of a processor, so long as it isn't bottlenecked by cores, instructions, or cache. It's very possible that games in the future will rely on more cores and essentially make the IPC conclusions at 720p pointless because these processor will be bottlenecked by their core count. It's also possible games start using a newer instruction, and all CPUs without it are forced to use a slower code path.

In any event, 720p benchmarks are completely fine.
 
Actually a very poor review. The tyest should have 16GB memory in all three machines. Some game titles benefit significantly with more memory added even from 16GB to 32 GB. Since the machines did not have equal memory the results are completely garbage. I expect if all three machines had 16GB memory the results for the Intel machines for FPS would be much lower and they would be in the same ball park.

Cheers thanks for the light-hearted humor. Just yesterday I was benchmarking with anyone of these games and I was like, why the hell are they using over 16GB's of memory. Now we know this is normal and these titles benefit significantly with more memory added even from 16GB to 32 GB.
 
No reviewer who takes their job seriously will test CPUs at 4K, I don't have the words to describe how dumb that is. Can you get over the low resolution testing, either get over it or go away we're all sick of reading this nonsense.
I have no problem with low resolution testing. Actually, I agree that it's the only way to test a CPU. The problems I have are with the conclusions drawn from the testing.
Then why don't you draw your own conclusions, instead of whining about Steve's? You obviously feel that you're his intellectual equal on this subject, after all.
 
Boohoo, I'll hafta' hope that the 7-1700 will perform on par with the 5-1600.. poor me.. (per my last comment).
Thanks Steve, you patience is exemplary as always
(suggest you boilerplate the last several paragraphs with variables from the article titles.. it won't Help the my-way-or-die commenters, but might save you time and callouses, lol)
 
You know, either way - I would be happy to own a Ryzen 5 1600 or an Intel i5-8400. Beautiful new CPU's that rock everyone's gaming world.
 
Another good review again, Steve. 1% mins (in addition to avg fps) plus multiple resolutions should really be the standard for all tech sites.

Here's an idea for a future budget article - by how much does Ryzen vs Coffee Lake differ on older games? I'm thinking pre-2014 stuff like the Bioshock Trilogy, Deus Ex Human Revolution, Half Life 2, Portal 2, Oblivion / Skyrim, etc. Hell, it's that time of year when I'm more getting the urge to fire up Amnesia: Dark Descent (2010), SOMA (2015) or have a blast at FEAR1 (2005) again, than struggle to maintain interest in some recent 2017 titles. A lot of us play a wider mix of old & new games than most mainstream tech sites tend to represent, and I haven't seen a single site do something like an R3 1300X vs i3-8100 for older games outside of the usual "12-game bubble" (BF1, Civ VI, GTA V, Hitman, Overwatch, Tomb Raider, Witcher 3, etc).

In theory "all old games will run fine on modern CPU's", in practise some open-world titles like Morrowind / Oblivion / Operation Flashpoint with large draw distances can absolutely bring a modern CPU to its knees by maxing out only 1-2 cores. In theory the Intel would win on IPC (based on Cinebench 1T scores), but the question is by how much do those synthetics scale on older 1-4 core usage games in actual practice? Be interesting to do something different and pick a few oldies to test as part of a budget gaming article series.

Actually this has been documented already on youtube videos and some websites. Ryzen does pretty bad as most were designed to run on a single core with high IPC and high clockspeeds. Don't take my word for it, just google it.
 
At 2560x1080 with GTX 1070, which one of those two cpus would you recommend me? My pc is only for gaming.
Im going for R5-1600 for the Socket longevity, AM4 will be compatible with Feb (Zen+) and will live up to 2020 but Intel Changes Socket every 2 gens or 2 years (less sometimes)

Grettings

Everyone talking about longevity this and that but how often do you update your platform? For real, unless you're a hardcore enthusiast you probably do it every 3 to 4 years and a new socket will be out by then, it just makes for a very poor excuse. No matter where you end up going do yourself a favor and don't cheap on the motherboard and by that I mean a B350. If you want to brag so much about "longevity" then why buy the cheapest? Don't you want to keep it around for 4 years? Wouldn't an X370 - Z370 provide the best features and quality components? I simply do not get it, say what you may about the prices I think of it as an investment and anyway nothing new is in the immediate future, no DDR5, no PCI-E 4.0 not much of anything so what you get now is probably what's going to stick with you for another 3 years.
 
Another good review again, Steve. 1% mins (in addition to avg fps) plus multiple resolutions should really be the standard for all tech sites.

Here's an idea for a future budget article - by how much does Ryzen vs Coffee Lake differ on older games? I'm thinking pre-2014 stuff like the Bioshock Trilogy, Deus Ex Human Revolution, Half Life 2, Portal 2, Oblivion / Skyrim, etc. Hell, it's that time of year when I'm more getting the urge to fire up Amnesia: Dark Descent (2010), SOMA (2015) or have a blast at FEAR1 (2005) again, than struggle to maintain interest in some recent 2017 titles. A lot of us play a wider mix of old & new games than most mainstream tech sites tend to represent, and I haven't seen a single site do something like an R3 1300X vs i3-8100 for older games outside of the usual "12-game bubble" (BF1, Civ VI, GTA V, Hitman, Overwatch, Tomb Raider, Witcher 3, etc).

In theory "all old games will run fine on modern CPU's", in practise some open-world titles like Morrowind / Oblivion / Operation Flashpoint with large draw distances can absolutely bring a modern CPU to its knees by maxing out only 1-2 cores. In theory the Intel would win on IPC (based on Cinebench 1T scores), but the question is by how much do those synthetics scale on older 1-4 core usage games in actual practice? Be interesting to do something different and pick a few oldies to test as part of a budget gaming article series.

If it can play new games, it will def play old games.

Why waste time telling us what we should already know?
 
720p results "can" show the future performance of the processor but it is never guaranteed nor is future performance anything you can guarantee. In essence, 720p shows the maximum potential of a processor, so long as it isn't bottlenecked by cores, instructions, or cache. It's very possible that games in the future will rely on more cores and essentially make the IPC conclusions at 720p pointless because these processor will be bottlenecked by their core count. It's also possible games start using a newer instruction, and all CPUs without it are forced to use a slower code path.

In any event, 720p benchmarks are completely fine.
More or less, that's the idea, but when I am writing it, I am an AMD fanboy, hijacking the comments. When you write it, well, that's different.

720p can give an indication, but only based on today's games, API's, programming habits, processors in the market, target groups for developers, what hardware companies try to push in the market, whatever. And that's the problem. All these things change. And all conclusions made today can fall apart latter. Especially when we are talking about a processor that will be the absolute minimum for a mainstream system tomorrow, like what it was a quad core yesterday. Who buys a quad core Kaby Lake today when there are Ryzen 5 and 8400 at the same price? Only someone ho doesn't read tech news. By the time Intel will be able to rump up production of Coffee Lakes, Quad core i5s will be the modern i3s. But 6 months ago a quad core Kaby Lake was looking super future proof for gaming.

The problem with 720p is that, while you explain how the results are not a guarantee and give a number of reason why things could change in a way that today's 720p results could end up pointless in the future, articles are written as 720p is the alpha and the omega. And that's the problem. 1080p, 1440p and 2160p results look more like results you check AFTER buying your system, because you know, GPU bottleneck, just to check and see where you new system stands. On the other hand 720p seems more like results used to push "future proof" sales, with articles giving extra significance to those results.

Let me say something here. Intel processors ARE the way to go for pure gaming. But the 720p results are used to exaggerate their advantage in current or older games and also create an image that that advantage will remain unchanged in the future and future games. Do anyone really believes that people reading those 720p results expect in 2-3 years their new 8400 to not continue killing a Ryzen 1600 as it is shown in those results? Let's not forget that the only example where Ryzen wins is mentioned as an anomaly.
 
More or less, that's the idea, but when I am writing it, I am an AMD fanboy, hijacking the comments. When you write it, well, that's different.

720p can give an indication, but only based on today's games, API's, programming habits, processors in the market, target groups for developers, what hardware companies try to push in the market, whatever. And that's the problem. All these things change. And all conclusions made today can fall apart latter. Especially when we are talking about a processor that will be the absolute minimum for a mainstream system tomorrow, like what it was a quad core yesterday. Who buys a quad core Kaby Lake today when there are Ryzen 5 and 8400 at the same price? Only someone ho doesn't read tech news. By the time Intel will be able to rump up production of Coffee Lakes, Quad core i5s will be the modern i3s. But 6 months ago a quad core Kaby Lake was looking super future proof for gaming.

The problem with 720p is that, while you explain how the results are not a guarantee and give a number of reason why things could change in a way that today's 720p results could end up pointless in the future, articles are written as 720p is the alpha and the omega. And that's the problem. 1080p, 1440p and 2160p results look more like results you check AFTER buying your system, because you know, GPU bottleneck, just to check and see where you new system stands. On the other hand 720p seems more like results used to push "future proof" sales, with articles giving extra significance to those results.

Let me say something here. Intel processors ARE the way to go for pure gaming. But the 720p results are used to exaggerate their advantage in current or older games and also create an image that that advantage will remain unchanged in the future and future games. Do anyone really believes that people reading those 720p results expect in 2-3 years their new 8400 to not continue killing a Ryzen 1600 as it is shown in those results? Let's not forget that the only example where Ryzen wins is mentioned as an anomaly.

I’d just like to point out that this is a very narrow sighted view on the matter. High refresh rate gaming is becoming very popular and a big part of enabling high fps performance rests on the CPU and memory speed. Most high refresh rate gamers sacrifice quality in order to achieve greater performance.

I’m not talking about resolution but rather quality settings. Generally speaking, those with 1080p 144 - 240 Hz monitors will game at medium or even low quality settings to achieve high frame rates. They do this for a number of reasons, namely to gain a competitive edge, whether that's the ability to see things faster or reduced input lag.

For these games the low resolution testing where the GPU bottleneck is reduced or removed are important to work out just how many frames the CPU will allow under the right conditions.

So a big part of this testing isn’t even about the future, it’s about right now.

Finally not sure if you had trouble following the thread but I never said you did any hijacking and I certainly didn’t call you a fanboy.
 
I’d just like to point out that this is a very narrow sighted view on the matter. High refresh rate gaming is becoming very popular and a big part of enabling high fps performance rests on the CPU and memory speed. Most high refresh rate gamers sacrifice quality in order to achieve greater performance.

I’m not talking about resolution but rather quality settings. Generally speaking, those with 1080p 144 - 240 Hz monitors will game at medium or even low quality settings to achieve high frame rates. They do this for a number of reasons, namely to gain a competitive edge, whether that's the ability to see things faster or reduced input lag.

For these games the low resolution testing where the GPU bottleneck is reduced or removed are important to work out just how many frames the CPU will allow under the right conditions.

So a big part of this testing isn’t even about the future, it’s about right now.

Finally not sure if you had trouble following the thread but I never said you did any hijacking and I certainly didn’t call you a fanboy.
No you didn't called me anything, but when 1-2-3 readers post a different opinion and people start talking about fanboys and hijackers, I believe most people will assume who the fanboys are or who hijacked the thread. If I am wrong, my bad, apologies.

And while mine is probably a narrow sighted view on the matter, no sarcasm here to be clear, I do think that the conclusions made based on the 720p testing lead to also a very specific and narrow conclusion that it is easily applied to everything. From 144fps gaming to 30fps gaming. Because in the end, the processor A that beats processor B by 30% today at 720p, will also beat it by 30% tomorrow. It's extremely easy to come to that conclusion, based on the 720p testing, excluding that anomaly of course(a little sarcasm here).

Anyway, thanks for the review, thanks for your time to answer my posts. Let's stop it here, so that you continue with whatever else you have to do. News and tests never stop. Especially in the middle of the week.
 
No you didn't called me anything, but when 1-2-3 readers post a different opinion and people start talking about fanboys and hijackers, I believe most people will assume who the fanboys are or who hijacked the thread. If I am wrong, my bad, apologies.

And while mine is probably a narrow sighted view on the matter, no sarcasm here to be clear, I do think that the conclusions made based on the 720p testing lead to also a very specific and narrow conclusion that it is easily applied to everything. From 144fps gaming to 30fps gaming. Because in the end, the processor A that beats processor B by 30% today at 720p, will also beat it by 30% tomorrow. It's extremely easy to come to that conclusion, based on the 720p testing, excluding that anomaly of course(a little sarcasm here).

Anyway, thanks for the review, thanks for your time to answer my posts. Let's stop it here, so that you continue with whatever else you have to do. News and tests never stop. Especially in the middle of the week.

If it beats it by 30% today, it will certainly beat it by 30% tomorrow. In fact you are almost guaranteed in a years time it will still be 30% faster, 2 years probably still 30% faster and quite possibly in 3 years time but it starts to get a bit blurry at that point. So if you're building a PC today with the realistic intention of keeping it for 2-3 years, then you are without question smart to base that decision on what you see today.

Today Vega gets trampled by Pascal in most games, same will probably be true in a years time, maybe 2 years but I think beyond 3 years Vega might be the superior architecture. So should you buy it today in the hope that in 3+ years it will have paid off? Or in 2 years will you have a new GPU anyway?
 
If it beats it by 30% today, it will certainly beat it by 30% tomorrow. In fact you are almost guaranteed in a years time it will still be 30% faster, 2 years probably still 30% faster and quite possibly in 3 years time but it starts to get a bit blurry at that point. So if you're building a PC today with the realistic intention of keeping it for 2-3 years, then you are without question smart to base that decision on what you see today.

Today Vega gets trampled by Pascal in most games, same will probably be true in a years time, maybe 2 years but I think beyond 3 years Vega might be the superior architecture. So should you buy it today in the hope that in 3+ years it will have paid off? Or in 2 years will you have a new GPU anyway?
We have a different opinion in the first paragraph. Your own testing from 2015 shows that things aren't as you say, but never mind, those tests where wrong I guess. You have improved you said. But I will insist. This is not 2011. We don't have 6 more years of quad cores in front of us.

As for Vega, you don't buy a Vega today, except if you are going to combine it with a FreeSync monitor and save yourself $150-$200 with that combination compared to an 1070 Ti and a GSync monitor. In any other case, you just don't buy a Vega. Vega can not be upgraded and drivers are not going to save it.
 
We have a different opinion in the first paragraph. Your own testing from 2015 shows that things aren't as you say, but never mind, those tests where wrong I guess. You have improved you said. But I will insist. This is not 2011. We don't have 6 more years of quad cores in front of us.

As for Vega, you don't buy a Vega today, except if you are going to combine it with a FreeSync monitor and save yourself $150-$200 with that combination compared to an 1070 Ti and a GSync monitor. In any other case, you just don't buy a Vega. Vega can not be upgraded and drivers are not going to save it.

I'd like to come back to the Vega discussion in 3-4 years ;)

I feel like we've had a swing and a miss on the other points so I'm happy to leave it as an agree to disagree situation.
 
If it can play new games, it will def play old games.
Because "all old games will run at 60fps if new ones do" is a complete fallacy at odds with observable reality (usually repeated by those who don't actually play such games). Examples : Morrowind (with MXE Graphical Extender that significantly increases view distance) or Oblivion (with mods like Unique Landscapes, especially around the "Aspen Wood" area) will stutter a lot more than Doom (2016) or Skyrim (2012) on the same modern CPU unless fed enough high 1T performance. Other games such as Neverwinter Nights with large mods, or even Age of Empires 2 HD with its unit cap significantly raised from classic 200 to over +1000 on giant maps can totally saturate one core. So no, you cannot blindly extrapolate guesses about performance of older games based purely on release year.
 
Then why don't you draw your own conclusions, instead of whining about Steve's? You obviously feel that you're his intellectual equal on this subject, after all.
I never whined about his conclusion's actually. I was arguing with a member of the forum, not Steve. And I did draw my own conclusions, which I shared (hey, that's what the forum is for, right?) and then I got called an AMD fanboy, even though I never used the word AMD once. Go figure..
 
Once again Civ V results are upside down. It had been proven on other tech sites that Intel chips complete the turns or in other words, the CPU load of the game faster than Ryzen. You get better FPS with Ryzen because little is going on whilst the game is waiting for the turn to process. This is not in any way indicative of DX12 performance in the future. If Steve, you could do an article explaining your position on this rather bizarre statement then I would very much appreciate it.

Also LOL at the Usual AMD fanboys claiming that 720p testing is irrelevant. The results are clear as day. Removing the GPU bottleneck exposes the true performance differences of these chips in current games and this is and always has been the best way of determining longevity of a chip in gaming performance. It’s not a new practise and has only recently been criticised - probably because the AMD fans don’t like it.

Finally, Intel is faster at gaming, web, MS office and photoshop and therefore better for productivity in most cases. Only unless a user has a specifically demanding niche application like video editing or encoding is Ryzen a stronger productivity machine. This is because “productivity” for 9 users out of 10 means MS office or web apps.

I understand we need a competitor in the marketplace. But lying about Ryzen (not aimed at reviewer - with exception of Civ his reviews are truth) and claiming it’s more competitive than it really is, is not the solution. It’s bad for consumers and the market in general

I am personally diappointed. Zen arrived after 5 years of hype and for me it failed to deliver. It never beat Intel at gaming, web, MS office and photoshop, the mainstays of the vast majority of home users, not to mention excluding an inbuilt GPU, something most productivity users would appreciate as most don’t need powerful 3D acceleration. This new 8th gen takes that a step further and nearly catches up to AMD in the one niche area that it can actually perform in.

Ryzen is done. You would be a fool to opt for the platform with very few exceptions today and even more so in 2/3 months when the cheaper Intel boards arrive. What a massive shame for the tech world in general, AMD have let us all down. Again.

*Edit*
Can we also look at the fact that we are assuming a 4ghz overclock for the 1600. Not everyone will get this or will be able to do this. Some tech reviewers have stated their sample won’t go above 3.8. I know it’s a small thing but at least the Intel performance here is pretty much garuanteed out of the box. I mean, if you bought a 1600, took it home, added a beefy, expensive power supply/cooler/mobo to overclock it and only achieved 3.8 you would be pretty frustrated. By contrast you can spend less on all the components on an Intel build and still get this performance. Also, do these locked Intel chips have base clock over clocking like Sky/Kaby lake? If they do then you can know that whilst you won’t get loads more out of it, it’s possible to get a bit more.
 
Last edited:
Also LOL at the Usual AMD fanboys claiming that 720p testing is irrelevant.

Nobody said that. They are irrelevant when it comes to future gaming performance. Also, LOL at the Usual Intel fanboys. See what I did there? Calling each other fanboys ain't really productive and it isn't an argument. If you actually have a valid point then you can criticize the argument, not the one who is using the argument. If I'm right, I'm right, regardless of whether or not I'm a fanboy or not.
 
Nobody said that. They are irrelevant when it comes to future gaming performance. Also, LOL at the Usual Intel fanboys. See what I did there? Calling each other fanboys ain't really productive and it isn't an argument. If you actually have a valid point then you can criticize the argument, not the one who is using the argument. If I'm right, I'm right, regardless of whether or not I'm a fanboy or not.
If you read my comment again and add “to future gaming performance” to the quote you pulled from it then you will grasp what I was saying. It’s not difficult mate.

Anyway, 720p testing is actually in my opinion only relevant to future performance as most people won’t be gaming at 720p today with these chips. If you think that current 720 games performance is more relevant to today than the future then you really haven’t got a hold on how these systems are working. Of course it’s not perfect. But if you could provide all the tech reviewers out there with a more accurate method of determining which chips will perform better when you add a more powerful graphics card to them in say 3 years time then I’m sure they will all be very grateful.

Oh and don’t start banging on about games changing towards being more sensitive to core count than IPC. It will be both. Games require a minimum amount cores to run and the performance is determined by how fast those cores go. And you are exceptionally delusional if you believe any game released in the next 4-5 years will require an 8 core CPU. I firmly believe that with the progression of modern APIs that CPU requirements for games will begin to stagnate. We are already seeing i3s perform more than fast enough for most gamers needs when 5 years ago they weren’t. CPU load is reduced on both Vulcan and DX12 when compared to DX11 in the same titles. And with modern consoles shipping with what effectively tablet processors It’s hard to see mainstream gaming becoming more dependent on CPU power. This is a great thing for gamers, as they won’t need to buy expensive multithreaded CPUs just to get the most out of their graphics cards.
 
Last edited:
Real world gaming, played on my own gear says that either platform slays all of the games that I like. (shooters mostly) I have 1080P screens and one 4K.
I have some pretty high-end Intel boxes here, and two Ryzen. The last that I bought is a Ryzen 5 1500X/B-350 ROG Strix setup that I'll put together this weekend.
Slight variations in performance don't even register on my radar anymore. I think that the key is to have good GPUs to use.
 
Once again Civ V results are upside down. It had been proven on other tech sites that Intel chips complete the turns or in other words, the CPU load of the game faster than Ryzen. You get better FPS with Ryzen because little is going on whilst the game is waiting for the turn to process. This is not in any way indicative of DX12 performance in the future. If Steve, you could do an article explaining your position on this rather bizarre statement then I would very much appreciate it.

Also LOL at the Usual AMD fanboys claiming that 720p testing is irrelevant. The results are clear as day. Removing the GPU bottleneck exposes the true performance differences of these chips in current games and this is and always has been the best way of determining longevity of a chip in gaming performance. It’s not a new practise and has only recently been criticised - probably because the AMD fans don’t like it.

Finally, Intel is faster at gaming, web, MS office and photoshop and therefore better for productivity in most cases. Only unless a user has a specifically demanding niche application like video editing or encoding is Ryzen a stronger productivity machine. This is because “productivity” for 9 users out of 10 means MS office or web apps.

I understand we need a competitor in the marketplace. But lying about Ryzen (not aimed at reviewer - with exception of Civ his reviews are truth) and claiming it’s more competitive than it really is, is not the solution. It’s bad for consumers and the market in general

I am personally diappointed. Zen arrived after 5 years of hype and for me it failed to deliver. It never beat Intel at gaming, web, MS office and photoshop, the mainstays of the vast majority of home users, not to mention excluding an inbuilt GPU, something most productivity users would appreciate as most don’t need powerful 3D acceleration. This new 8th gen takes that a step further and nearly catches up to AMD in the one niche area that it can actually perform in.

Ryzen is done. You would be a fool to opt for the platform with very few exceptions today and even more so in 2/3 months when the cheaper Intel boards arrive. What a massive shame for the tech world in general, AMD have let us all down. Again.

*Edit*
Can we also look at the fact that we are assuming a 4ghz overclock for the 1600. Not everyone will get this or will be able to do this. Some tech reviewers have stated their sample won’t go above 3.8. I know it’s a small thing but at least the Intel performance here is pretty much garuanteed out of the box. I mean, if you bought a 1600, took it home, added a beefy, expensive power supply/cooler/mobo to overclock it and only achieved 3.8 you would be pretty frustrated. By contrast you can spend less on all the components on an Intel build and still get this performance. Also, do these locked Intel chips have base clock over clocking like Sky/Kaby lake? If they do then you can know that whilst you won’t get loads more out of it, it’s possible to get a bit more.

Once again you don't know what you're talking about. The Civ results are not upside down, this test has nothing to do with AI turns.
 
Real world gaming, played on my own gear says that either platform slays all of the games that I like. (shooters mostly) I have 1080P screens and one 4K.
I have some pretty high-end Intel boxes here, and two Ryzen. The last that I bought is a Ryzen 5 1500X/B-350 ROG Strix setup that I'll put together this weekend.
Slight variations in performance don't even register on my radar anymore. I think that the key is to have good GPUs to use.

Yep. Unless you are going high refresh rate or are a pro gamer, you don't have to spend much to get a CPU that can handle all modern games when paired with a competent GPU.

The only thing I'm waiting for right now is AMD to release low end Ryzen processor. As it stands, CPU's under $100 haven't benefited much from Ryzen's release and Intel's aggressive push back because AMD haven't pressured them in that price sensitive area yet. I'm waiting for a $60 - $80 Quad or triple core and a <$50 Tripple or dual core.
 
I'm waiting for a $60 - $80 Quad or triple core and a <$50 Tripple or dual core.

I wouldn't hold my breath. Triple cores do not make a lot of sense on the Ryzen architecture and the only dual core Ryzen that has appeared on the marketing slides is a Ryzen APU, which most likely will retail at over 50$, considering the fact that the Bristol Ridge APUs and Athlons for AM4 platforms currently sell for ~60$ or more (at newegg, at least).
 
The only thing I'm waiting for right now is AMD to release low end Ryzen processor. As it stands, CPU's under $100 haven't benefited much from Ryzen's release and Intel's aggressive push back because AMD haven't pressured them in that price sensitive area yet. I'm waiting for a $60 - $80 Quad or triple core and a <$50 Tripple or dual core.
Intel did a nice move under $80 with the Kaby Lake Pentium and then regret it because it was selling better than what it should.
AMD can't do much before the AM3+ and FM2+ and Bristol Ridge stock goes away. They don't have the money anyway, to push too many new models in the market at once. I only wonder if they will create single CCX Ryzen dies for that market, or if they will just do what they where doing in the FM2 platform, by disabling the GPU part of an APU and calling the final product "Athlon".
 
Back