AMD Ryzen Review: Ryzen 7 1800X & 1700X Put to the Test

I'm considering building a new rig with one of these new CPUs to replace my aging Intel 2500K system.

Can anyone recommend which Ryzen CPU will be best to purchase that offers similar capability's to my current i5?

Obviously I will also have to get a new mainboard and DDR4 + aftermarket cooler also.

Just looking for some suggestions really.


Wait for the R5 line, then decide if you want a 6/12 for more multithreaded power or a 4/8 for a bit better multithreaded power or a 4/4. They all will be better to a stock 2500K, and a bit better when OC'd vs OC'd. The main gains you are getting are multithreaded power though. You might want to wait for upgrading until gaming really starts to use those cores more (like BF1 and WD2). By that time Zen will have ironed out most of its problems - though some will stick since its in its architect being a hybrid between desktop computer and workstation as in a mix between the mainstream i7 and the xeons - and you can make a better choice. Not only that, Intel is rumoured to release an 6/12 in the mainstream category with its next release which should be this year to early next year so you could also wait for that.

Anyways point is, if you upgrade, upgrade with more cores while also raising your ST performance. Ryzen does the former great, the latter not so - for now. So its best to wait out. If you really need a small boost in gaming performance, you can always get the 7700K. I Personally do find it a bit of a waste (paying 600-700 dollars or more for cpu/mobo/ram for absolute but not relatively 30-40% cpu improvement when games usually are GPU bound), but that's my opinion
 
Wait for the R5 line, then decide if you want a 6/12 for more multithreaded power or a 4/8 for a bit better multithreaded power or a 4/4. They all will be better to a stock 2500K, and a bit better when OC'd vs OC'd. The main gains you are getting are multithreaded power though. You might want to wait for upgrading until gaming really starts to use those cores more (like BF1 and WD2). By that time Zen will have ironed out most of its problems - though some will stick since its in its architect being a hybrid between desktop computer and workstation as in a mix between the mainstream i7 and the xeons - and you can make a better choice. Not only that, Intel is rumoured to release an 6/12 in the mainstream category with its next release which should be this year to early next year so you could also wait for that.

Anyways point is, if you upgrade, upgrade with more cores while also raising your ST performance. Ryzen does the former great, the latter not so - for now. So its best to wait out. If you really need a small boost in gaming performance, you can always get the 7700K. I Personally do find it a bit of a waste (paying 600-700 dollars or more for cpu/mobo/ram for absolute but not relatively 30-40% cpu improvement when games usually are GPU bound), but that's my opinion
Thanx for the advice, truth is I have been waiting on Ryzen materializing before building my new rig but looking at the initial results I think I may indeed hold off somewhat and see how costs are affected. I love my current i5, pared with my 1070 it pretty much still cuts the mustard regarding gaming at 1080p with all the bells and whistles on. And like I said it overclocks to around 4-4.5Ghz on air alone.
 
Last edited:
Your "advice" is about 6 years out of date. There certainly were initial issues on Intel Pentium 4 CPU's with HT performance regressions in a large number of apps that continued for 1-2 generations, but that stuff has long been sorted. Turning an i7 back into an i5 (or an i3 back into a 2C/2T Pentium) certainly doesn't "get the best out of it in most games". HT is best left enabled unless you want to cherry pick the 1-in-1000 obscure badly written piece of software that screws up core affinity then try and falsely hold up that outlier as "the average".

HT may be new on Ryzen, but it's not like we have to "guess" its impact when almost every major game over the past several years has been benchmarked on both i5 and i7's (including this article where an i7-6700K is "slower" than an i5-6600K exactly 0% of the time). Likewise, there have been reviews in the past of someone taking an i7, i5, i3 and Pentium, adjusting all clocks to the same frequency and visibly seeing the effect of toggling HT on/off on the same CPU with same clock rate, cache size and physical cores. Gee a 26-43% improvement, what a horrible "regression". Same with video encoding, file compression, etc, typically a consistent 20-30% faster with HT.

Sure there are cases where a few rare games will run a tiny bit slower (typically 1/6th negative impact of the far higher positive impact of leaving it enabled), and plenty of cases where games won't run faster simply due to not being very well threaded, but to effectively claim "Because Ryzen's HT implementation is a little flaky at the moment in some benchmarks, therefore on most games and in general usage so too must all i7's run slower than i5's" is a stupidly false extrapolation.

I don't think I need more proof than this http://www.hardware.fr/articles/956-7/impact-smt-ht.html

As I already explained, one of drawbacks on every SMT design (vs non-SMT or CMT designs) is that using some software, keeping SMT enabled means lower performance. Only way to get around that is to make perfectly working system, either on hardware or compiler level, where "more important" thread always gets higher priority for execution. That kind of system does not exist. And that also means that on many cases, especially in games where some threads are more important than others, enabling SMT will result lower performance.
 
Again... This is an 8C/16T CPU... It is not tailored for gaming specifically, but is more HEDT workstation centric. If gaming is all that matters, go laugh at $1500 6950X owners for losing to 7700K's...
This rings hollow when it was AMD that touted Ryzen's gaming performance in games like Battlefield 1.
 
All I see there is how a bunch of applications consistently gain 20-40% whilst a few games lose 2% overall. Who the hell is going to throw away +40% file compression or +25% video encoding performance just because Witcher 3 got 111.6fps instead of 112.2fps or worst case, F1 2016 gets 131.4fps instead of 139fps? Oh how utterly unplayable that must be...

I suspect the real reason you're suddenly arguing against HT (that's been around years) is because as your own link shows, the loss on Ryzen from HT is significantly larger (ie, same Witcher 3 falls from 101fps to 91fps or F1 falls from 116fps to 103fps), and if you can't "talk up AMD", then you need to "talk down Intel".

Given AMD also design the Jaguar CPU's for consoles and know full well the core assignment issues that arise from that ("8 core" consoles = 2x quad-cores separated over a latency expensive bus that forces game devs to try and force threads to the same cluster of 4), this is something they could accommodate for, ie, if consoles physical cores are 0-7, they could have designed Ryzen's core assignment (as the OS sees it) to have 0-7 physical then 8-15 virtual instead of 0/1, 2/3, etc, pairs. If they didn't do that, then that's a serious design flaw / oversight.
 
Great work Steve! Thumbs up!

I think this is a great come back by AMD. Look at where they were.
Yeah, the gaming performance was not mind blowing, but they are back in the game. I can hardly wait to see the 4 core chips reviewed.

Overall the pricing is great, performance is good, value is good.

Competition is revived. Good for all of us.
 
. Turning an i7 back into an i5 (or an i3 back into a 2C/2T Pentium) certainly doesn't "get the best out of it in most games". HT is best left enabled unless you want to cherry pick the 1-in-1000 obscure badly written piece of software that screws up core affinity then try and falsely hold up that outlier as "the average".
This is very true, HT has been helping with games for 5+ years now.
Since about 2010 give or take.

I suspect the real reason you're suddenly arguing against HT (that's been around years) is because as your own link shows, the loss on Ryzen from HT is significantly larger (ie, same Witcher 3 falls from 101fps to 91fps or F1 falls from 116fps to 103fps), and if you can't "talk up AMD", then you need to "talk down Intel"..
Hardreset lacks objectivity, as do many brand loyals.
AMD, overall has done well with Ryzen he should be happy.
It also should improve some overtime as the kinks are worked out.
 
This rings hollow when it was AMD that touted Ryzen's gaming performance in games like Battlefield 1.

AMD demoed Battlefield 1 on 4K resolution, not 1080p. Problem?

All I see there is how a bunch of applications consistently gain 20-40% whilst a few games lose 2% overall. Who the hell is going to throw away +40% file compression or +25% video encoding performance just because Witcher 3 got 111.6fps instead of 112.2fps or worst case, F1 2016 gets 131.4fps instead of 139fps? Oh how utterly unplayable that must be...

Bunch of applications get performance boost because when running those applications, thread priority is not important, only throughput.

Most games tested (thread priority important) were slower HT on vs off. Just as expected. Quite far from just 1 of 1000. No matter if difference is small, but that difference is there.

I suspect the real reason you're suddenly arguing against HT (that's been around years) is because as your own link shows, the loss on Ryzen from HT is significantly larger (ie, same Witcher 3 falls from 101fps to 91fps or F1 falls from 116fps to 103fps), and if you can't "talk up AMD", then you need to "talk down Intel".

I have said that on some cases SMT lowers performance since year 2001.

Given that Ryzen is new product, Windows 10 probably has scheduling problems with Ryzen and that explains why AMD suffers more on games. We have already seen that AMD's SMT is better than Intel's when it works.

Given AMD also design the Jaguar CPU's for consoles and know full well the core assignment issues that arise from that ("8 core" consoles = 2x quad-cores separated over a latency expensive bus that forces game devs to try and force threads to the same cluster of 4), this is something they could accommodate for, ie, if consoles physical cores are 0-7, they could have designed Ryzen's core assignment (as the OS sees it) to have 0-7 physical then 8-15 virtual instead of 0/1, 2/3, etc, pairs. If they didn't do that, then that's a serious design flaw / oversight.

There is no SMT on Jaguar CPU.

What you say about core assignment is operating system level problem, not CPU level problem. Applies to Intel too.
 
Underwhelmed.
Anyone with a I7 3770k has to be smiling. That is one good ole dog that still hangs with newest in games and is much better on power draw than the new AMDs.

AMD has improved but I'm sticking with intel.

I running Core i7 3770s in my boxes and I'm smiling. Looking at the results it seems to be a capable chip considering it's a third-generation Core processor. Guess I'll be hanging onto it for a little while longer.
 
Despite my earlier reply I'm still impressed with what AMD engineers were able to accomplish with the Ryzen processors. They offer good performance across the board at decent prices; of course, these results also showcase the strength of Intel's Core lineup particularly when it comes to games. Perhaps AMD's engineers will be able to refine their Ryzen CPUs in the near future and make it truly competitive.
 
From what I've heard, Ryzen 5 will be clocked very similarly to Ryzen 7 parts. There will just be fewer cores.

But they will overclock much better due to having less cores. Just look at 4.8 GHz FX-4350's.

Also idk about you, but I have seen sever websites say gaming with Zen feels smoother than with the 7700K. I would want to look into that before I bought anything...
 
Most games tested (thread priority important) were slower HT on vs off. Just as expected. Quite far from just 1 of 1000. No matter if difference is small, but that difference is there.
Again, you can't extrapolate from one CPU to another from one iffy site that didn't even test min fps. Eg, Watch Dogs 2 gains +29% from HT according to GamersNexus (113fps vs 87fps) and a +23% boost for BF1's min fps. Also no mention of the +45% gains in Crysis 3 (and a multitude of other games), or the whopping +83% i7 vs i5 boost in min fps in this very article, eg, BF1 4C/8T = 148fps vs 4C/4T = 81fps where barely 5% of that difference is cache and 95% HT. The reason you won't see much gain from HT on 8/16 CPU's is simply because most games aren't that well threaded, nor are they likely to be given they're typically written for consoles (with only 6 "accessible" cores).

The bottom line is, the "worst case loss" of performance from HT has long shrunk from Pentium 4 era -10-20% to an imperceptible 2% average, with many "negative scaling" games being well below 1% or having zero difference, whilst having the cores made wider from Haswell onwards has significantly boosted the gain particularly in games. A CPU feature which boosts +20-40% in many apps, and yet loses only 0.5-5% in a few games (but gains up to 45% avg fps in other games) is an obvious net gain for the technology overall, and is a far cry from what you claimed originally "disabling HT on BIOS is needed if wanting to get best performance on every software".

You can keep pushing the "HT makes games run slower" thing. People who actually own an i7 instead of an i5 for 120-144Hz gaming know better as this very Techspot article clearly demonstrates.
 
Last edited:
This rings hollow when it was AMD that touted Ryzen's gaming performance in games like Battlefield 1.
It actually does perform well in BF1. Do you know what the issue is with BF1? DX12. In DX12 programming is closer to the metal. Since it's programmed for either AMD's old architecture or Intel's, it performs bad. It has to process code that was simply not made for it. Pick DX11 and see the performance boost. See here;

pl3xBoy.png


K3ZyJXo.png


Enjoy;
https://www.computerbase.de/2017-03.../#diagramm-battlefield-1-dx11-multiplayer-fps
 
You can keep pushing the "HT makes games run slower" thing. People who actually own an i7 instead of an i5 for 120-144Hz gaming know better as this very Techspot article clearly demonstrates.
You've handily won this debate, no reason to keep going just let him punch himself out.
I used to engage with people over this topic on Techspot, Overclock.net and others but when 1 or 2 people try to throw shade on HT, 4 or 5 more people that share my view counter with inarguable evidence we have all known for years.
Some people are just going to view things how they want, regardless of the data presented.
HT is the main reason many chips, especially i3's, perform as well as they do in games.

This thread is wayyy of topic.
Cheers to AMD for making some stellar chips in Ryzen, they are performing great.
 
Again, you can't extrapolate from one CPU to another from one iffy site that didn't even test min fps. Eg, Watch Dogs 2 gains +29% from HT according to GamersNexus (113fps vs 87fps) and a +23% boost for BF1's min fps. Also no mention of the +45% gains in Crysis 3 (and a multitude of other games), or the whopping +83% i7 vs i5 boost in min fps in this very article, eg, BF1 4C/8T = 148fps vs 4C/4T = 81fps where barely 5% of that difference is cache and 95% HT. The reason you won't see much gain from HT on 8/16 CPU's is simply because most games aren't that well threaded, nor are they likely to be given they're typically written for consoles (with only 6 "accessible" cores).

Only quad cores tested on those pics. Not octa cores.

The bottom line is, the "worst case loss" of performance from HT has long shrunk from Pentium 4 era -10-20% to an imperceptible 2% average, with many "negative scaling" games being well below 1% or having zero difference, whilst having the cores made wider from Haswell onwards has significantly boosted the gain particularly in games. A CPU feature which boosts +20-40% in many apps, and yet loses only 0.5-5% in a few games (but gains up to 45% avg fps in other games) is an obvious net gain for the technology overall, and is a far cry from what you claimed originally "disabling HT on BIOS is needed if wanting to get best performance on every software".

So you admit that HT actually gives lower performance on some cases. That's exactly what I have been saying. Overall gain from HT is positive but it does not help every time.

My claim is still valid. For best performance you must sometimes keep HT on, sometimes HT must be disabled from BIOS, otherwise performance will be worse.

You can keep pushing the "HT makes games run slower" thing. People who actually own an i7 instead of an i5 for 120-144Hz gaming know better as this very Techspot article clearly demonstrates.

I have no problem saying that because that's true. No matter how hard you try, unless you can prove following fact wrong, this discussion is pointless.

All Intel CPU's slow down on some cases when HT is on vs HT off. That is because when using SMT, CPU core must make choice which of two (or more, SMT core is not limited to 2 threads) threads get highest priority. Wrong guess mean less important thread gets executed first and that means lower performance.

You've handily won this debate, no reason to keep going just let him punch himself out.

I already provided facts about how SMT works. Few benchmarks do not change that.

It just seems some people are too ignorant to accept truth.
 
AMD demoed Battlefield 1 on 4K resolution, not 1080p. Problem?
Yes and Gamer's Nexus and PCGamer both alluded to why that is an issue - GPU bottle-necking in 4k gaming - hence why this very site tested at 1080p instead. The 4k test was purposefully misleading and as GPUs improve the older Intel CPUs will start to outperform the Ryzen CPUs.

Kind of like "fine wine," eh?
 
Yes and Gamer's Nexus and PCGamer both alluded to why that is an issue - GPU bottle-necking in 4k gaming - hence why this very site tested at 1080p instead. The 4k test was purposefully misleading and as GPUs improve the older Intel CPUs will start to outperform the Ryzen CPUs.

Kind of like "fine wine," eh?

As GPU's improve, also games will support more threads. Also games will be better optimized for Ryzen. And then Ryzen will outperform current Intel quad core CPU's.

Their logic is flawed.
 
As GPU's improve, also games will support more threads. Also games will be better optimized for Ryzen. And then Ryzen will outperform current Intel quad core CPU's.

Their logic is flawed.
No the logic is not flawed because what I stated is the current state of the industry. You have stated a speculative opinion - there is hope that games will be able to utilize more cores/thread in the future but nothing concrete.

If I were to speculate as well I'd say that Ryzen would need more market penetration for your scenario to take place. Current surveys put the Intel gaming market share at or above 75% and the Nvidia market share 2.5x more than AMDs. It's great that AMD is getting Bethesda to "optimize" for Ryzen and Vega but companies aren't going to alienate the majority of gamers.
 
Reality check of PC gaming; 95%+ of PC gamers have a quad core or dual core CPU (46% have a dual core most likely with HT) and only one in three has a CPU clocked at over 3ghz. Granted a lot of these people play on their laptops still companies that sell video games - what video games are a business and not public resource!!!??? - have to make their games scale-able to them otherwise they are just selling to a small niche of people and not being business prudent.

Still some Phenom II x 6 owners (now add the FX6-8 owners as well) out there waiting for the day when video games will need more then four cores - you have a long wait gentleman and maybe a lady or two - until that time. CPUs are still the robin to video card's batman.

http://store.steampowered.com/hwsurvey
 
I don't understand, this gaming test only for full HD, how about 4K resolution. We buy this cpu for future 4k but not for Full HD. I hope 4k test will coming soon to see the complete fight
 
No the logic is not flawed because what I stated is the current state of the industry. You have stated a speculative opinion - there is hope that games will be able to utilize more cores/thread in the future but nothing concrete.

They have absolutely no evidence that Intel CPU's perform better than current Ryzens when GPU's get better. They are just speculating that. So they make test scenario that is based on speculation about future.

I have seen many examples where CPU A is faster on lower resolutions but CPU B is faster when GPU is clearly bottleneck.

If I were to speculate as well I'd say that Ryzen would need more market penetration for your scenario to take place. Current surveys put the Intel gaming market share at or above 75% and the Nvidia market share 2.5x more than AMDs. It's great that AMD is getting Bethesda to "optimize" for Ryzen and Vega but companies aren't going to alienate the majority of gamers.

Q3/2016 desktop GPU market share:

Intel 70.9%
Nvidia 16.1%
AMD 13.0%

Does not seem 2.5x to me.

Reality check of PC gaming; 95%+ of PC gamers have a quad core or dual core CPU (46% have a dual core most likely with HT) and only one in three has a CPU clocked at over 3ghz. Granted a lot of these people play on their laptops still companies that sell video games - what video games are a business and not public resource!!!??? - have to make their games scale-able to them otherwise they are just selling to a small niche of people and not being business prudent.

Still some Phenom II x 6 owners (now add the FX6-8 owners as well) out there waiting for the day when video games will need more then four cores - you have a long wait gentleman and maybe a lady or two - until that time. CPUs are still the robin to video card's batman.

http://store.steampowered.com/hwsurvey

How about consoles? I'd say most have octa core CPU.
 
Back