AMD Ryzen Review: Ryzen 7 1800X & 1700X Put to the Test

The AMD Jaguar CPU? It's a tablet CPU, it would be like comparing the ability to haul cargo between 8 honda civics or two 24 ft box trucks.

But it's octa core. And since single core performance is very poor compared to Ryzen or Skylake, console games must use many cores to get something out of it.
 
But it's octa core. And since single core performance is very poor compared to Ryzen or Skylake, console games must use many cores to get something out of it.

So what is your point? It takes eight cores of a Jaguar CPU with a dedicated OS towards gaming to offer similar performance to a modern desktop CPU? I think most people agree on that.
 
@HardReset is back baby! Woop! Woop! Once again telling us AMD is "Future Proof" even though the last time he told us that he was completely wrong and now he wants to prove it all over again...

How did the "Future Proof" FX-8370 fair in the review? Looking forward to your response as always :)
 
So what is your point? It takes eight cores of a Jaguar CPU with a dedicated OS towards gaming to offer similar performance to a modern desktop CPU? I think most people agree on that.

Point is that when console games (must) use many threads, console games ported to PC will also use.

This is quite evident when comparing todays console ports to those from X0 era (X0 had three core CPU).

@HardReset is back baby! Woop! Woop! Once again telling us AMD is "Future Proof" even though the last time he told us that he was completely wrong and now he wants to prove it all over again...

How did the "Future Proof" FX-8370 fair in the review? Looking forward to your response as always :)

Pretty well. Considering how much I paid for FX years ago, I still consider it better buy than any Intel was at that time.
 
Point is that when console games (must) use many threads, console games ported to PC will also use.

Completely incorrect as usual, pay attention and I will educate you. Modern console ports are running at ease on modern dual core CPUs and even modern quasi dual cores (like the FX-4). The Fact (not fanboy hyperbole) is the modern Desktop CPUs are not taxed as mush as their console equivalents when it comes to gaming so you need less cores. The industry is not designed to push the CPU as they are the GPU; GPU is easier to replace and no company markets "great AI" on a game when "great graphics" is more cost effective and sells more games. The gaming console industry will need a large jump in CPU power for the next consoles in order to push hex core + desktop CPUs for PC gamers (I wouldn't hold your breathe).

I realize you are an AMD fanboy and feel the need to justify your purchase (I honestly don't care) but that does not give you the tight to post false theories when the facts* prove you wrong.


* noun - something that actually exists; reality; truth:
 
They have absolutely no evidence that Intel CPU's perform better than current Ryzens when GPU's get better. They are just speculating that. So they make test scenario that is based on speculation about future.

I have seen many examples where CPU A is faster on lower resolutions but CPU B is faster when GPU is clearly bottleneck.
That's what the 1080p tests show; the evidence that the Intel CPUs outperform the Ryzen CPU when the GPU is not the restricting factor.

Q3/2016 desktop GPU market share:

Intel 70.9%
Nvidia 16.1%
AMD 13.0%

Does not seem 2.5x to me.
Where's that from? "Desktop" GPU marketshare is not being discussed here (context is key); here's Anandtech's graph of the same time period:
jpr_q3_2016_amd_vs_nvda_SHARES_575px.png

70.9% is 2.44 times greater than 29.1%.

The latest Steam survey (February 2017) which account for usage has Nvidia at 59.8% to AMD's 23.62% (2.53 times more usage.) Intel is listed at 16.2% but is not "discrete" hence the Anandtech graph above. Throwing APUs into the PC Gaming conversation is a red herring.
 
You know you're having a slow work day when you read through all the comments on a Techspot AMD related topic...

The review was great, clearly AM4 has to mature a little before any solid conclusions can be made.

One thing that I will have to argue however is the claims that the lower core count chips will be able to overclock better, I don't see this being the case. From what I gather from the reviews currently available the chips are hitting a wall, this could be again due to the immaturity of the platform but it would appear more an architectural limitation. And seeing as the 4c/8t CPUs are simply half of the 8c/16t CPUs I don't see that single module being able to clock any higher than the dual module counterpart. Historically the higher binned CPUs end up in the more expensive price brackets, AMD is no stranger to this and have done it in the past, on rare occasions you can unlock cores even. Guess we'll have to wait and see however.
 
Completely incorrect as usual, pay attention and I will educate you. Modern console ports are running at ease on modern dual core CPUs and even modern quasi dual cores (like the FX-4). The Fact (not fanboy hyperbole) is the modern Desktop CPUs are not taxed as mush as their console equivalents when it comes to gaming so you need less cores. The industry is not designed to push the CPU as they are the GPU; GPU is easier to replace and no company markets "great AI" on a game when "great graphics" is more cost effective and sells more games. The gaming console industry will need a large jump in CPU power for the next consoles in order to push hex core + desktop CPUs for PC gamers (I wouldn't hold your breathe).

So you are trying to say is that if console game is made to use 6 threads, when it's ported to PC it's made to use only 2 threads because it's enough? "(y)"

Read again what I wrote and try to understand it.

That's what the 1080p tests show; the evidence that the Intel CPUs outperform the Ryzen CPU when the GPU is not the restricting factor.


Where's that from? "Desktop" GPU marketshare is not being discussed here (context is key); here's Anandtech's graph of the same time period:
jpr_q3_2016_amd_vs_nvda_SHARES_575px.png

70.9% is 2.44 times greater than 29.1%.

The latest Steam survey (February 2017) which account for usage has Nvidia at 59.8% to AMD's 23.62% (2.53 times more usage.) Intel is listed at 16.2% but is not "discrete" hence the Anandtech graph above. Throwing APUs into the PC Gaming conversation is a red herring.

My stats are desktop graphic card market share.

Main word is Discrete. Again, you are not understanding that graph. Let me illustrate:

If any of following is sold, it counts as "shipped discrete graphic card" http://pcpartpicker.com/products/video-card/#sort=a8&page=1&X=0,5153

If any of following is sold, it does NOT count as "shipped discrete graphic card"
http://pcpartpicker.com/products/cpu/#sort=d7&s=22,20,21,25,29&X=9503,341046

For gaming, any of those APU's are miles better than any of Nvidia's low end cards. Remember also that low end cards sell much more than high end cards.

So basically: when AMD releases high performing APU (coming August?), AMD's share of graphic card market AND share of so called "gamer capable GPU's" gets higher BUT discrete graphic card share gets lower.
 
So you are trying to say is that if console game is made to use 6 threads, when it's ported to PC it's made to use only 2 threads because it's enough? "(y)"

Read again what I wrote and try to understand it.

No, not what I said at all. I stated the desktop CPU hardware needed to run the console port does not rely on the number of cores on the CPU but rather the performance of the cores.
 
I don't understand what's the disappointment here. In gaming benchmarks 1800x is within 10% range of 7700k in general, and in productivity stuff it's almost on par with 6900k, sometimes slightly falling behind. As I see it, this is a fresh design that needs some software/bios tweaks/updates, there are some obvious anomalies. Let's give it some time and see what ryzen 5 and 3 can do before jumping to conclusion.
 
No, not what I said at all. I stated the desktop CPU hardware needed to run the console port does not rely on the number of cores on the CPU but rather the performance of the cores.

That's true. But. Publishers often complain that PC versions do not sell enough, PC version needs more tweaking than console version etc etc. Now, if console version of game supports 6 threads, when porting that to PC (with basically same CPU instruction set), PC port has support for 6 threads with minimal extra work. If X1 and PS4 had CPU with, say 3 cores, it's pretty much certain that many modern console ports would only support 3 threads on PC.

AMD was only choice for new consoles, because no other company had fast enough CPU+GPU solution available. AMD could have offered quad core with stronger cores but decided to offer octa core just for that reason: better threading support for console games will mean better threading support for PC games also.
 
Publishers often complain that PC versions do not sell enough, PC version needs more tweaking than console version etc etc. Now, if console version of game supports 6 threads, when porting that to PC (with basically same CPU instruction set), PC port has support for 6 threads with minimal extra work. If X1 and PS4 had CPU with, say 3 cores, it's pretty much certain that many modern console ports would only support 3 threads on PC.

Publishers complain because they don't want to spend money optimizing a game for the PC that will not just sell a fraction of console sales but also get pirated like crazy if it does sell well. Remember all xbox and all playstations have the same configuration. If you own an xbox and I own the same model, they are exactly the same. PCs are all different, numerous CPUs from Intel and AMD, numerous GPUs from Nvidia, AMD, and Intel and all with different drivers. It's not that easy to transfer a console game to PCs.

AMD was only choice for new consoles, because no other company had fast enough CPU+GPU solution available. AMD could have offered quad core with stronger cores but decided to offer octa core just for that reason: better threading support for console games will mean better threading support for PC games also.

I'm not sure what their other choices were, there is an IBM plant not that far from me that did supply xbox 360 with CPU chips. I'm sure supply, cost, and power efficiency - cooling is a major issue with consoles as well as low power draw - had more to play into their selection then PC games. Sony can give a rats *** about PC gamers and microsoft is not that far behind them (both companies would prefer you buy a console).
 
Publishers complain because they don't want to spend money optimizing a game for the PC that will not just sell a fraction of console sales but also get pirated like crazy if it does sell well. Remember all xbox and all playstations have the same configuration. If you own an xbox and I own the same model, they are exactly the same. PCs are all different, numerous CPUs from Intel and AMD, numerous GPUs from Nvidia, AMD, and Intel and all with different drivers. It's not that easy to transfer a console game to PCs.

Right, it's not easy to transfer console game to PC. Major complaint about threading is amount of work it requires. But if console game is already threaded (for console), it saves lot of work for PC version.

I'm not sure what their other choices were, there is an IBM plant not that far from me that did supply xbox 360 with CPU chips. I'm sure supply, cost, and power efficiency - cooling is a major issue with consoles as well as low power draw - had more to play into their selection then PC games. Sony can give a rats *** about PC gamers and microsoft is not that far behind them (both companies would prefer you buy a console).

IBM supplied only CPU for X0, GPU was from AMD. Power draw was indeed main reason for selecting "APU" and AMD was only choice. Sony and Microsoft don't really care about PC gamers and threading on PC games, this way (octa core on consoles) AMD forced them to do something about it.
 
AMD forced them to do something about it.

AMD did not force them to do anything. Sony & MS had a demand with a boat load of money attached and AMD happily accepted. Sony Interactive is part of a conglomerate generating 70 billion in gross sales, MS does almost 90 billion. AMD does 4 billion in gross, they force very little when their competitor does over 12x in gross sales and need every dollar they can get.
 
My stats are desktop graphic card market share.

Main word is Discrete. Again, you are not understanding that graph. Let me illustrate:

If any of following is sold, it counts as "shipped discrete graphic card" http://pcpartpicker.com/products/video-card/#sort=a8&page=1&X=0,5153

If any of following is sold, it does NOT count as "shipped discrete graphic card"
http://pcpartpicker.com/products/cpu/#sort=d7&s=22,20,21,25,29&X=9503,341046

For gaming, any of those APU's are miles better than any of Nvidia's low end cards. Remember also that low end cards sell much more than high end cards.

So basically: when AMD releases high performing APU (coming August?), AMD's share of graphic card market AND share of so called "gamer capable GPU's" gets higher BUT discrete graphic card share gets lower.
Including Intel processors with integrated graphics (that's why Intel has a 71% "market share in your "stats") and claiming it's representative of the gaming market misses the point entirely. While discrete GPU marketshare leaves out the AMD APUS it compares the more comparable market that AMD and Nvidia compete in. That's why in other metrics the Intel/Nvidia lead is many times more, like say February's Steam Survey:

original-d660ebe0d0095cc982272c64542a5b9f.png


Though it's possible Steam is collecting non-gaming data in their survey Steam's metrics are more applicable when you consider your (non-linked) metrics capture "all desktop" marketshare, including desktops that never game.

Claiming that the top of the line A10-7890k is "miles better than any of Nvidia's Low end cards" is not accurate (setting the slider to sub-$50 only gets you old cards). The A10-7890k retails for $150 - for that price you can pick up a Pentium G4560 and GTX 1050 ($105+$60) which will be actually be miles better than any AMD APU (even with a 10% price premium). Here's what Tom's had to say about it:
AMD’s A10-7890K is an interesting APU...if you're a match for its target audience. Unfortunately, that's a fairly narrow slice of the market. Gamers wanting even a little bit more speed will find themselves disappointed. On an extreme budget, this is a good choice for online and browser-based games.
That's not the 4k BF1 crowd.
 
Well.. I would rather wait for ryzen 7 2800x in 2018. :/
My goodness, you AMD fans certainly are a patient and optimistic bunch. After a few years of hype Ryzen is finally here, and your talking about going back to the waiting game for 2018? I guess today's Ryzen disappointed you, huh?
 
I don't understand what's the disappointment here. In gaming benchmarks 1800x is within 10% range of 7700k in general, and in productivity stuff it's almost on par with 6900k, sometimes slightly falling behind. As I see it, this is a fresh design that needs some software/bios tweaks/updates, there are some obvious anomalies. Let's give it some time and see what ryzen 5 and 3 can do before jumping to conclusion.

I think people just wanted it to be the "be all, end all" chip. But with the hype train comes unrealistic expectations. Also the launch of Bulldozer didn't help.

But again I'll reiterate, this is nowhere near the fiasco that Bulldozer was. It's in line with what I expected, and bravo to AMD for bringing their IPC to be competitive with Intel again. Simple fact and no brainer, anybody that was eye-balling the 6800K or even 6900K need to consider Ryzen first. It's down right stupid if you don't.

For pure gamers-only builds, nobody's holding a gun to their head that they can't buy the 7700K. (Again why I'm sticking with my 6700K because I have a 165hz monitor.) The irony in that though, throughout the years even since the 2600K, people always touted "If all you do is game, then go with the 4 core i5" yet now AMD offers the chips that greatly exceeds the i5's as well as have 16 threads, they act like AMD robbed them of their man hood or something.

Now I'm really curious for the 4 core Ryzen chips though, I wonder if they will be more overclockable to at least 4.2-4.5ghz. If that happens, then wow that's going to be the real bargain chip for gamers.
 
The irony in that though, throughout the years even since the 2600K, people always touted "If all you do is game, then go with the 4 core i5" yet now AMD offers the chips that greatly exceeds the i5's as well as have 16 threads, they act like AMD robbed them of their man hood or something.

Now I'm really curious for the 4 core Ryzen chips though, I wonder if they will be more overclockable to at least 4.2-4.5ghz. If that happens, then wow that's going to be the real bargain chip for gamers.
How many i5's cost $400? If you go back to the gaming benchmarks the $400 1700X didn't "greatly exceed" the $242 i5-7600k. In fact the $500 1800X didn't greatly exceed the $329 i7-7700k. The argument for the i5's has always been a value argument; none of the Ryzen chips in this review offer good value for their prices (the 1700 was not reviewed).

Take PCWorld's comparison of the $300 1700 against the 5 year old i5-3570k:
...AMD’s CPUs actually whomp Intel’s chips in multithreaded productivity tasks—and for a fraction of the price of comparable 8-core Core processors. The Ryzen 7 1700 is damned disruptive, and not a dud whatsoever.

No, it’s not a dud—unless you’re looking to replace a 5-year-old, quad-core Intel Core i5 chip for mainstream gaming at the most popular display resolution. There, the Ryzen 7 1700 can stumble, and stumble hard.
So the $300 Ryzen CPU struggles to compete with a 5 year old i5 in gaming at 1080p. That's bad for AMD.
 
Why wait for a quad core Ryzan when the 8 core can't even beat a 3770k in games and eats power.
I think for people that are into video editing, 3dfxMax might want one. But lets face it intel quads with HT do pretty much everything most people want plus they are better at gaming.
Ohh man, RyZen has less than 9% less Performance per clock on Gaming than Kabylake, most of games uses 4 threads or less, if 4c/4c 8t RyZen's Boost between 4.20~4.50GHz (like i5 7600K/i7 7700K) @80% of price and getting +90% of fps, wich is better option? No point to be fanboy

1700/1800X RyZen in'st for Gaming, is for Render/Workstation as i7 6800/6900 series but with better prices
 
Now wait for BIOS/Windows updates to get better Performance or even wait 4c/6c for better Gaming/overclocking Performance per price
 
Pretty well. Considering how much I paid for FX years ago, I still consider it better buy than any Intel was at that time.
Considering CPU's that are older (Core i7 2600k and Core i5 2500k) than the FX-8370 are destroying it across the board, are you certain you made the right decision? If you break down how much that CPU has cost over the years and power consumption, pretty certain the Intel's were a better buy and they still perform admirably to this day.

Or have you gone full delusional mode and you're about to tell me in 100 years time when they can use all the cores it's going to blow my Intel Core i7 away?
 
How many i5's cost $400? If you go back to the gaming benchmarks the $400 1700X didn't "greatly exceed" the $242 i5-7600k. In fact the $500 1800X didn't greatly exceed the $329 i7-7700k. The argument for the i5's has always been a value argument; none of the Ryzen chips in this review offer good value for their prices (the 1700 was not reviewed).

Take PCWorld's comparison of the $300 1700 against the 5 year old i5-3570k:

So the $300 Ryzen CPU struggles to compete with a 5 year old i5 in gaming at 1080p. That's bad for AMD.
While I agree with what you say, I couldnt help myself from mentioning that 3570k users (or even 2500k users in this case) find little reason to upgrade to the most recent intel cpus too, when gaming is the concern. Some would argue that ryzen 7 holds extreme value for anything other than gaming as well, but I'll not discuss it cause the subject is gaming here. Also it's better to keep in mind that we have not yet seen how ryzen 5 performs.
 
AMD did not force them to do anything. Sony & MS had a demand with a boat load of money attached and AMD happily accepted. Sony Interactive is part of a conglomerate generating 70 billion in gross sales, MS does almost 90 billion. AMD does 4 billion in gross, they force very little when their competitor does over 12x in gross sales and need every dollar they can get.

Sony/MS didn't have another choice. So if AMD offers octa core APU and Sony/Ms want quad core APU, Sony/MS must either accept that or get much better worse deal elsewhere.

Including Intel processors with integrated graphics (that's why Intel has a 71% "market share in your "stats") and claiming it's representative of the gaming market misses the point entirely. While discrete GPU marketshare leaves out the AMD APUS it compares the more comparable market that AMD and Nvidia compete in. That's why in other metrics the Intel/Nvidia lead is many times more, like say February's Steam Survey:

original-d660ebe0d0095cc982272c64542a5b9f.png

My stats are not missing point, it just proves that looking for discrete cards only is very misleading. Remember that those market share statistics are about quantity, not quality. Most machines shipped has only Intel integrated graphics. Most graphics cards Nvidia supplies are ultra cheap cards for OEM machines, worse than most AMD APU's.

According to Steam survey, large majority of people use AMD CPU or Intel CPU with no integrated graphics so Steam statistics are somehow tweaked. Probably they only list "main" graphic adapter.

Though it's possible Steam is collecting non-gaming data in their survey Steam's metrics are more applicable when you consider your (non-linked) metrics capture "all desktop" marketshare, including desktops that never game.

Claiming that the top of the line A10-7890k is "miles better than any of Nvidia's Low end cards" is not accurate (setting the slider to sub-$50 only gets you old cards). The A10-7890k retails for $150 - for that price you can pick up a Pentium G4560 and GTX 1050 ($105+$60) which will be actually be miles better than any AMD APU (even with a 10% price premium). Here's what Tom's had to say about it:

That's not the 4k BF1 crowd.

My point was that majority of discrete graphic card sales are OEM crap for non gaming. AMD's discrete card share will rapidly drop when APU's get better.

I set slider to $50 as majority of Nvidia's desktop sales are cheaper than $50 cares. Basically AMD's $100 APU is better for games than majority of Nvidia's sold GPU's. Again we are talking about quantity.

Pentium G4560 was just recently released, before that all Pentium's were dual core. Many modern games require 4 threads, so previous Pentium's couldn't even launch many games. Perhaps that's the reason why Intel launched 4 thread Pentium. So compare APU to dual core Pentium or i3, and comparison is much more fair (as Q3/2016).
 
How many i5's cost $400? If you go back to the gaming benchmarks the $400 1700X didn't "greatly exceed" the $242 i5-7600k. In fact the $500 1800X didn't greatly exceed the $329 i7-7700k. The argument for the i5's has always been a value argument; none of the Ryzen chips in this review offer good value for their prices (the 1700 was not reviewed).

Take PCWorld's comparison of the $300 1700 against the 5 year old i5-3570k:

So the $300 Ryzen CPU struggles to compete with a 5 year old i5 in gaming at 1080p. That's bad for AMD.

$1000 Intel CPU struggles to compete with a 5 year old i5 in gaming at 1080p. That's what I call an ultimate disaster for Intel :cool:

Considering CPU's that are older (Core i7 2600k and Core i5 2500k) than the FX-8370 are destroying it across the board, are you certain you made the right decision? If you break down how much that CPU has cost over the years and power consumption, pretty certain the Intel's were a better buy and they still perform admirably to this day.

Or have you gone full delusional mode and you're about to tell me in 100 years time when they can use all the cores it's going to blow my Intel Core i7 away?

I'm still sure. I do much more things than just pure gaming.
 
Back