AMD Ryzen Review: Ryzen 7 1800X & 1700X Put to the Test

I'm sure if you use a little super glue and duct tape, you can squeeze that 1080 in. And yes, by all means, do us all a favor and buy an Xbox, because obviously you have no use for a Personal Computer. Typical n00b kiddo, that thinks a pc is used for gaming and surfing.
After looking at your pc's specs, I now understand your anger and jealousy. How long does it take to upload your comments at 56Kbps anyway?
 
About what was expected from Ryzen. I would probably revisit performance in about a month or so to see if any of the kinks can be worked out. Gamers should be waiting for the 4 and 6 cores CPUs.
 
Those PC specs were there when you were soiling your pampers, or Huggies. You should probably go back to playing your CSgo at 4k, before you strain something you don't use very often. In a couple weeks after nagging mom and dad, you can come back and brag about your shiny new 1080 Ti.
 
This is another Bulldozer hype train all over again.

AMD had YEARS to study Intel's CPU's when making Ryzen.

lol you are delusional.

What people seem to be forgetting is this a first gen product comparing to 7th gen intel 7700k.

This processor has alot of potential and will get better.

If you were expecting a 8 core processor to be faster than a quad in most games that don't use that many threads I have a beach front property to sell you in Alaska. 7700k also beats all of intel's 8 core chips!

years to study intel's designs?

Do you know how big intel is and how much money they make compared to AMD?

We are all lucky AMD is still around.

Pretty much what I expect from "gamers" damn children and your tunnel vision and limited views.
 
Last edited:
I don't know what power consumption numbers you saw, but the ones I saw put it on-par with the 6900K in both power consumption and performance. I'm guessing everyone calling it a bad performance-per-watt choice must be comparing it to Intel quad-cores, mixing up stress power consumption numbers with gaming performance numbers.
Ah yes looking back I did mistaken the 6900k for a quad core Intel. My bad!
 
Those PC specs were there when you were soiling your pampers, or Huggies. You should probably go back to playing your CSgo at 4k, before you strain something you don't use very often. In a couple weeks after nagging mom and dad, you can come back and brag about your shiny new 1080 Ti.
"Those PC specs were there when you were soiling your pampers, or Huggies"

That's not something to brag about. A 49 year old man rockin' a rig from 2002? Can anyone say s-u-c-c-e-s-s? Secondly, it's hilarious that someone with a setup weaker than what third world school children use is talking smack to anyone about tech. Are you drunk?! Or do you just think it's "neat" to talk to a genuine city slicker?

I think you should be the one to get an Xbox- it is literally ten times more powerful than what you have now. Maybe all your farm animals will pitch in and get you a gift card to Cletuses elektroniks store down the way.
 
I will put this here for people wondering about the gaming results:

We had a lot of trouble benchmarking games with Ryzen. It seems we weren't the only ones, and many other reviewers have reported strangely low performance here. Our initial Asus board was plagued with bugs, and we saw some gains simply by switching to a Gigabyte or Asrock board. This really isn't the sort of behavior you'd expect, and AMD even acknowledged there were some issues with some Asus boards.

While we are pretty confident in our application test results, there could be some unresolved early issues with Ryzen and AM4 boards that is leading to strangely low gaming performance. We're not 100% sure what is going on there; Steve and I spent a while discussing what could be up, and we ended up confused more than anything else.

So if you're a gamer that's looking at our gaming results and thinking "that's disappointing", there could be an unresolved story here.

Of course one possible conclusion is simply that Ryzen isn't that amazing in games, but we're just not fully sure that is truly the case if all hardware was working correctly

EDIT: Don't get your hopes up about a potential fix. The results we achieved could be it, and you should make any buying decisions accordingly at this stage. The best thing may be to wait a few weeks just to make sure ;)

Most likely the problem extends from the way SMT is being used ie: 2 actual cores and 2 hyperthreads instead of 4 actual cores being used.... if you could test with smt off in games you'l find an pretty good increase in FPS
 
Most likely the problem extends from the way SMT is being used ie: 2 actual cores and 2 hyperthreads instead of 4 actual cores being used.... if you could test with smt off in games you'l find an pretty good increase in FPS
So now people have to go into bios and disable SMT every time they want to fire up a game, then go back in and turn it on when they're done? Does that really sound practical to you?
 
So now people have to go into bios and disable SMT every time they want to fire up a game, then go back in and turn it on when they're done? Does that really sound practical to you?

Practical No it does not sound practical but maybe AMD will update the microcode to only use hyper-threads once all actual core are exhausted
 
Funny how some are saying to turn off SMT on Ryzen but on the iCore's no one is saying to turn off HT.

Now there have always been gamers who make claims that HT is better off and on. Even when BF1 launched there were ppl saying this.
I have messed with HT off and on over the years with various games. I have never noticed a single difference in performance.

I am running i7 3770k with a EVGA 1070 FTW. Plays BF1 on ULTRA and R6 Siege on max. No issues that ive seen that would be related to CPU.
 
What I am interested in knowing is why Ryzen seems to be more "smoother" than a i7? Even if the hiccup is rare, why is there even a hiccup at all? Is HT causing it or something else all together. That I wouldn't mind knowing.
As I have said I haven't notice a difference with HT on or off, or even a performance issue or any anomalies.

So for me, I'd think that Ryzen has something helping it for this smoother effect rather than something a i7 could turn off or on.
 
Looking at the gaming results I think I may just stick with Intel considering the potential drop in price that Ryzen may bring about.

Think I might just wait another month until conclusive reports materialize.

I would just wait for AMD's quad cores if price is an issue. Most likely they will clock to 4.2GHz+, and they will cost half of an i7.

Again keep in mind that the 8-core is meant to compete with Intel's 6 - 10 core CPU's, and it does well. But 4-core Kabylake will undoubtedly still have a gaming advantage (most of the time, not always).
 
Most likely the problem extends from the way SMT is being used ie: 2 actual cores and 2 hyperthreads instead of 4 actual cores being used.... if you could test with smt off in games you'l find an pretty good increase in FPS

This was not the issue. A game like Watch Dogs 2 uses a full 16 threads and it still lags behind Intel's 16-thread CPUs considerably, despite posting near-equal application performance results
 
Looking at the gaming results I think I may just stick with Intel considering the potential drop in price that Ryzen may bring about.

Think I might just wait another month until conclusive reports materialize.

I would just wait for AMD's quad cores if price is an issue. Most likely they will clock to 4.2GHz+, and they will cost half of an i7.

Again keep in mind that the 8-core is meant to compete with Intel's 6 - 10 core CPU's, and it does well. But 4-core Kabylake will undoubtedly still have a gaming advantage (most of the time, not always).

From what I've heard, Ryzen 5 will be clocked very similarly to Ryzen 7 parts. There will just be fewer cores.
 
Let me be the one crashing the party for Intel fanboys: Intel has to do better, supremacy is long gone. The future is in multicore, multithread and heterogenous computing and that's the way software companies head on and AMD is better suited for that.@ Steve: You should revise the power consumption figures or to check your testing hardware.
 
This review seems to be skewed since Intel's 6900K had twice the memory than the Ryzen 1800. 32GB ram vs 16GB. Unless AMD chipset disallowed the extra memory. So a redo is in order to determine the truth.
 
Again I'd argue that's not the case as PCGamer.com indicates this is an issue with Ryzen, not Core i7's and i3's. I would conceed that perhaps a driver does this in Windows automatically (a solution the author I linked suggests) but AMD's fixes sent to the reviewers of disabling in the BIOS is not a fix - that's a bug.

As I already proved, that's the case with every SMT CPU unless using software tweaks that for example first load all physical cores and after that all logical cores, effectively disabling SMT on many cases. So that is issue with all Core ix CPU's with SMT too. Difference between SMT on/off may not be same, but it still is there.

So that's not a bug, it's just how SMT is supposed to work. CMT (Bulldozer's module design) have at least that advantage. On Bulldozer's CMT design, two integer threads for one module is never slower than one integer thread for one module.

So now people have to go into bios and disable SMT every time they want to fire up a game, then go back in and turn it on when they're done? Does that really sound practical to you?

Intel users should have done that since Pentium 4 days (2001) if wanting to get best gaming performance ;)

This was not the issue. A game like Watch Dogs 2 uses a full 16 threads and it still lags behind Intel's 16-thread CPUs considerably, despite posting near-equal application performance results

There's only 16% difference on minimum frame rates between 4C/8T (7700K) and 8C/16T (6900K) on Watch dogs 2. Average frame rates are same. So even if Watch Dogs 2 "use" 16 threads, optimization is terrible.

And it indeed does not use 16 threads as said on article:

The last game I had time to test was Watch Dogs 2, in which the CPU utilization hovered between 70 and 80% utilization on the Ryzen processors with an even load across all 16 threads.

That's more like 10-12 threads at most.
 
Just as I expected. Falls short of the pre-release hype it created.

My only reason for building a new desktop is to play games, and my aging 4790K still crushes these two CPUs. Good to know that I can still cruise along with it.

In all fairness, I did do lots of reading and wanted to get the best of AMD CPU when I was building my 4790K but in the end felt not worth it with the heat and efficiency issues of AMD CPUs, and felt disappointed with it's Fury card's hype, which fell short of expectations too.
 
Just as I expected. Falls short of the pre-release hype it created.

My only reason for building a new desktop is to play games, and my aging 4790K still crushes these two CPUs. Good to know that I can still cruise along with it.

In all fairness, I did do lots of reading and wanted to get the best of AMD CPU when I was building my 4790K but in the end felt not worth it with the heat and efficiency issues of AMD CPUs, and felt disappointed with it's Fury card's hype, which fell short of expectations too.

Ryzen delivered what hype promised. It was not promised to be CPU for low res gaming only. It was promised to challenge i7-6900K at much lower price. i7-6900K is not meant for low res gaming only btw.
 
Can you benchmark game with less core number? like 4 cores only for example
 
So yes, that (disabling HT on BIOS) is SOP on when using Intel CPU's if wanting to get best performance on every software.

On most games disabling SMT mean higher performance.
Your "advice" is about 6 years out of date. There certainly were initial issues on Intel Pentium 4 CPU's with HT performance regressions in a large number of apps that continued for 1-2 generations, but that stuff has long been sorted. Turning an i7 back into an i5 (or an i3 back into a 2C/2T Pentium) certainly doesn't "get the best out of it in most games". HT is best left enabled unless you want to cherry pick the 1-in-1000 obscure badly written piece of software that screws up core affinity then try and falsely hold up that outlier as "the average".

HT may be new on Ryzen, but it's not like we have to "guess" its impact when almost every major game over the past several years has been benchmarked on both i5 and i7's (including this article where an i7-6700K is "slower" than an i5-6600K exactly 0% of the time). Likewise, there have been reviews in the past of someone taking an i7, i5, i3 and Pentium, adjusting all clocks to the same frequency and visibly seeing the effect of toggling HT on/off on the same CPU with same clock rate, cache size and physical cores. Gee a 26-43% improvement, what a horrible "regression". Same with video encoding, file compression, etc, typically a consistent 20-30% faster with HT.http://techbuyersguru.com/sites/def...eGamersBench/HaswellGaming/crysis3haswell.png

Sure there are cases where a few rare games will run a tiny bit slower (typically 1/6th negative impact of the far higher positive impact of leaving it enabled), and plenty of cases where games won't run faster simply due to not being very well threaded, but to effectively claim "Because Ryzen's HT implementation is a little flaky at the moment in some benchmarks, therefore on most games and in general usage so too must all i7's run slower than i5's" is a stupidly false extrapolation.
 
Just as I expected. Falls short of the pre-release hype it created.

My only reason for building a new desktop is to play games, and my aging 4790K still crushes these two CPUs. Good to know that I can still cruise along with it.

In all fairness, I did do lots of reading and wanted to get the best of AMD CPU when I was building my 4790K but in the end felt not worth it with the heat and efficiency issues of AMD CPUs, and felt disappointed with it's Fury card's hype, which fell short of expectations too.
Again... This is an 8C/16T CPU... It is not tailored for gaming specifically, but is more HEDT workstation centric. If gaming is all that matters, go laugh at $1500 6950X owners for losing to 7700K's...
 
A much better showing then the highly disappointing Bulldozer family but not exactly a "wow" factor chip. I'm more interested in the Ryzen 3 & 5 chips but as now there still seems to be little reason to upgrade from my OC i5-2500k @ 4ghz.

I think the upgrade will be small, same counts for going the 7700K route though. Only if you need more threads, would the 1700 make any sense. The 1700 at 3.9ghz will win against your i5 2500K at gaming though, and it will destroy it (and the 7700K) in multithreaded applications (adobe, encoding, streaming etc.).

BUT, I am running a phenom II X6 @3.7ghz, and it is keeping my happy for now. I will wait out until AMD irons out the bugs/optimize the software and then I'll consider the 1500X. If I am right, the 1500X will have a MUCH higher gaming performance then the R7 series because it will only have 1 cache module (CCX I believe), instead of 2 (which is bottlenecking currently the R7 - AMD needs to optimize this through software and OS optimization).

Stay tuned though, Ryzen does have some finewine qualities in them as you can see from the synthetic benchmarks. I bet in a month or two, Ryzen will be equal to better then the 6900K in gaming in a lot of games (not all), and so will also be better then the 7700K @5.1ghz in very wel multithreaded games. Wait and see ;)
 
Back