Simulating AMD Ryzen 5 1600X, 1500X Gaming Performance

These benchmark articles always make me wonder what a Ryzen would do with another AMD product like the RX 480 for a GPU. AMD processor, AMD chipset, AMD GPU - would be interesting to see if there is any optimization compared to always running nvidia GPUs in these tests. Past AMD APUs and GPUs had some interesting dynamics at times with hybrid Crossfire and such, it makes me curious.

And, let's be real here - the price point vs performance is potentially one of the appeals of Ryzen chips, so why are the affordable AMD graphics options always apparently locked out of benchmarking, when they would fall right in with the budget-conscious crowds?
because none of the current AMD graphics cards are strong enough to remove the GPU bottleneck and accurately determine CPU performance. look at ryzen performance in 4k benchmarks - its negligible from intel because of GPU bottleneck.
 
Moreover, overclocking does not benefit a lot in the long run, it will at most hold you back a couple months before you wish to upgrade to a better CPU.

Wait, what? You are joking right? Or is this what you've become accustom to by using AMD CPUs? Must be, because people running i7s with a decent overclock can go YEARS, yes YEARS, without the need to upgrade CPUs, hell I nearly got a whole decade out of my first gen i7 platform, to this day that very chip holds it's overclock at 4.0+ GHz and is still viable in every game I throw at it.
 
Funny since looking at tomshardware review it shows the i5 holding its own to the multi core 1700 or beating it in practically every game but whatever "alternative facts" float your AMD fan boy boat

http://www.tomshardware.com/reviews/amd-ryzen-vs-intel-kaby-lake-gaming,4977-5.html
Funny since looking at tomshardware review it shows the i5 holding its own to the multi core 1700 or beating it in practically every game but whatever "alternative facts" float your AMD fan boy boat

http://www.tomshardware.com/reviews/amd-ryzen-vs-intel-kaby-lake-gaming,4977-5.html
in response to me mentioning how misleading framerate tests are in cpu game benchmarking you link an article showing cpu framerates to back up your claim. You need to go out and build 3-4 systems and test for yourself if or just keep using out dated hardware. Either way using any cpu with less than 8 threads today is not optimal (checkout digital foundrys videos.) they Basically dismiss i5s as gaming CPUS out right at this point. In my own testing I've seen the same results. 6c/12t CPUS should be a minimum purchase as of April 11th regardless of which manufacturer you choose to go with.
 
because none of the current AMD graphics cards are strong enough to remove the GPU bottleneck and accurately determine CPU performance. look at ryzen performance in 4k benchmarks - its negligible from intel because of GPU bottleneck.
That's funny, considering all we ever hear about regarding AMD's cards is that they're much more CPU bottlenecked than nVidia's.
 
in response to me mentioning how misleading framerate tests are in cpu game benchmarking you link an article showing cpu framerates to back up your claim. You need to go out and build 3-4 systems and test for yourself if or just keep using out dated hardware. Either way using any cpu with less than 8 threads today is not optimal (checkout digital foundrys videos.) they Basically dismiss i5s as gaming CPUS out right at this point. In my own testing I've seen the same results. 6c/12t CPUS should be a minimum purchase as of April 11th regardless of which manufacturer you choose to go with.

You are right and convinced me the error of my ways. I should stop reading these "fake review" web sites like anandtech, techspot, techpowerup, tomshardware, etc., as they are all just printing reviews Intel wants us to read and believe. The real reviews are on the "alternative review" sites that post true facts like AMD APU's showing better performance then Nvidia 1080ti cards. I've seen these reviews on the internet therefore they must be true! Quickly someone pass me a tinfoil hat!
 
You are right and convinced me the error of my ways. I should stop reading these "fake review" web sites like anandtech, techspot, techpowerup, tomshardware, etc., as they are all just printing reviews Intel wants us to read and believe. The real reviews are on the "alternative review" sites that post true facts like AMD APU's showing better performance then Nvidia 1080ti cards. I've seen these reviews on the internet therefore they must be true! Quickly someone pass me a tinfoil hat!
Do you read the articles on those sites or just look at the pictures. Steve said at the end of his 1800x review that the Ryzen cpu produced less stutter and a smoother experience compared to a 7700k. Not sure how many people need to tell you this, but seeing is believing. That's why I said test it for yourself.
 
Gaming benchmarks are kind of "isolated" tests, meaning, they usually dont run much stuff at the background in order to get a consistent result among different hardware. They even usually do fresh OS install, so I read in many benchmark articles. Frame consistency has been a long discussion topic and I'm not that much into tech details BUT I read a lot of complaints from people using i5s (check reddit for instance) that they get stuttering in games, some say they stop/disable some background tasks to prevent this. Now, I think quad cores may not be showing their limit if you only run the game and nothing more but for example you wouldn't like to see your CPU at 100% usage on all cores if you want to record/stream your progress OR if you need to run some background stuff while you enjoy your game. May not be relevant for average joe though
 
That's funny, considering all we ever hear about regarding AMD's cards is that they're much more CPU bottlenecked than nVidia's.
that was about the driver overhead in DX11. they kinda fixed that in the past year with multiple big driver updates (albeit it's still not at Nvidia's level).
FYI it doesn't really matter how much of the CPU a GPU uses, all that matters is the final numbers.
 
Literally nobody buys the 7350K though lol it's the most irrelevant CPU in the universe

As for the 1500X being remarkable value I don't get it. 7600K is like $200 in many places now right? I have seen it. So 1500X is not going to be faster than that even with the extra threads it seems. Nothing much remarkable about a CPU $10 cheaper being there or thereabouts of one that has been around a little while now. Ryzen won't change the quad core scene much.

Nope, it's 1600 that will be the best part in the whole lineup for the price. 6 cores for the price of a mainstream i5 is more like it. Previously a 6 core setup was expensive, the cheapest Intel 6 core and board to support it has always been like $600.

Now you'll be able to get a good 6 core setup for as little as ~$350.
the 1500x is considered a remarkable value because you can OC it on a cheap B350 chipset and you also get the big Wraith Spire cooler ---> but this is if only you are thinking about gaming on a tight budget. the 1600 is definitely the sweet spot and worth the extra 30$.

Are the lower end Z170/Z270 boards that expensive though? I mean you can get them for like $100 these days. It's like $10 here or there for a B350 which is really an inferior chipset anyway isn't it? Nothing much to shout about.

So I can't really say remarkable at all. Good potentially. Remarkable? Not like a 1600 would be for the price differential I pointed out between previous Intel 6 core setups.

7600k for $200? well yes only in some places, in most place it's still above $220 and You MUST get a cooler for the 7600K since it didn't come with stock cooler, and to achieve 4.8Ghz or above you'll need at least $30 cooler. with the 1500x or 1600 you can achieve 3.9Ghz with the wraith spire stock cooler (Look at bitwit video regarding the wraith spire). as for the motherboard, B350 starting around $80 while Z170/Z270 still above $100
 
that was about the driver overhead in DX11. they kinda fixed that in the past year with multiple big driver updates (albeit it's still not at Nvidia's level).
FYI it doesn't really matter how much of the CPU a GPU uses, all that matters is the final numbers.
Not really... Only Polaris has a lowered CPU overhead. Anything that's Fiji and earlier still has massive CPU bottlenecks. And if we're testing CPUs, it still baffles me why no one tested Ryzen with a Fury X card.
 
Wait, what? You are joking right? Or is this what you've become accustom to by using AMD CPUs? Must be, because people running i7s with a decent overclock can go YEARS, yes YEARS, without the need to upgrade CPUs, hell I nearly got a whole decade out of my first gen i7 platform, to this day that very chip holds it's overclock at 4.0+ GHz and is still viable in every game I throw at it.
Oh c'mon.. I know there are only a few games that really benefit from OC. Most games like Witcher 3, Tomb Raider will get 1 or 2 fps raise even after you OC your cpu by 1Ghz. Intel or AMD does not matter. Its just not worth it to me. Better to upgrade your CPU and GPU every 2-3 years.
 
Not really... Only Polaris has a lowered CPU overhead. Anything that's Fiji and earlier still has massive CPU bottlenecks. And if we're testing CPUs, it still baffles me why no one tested Ryzen with a Fury X card.
sorry but you have no idea what you are talking about dude. you need to document yourself a bit more.
 
I don't regret my new 7700k
Nor should you (especially if you grabbed it when it was $299.99 right before Ryzen launched from Microcenter).

To me focusing on value as the sole determining factor for a purchase overlooks too many other criteria. I purchased the i5-3570k not because I wanted to overclock but because I wanted the best i5 in the lineup. It has served me well for 4+ years and stays cool with the 212 Evo on top.

With video cards I stick to Nvidia because they're more efficient so they run cooler and quieter at similar performance to the AMD parts. In the micro-ATX case I use this allows me to stay below 60 degrees C on both with near silent operation (both OC'ed). I paid a premium for better performance (1060 6GB over RX 480, MSI Gamer X) and thermals but way overpaid if value was my sole criteria.

Keeping my wife happy by her not hearing the fans and myself happy with excellent performance is well worth the $100 or more I spent.
 
Do you read the articles on those sites or just look at the pictures. Steve said at the end of his 1800x review that the Ryzen cpu produced less stutter and a smoother experience compared to a 7700k. Not sure how many people need to tell you this, but seeing is believing. That's why I said test it for yourself.

True he said that for "some games" but lets display the entire line shall we

Performance was smooth with the Ryzen processors while every now and then the quad-core 7700K had a small hiccup. These were rare but it was something I didn't notice when using the 1800X and 1700X. But as smooth as the experience was, it doesn't change the fact that gamers running a high refresh rate monitor may be better served by a higher clocked Core i7-6700K or 7700K.
 
Nor should you (especially if you grabbed it when it was $299.99 right before Ryzen launched from Microcenter).

To me focusing on value as the sole determining factor for a purchase overlooks too many other criteria. I purchased the i5-3570k not because I wanted to overclock but because I wanted the best i5 in the lineup. It has served me well for 4+ years and stays cool with the 212 Evo on top.

With video cards I stick to Nvidia because they're more efficient so they run cooler and quieter at similar performance to the AMD parts. In the micro-ATX case I use this allows me to stay below 60 degrees C on both with near silent operation (both OC'ed). I paid a premium for better performance (1060 6GB over RX 480, MSI Gamer X) and thermals but way overpaid if value was my sole criteria.

Keeping my wife happy by her not hearing the fans and myself happy with excellent performance is well worth the $100 or more I spent.

I actually bought the 1700X (@3.9ghz) but in gaming, it was virtually identical to my 2500k@4.6ghz/2133mhz. Yes, multi-tasking was on a different level and windows was a bit faster and that is nice, but I wanted a bigger improvement and I don't do a lot of heavy multi-tasking.

Then I got the 7700k on monday. WOAH. Big difference. Windows is MUCH faster than the 2500k and still much better than the 1700x.
 
Look how much I have no idea what I'm talking about;

What you said: "A Fury X at 1080p likely causes even more CPU bottleneck than even a GTX 1070"
What the video discovered: Nvidia has a problem with Ryzen and DX12 in Rise of the Tomb Raider most likely because of a driver issue/bottleneck

Seriously dude. WTF do you want from me? What more can I say when you try to argue with me by posting something that clearly has nothing to do with your statement. This is literally you proving that I was right.
 
Memory speed matters a lot for Ryzen;


...actually no

We are happy to report that you can save some money by choosing a slower DDR4-2133 or DDR4-2666 memory, at least until DDR4-3200 or higher memory becomes more affordable. You lose practically no performance to slower memory on the Ryzen platform, when averaged across our CPU tests. The fastest memory configuration in our bench, DDR4-3200 CL14, is about 3.1 percent faster than the slowest DDR4-2133 configuration. In specific tests, the differences in performance can be larger than the average. WinRAR handles a 1.5 GB compression job 5 seconds faster on DDR4-3200 than DDR4-2133, for example.

https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Analysis/13.html
 
...actually no

We are happy to report that you can save some money by choosing a slower DDR4-2133 or DDR4-2666 memory, at least until DDR4-3200 or higher memory becomes more affordable. You lose practically no performance to slower memory on the Ryzen platform, when averaged across our CPU tests. The fastest memory configuration in our bench, DDR4-3200 CL14, is about 3.1 percent faster than the slowest DDR4-2133 configuration. In specific tests, the differences in performance can be larger than the average. WinRAR handles a 1.5 GB compression job 5 seconds faster on DDR4-3200 than DDR4-2133, for example.

https://www.techpowerup.com/reviews/AMD/Ryzen_Memory_Analysis/13.html
the productivity results are as I expected them to be, but you should ignore those gaming benchmarks. they only did the average FPS which is not what I wanted. they should have shown the percentiles too (1% and 0.1%). people are reporting some really positive results there. I wonder what the min FPS numbers were for them.
but even with just the averages, we still see some games getting ~10 extra FPS. DDR4 2666MHz is the minimum I would recommend based on these benchmarks (at least until 3200 becomes cheaper).
 
Last edited:
What you said: "A Fury X at 1080p likely causes even more CPU bottleneck than even a GTX 1070"
What the video discovered: Nvidia has a problem with Ryzen and DX12 in Rise of the Tomb Raider most likely because of a driver issue/bottleneck

Seriously dude. WTF do you want from me? What more can I say when you try to argue with me by posting something that clearly has nothing to do with your statement. This is literally you proving that I was right.
First, I want you to calm down. Arguing with emotional outbursts is not healthy for anyone.

Second, if people weren't too busy trashing AMD cards by making up excuses that they are not fast enough, but instead simply took the time to benchmark a Fury card at 1080p, the driver issue of nVidia+Ryzen would have come to light. Now Ryzen is unfairly being trashed because no one in the press decided to test with AMD cards.

Thirdly, yes, it has to do with my statement. Or are you going to tell me that RX480s in crossfire are less of a hit on a CPU than a GTX 1070/GTX1080? What about a Fury X vs RX480s in crossfire? Don't think the Fury X can tax CPUs enough at 1080p? That was your argument wasn't it? I'm quite sure that it does, considering it's been notorious for doing that exact thing, as I already previously mentioned. But whatever.

Bottom line is that the fastest AMD card should have been used for at least a single Ryzen test, even if it is not in the performance league of a GTX 1080. Being able to produce higher framerates does not necessarily mean that that card is definitely using more CPU. A slower card can use more CPU if the front end and driver is different.
 
Last edited:
I am desperate to see some Handbrake results, as the latest Ver runs very slowly on 3.3Ghz i5-2500. Where should my upgrade monah go ? the ry5 1600 looks very nice on paper, but how would the loss of avx2 sse3 affect encode speed? or should I just keep what I got, encode slowly and spend the Dosh on more hard disks 10Tb? mingle wid da shingle? HAMR time..or getting by on Helium.
 
7600k for $200? well yes only in some places, in most place it's still above $220 and You MUST get a cooler for the 7600K since it didn't come with stock cooler, and to achieve 4.8Ghz or above you'll need at least $30 cooler. with the 1500x or 1600 you can achieve 3.9Ghz with the wraith spire stock cooler (Look at bitwit video regarding the wraith spire). as for the motherboard, B350 starting around $80 while Z170/Z270 still above $100

It's $200 now and probably price drop even further when those parts launch. A decent B350 board is still $90, and I wouldn't call Z170/270 vastly superior, but again it IS technically better than B350 for $100. You pay a bit more, and you get a bit more.

You do realise if we factor in overclocking the 7600K is likely to beat up the 1500 and be totally worth the extra money you spend on a cooler for it? It seems any Ryzen really struggles to exceed 4GHz but most 7600Ks knock on the door of 5GHz, on air.

Like I said. Calling the value 'remarkable' isn't the word I would choose. It's 'decent' value or 'good' value at best. Nothing remarkable whatsoever about a chip and platform a little cheaper than an existing one and kind of matching it, or losing a bit.

The 1600 is genuinely 'remarkable' value. The quad cores look nothing like as interesting for their price based on simulated performance here.
 
Back