Simulating AMD Ryzen 3 1200, 1300 Performance

Julio Franco

Posts: 9,099   +2,049
Staff member
Last edited by a moderator:
Very interesting. It's quite nice to see AMD competitive again, but I'm more eager to see these in APU form. For workstations, it's already a done deal. But for TV-PCs and budget gaming rigs, but more importantly for mobile, I'm enthusiastic that these things can change the landscape again.

I run an A8-5600K for my TV-PC attached to a GTX-750 Ti and a (dying but beautiful) Panasonic plasma. It's prone to overheating and kinda sucks, but still knocks the socks off the consoles and even works well enough for content-creation work in Maya and Photoshop. This compared to a full Bulldozer 8120 rig for hardcore gaming and two Piledriver 8350s I use for rendering work. It's simply far more enjoyable to sit in front of my TV with a decent (if dated) graphics card, a pair of DS4 controllers, and my pals. You can't game in the living room on a 20-whatever-" monitor and have a good time.

So while I'm looking forward to Ryzen 1700s in my workstations, they're already pretty fast (24 cores@5GHz) and I"m far more excited about replacing this TV-PCs innards. I don't need the APU setup, but given the current landscape it seems like Ryzen 3 APUs could make a serious dent in the mobile world. $300 laptops that don't suck? $500 laptops that actually can compete? We'll see.
 
Great review. I missed out on the G4560 when it was in stock at Microcenter for $60-65 and have been regretting it since (I was going to use it to rebuild my Plex/media server).

Should the Ryzen 3 performance estimate turn out to be as accurate as you all's 5 estimates it looks like I am going to retire my 3570k to server duty and get a Ryzen 5 1600 for my new desktop - that is unless Intel's Cannonlake or other 10nm chip offers something better.
 
Very good review. Here is my reservation on Ryzen for gaming (not general use), looking at the i7-7700k vs 7800x review https://www.techspot.com/review/1445-core-i7-7800x-vs-7700k/

you see a 4/8 (real/virtual core) CPU delivering the same or better gaming performance as its 6/12 counterpart. Granted the architecture is slightly different both CPUs were OC to relative similar clock speeds for the test.

Now you look at the Ryzen 1500x & 1600 performance
https://www.techspot.com/article/1381-ryzen-1600x-vs-1600/
https://www.techspot.com/review/1379-and-ryzen-5-1600x-1500x/
Once again both CPUs are 4/8 vs 6/12 cores and are similar in clock speeds. Yet the 1600 displays a clear and distinct gaming advantage compared to the Intel test above.
 
The gaming tests didn't surprise me. The 1800X struggles with 1080p. Which had the biggest gains on Steam than any other resolution.

Ryzen is for the 20%, not the 80% AMD needed.
 
The gaming tests didn't surprise me. The 1800X struggles with 1080p. Which had the biggest gains on Steam than any other resolution.

Ryzen is for the 20%, not the 80% AMD needed.

What a logic :D 90% of Steam users have much weaker CPU than 1800X and those who have 1800X don't play on 1080p...
 
The gaming tests didn't surprise me. The 1800X struggles with 1080p. Which had the biggest gains on Steam than any other resolution.

Ryzen is for the 20%, not the 80% AMD needed.

I couldnt agree more. An 1800x cant push a 1080 let alone a 1080ti to its max. What heppens a year or two down the road when a 2060 or 3060 comes out with gaming horsepower compareable yo a current 1080ti? That means your cpu that cant push a 1080ti can never take advantage if a 2080ti or 3080ti. I would rather have the faster intel cores knowing that it has the horses to drive 144+ fps at 1080p and future gen video cards wih way more power can/will push that 144 at 1440p or 4k.

Everyone screaming "Thank god for AMD!" The only area they are being competative is in the content creation space. I recently aold my overclocked 2600k pc and bought a 1600 system. All I can say is that my attempt at an "upgrade" is anything but that. I am seeing lower minimums and lower max frames. Pretty dissapointing.

If you have the cash, and your gaming, no matter the reaolution. Get a 7700k or 7600k overclock it to 4.8 or so and hold on for the next 3 years or so cause you have the best u can buy and the baddest GPU's over the next couple generations will blow you away with that kind of cpu power pushing them.
 
The gaming tests didn't surprise me. The 1800X struggles with 1080p. Which had the biggest gains on Steam than any other resolution.

Ryzen is for the 20%, not the 80% AMD needed.

What a logic :D 90% of Steam users have much weaker CPU than 1800X and those who have 1800X don't play on 1080p...

And those 1800x gamers are going to be pissed when they upgrade their high end gpu in a few years and see marginal gains because their 1800x's low single core speed has been holding them back this whole time.
 
I couldnt agree more. An 1800x cant push a 1080 let alone a 1080ti to its max. What heppens a year or two down the road when a 2060 or 3060 comes out with gaming horsepower compareable yo a current 1080ti? That means your cpu that cant push a 1080ti can never take advantage if a 2080ti or 3080ti. I would rather have the faster intel cores knowing that it has the horses to drive 144+ fps at 1080p and future gen video cards wih way more power can/will push that 144 at 1440p or 4k.

Everyone screaming "Thank god for AMD!" The only area they are being competative is in the content creation space. I recently aold my overclocked 2600k pc and bought a 1600 system. All I can say is that my attempt at an "upgrade" is anything but that. I am seeing lower minimums and lower max frames. Pretty dissapointing.

If you have the cash, and your gaming, no matter the reaolution. Get a 7700k or 7600k overclock it to 4.8 or so and hold on for the next 3 years or so cause you have the best u can buy and the baddest GPU's over the next couple generations will blow you away with that kind of cpu power pushing them.

I almost had myself convinced a 1600X would be worth it. I had it all in my newegg wishlist until I continued to see it fail @ 1080p compared to the competition. Everyone wants to say drivers and patches and optimizations will fix it, but if it's just going to be the same excuse as the previous generation and the generation before that, then I'm not going to bother. I need four fast cores, not 8 slow ones. AMD missed the mark.

And those 1800x gamers are going to be pissed when they upgrade their high end gpu in a few years and see marginal gains because their 1800x's low single core speed has been holding them back this whole time.

They'll be playing a lot of Cinebench though!
 
I couldnt agree more. An 1800x cant push a 1080 let alone a 1080ti to its max. What heppens a year or two down the road when a 2060 or 3060 comes out with gaming horsepower compareable yo a current 1080ti? That means your cpu that cant push a 1080ti can never take advantage if a 2080ti or 3080ti. I would rather have the faster intel cores knowing that it has the horses to drive 144+ fps at 1080p and future gen video cards wih way more power can/will push that 144 at 1440p or 4k.

What happens? Almost nothing. Any CPU that is fast enough for 1080 will be fast enough for imaginary "2080". But. Future games will use more cores. And when future games use more cores than 7700K has to offer, it will be slow.

Everyone screaming "Thank god for AMD!" The only area they are being competative is in the content creation space. I recently aold my overclocked 2600k pc and bought a 1600 system. All I can say is that my attempt at an "upgrade" is anything but that. I am seeing lower minimums and lower max frames. Pretty dissapointing.

If you have the cash, and your gaming, no matter the reaolution. Get a 7700k or 7600k overclock it to 4.8 or so and hold on for the next 3 years or so cause you have the best u can buy and the baddest GPU's over the next couple generations will blow you away with that kind of cpu power pushing them.

You are playing crappy games then.

7700K won't last until next year, it's cores are already maxed out on games, so for future games it will be bad choice. It's also bad choice for real world gaming. Benchmarks are not real world gaming. 7600K is already useless.

And those 1800x gamers are going to be pissed when they upgrade their high end gpu in a few years and see marginal gains because their 1800x's low single core speed has been holding them back this whole time.

Who really cares about single core performance? It's 2017! Ryzen has more than enough single core performance. Well perhaps not if using $700 video card with 640*480 resolution and such.

Of course, not enough for those who cannot do anything else than look for longer bars on benchmarks...

I almost had myself convinced a 1600X would be worth it. I had it all in my newegg wishlist until I continued to see it fail @ 1080p compared to the competition. Everyone wants to say drivers and patches and optimizations will fix it, but if it's just going to be the same excuse as the previous generation and the generation before that, then I'm not going to bother. I need four fast cores, not 8 slow ones. AMD missed the mark.

They'll be playing a lot of Cinebench though!

If you are gaming on 1080p and are NOT GPU limited but CPU limited, then you have wrong setting. Simple. Any Ryzen is fast enough for all games @ 1080p. If not, then game is totally crap or video card settings are wrong.

Last year Intel users played Cinebench a lot. Not anymore as Intel is slower. That is called fanboyism.
 
What happens? Almost nothing. Any CPU that is fast enough for 1080 will be fast enough for imaginary "2080". But. Future games will use more cores. And when future games use more cores than 7700K has to offer, it will be slow.



You are playing crappy games then.

7700K won't last until next year, it's cores are already maxed out on games, so for future games it will be bad choice. It's also bad choice for real world gaming. Benchmarks are not real world gaming. 7600K is already useless.



Who really cares about single core performance? It's 2017! Ryzen has more than enough single core performance. Well perhaps not if using $700 video card with 640*480 resolution and such.

Of course, not enough for those who cannot do anything else than look for longer bars on benchmarks...



If you are gaming on 1080p and are NOT GPU limited but CPU limited, then you have wrong setting. Simple. Any Ryzen is fast enough for all games @ 1080p. If not, then game is totally crap or video card settings are wrong.

Last year Intel users played Cinebench a lot. Not anymore as Intel is slower. That is called fanboyism.

Ever play a game called GTA V? Ever run the ingame benchmark 5+ times on a 4.5ghz 2600k w 2133 ram and then another 5+ times on a 1600 w 3200mhz ram? I know these two systems in and out now.

When you say any cpu that is fast enough for a 1080 will be fast enough for a 2080, well, the ryzen cpus arent fast enough to stretch the legs of a 1080 right now vs a 7700k at ANY resolution. WHY? Because wether you understand or not, single core performance DOES matter. Multiply the single cores by how many physical ones on a chip plus add the ability of hyper-threading and you have any modern day cpu.

Amd has been throwing more cores at us for years in their gpu architecture and their cpu's even back to the FX series. Their formula is to throw more cores at a work load not fewer faster ones. There are multiple ways to solve any problem, its just that intel and Nvidias solutions have been and still are more efficient.

How else do you explain a 7700k besting an 1800x in gaming workloads?

And I wouldnt hold your breath on the 7600k being dead and the 7700k not making to he end of the year...they both top the charts in everything but synthetic multi-core task benchmarks. A STOCK 7600k out performs my brand new 1600 overclocked to 3.8 and it only has 4 physical cores vs my 6/12 1600.

Dont be dumb and hit me with the fanboyism either. Eveyone knows gpu architecture gros much more quickly than cpu. So the idea of a 3060 that outperforms a current 1080ti is a very real prospect and one we could see in as little as two years from now.
 
Ever play a game called GTA V? Ever run the ingame benchmark 5+ times on a 4.5ghz 2600k w 2133 ram and then another 5+ times on a 1600 w 3200mhz ram? I know these two systems in and out now.

When you say any cpu that is fast enough for a 1080 will be fast enough for a 2080, well, the ryzen cpus arent fast enough to stretch the legs of a 1080 right now vs a 7700k at ANY resolution. WHY? Because wether you understand or not, single core performance DOES matter. Multiply the single cores by how many physical ones on a chip plus add the ability of hyper-threading and you have any modern day cpu.[/quote]

GTA V is years old DX11 game, nobody cares.

At any resolution? When GPU becomes bottleneck, CPU differences are minimal. Simple. Single core performance matters much less on DX12/Vulkan software than very old DirectX 11 stuff. Let's face it, more cores for same price is always better, if not immediately, after some time yes.

Amd has been throwing more cores at us for years in their gpu architecture and their cpu's even back to the FX series. Their formula is to throw more cores at a work load not fewer faster ones. There are multiple ways to solve any problem, its just that intel and Nvidias solutions have been and still are more efficient.

How else do you explain a 7700k besting an 1800x in gaming workloads?

GPU's are all about adding more cores, applies to Nvidia as AMD. AMD's solution is much more power efficient, Intel might have advantage on some marginal cases because their manufacturing tech allows higher clock speeds.

How do I explain?

1. Most games are crappy software
2. Very few games are optimized for Ryzen and also for Intel's new mesh CPU's (Skylake-X and so)
3. Benchmarking is not gaming
4. When there is background software, 7700K chokes as it's maxed out already on many games.
5. On gaming, GPU should always be maxed out and in that case CPU differences are minimal. I always laugh at those "GTX 1080 Ti + 1080p resolution" -benchmarks.
6. 1800X is 8-core CPU so that's about same as asking why Intel's $2000 CPU loses to 7700K on gaming, those 8+ core CPU's are not made for gaming only with no background tasks, simple.

And I wouldnt hold your breath on the 7600k being dead and the 7700k not making to he end of the year...they both top the charts in everything but synthetic multi-core task benchmarks. A STOCK 7600k out performs my brand new 1600 overclocked to 3.8 and it only has 4 physical cores vs my 6/12 1600.

Dont be dumb and hit me with the fanboyism either. Eveyone knows gpu architecture gros much more quickly than cpu. So the idea of a 3060 that outperforms a current 1080ti is a very real prospect and one we could see in as little as two years from now.

You are probably using very light threaded software then and have very few software on background. In that case, I wonder why you bought more than 4 cores anyway. If your workload is light threaded, it doesn't make 6/8 core CPU's bad. It's just they are not suitable for your use. I wouldn't change 8-core Ryzen into even 5.5 GHz 7700K, it just don't have enough cores or processing power.

In two years, Nvidia will surely have faster card than GTX 1080 Ti.
 
When you say any cpu that is fast enough for a 1080 will be fast enough for a 2080, well, the ryzen cpus arent fast enough to stretch the legs of a 1080 right now vs a 7700k at ANY resolution. WHY? Because wether you understand or not, single core performance DOES matter. Multiply the single cores by how many physical ones on a chip plus add the ability of hyper-threading and you have any modern day cpu.

GTA V is years old DX11 game, nobody cares.

At any resolution? When GPU becomes bottleneck, CPU differences are minimal. Simple. Single core performance matters much less on DX12/Vulkan software than very old DirectX 11 stuff. Let's face it, more cores for same price is always better, if not immediately, after some time yes.



GPU's are all about adding more cores, applies to Nvidia as AMD. AMD's solution is much more power efficient, Intel might have advantage on some marginal cases because their manufacturing tech allows higher clock speeds.

How do I explain?

1. Most games are crappy software
2. Very few games are optimized for Ryzen and also for Intel's new mesh CPU's (Skylake-X and so)
3. Benchmarking is not gaming
4. When there is background software, 7700K chokes as it's maxed out already on many games.
5. On gaming, GPU should always be maxed out and in that case CPU differences are minimal. I always laugh at those "GTX 1080 Ti + 1080p resolution" -benchmarks.
6. 1800X is 8-core CPU so that's about same as asking why Intel's $2000 CPU loses to 7700K on gaming, those 8+ core CPU's are not made for gaming only with no background tasks, simple.



You are probably using very light threaded software then and have very few software on background. In that case, I wonder why you bought more than 4 cores anyway. If your workload is light threaded, it doesn't make 6/8 core CPU's bad. It's just they are not suitable for your use. I wouldn't change 8-core Ryzen into even 5.5 GHz 7700K, it just don't have enough cores or processing power.

In two years, Nvidia will surely have faster card than GTX 1080 Ti.

More cores for the same price is not always better ask the 8350 and i3 4130 guys who had higher fps the last few years.

The point Im making is that these current gen ryzens are OK for now, but in a few years the more affordable GPU power will make them look like poor investments unless AMD can find a way to get more managable higher clocks out of them and increased IPC in Ryzen 2 and 3. If I could get this 1600 to 4.3-4.5ghz Id be happy as hell.

The reason they do 1080p 1080ti tests is to show cpu discrepancies
 
Last edited:
More cores for the same price is not always better ask the 8350 and i3 4130 guys who had higher fps the last few years.

The point Im making is that these current gen ryzens are OK for now, but in a few years the more affordable GPU power will make them look like poor investments unless AMD can find a way to get more managable higher clocks out of them and increased IPC in Ryzen 2 and 3. If I could get this 1600 to 4.3-4.5ghz Id be happy as hell.

The reason they do 1080p 1080ti tests is to show cpu discrepancies

Latest game tests show FX-8150 is indeed faster than i5-2500K on current games. It took longer than expected but it happened.

DX12/Vulkan games will use cores better so that problem is solved that way. Unless GF screws up their 7nm process, Ryzen 2 will clock very high as architecture allows very high clock speeds. Current 14nm LPP process "sweet spot" is around 2.5 GHz and still AMD managed to get 3.6 GHz for 8 cores/4 GHz turbo. After around 4 GHz process limits heavily but on the other hand, 14nm LPP is very good for servers. Let's hope 7nm is better.

That 1080p still makes no sense as games are almost always GPU limited. In real life most gamers adjust settings so high GPU allows. If in that case GPU limits, then CPU differences are minimal. If it's so, that should be accepted. CPU-limit on today's games rarely tells anything about CPU-limits on future games. Also nobody pairs dual core CPU with $700 video card, that's just unrealistic scenario. For those reasons, low resolution tests makes no sense at all and can be skipped.

Similar scenario would be game loading times HDD vs SSD. On many games there is virtually no difference and that should just be accepted. But because "HDD cannot be as fast as SSD", putting some dummy load on background makes SSD look much faster. Surely but if question was game loading times HDD vs SSD, that extra dummy load makes test crappy.
 
Latest game tests show FX-8150 is indeed faster than i5-2500K on current games. It took longer than expected but it happened.

DX12/Vulkan games will use cores better so that problem is solved that way. Unless GF screws up their 7nm process, Ryzen 2 will clock very high as architecture allows very high clock speeds. Current 14nm LPP process "sweet spot" is around 2.5 GHz and still AMD managed to get 3.6 GHz for 8 cores/4 GHz turbo. After around 4 GHz process limits heavily but on the other hand, 14nm LPP is very good for servers. Let's hope 7nm is better.

That 1080p still makes no sense as games are almost always GPU limited. In real life most gamers adjust settings so high GPU allows. If in that case GPU limits, then CPU differences are minimal. If it's so, that should be accepted. CPU-limit on today's games rarely tells anything about CPU-limits on future games. Also nobody pairs dual core CPU with $700 video card, that's just unrealistic scenario. For those reasons, low resolution tests makes no sense at all and can be skipped.

Similar scenario would be game loading times HDD vs SSD. On many games there is virtually no difference and that should just be accepted. But because "HDD cannot be as fast as SSD", putting some dummy load on background makes SSD look much faster. Surely but if question was game loading times HDD vs SSD, that extra dummy load makes test crappy.

The reason they do the tests in 1080p is to remove the GPU bottleneck.... If they did 4k tests with a 1050ti all the CPU's would perform the exact same.

What Im saying, and what I said from the start is that, with this data we can see that the Ryzen parts cannot fully utilize a 1080 or 1080ti in some heavily CPU intensive games. So, in the future once these mid level ryzen owners (like myself) upgrade their current 970/960/1060/1070 or 480/580/470/390x to the next gen 3060 or rx780 which will be extremely affordable, their perceived gains will not be the reality, unless of course they intend to increase their resolution at the same time. I game on a 46" 1080p "monitor" and a 120" 1080p projector. A true 4k projector is not something I care to drop $20k on at the moment plus a supporting card. But in the future that GPU power will be more affordable and my CPU will be holding back my max fps
 
The reason they do the tests in 1080p is to remove the GPU bottleneck.... If they did 4k tests with a 1050ti all the CPU's would perform the exact same.

So what's the problem? If all CPU's perform same with 1050Ti @ 4K, then all CPU's are equally fast. And that is problem because?

What Im saying, and what I said from the start is that, with this data we can see that the Ryzen parts cannot fully utilize a 1080 or 1080ti in some heavily CPU intensive games. So, in the future once these mid level ryzen owners (like myself) upgrade their current 970/960/1060/1070 or 480/580/470/390x to the next gen 3060 or rx780 which will be extremely affordable, their perceived gains will not be the reality, unless of course they intend to increase their resolution at the same time. I game on a 46" 1080p "monitor" and a 120" 1080p projector. A true 4k projector is not something I care to drop $20k on at the moment plus a supporting card. But in the future that GPU power will be more affordable and my CPU will be holding back my max fps

I doubt that Ryzen cannot fully utilize 1080 on games. Just putting high enough graphic settings would mean Ryzen is as fast as Intel CPU's so eventually almost all games are GPU bottlenecked (except some with ultra crappy engine like Starcraft 2 or World of Tanks).

So, in the future...

Yeah, I heard exactly same when there was debate FX 8 core vs i5 4 core. Back then FX was slower than i5, now FX is faster than i5. Again, future games use cores better than today's games so you would want more lower clocked cores rather than less higher clocked ones for the future. So making assumptions based on what CPU is good TODAY is absolutely no indication about what current CPU is good after couple of years. Also DX12/Vulkan adoption will mean more and more games will utilize many more cores and DX9/DX10/DX11 one thread limitation will be mostly gone.

Let's put it this way. Every time there has been discussion about "should I get less higher clocked cores or more lower clocked cores", latter option has proved better for future although "reviews" said otherwise. I don't believe this time is different.
 
So what's the problem? If all CPU's perform same with 1050Ti @ 4K, then all CPU's are equally fast. And that is problem because?



I doubt that Ryzen cannot fully utilize 1080 on games. Just putting high enough graphic settings would mean Ryzen is as fast as Intel CPU's so eventually almost all games are GPU bottlenecked (except some with ultra crappy engine like Starcraft 2 or World of Tanks).

So, in the future...

Yeah, I heard exactly same when there was debate FX 8 core vs i5 4 core. Back then FX was slower than i5, now FX is faster than i5. Again, future games use cores better than today's games so you would want more lower clocked cores rather than less higher clocked ones for the future. So making assumptions based on what CPU is good TODAY is absolutely no indication about what current CPU is good after couple of years. Also DX12/Vulkan adoption will mean more and more games will utilize many more cores and DX9/DX10/DX11 one thread limitation will be mostly gone.

Let's put it this way. Every time there has been discussion about "should I get less higher clocked cores or more lower clocked cores", latter option has proved better for future although "reviews" said otherwise. I don't believe this time is different.

The problem is that when your not using a 1050ti your leaving performance on the table. If you want to lock your fps at 1440 144fps, you need a high hp GPU and CPU. Intel is the fastest for gaming period. Now if your streaming and creating content at the same time that is where these Ryzens are taking hold.
 
The problem is that when your not using a 1050ti your leaving performance on the table. If you want to lock your fps at 1440 144fps, you need a high hp GPU and CPU. Intel is the fastest for gaming period. Now if your streaming and creating content at the same time that is where these Ryzens are taking hold.

Quite many games tested at 1440p https://www.techpowerup.com/reviews/MSI/GTX_1080_Ti_Gaming_X/6.html

Only one of them got 144 FPS average with 1080 Ti.

Again, quite few games tested at 1440p, Ryzen vs 7700K https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/10.html

As you can see, differences are very small. Only game that makes even some difference is Fallout 4. But remembering Fallout 4's engine is from 1997, nobody really cares.

Remembering those results are from March, today's results are even better. So I really wouldn't be concerned about Ryzen's CPU power in next couple of years.
 
Quite many games tested at 1440p https://www.techpowerup.com/reviews/MSI/GTX_1080_Ti_Gaming_X/6.html

Only one of them got 144 FPS average with 1080 Ti.

Again, quite few games tested at 1440p, Ryzen vs 7700K https://www.techpowerup.com/reviews/AMD/Ryzen_7_1800X/10.html

As you can see, differences are very small. Only game that makes even some difference is Fallout 4. But remembering Fallout 4's engine is from 1997, nobody really cares.

Remembering those results are from March, today's results are even better. So I really wouldn't be concerned about Ryzen's CPU power in next couple of years.

Your tables show a 12% difference @ 1080p and 5% difference @ 1440p. You will see more difference with a 1080ti vs these tests with a 1080. ALSO, this 1800X you referenced was boosted all the way to 4.0 while the 7700k is at stock 4.2 when using 4+ cores. Overclock the intel as far as it will go (like the 1800X) plus swap to a 1080ti and the spread increases further.
 
Your tables show a 12% difference @ 1080p and 5% difference @ 1440p. You will see more difference with a 1080ti vs these tests with a 1080. ALSO, this 1800X you referenced was boosted all the way to 4.0 while the 7700k is at stock 4.2 when using 4+ cores. Overclock the intel as far as it will go (like the 1800X) plus swap to a 1080ti and the spread increases further.

As expected, difference is larger on low resolutions. There was also 1800X without overclocking. Ryzen will also get boost when used with 3200 MHz memory. Even if there is some difference, I gladly swap 10% difference for double number of cores. That difference will also turn around in the future when 1800X will be faster.
 
Back