AMD Ryzen 5000 launch: "Fastest gaming CPU," higher clocks, higher prices

There is no need to list CPU's that support SSE. First, it can be easily checked and since compiler creates additional non-SSE codepath when SSE is used, there is no harm creating SSE codepath for CPU that even doesn't support it.
There are several fundamental errors here. First of all, if one attempts to execute SSE code on a chip that doesn't support it, there is a indeed a great deal of harm done -- a hard stop from an illegal instruction error, in fact. That's the entire reason compilers must emit code to check for these extended instruction sets. That "non-SSE codepath" you mention only gets executed if this check fails.

Now, how does one "easily check" for the presence of SSE? You execute a cpuid opcode and check for the presence of a particular bit set in the EAX register. But here's the problem. That bit was defined by Intel, and it was defined only when Intel created the new instructions. Since the bit was meaningless before this time, some preexisting non-Intel CPUs were inconsistent in their handling of it. Which is why when the original SSE(1) instructions were introduced, some programs (compiled on certain non-Intel compilers and ran on certain non-Intel cpus) would fault. There was a somewhat similar early compiler issue between MMX and AMD's 3D-Now instruction set, which I won't get into, but the critical point is that the only way to be 100% sure was to validate the EAX bit against the ManufacturerID string, and, if necessary, to validate the highest calling (It's also possible to validate against the highest calling parameter as well-- essentially the "class" of the cpu within its manufacturer)

With me so far? Now, compilers that are interested in the highest possible performance on all possible CPUs will validate all possible combinations know to support the instruction set in question. An Intel compiler written by Intel-paid developers isn't (or wasn't then, at least) going to go to such great lengths to ensure it would emit proper code. If the SSE (and later, the SSE2) bit was set and the CPU was "GenuineIntel", take the SSE path. Otherwise, play it safe. You'll rightly point out that it's very little work to optimize for AMD as well. But when this code was first written, this wasn't possible, nor even beneficial (AMD, after all, did not support the instructions then). Could Intel have spent a trivial amount of time to update their compiler later? Sure. But again, why would they? Why spend money to help a competitor? Why should they be forced to? In closing, I'll point out that Intel's compiler was hardly the one around, even then. There were plenty of compilers that quickly adapted to AMD's processors, and optimized accordingly.
 
Last edited:
If that's acceptable practice to you, which you seem to be indicating, it says quite a lot about you as a person.
I believe that alleging character defects in anyone who disagrees with you says more about you than myself.

Do read the all text source I linked to understand the depth of Intel's long history of fudging benchmarks... If the Principled scandal had been a one off, then maybe I'd be inclined to see it as an "accident". Lol, but that's not reality. As I said you seem as gullible as Intel expects you to be.
I'm fairly confident I have more experience with Intel's actions here than you, dating all the way back to the mid-80s and the benchmarking controversy between their supercomputer and Thinking Machines. And you seem perpetually willing to intentionally misinterpret me. I've said a dozen times that Intel's commissioning of a benchmark that favored them was no "accident". Why do you keep pretending I think it was? Benchmarks are always biased. Always. There are untold quintillions of possible ways to select, configure, and run a benchmark, and each and every one impacts the end result. A neutral reviewer will try to minimize it, but it cannot be eliminated. ... which is why every benchmark ever run generates so many heated remarks in the comments section. Instead of trying to criminalize the process, just settle for a little understanding instead.
 
I said it in a comment on another post about Endymio

"there is no point discussing with you"

There is a joy being right and it's a joy being proved wrong or enlighten by some others erudite commentary.

Just seeing 2 of his posts I could concern he had extremely rigid views - If you enjoy talking with flat-earthers , climate deniers go for it - they are also rigid and take nothing on board and recite the same arguments ad Infinium .

I know he likes the leader of the free world - I remember walking in on some friends watching The Apprentice - it was obvious within 1 minute of listening to him that he was a fraud and conman - he had all the classic tells - There are conmen who hide those tells - But I think it's like those Nigerian scammers - make it so obvious only the gullible buy in
 
Yes, there are plenty of AMD fanboys that think that everything the company makes is made of gold.... but this time it looks like they may be right!
+1 to THIS. I may only use AMD CPUs and Radeon cards but it's not because I'm a fanboy of anyone, it's because I hate Intel and nVidia. Not their products, the companies themselves. Fanboys of both sides make me want to give them a good smack because Intel fanboys are ignorant (because they've never owned AMD) and AMD fanboys have brainwashed themselves.

I'll tell ya, I HATED Polaris and Vega. I flatly REFUSED to buy that crap. I stuck with my R9 Fury until I bought my 5700 XT back in August (it was like $90CAD cheaper than the general going rate so I couldn't say no). When bulldozer first came out, I was waiting for it and had an AM3 motherboard all ready to go. The results were so bad that I bought a Phenom II X4 965 instead. Then a bit later, when the FX-8350 came out, I bought that. I was actually considering the i5-2500K but it was like $400 for the CPU alone and I already had a 990FX motherboard. So I waited and Tiger Direct dropped the price of the FX-8350 to $170CAD and I bought it. It did me fine for five years but I never claimed that it was better than what Intel had. At the same time, the only people who talk about "suffering" with the FX-8350 are people who never actually had one.

It's ok to like one brand over another for whatever reason as long as that a grip on reality is maintaned. (y) (Y)
 
I think there have been enough personal comments and remarks. Please stick to the issues and leave personal attacks out of it. Thank you.
 
So Nvidia increases prices by 50 -100 USD, you lose it.

AMD does something similar, it is reasonablie, trust us!

Just lol

AMD is taking pricing parity, they need to make money to remain competitive, Intel isn't laying around doing nothing, AMD needs the money for R&D they can't afford to make pennies on their CPU's and expect to remain competitive, that's what happened last time. AMD undercut the Pentium 4 and Pentium D when they didn't need to and didn't have the money to have Phenom ready or where it needed to be to compete with Core2. AMD has learned from that mistake, they realized there was no reason the X2 3800 needed to be priced with the Pentium D 820, or the Athlon 64 3200 under the price of a Pentium 4 3.0, and they don't appear to want to repeat that mistake. I applaud them, do I like price increases no, but I understand that to fund development you need them.

+1 to THIS. I may only use AMD CPUs and Radeon cards but it's not because I'm a fanboy of anyone, it's because I hate Intel and nVidia. Not their products, the companies themselves. Fanboys of both sides make me want to give them a good smack because Intel fanboys are ignorant (because they've never owned AMD) and AMD fanboys have brainwashed themselves.

I'll tell ya, I HATED Polaris and Vega. I flatly REFUSED to buy that crap. I stuck with my R9 Fury until I bought my 5700 XT back in August (it was like $90CAD cheaper than the general going rate so I couldn't say no). When bulldozer first came out, I was waiting for it and had an AM3 motherboard all ready to go. The results were so bad that I bought a Phenom II X4 965 instead. Then a bit later, when the FX-8350 came out, I bought that. I was actually considering the i5-2500K but it was like $400 for the CPU alone and I already had a 990FX motherboard. So I waited and Tiger Direct dropped the price of the FX-8350 to $170CAD and I bought it. It did me fine for five years but I never claimed that it was better than what Intel had. At the same time, the only people who talk about "suffering" with the FX-8350 are people who never actually had one.

It's ok to like one brand over another for whatever reason as long as that a grip on reality is maintaned. (y) (Y)

I'll agree I didn't have the 8350, but I did have a 6300 for several years, was it blisteringly fast, no it wasn't, it was a better cpu than a Pentium Dual core in everything and an I3 in everything but gaming at release, sure by 2015 the PentiumG was faster in everything but by then the 6300 was already 3 years old and I didn't feel so bad, plus they where regularly sold new for about $80 which was more celeron territory by then. I don't know of anyone who actually suffered from the FX, unless you bought an FX4100 or 4300 but it was a 100 dollar chip in 2012, you knew what you where buying and I don't feel bad for you.
 
... Instead of trying to criminalize the process, just settle for a little understanding instead.

Paying someone to take a very, almost exaggerated, non-neutral stance in benchmarks is, in reality, what criminalizes the process. Not the people that recognize it as such.

There's a massive difference between picking what could be called "neutral" unpaid review benchmarks to cast the best light on your product, and paying someone to make it not neutral. These are not the same and for any suggestion they are, or to obfuscate that distinction, is disingenuous at very best.
 
Last edited:
...

I'll agree I didn't have the 8350, but I did have a 6300 for several years, was it blisteringly fast, no it wasn't, it was a better cpu than a Pentium Dual core in everything and an I3 in everything but gaming at release, sure by 2015 the PentiumG was faster in everything but by then the 6300 was already 3 years old and I didn't feel so bad, plus they where regularly sold new for about $80 which was more celeron territory by then. I don't know of anyone who actually suffered from the FX, unless you bought an FX4100 or 4300 but it was a 100 dollar chip in 2012, you knew what you where buying and I don't feel bad for you.
I also have a nostalgia driven preference for AMD but I avoided the bulldozer series like the plague. I had an old Phenom2 965 at home and all my office CPUs were intel all throughout that period.

On video cards, I wouldn't have an issue with the 5700 series and contemplated ponying up when they launched, but a friend of mine gave me his Sapphire 580, retired mining card to replace some old card I had.

I didn't have much hope for it (being a retired mining card and all) but I think this one is a gem, it clocks well over 1500 stable with 2500mhz RAM and the it stays well under 80c, without sucking a ton of power (faster than a 590 by a reasonable margin). But again ... I think I just got lucky. Time to upgrade though ... waiting for RTX3060 and RDNA2 to see what is on offer there.
 
Last edited:
Well that's that. For the first time in history, one CPU maker has the other one beat in EVERY SINGLE METRIC. When AMD was at its worst, it still had better performance/price than Intel despite losing in everything else. Now AMD has Intel beat in single and multi-thread performance, power efficiency, gaming and price over all four platforms (mobile, desktop, HEDT and server) and at the same time, AMD STILL has the better performance/price ratio because despite being better in every way, AMD CPUs are STILL cheaper to buy (even with the $50 increase) and you don't have to pay extra for a decent cooler!

I never thought that I'd see this day come (and I've been using a home PC regularly since 1986) but I sure am glad that it has. The little silicon manufacturer that was almost bankrupt five years ago has stormed back and pummeled Intel into the dirt. No one can describe AMD as "Not as good as Intel but cheaper" any more because now that gaming's gone, there is literally nothing for them to grasp at but straws. Now people will have to describe AMD as "Better than Intel but cheaper" which is exactly how you win a market.

Intel won't die and Intel won't stop so this competition will go on for a long time which will be good for ALL of us! :heart_eyes:

On the ATi side, that little 3-game teaser shows that RDNA 2 is NOT going to be a letdown by any means. Steve Walton measured an average FPS of 72 in Gears 5 with the RTX 3080 at 4K Ultra and as far as I'm concerned, if Steve says it, take it to the bank!
4K_Gears.png

AMD showed that ATi's new card (and I found it odd that they didn't say which one) beats the RTX 3080 in Gears 5 by 1fps! Now, I'm not delusional. I know that this is not what anyone would call an actual win but it does show where ATi was aiming when they came out with RDNA 2 and they DEFINITELY hit the target! Normally, I wouldn't take much stock in a company's own benchmark but this is a simple display of the average frame rate in a specific game at a specific resolution using a specific graphics preset. You can't fake that.
EDIT: It has now been revealed that ATi was using the built-in benchmark like (apparently) everyone else so that only adds to its validity.
2020-10-08-image-24-j.webp


I tell ya, 2020 has been a terrible year because of SARS-CoV-2 but we can still consider ourselves lucky because never before in our beloved industry has competition ever been so healthy. When GPU competition was healthy, CPU competition wasn't. When CPU competition was healthy, GPU competition wasn't. Now, FINALLY, we have both at the same time! When I think of how long I've waited for this to happen (decades), I get all giddy! :heart_eyes:

I would say that 2020 is the greatest year that the PC industry has ever seen despite all of the current hardships and we are all lucky to be living in it. ?
You really miss ATi, huh?
 
Thinking about Ryzen 5000 pricing a bit more, the 6C (Ryzen 5600X) price increase is the one that stands out as a bit irritating. 5900X and 5950X price increases are imho a non issue considering the tier and % increase. 5800X can be argued but again that depends on performance over the alternatives and with that regard it should still be good value.

For me the important point would be if AMD continues to offer Ryzen 2 CPU, or not. If they remain available at least in the 4C to 8C range, customers can chose if they want to go for value with a 3600 / 3700X, or if they want the best and go for Ryzen 5000. That means there would be a choice depending on personal preference.
 
An i7-10700 has 8 cores but is priced similarly to the 6-core 5600x
An i9-10850k has 10 cores but is priced similarly to the 8-core 5800x.

Unless you need PCI4.0 or more than 10 cores, Intel seems to be a better value proposition. More cores and an iGPU as well for around the same money.
 
Last edited:
An i7-10700 has 8 cores and costs about the same with the 6-core 5600x
An i9-10850k has 10 cores and costs about the same with the 8-core 5800x.

Unless you need PCI4.0 or more than 10 cores, Intel seems to be a better value proposition. More cores and an iGPU as well for around the same money.

Ryzen 9 5900X (12C) MSRP is $ 549
Core i9-10900K (10C) is listed @ Intel as a available from $549, Best Buy has it for $530.

Ryzen 7 5800X (8C) MSRP is $ 449
Core i7-10700K (8C) is listed @ Intel as available from $399, Best Buy has it for $379.

For 12C vs. 10C, there is a price parity, 8C vs 8C, Intel is indeed cheaper.

Of course, if you compare budget to top of the line like you did, it looks a bit different but then we may as well include Ryzen 2 in the comparison.

Using Best Buy prices where possible (to stay consistent), we have the following situation:
Ryzen 9 5900X - $549
Core i9-10900K - $530
Core i9-10850K - $486
Ryzen 9 3900X - $459 (includes HSF)

Ryzen 7 5800X - $449
Core i7-10700k - $379
Core i7-10700 - $319
Ryzen 7 3700X - $309 (includes HSF)

Ryzen 5 5600X - $299 (includes HSF)
Core i5-10600K - $279
Ryzen 5 3600 - $199 (includes HSF)
Core i5-10400 - $189

What we do not know yet are the actual prices in November - Ryzen 3 may be more expensive than MSRP, could be the same or cheaper. Also, Ryzen 2 and the Intel CPU could see a price reduction.

The final piece of the puzzle is the actual performance in different tasks, perf / watt and in the end perf /$ plus actual platform features.

I personally would not under estimate Ryzen‘s IO advantage (even ignoring PCIe 3 vs. 4) going forward. Just look at the IO via the CPU vs. what needs to go via the chipset.
 
Last edited:
There are several fundamental errors here. First of all, if one attempts to execute SSE code on a chip that doesn't support it, there is a indeed a great deal of harm done -- a hard stop from an illegal instruction error, in fact. That's the entire reason compilers must emit code to check for these extended instruction sets. That "non-SSE codepath" you mention only gets executed if this check fails.

When SSE codepath is created also is created non-SSE codepath. So providing SSE codepath makes no harm even when CPU does not support it.

Now, how does one "easily check" for the presence of SSE? You execute a cpuid opcode and check for the presence of a particular bit set in the EAX register. But here's the problem. That bit was defined by Intel, and it was defined only when Intel created the new instructions. Since the bit was meaningless before this time, some preexisting non-Intel CPUs were inconsistent in their handling of it. Which is why when the original SSE(1) instructions were introduced, some programs (compiled on certain non-Intel compilers and ran on certain non-Intel cpus) would fault. There was a somewhat similar early compiler issue between MMX and AMD's 3D-Now instruction set, which I won't get into, but the critical point is that the only way to be 100% sure was to validate the EAX bit against the ManufacturerID string, and, if necessary, to validate the highest calling (It's also possible to validate against the highest calling parameter as well-- essentially the "class" of the cpu within its manufacturer)

With me so far? Now, compilers that are interested in the highest possible performance on all possible CPUs will validate all possible combinations know to support the instruction set in question. An Intel compiler written by Intel-paid developers isn't (or wasn't then, at least) going to go to such great lengths to ensure it would emit proper code. If the SSE (and later, the SSE2) bit was set and the CPU was "GenuineIntel", take the SSE path. Otherwise, play it safe. You'll rightly point out that it's very little work to optimize for AMD as well. But when this code was first written, this wasn't possible, nor even beneficial (AMD, after all, did not support the instructions then). Could Intel have spent a trivial amount of time to update their compiler later? Sure. But again, why would they? Why spend money to help a competitor? Why should they be forced to? In closing, I'll point out that Intel's compiler was hardly the one around, even then. There were plenty of compilers that quickly adapted to AMD's processors, and optimized accordingly.

It's still easy to determine what instructions CPU can execute. So limiting SSE code for only Intel, was limiting performance on other than Intel CPU's. Not optimizing for Intel but de-optimizing for everyone else. That is why Intel is not allowed to do so any more. In fact Intel put MORE work to gimp other CPU's than it would have took them to make compiler work as it should have. As said, Intel cannot no longer legally do that kind of things.

Intel did not made their compiler to work ONLY for Intel CPU's so it was certain that software compiled with Intel compiler will be used with other CPU's than Intel too. That's why this "there were other compilers too" -excuse does not apply here.

There is no even need to "optimize" for other CPU's. Just let them use SSE codepaths. This is good example, just changing CPUID to "GenuineIntel" gives much more performance: https://arstechnica.com/gadgets/2008/07/atom-nano-review/6/

It's always easy to figure out excuses but as those practices are determined to be illegal so they were really just excuses.
 
There's a massive difference between picking what could be called "neutral" unpaid review benchmarks to cast the best light on your product and paying someone to make it not neutral.
You don't think the benchmarks that AMD provided for this very article on the 5000 series were chosen to cast them in the best light? And that some AMD employee was paid to do just that? The link you provided to the i9 benchmarks contains a very apt statement: "The bottom line is this: always wait for the independent reviews. This was a study clearly commissioned to present Intel's competitive advantage and to make its new product look good...."

Which is almost exact what TechSpot here says about these most recent AMD benchmarks: "In the end, we'll have to wait for independent benchmarks to judge the value proposition that AMD is showing. Like all company-produced benchmarks, we suspect there has been some level of cherry-picking...

Don't have unreasonable expectations that a company will be unbiased about its own products. If you find yourself getting too emotionally invested in the process: take a deep breath, and have a "Zen" moment. Pun intended.

When SSE codepath is created also is created non-SSE codepath. So providing SSE codepath makes no harm even when CPU does not support it.
Perhaps there's a language barrier here, the situation really isn't that complicated. Yes, the compiler emits two code paths. Merely providing an SSE path is harmless, yes. But executing that path on a non-SSE processor results in a fault. If this were not true, there would be no need to test the cpuid in the first place. The test must take place, and it must be accurate, else the resultant code will fail on the wrong processor.
 
Perhaps there's a language barrier here, the situation really isn't that complicated. Yes, the compiler emits two code paths. Merely providing an SSE path is harmless, yes. But executing that path on a non-SSE processor results in a fault. If this were not true, there would be no need to test the cpuid in the first place. The test must take place, and it must be accurate, else the resultant code will fail on the wrong processor.

If there only were a more reliable forward looking method to determine if a CPU supports certain features like a feature flag instead of querying the make and model.....
 
If there only were a more reliable forward looking method to determine if a CPU supports certain features like a feature flag instead of querying the make and model.....
I recognize the sarcasm, but read my original post. Feature flags exist, but they are not perfectly reliable when the processor was designed and produced before the flag existed. These flags were not predefined by any open-standards committee -- it was Intel who initially created them, leaving most undefined for future expansion. AMD in general followed Intel's lead, but in some areas diverged, such as 3DNow. To test for the presence of 3DNow, you use the AMD-defined feature flags, not Intels.

I don't know any early AMD processors that were inconsistent in their handling of the bits that later became designated SSE flags, but some of the Vias were. There are hundreds of other special cases for various processors and processor features as well, which is why compilers go through such contortions to emit optimized code that runs on all possible cpus.

To reiterate: it would have been a trivial amount of extra work for Intel's compiler to support SSE(2) on AMD. But not supporting it was not "extra work done to deoptimize for AMD".
 
Last edited:
Even
Lol - that just proves my point! Intel advanced so slowly, holding the entire industry back, that you didn't need to upgrade! They did little, then less, and finally almost nothing.

I call getting a lousy dual-core in my $1100 XPS 13 laptops being milked. If Intel hadn't used every means, fair and often foul, to monopolize the market and hamstring AMD, I could have twice the performance! Multiply that experience by about a billion, and Intel has a lot to answer for.

Intel will be back, which is good, but in the meantime we desperately need AMD to keep hitting home runs and build up enough mind-share and money to survive that return. Otherwise we'll be back to ten-year stretches of mediocrity. Which may be fine with you, but not with me.
I had a pentium 3 tower cpu which I bought $406 ?, I gave up on it after struggling with it a lot, for 4 years, and later, dismantled and gave away in parts I.e hdd, ram etc, because it was obviously useless and there was no room for upgrade to a different processor on the same motherboard. Long story short, I later turned to a laptop $270 an HP pavillion dv6 i5 2nd gen, with both ati and intel graphics cards. Honestly, it made me feel like I had gotten a good deal coz it performed better than my tower. AMD was unheard of. All products in the market here (Uganda-Africa) are old intel products, its like a dump for discarded intel parts exhorbitantly priced. Recently for need of of a better laptop for programming studies, when I saw the asus rog zephyrus g14 Ryzen 9 4900HS, I was impressed. It all led me, inexorably, here ?.
So pissed at intel for not being as equally innovative, and for actively preventing more passionate companies from penetrating the market. I'm never going to have intel as my first choice ever again (unless otherwise) whether in terms of mobile gadgets or tower pcs. I have felt the stagnation of the computer world much more potently here albeit in oblivion. I'm happy that AMD has rekindled my enthusiasm for computers. ☺
 
Last edited:
Finding it funny that not even AMD could get hold of the RTX 3090 graphic card for this testing.
 
After reading this I am of the opinion that if you have a Zen 2 processor, there would be very little gain upgrading to Zen 3 unless you are moving up in levels (I.e. from a 3900x to a 5950x). Save the money.
 
And they could even launch a 64MB dual CCD eight core CPU down the road. Not sure how that would turn out performance wise for gaming but for non-gaming applications that could give a nice boost and make it easier to cool at the same time.
In previous generations, they never did anything like that in the consumer space, but they did have EPYC parts like that, where there were different amounts of cache for the same number of cores through distributing the cores among CCXes differently. For example, the EPYC 7232P, 7252, and 7262, with 36, 68, and 132 MB of cache respectively, although they all had 8 cores.
 
AMD price gouging hard. The 6 core model has literally doubled in price!

Of course they have the performance advantage. And because of this I imagine I will be buying one of these juicy looking new CPUs to replace my ageing but brilliant 4790k.

Lmao to all the *****s who genuinely believed AMD would keep their low prices once they had the upper hand over Intel. These two companies are just as bad as each other.
AMD price gouging hard. The 6 core model has literally doubled in price!

Of course they have the performance advantage. And because of this I imagine I will be buying one of these juicy looking new CPUs to replace my ageing but brilliant 4790k.

Lmao to all the *****s who genuinely believed AMD would keep their low prices once they had the upper hand over Intel. These two companies are just as bad as each other.

$50 more is not double ($299 vs $249 for the 600x), but kudos for the QUICK MATH though.

Personally I am not happy they have put up their prices which makes their chips now higher than Intel, I am happy with the 3000 series and will wait a year or two for the prices to drop.
 
So the 5600X is 20% more expensive at start than the 3600x was. that is just too much in my opinion. I bet that value will be much worse than the 3600x was.
This release is a disappointment to me. Yeah I get the whole point is that they have finally caught up with Intel in gaming. Cool, and what do we have now? $300 mid range CPUs for gaming. Well thanks!
 
Looks like intels 10th gen will be at or lower then msrp soon after AMDs launch, the gaming budget option lol. 10600kf has a $237 msrp for those willing to oc. Gamer Nexus shows you how to do it in like 5 min.

Maybe will get lucky and the 5600 non X comes in at 200
 
Finally, Goodbye Intel.

From my current i7 8700K to Ryzen 3. Been so long since I wanted to break away from Intel monopoly.

I laugh at those who still insist Intel is king for gaming, where the benchmark leads of a few FPS, doesn't matter a sh!t anymore.

I hate monopoly and corporate culture.

An all rounder is always gonna be a better than a gaming-only CPU.

The rest of us have better things to do in life than just wasting life in gaming.
 
If the RDNA 2 launch goes as I expect and AMD has only teased the RX 6800 XT instead of the RX 6900 XT(X), then the RX 6900 XT(X) will definitely beat nVidia this generation. We already have Ryzen 5000 beating Intel COMPLETELY so...

If all of this takes place, I hereby give October 2020 the title of "Red October"!
 
Back