AMD is smashing Intel in retail desktop CPU sales

This is hardly surprising. Current gen of Intel processors are generally uncompetitive when looking at both performance and power consumption. They only sell better if Intel slash prices. In addition, Intel further fire at their own feet by saying that Rocket Lake will be a 6 to 8 months product which will be replaced by Alder Lake by end of the year. So it is likely that even Intel supporters are skipping Rocket Lake since the socket is a dead end.
 
This is hardly surprising. Current gen of Intel processors are generally uncompetitive when looking at both performance and power consumption. They only sell better if Intel slash prices. In addition, Intel further fire at their own feet by saying that Rocket Lake will be a 6 to 8 months product which will be replaced by Alder Lake by end of the year. So it is likely that even Intel supporters are skipping Rocket Lake since the socket is a dead end.
AM4 is a dead end too then. Who cares about this. If a CPU can't last 4-5 years, it sucked from the beginning. Pointless for me. AMD users ramble about this all the time, especially Ryzen 1000 and 2000 buyers (mediocre CPUs), which have option to upgrade to 3000 series now, WHY THE NEED after only 1-2 years if your CPU was GOOD when you bought it? Hint; It was not. It sucked. Hence the price. 12nm GloFo was and is a terrible node. Worse than 14nm Intel by far.

Most Ryzen 300 and 400 series boards did not even get firmware to support 5000 series, or lack VRMs/power delivery to run Ryzen 5000 chips at peak clocks. Some did not even have room for the new firmwares on the ROM and new boards were released with bigger ROM storage... There's tons of threads with people having issues with old boards and a newer chip.

I have NEVER and I will NEVER re-use a motherboard. I buy a system, use it for 4-5 years and then replace it. GPU is every 2 years.

No "Intel Supporters" have any reason to upgrade to Rocket Lake.

A 8700K or 9900K at 5 GHz will perform identical to the new chips at same clockspeeds in 99% of stuff, especially gaming. People with those chips have zero reason to upgrade and watt usage is not high in regular gaming etc.

Only in synthetic burn-ins it will hit high numbers. You never see this in real-world usecases. Hell, Ryzen 5800X can get really hot too, is that a problem? For some it is. If you cooling sucks. 5900X and 5950X can get burning hot even at stock. Who cares if a CPU uses 150 or 250 watts, if performance is good. Get a good cooler and you will never notice. GPUs use like twice or even triple that.


My 9900K at 5.2 GHz hits like 125 watts in gaming on average, locked at 5.2 with no AVX offset. Absolutely smashing Ryzen 3000 series, only OC'ed 5000 series comes close in terms of performance, and I have had this chip for 3 years by now, haha almost 4 years in 3 months... Awesome chip that I have zero reason to replace before 1-2 years from now, where true next gen stuff is out and DDR5 + next gen platforms have matured. Ryzen 1000 and 2000 are considered pure trash for gaming thats why I don't mention them. Even a 5 year old Intel chip will demolish them in gaming.

For gaming and emulation Intel is still king. I do tons of emulation and AMD hardware is simply too wonky for me. Most emulators are heavily optimized for Intel + Nvidia and this is fact. Visit emulation forums if you are in doubt.

I tried my Ryzen 3600 (at 4.2 GHz using 3200/C14 memory) for gaming and it was ALOT slower in all games and emulators than my 9900K. It's now in my server, which was the reason it was bnought to begin with..
 
Last edited:
If the 11-th gen was any indication how things at Intel are, "Alder Lake" my become their tombstone caption.

Why would 11th gen be "any" indication, nevermind the total indication, of how things are at Intel? A multi hundred billion dollar company with a long history of market domination, and gigantic teams of very smart people, maybe just might also be an indication of how things are at Intel.

My last 2 CPUs are AMD, in 2011 a $50 Athlon II X2 250, and a 5800X this year, lest anyone think I am some Intel fan boy. I have owned both companies products going back to the 80s.

Saying 11th gen might plausibly be foreshadowing to their tombstone, seems a little bit dramatic, emotional, and histrionic.

 
Why would 11th gen be "any" indication, nevermind the total indication, of how things are at Intel? A multi hundred billion dollar company with a long history of market domination, and gigantic teams of very smart people, maybe just might also be an indication of how things are at Intel.
You confuse "domination" with "monopoly". The latter translates into stagnation and price fixation, that's what Intel has been practicing for many years, and is all they are good at now. And those gigantic teams of smart people are long gone.
 
AMD is doing great in the CPU segment, it's good to see.


6900XT pretty much nowhere to be seen and 6800 series have low availability too. Ray tracing performance is mediocre too.

AMD never had been able to deliver on both fronts at the same time. So I'm not really surprised.


Not touching AMD in my gaming rig yet and probably won't. It's simply too wonky overall and lacks emulation optimizations which I need and use often. Use tons of emulators, for Switch, PS3 and other consoles. AMD hardware is simply not good for this.

Maybe in 3-5 years if they keep up, but I don't expect them to. AMD had some good runs time from time, but Intel and Nvidia always came back and slapped them. The new Intel CEO is doing a great job. Bob Swan was a terrible CEO, glad he got booted.

The situation is not exclusive to AMD though, Nvidia 3080 & 3090 were also hard to find initially as well. Also in regards to 6900X no where to be found. You can find them in stock in a lot of stores here in Sydney, Australia. Ditto to the 6800X as well.

In regards to emulation my Ryzen 5900X handles & plays with every emulation just as good as my mates Intel's setup.

Sure Intel & Nvidia did slapped them hard but you've got to give it to AMD. The deplorable bulldozer CPU's performance almost put them out of business. The fact of the matter is a much smaller company like them with far less money than it's rivals were able to turned things around in these past few years with their Ryzen lineup & their GPU's being quite competitive again is something to be praised for.
 
I'm amazed 15% are buying Intel. They are the equivalent of anti-vaxxers.

Alder lake is now apparently going to use more power than cRocket Lake, so much for 10nm and .little helping out.
When the alternative is 3300x or 1600AF, a 10400f starts looking pretty attractive for the same amount of money. Especially since there are still no lower mid-range or low-end Zen3 CPUs.
 
AMD is doing great in the CPU segment, it's good to see.

Just sucks that their GPU's are not up to par with terrible availablity.

In terms of availability (not exactly their fault right now though), ray tracing, and FSR v DLSS commentary, sure, there's a reasonable point to be made that AMD is behind Nvidia with their current crop of GPUs, but in terms of raw performance and framerate, the 6800xt goes absolutely toe to toe with the 3080 (same price point), and beats it in more than just a small number of games...
 
When the alternative is 3300x or 1600AF, a 10400f starts looking pretty attractive for the same amount of money. Especially since there are still no lower mid-range or low-end Zen3 CPUs.
This, frankly I think AMD would put the final nail in Intel's enthusiast-level-coffin if they released a Ryzen 3 5500 series CPU
 
I tried my Ryzen 3600 (at 4.2 GHz using 3200/C14 memory) for gaming and it was ALOT slower in all games and emulators than my 9900K. It's now in my server, which was the reason it was bnought to begin with..
Good job comparing Ryzen 3600 with i9 9900K. Amazing!

Because they are really in the same tier and compete with each other, right?

Why don't we make a Ryzen 3800 XT vs i9 9900K and let's see if the difference is that big again, or if intel loses on some parts and does not win in everything by a mile...

Ignorance is a bliss, they said, I call it stupidity.
 
You confuse "domination" with "monopoly". The latter translates into stagnation and price fixation, that's what Intel has been practicing for many years, and is all they are good at now. And those gigantic teams of smart people are long gone.
No, I don't confuse that. A monopoly means complete market share, as opposed to domination, which is a fair assessment of what intel was in the 2010s. Or at least, early to late middle 2010s...

And lol, no, Intel did not lose all their smart people. Do you have any idea how many people and who works there?

Jim Keller himself has said they have many smart people there.

I think I will take the word of a guy who worked there recently, and a guy who is a celebrity in the chip fab game, over your word. Especially since you don't seem to be stating anything that is accurate whatsoever.
 
Who cares if a CPU uses 150 or 250 watts, if performance is good. Get a good cooler and you will never notice. GPUs use like twice or even triple that.
Lots of people, me included. For many reasons. The heat generated in the room. The electric bill. The desire to run the power supply unit at peak efficiency, which is at around 50 - 55% of max rating.

And what % of GPUs use twice or triple 150 - 250 watts? Like 2% of ther market? I p[lay at 1440p, and my GPU uses 125 watts, 145 if I OC it. It's a 1660 Super, which is far more representative of the market on the whole. Most people have cards like a 1660 super, 1650 super, 1060, 1070, and other cards in similar price & wattage ranges.

And even if many more people used GPUs that burned 300 - 400 watts, it in no way follows that you would not care about the wattage of the CPU.

If anything, it would be all the MORE reason to want that CPU wattage to be on the lower side.
 
I cannot say I am surprised by this.
Why someone would think that the level of surprise would be relevant to an article like this, will never really follow. No one is surprised, unless they have been away from tech news for over 3 years. I'm not surprised, for example, that Intel has had trouble stepping down to the next node, because it has been in the news for several years. But then, no one is surprised by that either.

I'm also not (insert emotion) by it.

It is far more informative to relay what emotion one does feel about a thing, rather than the emotions that one does not feel, nor would be expected to feel.

the Ryzen news is pleasing, just for the reason that we have more competition.
 
Last edited:
AM4 is a dead end too then. Who cares about this. If a CPU can't last 4-5 years, it sucked from the beginning. Pointless for me. AMD users ramble about this all the time, especially Ryzen 1000 and 2000 buyers (mediocre CPUs), which have option to upgrade to 3000 series now, WHY THE NEED after only 1-2 years if your CPU was GOOD when you bought it? Hint; It was not. It sucked. Hence the price. 12nm GloFo was and is a terrible node. Worse than 14nm Intel by far.

Most Ryzen 300 and 400 series boards did not even get firmware to support 5000 series, or lack VRMs/power delivery to run Ryzen 5000 chips at peak clocks. Some did not even have room for the new firmwares on the ROM and new boards were released with bigger ROM storage... There's tons of threads with people having issues with old boards and a newer chip.

I have NEVER and I will NEVER re-use a motherboard. I buy a system, use it for 4-5 years and then replace it. GPU is every 2 years.

No "Intel Supporters" have any reason to upgrade to Rocket Lake.

A 8700K or 9900K at 5 GHz will perform identical to the new chips at same clockspeeds in 99% of stuff, especially gaming. People with those chips have zero reason to upgrade and watt usage is not high in regular gaming etc.

Only in synthetic burn-ins it will hit high numbers. You never see this in real-world usecases. Hell, Ryzen 5800X can get really hot too, is that a problem? For some it is. If you cooling sucks. 5900X and 5950X can get burning hot even at stock. Who cares if a CPU uses 150 or 250 watts, if performance is good. Get a good cooler and you will never notice. GPUs use like twice or even triple that.


My 9900K at 5.2 GHz hits like 125 watts in gaming on average, locked at 5.2 with no AVX offset. Absolutely smashing Ryzen 3000 series, only OC'ed 5000 series comes close in terms of performance, and I have had this chip for 3 years by now, haha almost 4 years in 3 months... Awesome chip that I have zero reason to replace before 1-2 years from now, where true next gen stuff is out and DDR5 + next gen platforms have matured. Ryzen 1000 and 2000 are considered pure trash for gaming thats why I don't mention them. Even a 5 year old Intel chip will demolish them in gaming.

For gaming and emulation Intel is still king. I do tons of emulation and AMD hardware is simply too wonky for me. Most emulators are heavily optimized for Intel + Nvidia and this is fact. Visit emulation forums if you are in doubt.

I tried my Ryzen 3600 (at 4.2 GHz using 3200/C14 memory) for gaming and it was ALOT slower in all games and emulators than my 9900K. It's now in my server, which was the reason it was bnought to begin with..
Stop talking crap, please. Ryzen's are superior CPU's, and there's no doubt about it.
 
Stop talking crap, please. Ryzen's are superior CPU's, and there's no doubt about it.

So all of a sudden Ryzen 1000 & 2000 are garbage? I need to call my brother at work today and tell him his Ryzen 2000 series is useless and garbage now.

Now I gotta take his PC to a local recycling center on Monday since the town he lives in is closed on weekends. What CPU does he need to play games that the Ryzen couldn't do?
 
Last edited:
I’m not surprised, the 7nm 5000 series is a better buy than the 14nm 10th or 11th gen Intel. 3000 series was good too but not quite so clear cut as gamers were still better with Intel. The 1000 and 2000 series Ryzen were quite poor in hindsight, in particular the 2000 series which competed with Intel 8th gen.

But Intels 14nm parts are very close and you do wonder if Intel can get their fabs in order or use TSMC to make much faster chips than Ryzen 5000. If 14nm is that close then 10nm should be a lot faster.

Still, it is nice to see the blue moon that is AMD beating Intel again. The last time that that happened I was a wreckless teenager!

 
But Intels 14nm parts are very close and you do wonder if Intel can get their fabs in order or use TSMC to make much faster chips than Ryzen 5000. If 14nm is that close then 10nm should be a lot faster.
No. 14nm+++++++ was made for high clocks, no matter what power consumption.

10nm is more for low power consumption. It's possible Intel's 10nm will never be faster than 14nm+++++++.
 
MindFactory is one (1) store in Germany. Statistically, an N of one (1) has no valid significance, essentially being an anecdotal report. That doesn't smash much.
 
Fixed that a bit for you. :)
They are not the biggest d!cks in the industry for no reason. They remind me of the tale of the Scorpion.

They will backstab and screw anyone and everyone, specially their partners and customers.
Actually, it is the "Tail" of the scorpion.
 
What makes you think things are different on other shops? Amazon and Newegg tell similar story too.

Also those are actual sale stats, not any Steam Survey type BS.
My comment in on this story found above in TechSpot.com; not about speculations in other stores in other countries. These numbers, to be fornicating redundant, are not found in THIS story. Besides, AMD hold 22% of the market versus Intel -- not exactly "knocking the king off the throne."
On a personal anecdote, I only own AMD CPUs with a 3900X on my main system.
 
My comment in on this story found above in TechSpot.com; not about speculations in other stores in other countries. These numbers, to be fornicating redundant, are not found in THIS story. Besides, AMD hold 22% of the market versus Intel -- not exactly "knocking the king off the throne."
On a personal anecdote, I only own AMD CPUs with a 3900X on my main system.

22% of market yes. Intel used to have 98% of server market, now AMD has around 15%. And AMD is doing it with better CPU's, not just lower prices. Intel can still rely on old and loyal customers but not forever.

AMD beating Intel so heavily on retail sales without competing with price is pretty much impressive. AMD also has best CPU for every major category, Intel has some niches it can compete. Basically AMD does everything better than Intel, that's enough to "knock the king off the throne." Market is just too stupid to realize that. Yet.
 
22% of market yes. Intel used to have 98% of server market, now AMD has around 15%. And AMD is doing it with better CPU's, not just lower prices. Intel can still rely on old and loyal customers but not forever.

AMD beating Intel so heavily on retail sales without competing with price is pretty much impressive. AMD also has best CPU for every major category, Intel has some niches it can compete. Basically AMD does everything better than Intel, that's enough to "knock the king off the throne." Market is just too stupid to realize that. Yet.
"My comment in on this story found above in TechSpot.com; not about speculations in other stores in other countries. These numbers, to be fornicating redundant, are not found in THIS story. Besides, AMD hold 22% of the market versus Intel -- not exactly 'knocking the king off the throne.' "
Having the best CPU in every category means little when you Have 22% Market Share; again, not smashing much!
Well-read users don't run corporations that have been buying Intel for so many years that they would not recognize AMD if they tripped over it, and it reached up and bit them on the leg.
Of course, AMD could price the 5000 series like the 3000 series, too.
 
Last edited:
"My comment in on this story found above in TechSpot.com; not about speculations in other stores in other countries. These numbers, to be fornicating redundant, are not found in THIS story. Besides, AMD hold 22% of the market versus Intel -- not exactly 'knocking the king off the throne.' "
Having the best CPU in every category means little when you Have 22% Market Share; again, not smashing much!
Well-read users don't run corporations that have been buying Intel for so many years that they would not recognize AMD if they tripped over it, and it reached up and bit them on the leg.
Of course, AMD could price the 5000 series like the 3000 series, too.
22% overall market share means quite a lot because it proves Intel just cannot sell long term just because it's Intel. AMD has better products everywhere and continuously eat Intel's market share. AMD continues to gain more market share and that's smashing. Overall market share doesn't mean much. If you look at overall GPU market share, Intel has around 60% vs Nvidia's and AMD's 20%. So Intel is smashing AMD and Nvidia on GPU side 🤔
 
Back