Nvidia reaches historic 92% GPU market share, leaves AMD and Intel far behind

What? Nvidia makes mobile cards...what are you smoking? Also, AMD's APU's are an ENTIRELY different beast than their discreet cards. Their APU's are also trash compared to Intel's. AMD is literally doing EVERYTHING wrong, as they always have.

They should just spin off their graphics cards into a separate company and divest it from their infrastructure. Let ATI exist again so we can get some real GPU competition again.
Most of Nvidia discrete sales are mobile GPUs. What's wrong with AMD APUs?
Your timeline is way off. There have been times when AMD had more market share than Nvidia, and many times when they have been ahead on performance and/or efficiency.

Things only started to get weird when Mores Law came to an end and certain parts of chips can no longer be shrunken down, while the remaining parts get closer and closer to the atomic limit.

Now a huge factor in which chips are better isn't about which design is better but who get priority access to TSMCs best nodes. That's a big part of why Apple and Nvidia hardware stays ahead, they have deeper pockets to get priority access.
Looking at discrete share, AMD last time had more share 2005. That is, about 20 years ago.

Nvidia does not use any "priority access nodes" like Apple does. Nvidia made RTX3000 series on inferior Samsung 8nm process. Currently Nvidia uses custom 5nm TSMC process that is unlikely much better than 5nm standard AMD is using.

-

lol - I know you wear red-tinted glasses, but try and come back to reality a bit. If AMD could match Nvidia’s high-end GPU, they’d sell out in minutes… people are desperate for high-end GPUs and they sell out instantly.
Yeah right. AMD has big amount of professionals thinking about these. Of course you know better. Even if they did sell in minutes, is it still profitable? Again, professionals make these calculations so answer is very probably not.
Even AMD isn’t so dumb as to say, “we don’t want any of that money”… the reason they don’t sell in the high end is that they can’t make one. And no, one with a 600w power envelope doesn’t count.

That’s overall - cause AMD fails so spectacularly selling laptop CPUs… in the desktop market, they are much higher.
AMD has calculated that designing and manufacturing high end chips is not profitable. Simple as that.

And why AMD fails to sell laptop CPUs? Because Intel laptop CPUs are better? No, because stupid people want Intel, no matter how bad products actually are. Now telling that AMD high end GPU would sell out instantly, probably not.
 
This seems more like a fanboy-coping mechanism than actual facts.

I have had Intel CPU's for most of my life, not because of "brand loyalty", because the name or the color "blue"... Every 3-4 years I bought a new PC, Intel CPU simply beat AMD in terms of gaming performance based on a multitude of benchmarks from various sources. The only time I made a bad choice was when I got a first generation i5 CPU that people said would be perfectly fine for multitasking, but the machine simply wasn't performant enough, when I switched to an i7 I had zero issues.

My new computer is coming in a few weeks, and for the first time in about 10-15 years I'll have an AMD CPU inside. Why? Because it's the most performant in gaming currently speaking.

People that stubbornly remain loyal to some company that cares not a single bit about you are pathetic in my opinion. The only thing I'm loyal to is performance, and for that matter I couldn't care what logo is on it, I would sport a CPU or GPU with a swastika on it if it was the most performant for gaming.
Problem is: your logic only applies to ........ you. You see, despite Intel being behind AMD on servers for 6 years straight, Intel still has 72% server share. Why? Because of brand loyalty.

Intel desktop share is 72% vs 28% for AMD. Why? Because of brand loyalty.

I doubt you buy many CPUs. Even if YOU buy CPU that has most performance, majority of market does not.

So, to summarize the comments, as usual it's all circumstantial, has nothing at all to do with the poor Radeon user experience vs Geforce, it's just AI/crypto/whatever fad and not GAMErZ anyway, we just need to have faith in the opinion pieces of the Techtubers and Mindfactory instead.

Brace for more crushing disappointment.
You summarized it well. That's the truth. IF you disagree, care to explain why Intel still dominates CPU market despite AMD having better products on every category. And CPUs even last unlike Intel's Raptor Lakes that burn like flies...
 
Looking at discrete share, AMD last time had more share 2005. That is, about 20 years ago.
It wasn't until 2014 that AMDs market share started to dip substantially. Before then they consistently had about 40% of the market. The timing on the shift also has a lot to do with the first big bitcoin mining trend and the first time mining had a big impact on the market.

jpr_q2_2016_amd_vs_nvda_SHARE.png


Nvidia does have a node advantage. RTX 4000 was a node ahead of RDNA3 and RTX 5000 is a node ahead of RDNA4.

RDNA2 was AMD's first attempt at chiplets, and RDNA2 was generally faster than RTX3000 in standard raster workloads.
 
Problem is: your logic only applies to ........ you. You see, despite Intel being behind AMD on servers for 6 years straight, Intel still has 72% server share. Why? Because of brand loyalty.

Intel desktop share is 72% vs 28% for AMD. Why? Because of brand loyalty.

I doubt you buy many CPUs. Even if YOU buy CPU that has most performance, majority of market does not.

In your first part you're talking about server CPU's and the 2nd part you're suddenly shifting over to DESKTOP CPU's... and in most of your posts you're talking about consumer CPU/GPU's.

So are we talking about companies or consumers when you refer to the 'market'? Or do you just cherry-pick whichever, whenever to make your point? Because I was purely talking about a consumer's standpoint regarding brand loyalty, and given your previous posts you were not shying away to show your brand loyalty by showing off your red-tinted glasses.

What I know about gamers(=consumers) is that they care about GAMING performance(FPS), they care about how easy it is to cool their hardware and how stable their hardware is.

They will simply get the most performant hardware that fits their budget, which typically leaves out any CPU's above a 1000 bucks, which coincidentally are a whole bunch of AMD Ryzen Threadripper PRO currently priced at 5000 - 15000 USD/EURO. Also because these are meant for other workloads than gaming, obviously.

In terms of stability, people are shifting over to AMD CPU's because of the (12900)/13900/14900 degrading chips fiasco, AMD had similar issues in the past like 15~ years ago with chips that ran extremely hot, in the past few years you had some issues with Infinity Fabric stuttering and even more recent to get the best performance in AMD-chips you practically need to purchase a license for Process Lasso to make sure games use the cache cores properly automatically because the chipset drivers are not activating it for every single game.

When you're talking about companies? They typically get whatever they can get the cheapest contracts for, even if performance is not the very best. I worked in a multinational company and they cheaped out on laptops because well.. it cost them a few million less. These cheap laptops caused a whole bunch of delays, developers were unable to do their job because these laptops were too weak for it's intended purpose and it ended up costing them more in the long run.

Brand loyalty for companies is keeping $$$ in their pockets. If you ever joined a company IT meeting about the purchase of hardware you know exactly what types of things are discussed. The most common discussed talking point is the PRICE TAG and how the company can keep the price tag DOWN.
 
It wasn't until 2014 that AMDs market share started to dip substantially. Before then they consistently had about 40% of the market. The timing on the shift also has a lot to do with the first big bitcoin mining trend and the first time mining had a big impact on the market.
Having clearly better product (around 2010-) only resulted in "around 40%" share. Meaning "equally good product" would mean somewhere 25-30% share. So why should AMD even bother? It should be around 70% when having clearly better product and 50% when having equal product.
jpr_q2_2016_amd_vs_nvda_SHARE.png


Nvidia does have a node advantage. RTX 4000 was a node ahead of RDNA3 and RTX 5000 is a node ahead of RDNA4.

RDNA2 was AMD's first attempt at chiplets, and RDNA2 was generally faster than RTX3000 in standard raster workloads.
RTX5000 is using 5nm custom TSMC, RDNA4 is using 5nm TSMC. There might be small advantage because of custom process but again small. Apple clearly have advantage, a pricey one.

RDNA2 also had some problems that AMD could have avoided if priority was consumer class GPUs. Unsurprisingly priority was elsewhere.
In your first part you're talking about server CPU's and the 2nd part you're suddenly shifting over to DESKTOP CPU's... and in most of your posts you're talking about consumer CPU/GPU's.

So are we talking about companies or consumers when you refer to the 'market'? Or do you just cherry-pick whichever, whenever to make your point? Because I was purely talking about a consumer's standpoint regarding brand loyalty, and given your previous posts you were not shying away to show your brand loyalty by showing off your red-tinted glasses.
Both examples are valid. Pick ANY x86 CPU class (consumer, server, mobile) and you see clear customer bias for Intel.

And as shown previously, even when AMD had better GPU, Nvidia still sold more. All these CPU and GPU sales say same story: brand loyalty exists and it's not for AMD.
What I know about gamers(=consumers) is that they care about GAMING performance(FPS), they care about how easy it is to cool their hardware and how stable their hardware is.

They will simply get the most performant hardware that fits their budget, which typically leaves out any CPU's above a 1000 bucks, which coincidentally are a whole bunch of AMD Ryzen Threadripper PRO currently priced at 5000 - 15000 USD/EURO. Also because these are meant for other workloads than gaming, obviously.

In terms of stability, people are shifting over to AMD CPU's because of the (12900)/13900/14900 degrading chips fiasco, AMD had similar issues in the past like 15~ years ago with chips that ran extremely hot, in the past few years you had some issues with Infinity Fabric stuttering and even more recent to get the best performance in AMD-chips you practically need to purchase a license for Process Lasso to make sure games use the cache cores properly automatically because the chipset drivers are not activating it for every single game.
Most consumers don't care at all on FPS. Again, you are talking simply from your POV. Just look at overall GPU share. Intel dominates there. You think those machines with integrated trash are for high FPS gaming?

AMD consumer class Ryzens with chiplets are not really even meant for gaming. That is, because 1. most people just don't care about high FPS and 2. chiplet design is very cheap to manufacture. AMD could make monolithic Ryzens for high FPS gaming but market is so small there is not enough profit to make.
When you're talking about companies? They typically get whatever they can get the cheapest contracts for, even if performance is not the very best. I worked in a multinational company and they cheaped out on laptops because well.. it cost them a few million less. These cheap laptops caused a whole bunch of delays, developers were unable to do their job because these laptops were too weak for it's intended purpose and it ended up costing them more in the long run.

Brand loyalty for companies is keeping $$$ in their pockets. If you ever joined a company IT meeting about the purchase of hardware you know exactly what types of things are discussed. The most common discussed talking point is the PRICE TAG and how the company can keep the price tag DOWN.
Oh OK, then AMD should have outsold Intel by wide margin on laptops as AMD laptops are generally cheaper. Again you have one example but big picture is different.

Intel must offer very heavy discount on server CPUs to match AMD pricing, I doubt they are doing that. Looking how Intel is rapidly losing share, even more. Only question is how Intel still has more share despite AMD being much better everywhere. Because stupid companies just refuse to admit AMD has better CPUs.
 
Having clearly better product (around 2010-) only resulted in "around 40%" share. Meaning "equally good product" would mean somewhere 25-30% share. So why should AMD even bother? It should be around 70% when having clearly better product and 50% when having equal product.

RTX5000 is using 5nm custom TSMC, RDNA4 is using 5nm TSMC. There might be small advantage because of custom process but again small. Apple clearly have advantage, a pricey one.

RDNA2 also had some problems that AMD could have avoided if priority was consumer class GPUs. Unsurprisingly priority was elsewhere.

Both examples are valid. Pick ANY x86 CPU class (consumer, server, mobile) and you see clear customer bias for Intel.

And as shown previously, even when AMD had better GPU, Nvidia still sold more. All these CPU and GPU sales say same story: brand loyalty exists and it's not for AMD.

Most consumers don't care at all on FPS. Again, you are talking simply from your POV. Just look at overall GPU share. Intel dominates there. You think those machines with integrated trash are for high FPS gaming?

AMD consumer class Ryzens with chiplets are not really even meant for gaming. That is, because 1. most people just don't care about high FPS and 2. chiplet design is very cheap to manufacture. AMD could make monolithic Ryzens for high FPS gaming but market is so small there is not enough profit to make.

Oh OK, then AMD should have outsold Intel by wide margin on laptops as AMD laptops are generally cheaper. Again you have one example but big picture is different.

Intel must offer very heavy discount on server CPUs to match AMD pricing, I doubt they are doing that. Looking how Intel is rapidly losing share, even more. Only question is how Intel still has more share despite AMD being much better everywhere. Because stupid companies just refuse to admit AMD has better CPUs.
Considering AMD absorbed ATI as it failed, maintaining 40% for years was pretty good.

Anyway, the point wasn't which is better. It's just that AMD has had high market share up until the first bitcoin mining craze that more or less derailed things.

Please don't write me another 2000 word essay that says almost nothing.
 
Considering AMD absorbed ATI as it failed, maintaining 40% for years was pretty good.

Anyway, the point wasn't which is better. It's just that AMD has had high market share up until the first bitcoin mining craze that more or less derailed things.

Please don't write me another 2000 word essay that says almost nothing.
AMD market share was downhill already before any major GPU crypto mining happened so that explanation is pretty poor.
 
Some good news for rdna 4 customers. FSR 4 games list just expanded to 65 games.
 
AMD seems to be happy selling the volume of Radeons (they do, 8% or whatever) right now, or they would focus on producing more since there clearly is demand whereas NVidia is looking to cut production. Neither AMD nor NVidia is in anyway loyal or focused on the Gaming Market (right now at least), as PC Gaming often is and has been, perhaps it continues to be viewed as a dying markt, or in the least one about to go to Cloud Services such as GeForce Gaming and the like where dGPU is much less of a needed thing.
 
The 7900 XTX is a powerful card, I'd not change it for anything this gen (or maybe even the next - I'm still holding onto a 6950 XT!). Thing is tho, consumer sentiment towards the card (and the 9070 XT) is poor.

Tell a casual gamer the XTX has performance parity with the RTX 4080 / RTX 5070 Ti they won't believe you. Herein lies AMD's problem trying to sell anything to consumers.
Not sure what you mean here… The 9070XT has raster parity with 5070Ti and essentially smokes the 7900XTX in ray tracing.
I would say the 9070XT is a very decent gaming card and also a very decent value, at least in Canada where one at around 10-15% over MSRP is usually available nowadays.

Sure, the content creation/ AI value is significantly lower, no question about it. I would love to see Adobe or the CAD Software companies adopt a less Cuda-centric approach to the world, I’m sure AMD would perform better in this respect, like it does in some of the optimized gaming titles. Wishful thinking, I know.
 
AMD market share was downhill already before any major GPU crypto mining happened so that explanation is pretty poor.

Hardly.

I'm not trying to explain anything other than that AMD consistently did pretty well in the GPU market before mining got big. That is a fact.

You need to have your eyes checked and take another look at that graph. No company would be sad to have ~40% of a market except maybe Nvidia.
 
Hardly.

I'm not trying to explain anything other than that AMD consistently did pretty well in the GPU market before mining got big. That is a fact.

You need to have your eyes checked and take another look at that graph. No company would be sad to have ~40% of a market except maybe Nvidia.
AMD share was on decline since 2005, despite some better quarters. First Bitcoin mining craze started 2013. AMD share dipped Q3/2014, well after first bitcoin mining graze ended, and never recovered.

Having clearly better product and have only around 40% market share. Happy with that? No. Because equal product then gives much less than 40%. GPU development takes money and if results are that bad, no wonder AMD invested elsewhere.

Just like Intel. They could have done much better on discrete GPUs. But, developing better GPUs cost money. Will that money ever come back? Intel is balancing between development costs and expected revenue.
 
Not sure what you mean here… The 9070XT has raster parity with 5070Ti and essentially smokes the 7900XTX in ray tracing.
I would say the 9070XT is a very decent gaming card and also a very decent value, at least in Canada where one at around 10-15% over MSRP is usually available nowadays.
You and I both know the Radeon's are good value and performance, my point was the average gamer (who has been marketed to and likely won't bother with research) doesn't know.

They identify Nvidia as good, Radeon as bad - at least more often than not anyway. AMD make good products for the most part, but they should fire their marketing team given Nvidia has most mindshare among buyers.
 
AMD share was on decline since 2005, despite some better quarters. First Bitcoin mining craze started 2013. AMD share dipped Q3/2014, well after first bitcoin mining graze ended, and never recovered.

Having clearly better product and have only around 40% market share. Happy with that? No. Because equal product then gives much less than 40%. GPU development takes money and if results are that bad, no wonder AMD invested elsewhere.

Just like Intel. They could have done much better on discrete GPUs. But, developing better GPUs cost money. Will that money ever come back? Intel is balancing between development costs and expected revenue.
They didn't have clearly better products. They often did have better performance and/or efficiency, but they also had a lot of driver issues.

Anyway, your rants have little to do with the points I've already made.
 
They didn't have clearly better products. They often did have better performance and/or efficiency, but they also had a lot of driver issues.

Anyway, your rants have little to do with the points I've already made.
Like GTX480 was better than AMDs offerings? It wasn't. Driver issues is old untrue story. Remember that famous 5000 series black screen "bug"? Even Techspot was not able to reproduce that . It was just Nvidia propaganda.

What actually are your points? You pointed Nvidia has node advantage, that it basically has not. You also said AMD decline started 2014 because bitcoin and/or node development. Those have very little to do. We know that Zen development started 2012 and AMD has confirmed that Zen took resources from GPU development.
 
+1 here kirby, AMD always is and always was a hardware-only company, this is why they like doing consoles so much, and look on the software they make: only copy from others absolutely nothing of their own, while if you look nvidea - they have TONS of software, obviously CUDA is their biggest with gigabytes of code but it is far far from being the only one. consoles is an easier market for them then, because they PROACTIVELY TRYING TO AVOID making software: did you ever asked yourself why they dropped they fglrx linux driver and open sources their drivers to radeonhd and amdgpu (AMD linux kernel drivers) - do you really think it is only because they want to save money on software development and let open source free developers instead ? or...it is no less than that also because they are bad at it ? history proves they are bad at it. I am 100% sure if it was possible for them to open source the windows drivers, they would have done it right now :)
 
Most consumers don't care at all on FPS. Again, you are talking simply from your POV. Just look at overall GPU share. Intel dominates there. You think those machines with integrated trash are for high FPS gaming?
I'm pretty sure I made a clear distinction for Gamers in my post, which are obviously also part of the consumer group, maybe not misread or misconstrue to make some bogus 'gotcha' moment?

AMD consumer class Ryzens with chiplets are not really even meant for gaming. That is, because 1. most people just don't care about high FPS and 2. chiplet design is very cheap to manufacture. AMD could make monolithic Ryzens for high FPS gaming but market is so small there is not enough profit to make.
I find it funny how you both say people are *****s because they don't know what they're buying, but at the same time you claim they buy Intel for some perceived brand loyalty bias(your POV).

You do realize if people have no clue what they're buying, they'll just buy whatever, right? and not for some perceived bias by someone with obvious red-tinted glasses (you)? If they don't care about performance at all, it doesn't matter what they're buying because anything will do. For 99% of the people a Chromebook is enough for what they do with computers, browsing, looking at e-mails and porn.

Those that DO care about FPS which are typically gamers, typically buy that which is most performant, at least according to benchmarks, some only look at one source like GamersNexus or dozens of sources, while others look at bad sources to make their choices. But in the end Gamers care about FPS.

Intel must offer very heavy discount on server CPUs to match AMD pricing, I doubt they are doing that. Looking how Intel is rapidly losing share, even more. Only question is how Intel still has more share despite AMD being much better everywhere. Because stupid companies just refuse to admit AMD has better CPUs.
In one post you're complaining about some perceived bias by companies because they keep buying Intel CPU's, but simultaneously you say they're rapidly losing shares. What point are you even trying to make? Because if they're rapidly losing shares like you claim, there is no bias but they're buying they CHEAPEST option available... because like you said, they don't care about performance :).

So you're basically confirming my previous statement where companies simply pick what saves them the most money, which may currently be AMD, but previously was Intel, do you even read what you post?
 
Last edited:
I'm pretty sure I made a clear distinction for Gamers in my post, which are obviously also part of the consumer group, maybe not misread or misconstrue to make some bogus 'gotcha' moment?
Gamers are very small subgroup of consumers. When talking about CPU market shares, gamers are so small group we mostly don't need to consider them.
I find it funny how you both say people are *****s because they don't know what they're buying, but at the same time you claim they buy Intel for some perceived brand loyalty bias(your POV).

You do realize if people have no clue what they're buying, they'll just buy whatever, right? and not for some perceived bias by someone with obvious red-tinted glasses (you)? If they don't care about performance at all, it doesn't matter what they're buying because anything will do. For 99% of the people a Chromebook is enough for what they do with computers, browsing, looking at e-mails and porn.
Intel brand loyalty is very clear on servers. On consumer market, OEMs are loyal to Intel and because Intel has more availability, most CPUs sold are naturally Intel.

Some buy Intel because Intel. Some buy whatever is available and then matters what is available, so Intel. AMD beats Intel on retail sales, most intelligent people (=those who know what they are buying) prefer AMD. Sadly retail market is much smaller than OEM market. Chromebooks have their own problems, like more heavily depending on working internet connection.
Those that DO care about FPS which are typically gamers, typically buy that which is most performant, at least according to benchmarks, some only look at one source like GamersNexus or dozens of sources, while others look at bad sources to make their choices. But in the end Gamers care about FPS.

In one post you're complaining about some perceived bias by companies because they keep buying Intel CPU's, but simultaneously you say they're rapidly losing shares. What point are you even trying to make? Because if they're rapidly losing shares like you claim, there is no bias but they're buying they CHEAPEST option available... because like you said, they don't care about performance :).

So you're basically confirming my previous statement where companies simply pick what saves them the most money, which may currently be AMD, but previously was Intel, do you even read what you post?
Gamers again are so small group that when talking Overall market share, they barely make difference.

There is still bias because Intel should have lost much more market share if it was just about product quality and/ore price. Intel is not cheapest option either on most cases.

AMD has always been cheaper choice and still is. People are not buying Intel for price, not for performance, not for power consumption... For what? Exactly, brand loyalty.
 
+1 here kirby, AMD always is and always was a hardware-only company, this is why they like doing consoles so much, and look on the software they make: only copy from others absolutely nothing of their own, while if you look nvidea - they have TONS of software, obviously CUDA is their biggest with gigabytes of code but it is far far from being the only one. consoles is an easier market for them then, because they PROACTIVELY TRYING TO AVOID making software: did you ever asked yourself why they dropped they fglrx linux driver and open sources their drivers to radeonhd and amdgpu (AMD linux kernel drivers) - do you really think it is only because they want to save money on software development and let open source free developers instead ? or...it is no less than that also because they are bad at it ? history proves they are bad at it. I am 100% sure if it was possible for them to open source the windows drivers, they would have done it right now :)
Makes sense. Lossless scaling app that costs $6 uses 2 gpus to improve latency in frame generation. Meanwhile a decade later AMD still can't get it's integrated and Discrete gpus to work together.
 
Like GTX480 was better than AMDs offerings? It wasn't. Driver issues is old untrue story. Remember that famous 5000 series black screen "bug"? Even Techspot was not able to reproduce that . It was just Nvidia propaganda.

What actually are your points? You pointed Nvidia has node advantage, that it basically has not. You also said AMD decline started 2014 because bitcoin and/or node development. Those have very little to do. We know that Zen development started 2012 and AMD has confirmed that Zen took resources from GPU development.
It's not fun to debate someone who has the facts completely wrong and won't shut up about it.

I've stated my point several times. I'm sorry if you're reading comprehension skills are too low to get it.

My point is that before mining took over in 2014, AMD was doing fine in the GPU market. Not as well as Nvidia, but certainly not in trouble. They are behind now for several reasons, the big one being that Nvidia invested in AI much earlier than anyone else, and now AMD is playing catch-up. Nvidia does have a node advantage today, and has for the last few years, which is a factor but not as big of a factor as AI.

The rate at which AMD is catching up now is quick, and they are nearing feature parity with Nvidia while offering more ram and better performance per dollar. There is no reason why they can't succeed in the GPU market at some point in the future.

I agree that Nvidia is the default choice for most people who don't follow hardware closely, but today there are tons of youtube channels that talk about this stuff in simple terms and lately they are all saying AMD is crushing it at the most common price points.


Stop quoting me about this or I am just going to put you on ignore. The points I am making about the past are just numbers, they are not up for debate. The other points I'm making are speculative. You don't have to agree with them, but I am not going to keep repeating myself over and over until I word it in a way you can understand. If you don't get my point by now, I'm sorry, that's on you.
 
Nvidia does have a node advantage today, and has for the last few years, which is a factor but not as big of a factor as AI.
OK, Nvidia RTX 3000 series used Samsung 8nm process vs AMD RX6000 series that had TSMC 7nm. FYI, TSMC 7nm is superior against Samsung 8nm. So Nvidia didn't have node advantage on RTX3000 series, far from it.
 
Back