Nvidia now holds 80% of the dedicated GPU market

midian182

Posts: 9,741   +121
Staff member
In a nutshell: Nvidia's upcoming Ampere consumer cards will help tighten its grip on the dedicated GPU industry, an area that it already dominates. According to a new report, team green has seen its dGPU market share jump from 75% in Q1 2020 to 80% in the second quarter.

Jon Peddie Research's Q2 2020 "Market Watch" report shows that desktop graphics add-in board sales increased 6.55% from Q1 2020. Intel doesn't (yet) have any product in this field, so Nvidia's 5% quarterly increase has come from AMD, which drops to 20%.

It's a different story when it comes to overall graphics, which Intel rules because of the integrated GPUs it offers in many of its CPUs. Chipzilla takes 64% of this sector—down 3%—while Nvidia's share jumped from 16% to 19%, and AMD was up 1% to 18% QoQ.

It was good news for both gaming giants, as AMD's overall unit shipments were up 8.4% QoQ while Nvidia's jumped 17.8%. Intel, however, was down 2.7%.

Jon Peddie notes that the second quarter is usually down compared to Q1, but the increase in people working from home and gaming—both a result of the pandemic—meant this quarter was up.

"The pandemic has been disruptive and had varying effects on the market. Some sales that might have occurred in Q3 (such as notebooks for the back-to-school season) have been pulled in to Q2 while desktop sales declined. Intel's manufacturing challenges have also negatively affected desktop sales," wrote Jon Peddie, President of JPR.

"We believe the stay at home orders have continued to increase demand in spite of the record-setting unemployment levels. As economies open up, consumer confidence will be an important metric to watch."

Ampere already looks exciting, but Nvidia will have to contend with AMD's Big Navi release this year, while 2021 sees the launch of Intel's Xe HPG Gaming GPUs. More competition for Jensen Huang's firm, but more choice for consumers.

Permalink to story.

 
This isn’t surprising considering how awful Radeon has been of late. They really need to fix their drivers. Hopefully this diminishing market share will push them to stop fobbing users off and actually fix their issues.
 
This isn’t surprising considering how awful Radeon has been of late. They really need to fix their drivers. Hopefully this diminishing market share will push them to stop fobbing users off and actually fix their issues.

I honestly think AMD doesn't cares anymore, I've been in this game for a long time and even when they had good drivers and good cards nVidia always outsold them, I think at this point they are tired of loosing money on this market and I'm not surprised, they bring out a good card at a good price, nVidia drops its price, everyone thanks AMD and then goes and buy a GeForce, its not exactly a good business to be in
 
We are talking about a monopoly here, for all intents and purposes. This is why top-of-the-line consumer GPUS have reached and exceeded Euro 1200.

AMD (ATI) maintains only a token presence in the discrete GPU market.
 
This isn’t surprising considering how awful Radeon has been of late. They really need to fix their drivers. Hopefully this diminishing market share will push them to stop fobbing users off and actually fix their issues.

Facepalm. Nvidia's integrated graphics share is pretty much zero. So More AMD sells integrated graphic chips (that many times are faster than Nvidia's low end discrete offerings), Less AMD gains discrete market share.

We are talking about a monopoly here, for all intents and purposes. This is why top-of-the-line consumer GPUS have reached and exceeded Euro 1200.

AMD (ATI) maintains only a token presence in the discrete GPU market.

You do understand that is all about Amount of chips, not speed. $40 dollar graphic card adds as much market share as GTX 2080 Ti does.

So basically if OEM machine has GT710 and same time AMD sells Ryzen 7 4700G (much faster than GT710), Nvidia gains more discrete market share while AMD does not.

That is why AMD lost much of discrete market share since they released APU's. And that is why discrete card market share does not matter at all.
 
Facepalm. Nvidia's integrated graphics share is pretty much zero. So More AMD sells integrated graphic chips (that many times are faster than Nvidia's low end discrete offerings), Less AMD gains discrete market share.



You do understand that is all about Amount of chips, not speed. $40 dollar graphic card adds as much market share as GTX 2080 Ti does.

So basically if OEM machine has GT710 and same time AMD sells Ryzen 7 4700G (much faster than GT710), Nvidia gains more discrete market share while AMD does not.

That is why AMD lost much of discrete market share since they released APU's. And that is why discrete card market share does not matter at all.

Yeah I agree that AMD probably have a higher percentage market share than Nvidia does when it comes to integrated graphics. What’s your point? This article is about discrete graphics and Nvidia don’t make APUs, and let’s face it, if they did they’d probably be better than AMDs at playing games.

Also I use AMDs integrated graphics in the Vega8 on my 3500u laptop. It’s better than Intel’s stuff but it’s still rubbish really, it can’t run much, Skyrim runs at 40fps, so a 9 year old game can’t even hit 60fps. You won’t be getting a very good experience in general. And everyone knows AMDs driver support for discrete GPUs is bad, well it’s even worse for integrated, I think I’ve had 2 updates in the 13 months I’ve owned that laptop, it’s awful really. But yeah sure, I only got about 25 fps in Skyrim on the i5 8250 I had in my previous laptop.
 
Well any "in your face" comment made against AMD fans would get deleted so quick here, I wonder how this article get greenlighted here ;)
It's not bad to call something that is crap crap.
The RR7 was garbage and so was the Vega arch for Gamers, nice compute performance for the price, but gaming yeah you cannot even remotely argue.
It's not about liking a particular company, and a smart person will buy whatever product gives them the most value and power in their budget.
Sometimes it's NVidia sometimes it's AMD or Intel.
Being a fanboy or particular brand loyalty is for the braindead consumer.
 
We are talking about a monopoly here, for all intents and purposes. This is why top-of-the-line consumer GPUS have reached and exceeded Euro 1200.

AMD (ATI) maintains only a token presence in the discrete GPU market.

Yup Nvidia doesn't want to kill off RTG otherwise Nvidia would be subjected to antitrust laws.
But you gotta commend Nvidia for being a relentless innovator depite holding the GPU monopoly while Intel's incompetence just got everyone laughing...

It's not bad to call something that is crap crap.
The RR7 was garbage and so was the Vega arch for Gamers, nice compute performance for the price, but gaming yeah you cannot even remotely argue.
It's not about liking a particular company, and a smart person will buy whatever product gives them the most value and power in their budget.
Sometimes it's NVidia sometimes it's AMD or Intel.
Being a fanboy or particular brand loyalty is for the braindead consumer.

Same for supporting a brand just to keep the "Duopoly" status quo. How many times have I heard people saying you gotta buy AMD GPU to keep GPU prices down. Yeah it doesn't really work that way since Nvidia already took great care to keep RTG survive, that means jacking up the prices so that RTG can maintain a healthy profit margin too.
 
Last edited:
Yeah I agree that AMD probably have a higher percentage market share than Nvidia does when it comes to integrated graphics. What’s your point? This article is about discrete graphics and Nvidia don’t make APUs, and let’s face it, if they did they’d probably be better than AMDs at playing games.

Point is that more AMD makes APU's, less AMD sells discrete chips. So who cares and why? Problem with this article is that it really tells nothing important, but some seem to think so.

With what CPU (and integrated graphics) Nvidia would run games better than AMD?

Also I use AMDs integrated graphics in the Vega8 on my 3500u laptop. It’s better than Intel’s stuff but it’s still rubbish really, it can’t run much, Skyrim runs at 40fps, so a 9 year old game can’t even hit 60fps. You won’t be getting a very good experience in general. And everyone knows AMDs driver support for discrete GPUs is bad, well it’s even worse for integrated, I think I’ve had 2 updates in the 13 months I’ve owned that laptop, it’s awful really. But yeah sure, I only got about 25 fps in Skyrim on the i5 8250 I had in my previous laptop.

Gaming experience is not very good with 3500u but considering price, it is.

For driver updates: "AMD recommends OEM-provided drivers which are customized and validated for their system-specific features and optimizations. "

Problem with OEM stuff.

Yup Nvidia doesn't want to kill off RTG otherwise Nvidia would be subjected to antitrust laws.
But you gotta commend Nvidia for being a relentless innovator depite holding the GPU monopoly while Intel's incompetence just got everyone laughing...

How could Nvidia "kill" RTG when AMD has practically monopoly on console GPU's and AMD's integrated graphics are far ahead Intel ones. Not to mention Nvidia even doesn't have integrated graphics :D
 
We are talking about a monopoly here, for all intents and purposes. This is why top-of-the-line consumer GPUS have reached and exceeded Euro 1200.

No and no. Nvidia charges like 1200 euros just because Nvidia's top chips are freaking expensive to make.

Radeon 5700XT die size is only around 18*14.5mm (estimations, total 251 mm2), that gives 208 dies per wafer. With 0.09 defect density, that makes 165 fully working dies. Assuming 10K wafer cost, that makes 60 dollars per chip.

GTX 2080 Ti die size is 31*25mm that makes (according to calculator) only 63 dies per wafer. Assuming same 0.09 defect density, it makes 33 fully working dies. Assuming 8K wafer cost (custom process), that makes 242 dollars per chip.

So basically GTX 2080 Ti die is 4x more expensive to make vs Radeon 5700 XT die. No wonder AMD didn't want to waste already limited 7nm capacity for something similar.
 
No and no. Nvidia charges like 1200 euros just because Nvidia's top chips are freaking expensive to make.

Radeon 5700XT die size is only around 18*14.5mm (estimations, total 251 mm2), that gives 208 dies per wafer. With 0.09 defect density, that makes 165 fully working dies. Assuming 10K wafer cost, that makes 60 dollars per chip.

GTX 2080 Ti die size is 31*25mm that makes (according to calculator) only 63 dies per wafer. Assuming same 0.09 defect density, it makes 33 fully working dies. Assuming 8K wafer cost (custom process), that makes 242 dollars per chip.

So basically GTX 2080 Ti die is 4x more expensive to make vs Radeon 5700 XT die. No wonder AMD didn't want to waste already limited 7nm capacity for something similar.
Ty, you get it, also there is complexity in the Lithography that determines rejection factors and such, TSMC has a calculator where you can calculate based on die size the amount of "cherry" versus "average" and reject faulty dies. Typically all companies use laser sheering to cut parts of the die to help recoup some of that cost on lower teir cards. NVidia and AMD both rebrand chips made later where the cost is cheaper and the process has less rejects, sometimes they drop prices other times they rebrand them with better performance tunings.

People have to realize GPU costs will only drop when monolithic dies are stopped being used, a chiplet or MCM config makes production much cheaper and I wouldn't be surprised after PCIe5 gen is released and used, that the design will overtake the monolithic approach. Otherwise AMD makes the smart choice of cutting costs and complexity by simpler smaller designs where NVidia goes complex and large.
I do find it weird though AMD is finally separating Pro arch from gaming arch, RDNA versus CNDA, while NVidia is doing the opposite approach merging the 2, I wonder how this strategy will fair out for both companies.
 
I honestly think AMD doesn't cares anymore, I've been in this game for a long time and even when they had good drivers and good cards nVidia always outsold them, I think at this point they are tired of loosing money on this market and I'm not surprised, they bring out a good card at a good price, nVidia drops its price, everyone thanks AMD and then goes and buy a GeForce, its not exactly a good business to be in
To be fair I already gave up, I switched to Nvidia back in April after 7 years of Radeon.
Point is that more AMD makes APU's, less AMD sells discrete chips. So who cares and why? Problem with this article is that it really tells nothing important, but some seem to think so.

With what CPU (and integrated graphics) Nvidia would run games better than AMD?



Gaming experience is not very good with 3500u but considering price, it is.

For driver updates: "AMD recommends OEM-provided drivers which are customized and validated for their system-specific features and optimizations. "

Problem with OEM stuff.



How could Nvidia "kill" RTG when AMD has practically monopoly on console GPU's and AMD's integrated graphics are far ahead Intel ones. Not to mention Nvidia even doesn't have integrated graphics :D
AMD make the drivers for my Asus Zenbook. I understand that you are a big fan of AMD and won’t have them blamed for their own failings. But the two driver updates I have received for my 3500u came from AMD, you aren’t seriously suggesting that Asus should make them are you? You are aware they can’t, they don’t have access to AMDs inner workings, no one does. AMD blaming the laptop manufacturer is a very anti-consumer thing to do. But then AMD aren’t exactly consumer friendly when it comes to Radeon.

This article confirms what most of us already knew.
Less and less people are paying money for AMD graphics cards. That’s what happens when you have a bad product. And at the moment Radeon is a festering heap of garbage. APU sales are not relevant to the discrete market, they can’t play modern games.
 
Ty, you get it, also there is complexity in the Lithography that determines rejection factors and such, TSMC has a calculator where you can calculate based on die size the amount of "cherry" versus "average" and reject faulty dies. Typically all companies use laser sheering to cut parts of the die to help recoup some of that cost on lower teir cards. NVidia and AMD both rebrand chips made later where the cost is cheaper and the process has less rejects, sometimes they drop prices other times they rebrand them with better performance tunings.

That is true. Still, Nvidia's GPU is way too huge to make cheaply even with very good yields and re-using partially working chips. Batter working process performance wise also don't actually drive manufacturing costs down.

People have to realize GPU costs will only drop when monolithic dies are stopped being used, a chiplet or MCM config makes production much cheaper and I wouldn't be surprised after PCIe5 gen is released and used, that the design will overtake the monolithic approach. Otherwise AMD makes the smart choice of cutting costs and complexity by simpler smaller designs where NVidia goes complex and large.
I do find it weird though AMD is finally separating Pro arch from gaming arch, RDNA versus CNDA, while NVidia is doing the opposite approach merging the 2, I wonder how this strategy will fair out for both companies.

Monolithic dies are no problem. Problem with Nvidia us wasting transistors for not so useful features. Just looking at transistor count, GTX 2080 Ti has 55% more transistors. Is it 55% faster too? Usually, no.

MCM design (chiplet is MCM too) have some problems when using GPU's and I highly doubt PCIe 5.0 will solve those. On that timeframe however there may be some innovations especially on packaging technology that can solve many problems.

AMD finally has enough money to do that. We actually don't know Nvidia's plans accurately when it comes to lower than high end cards.

AMD make the drivers for my Asus Zenbook. I understand that you are a big fan of AMD and won’t have them blamed for their own failings. But the two driver updates I have received for my 3500u came from AMD, you aren’t seriously suggesting that Asus should make them are you? You are aware they can’t, they don’t have access to AMDs inner workings, no one does. AMD blaming the laptop manufacturer is a very anti-consumer thing to do. But then AMD aren’t exactly consumer friendly when it comes to Radeon.

You don't understand how OEM markets work. AMD is NOT responsible for making drivers for any OEM devices/APU's/graphic cards. OEM manufacturer is one who is responsible. It's all about money. AMD will gladly make driver updates for OEM devices even once a month IF OEM manufacturer pays AMD for it. Guess how gladly they want to pay AMD for driver updates. They don't so guess what follows from that.

So AMD is NOT responsible to make drivers for 3500U, OEM manufacturer is. You should be glad AMD offers two driver updates for free. AMD does not have to offer any. Blaming AMD for offering "only" two driver updates when AMD is not obliged to offer ANY "?"

This article confirms what most of us already knew.
Less and less people are paying money for AMD graphics cards. That’s what happens when you have a bad product. And at the moment Radeon is a festering heap of garbage. APU sales are not relevant to the discrete market, they can’t play modern games.

It basically confirms AMD APU's are selling so well there is less need for discrete AMD cards. Again, not all discrete cards are faster than integrated solutions. So this article confirms nothing you imagine it does.
 
Problem with Nvidia us wasting transistors for not so useful features. Just looking at transistor count, GTX 2080 Ti has 55% more transistors. Is it 55% faster too? Usually, no.
Transistor count alone really doesn't tell a useful picture. The GP102 (as used in the 1080 Ti) has 48% more transistors than the 980 Ti's GPU and, depending on the game and resolution, it's as much as 100% faster. The likes of the RTX 2060, which has 18% fewer transistors than the 980 Ti is notably faster than it and it has those not-so-useful features too.
 
Yeah I agree that AMD probably have a higher percentage market share than Nvidia does when it comes to integrated graphics. What’s your point? This article is about discrete graphics and Nvidia don’t make APUs, and let’s face it, if they did they’d probably be better than AMDs at playing games.

Also I use AMDs integrated graphics in the Vega8 on my 3500u laptop. It’s better than Intel’s stuff but it’s still rubbish really, it can’t run much, Skyrim runs at 40fps, so a 9 year old game can’t even hit 60fps. You won’t be getting a very good experience in general. And everyone knows AMDs driver support for discrete GPUs is bad, well it’s even worse for integrated, I think I’ve had 2 updates in the 13 months I’ve owned that laptop, it’s awful really. But yeah sure, I only got about 25 fps in Skyrim on the i5 8250 I had in my previous laptop.

They don't make them yet, wouldn't be surprised if that becomes the green hivemind's next prey.
 
I'm not an expert, but I think that Nvidia dominion on the dedicated GPU market has increased the prices too much, to the extend that PC Gaming is turning ridiculously expensive, especially in my country, soon I'll have to drop PC Gaming...

Well, the one good thing is that 2013/14 GPUs can still do low/medium 1080p in modern games, 14/15 high to very high 1080p,/1440p. The need to increase graphics power has decreased greatly. This isn't like the 90s/early 2000s.
 
Well, the one good thing is that 2013/14 GPUs can still do low/medium 1080p in modern games, 14/15 high to very high 1080p,/1440p. The need to increase graphics power has decreased greatly. This isn't like the 90s/early 2000s.

In case you haven't noticed, it's because the focus is on high refresh rate now. 60 fps is no longer considered benchmark standard and more like minimum when it comes to GPU push. Some outliers like Control or Microsoft Flight Simulator but if you card can't push >60FPS on high they will be considered trash, based on marketing fluffs.

But the real thing is that GPU progress has slowed down quite a bit. It's 4 years since GTX 1080 released and the equivalent RX 5600XT/RTX 2060 non Super isn't dirt cheap, in fact RTX 2060 is very highly priced for a so called 60 card. If you compare the GTX 680 vs GTX 1060 for example, the GTX 1060 is miles faster and cheaper.
 
In case you haven't noticed, it's because the focus is on high refresh rate now. 60 fps is no longer considered benchmark standard and more like minimum when it comes to GPU push. Some outliers like Control or Microsoft Flight Simulator but if you card can't push >60FPS on high they will be considered trash, based on marketing fluffs.

But the real thing is that GPU progress has slowed down quite a bit. It's 4 years since GTX 1080 released and the equivalent RX 5600XT/RTX 2060 non Super isn't dirt cheap, in fact RTX 2060 is very highly priced for a so called 60 card. If you compare the GTX 680 vs GTX 1060 for example, the GTX 1060 is miles faster and cheaper.

Except "most" gamers really don't care; really any way you cut it, 4K and/or high refresh is a vocal minority. Definitely agree on card progress, though if we could have custom drivers/vulkan; even a GTX 580 3GB could hang @1080p low.
 
Well, the one good thing is that 2013/14 GPUs can still do low/medium 1080p in modern games, 14/15 high to very high 1080p,/1440p. The need to increase graphics power has decreased greatly. This isn't like the 90s/early 2000s.
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.
 
I honestly think AMD doesn't cares anymore, I've been in this game for a long time and even when they had good drivers and good cards nVidia always outsold them, I think at this point they are tired of loosing money on this market and I'm not surprised, they bring out a good card at a good price, nVidia drops its price, everyone thanks AMD and then goes and buy a GeForce, its not exactly a good business to be in

They need to take the top spot it's the only way the feelings on the difference would change. otherwise amd is always looked at as the cheaper weaker alternative.
 
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.

I'm sure it will.
zVAw8Mz.png
 
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.

you know if your smart about when you buy and when you sell you can easily continually upgrade and not spend all that much.

for example I paid about $1400 for sli 1080ti used for a year sold for enough to pocket $250 and get a 2080ti then used that for almost 2 years just sold that recently for 88% of what I originally paid for it and have about $1250 returned in total to me from that original $1400 while also having had the best gaming you could for 3 years.

Now for what will basically be $150-250 out of pocket (or about $1550-1650 total) I'll have the best gaming experience for another 1 or 2 years while also having had it for the previous 3 already.

~$1600 for the best in gaming for 5 years to me is a great price to pay.
 
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.
Well I hope my 1080ti stands this upcoming console generation change for at least another 3 years since buying something better can break the bank.

you know if your smart about when you buy and when you sell you can easily continually upgrade and not spend all that much.

for example I paid about $1400 for sli 1080ti used for a year sold for enough to pocket $250 and get a 2080ti then used that for almost 2 years just sold that recently for 88% of what I originally paid for it and have about $1250 returned in total to me from that original $1400 while also having had the best gaming you could for 3 years.

Now for what will basically be $150-250 out of pocket (or about $1550-1650 total) I'll have the best gaming experience for another 1 or 2 years while also having had it for the previous 3 already.

~$1600 for the best in gaming for 5 years to me is a great price to pay.
It's not bad to call something that is crap crap.
The RR7 was garbage and so was the Vega arch for Gamers, nice compute performance for the price, but gaming yeah you cannot even remotely argue.
It's not about liking a particular company, and a smart person will buy whatever product gives them the most value and power in their budget.
Sometimes it's NVidia sometimes it's AMD or Intel.
Being a fanboy or particular brand loyalty is for the braindead consumer.


the only thing I'm a fanboy of is the higher performance you can get. I'll go wherever that is and since 2013 it's ONLY been Nvidia.

What AMD needs is a actual top tier card that can and does best Nvidias best at the time its actually relevant not years after its been available.

They'll continue to lose market share until they can prove to more than just the cheaper gamers that they are a force to be reckoned with.

I rocked ati radeon from 1999-2012 but they ain't ever really competed since.

owning the bottom or even middle market will do very little to actually make your image change and overall people will still take the cards from the "top brand" even if in their price bracket the underdog is actually a better choice.
 
Back