Intel has lost all of its dedicated GPU market share

I'm not surprised that nVidia has 88% of the GPU market share, what I find surprising is that there are 12% of the market choosing the less capable cards that AMD puts out.
Price/performance is AMD's thing in the GPU market (the nvidia "DLSS" and Cuda tax is not attractive to everyone). My previous card was a 1080Ti and a 970 before that - I now game on a 7800XT - ray tracing is not a primary concern when you have 2000 games available where only a small percentage have any RT at all. The additional VRAM you get with AMD at each price point is also useful for playing with LLMs (my workstation has a Nvidia RT GPU but with ony 12 GB VRAM you can't load the entire model into memory so I end up using my personal machine). Without AMD the 1080Ti would never have seen the light of day (nVidia's panic over the forthcoming Vega 64 caused them to release this slightly cut down Titan as the GOAT 1080Ti) and whilst the Vega 64 was a disappointment (traded blows with the normal 1080) - nVidia didn't know this.
The surprise was intel getting market share to begin with as their prime market would have been the budget concious gamer - but as most of them are still on PCIe 3.0 systems that don't support resizable BAR features the discrete intel cards are a non-starter.
 
Battlemage is going to compete in low to mid-end market just fine, if price is right.

That's a big 'if'. It's always the question whether a company can set the right price, and I have my doubts that Intel could compete well. Most likely Battlemage will still be a bigger chip than AMD's performance equivalent chip, so Intel will have lower margins to begin with. Drivers will continue to be a struggle. Intel will likely price low, but AMD has also stated its intention to win market share, which would imply more aggressive pricing than until now. So Intel has an uphill struggle ahead of it.

The surprise was intel getting market share to begin with
Thinking about it, my guess is that most sales were OEM systems. Some are shipping with Arc cards, and I'm sure Intel incentivised the OEMs to create them. They probably didn't sell well, so OEMs moved back to NVIDIA.
 
Last edited:
I don't care what JPR's numbers are, I'm still seeing them sell where I live and in fair numbers. Can't trust numbers that don't match what's happening in the real world. JPR's numbers can't be trusted.
 
I don't care what JPR's numbers are, I'm still seeing them sell where I live and in fair numbers. Can't trust numbers that don't match what's happening in the real world. JPR's numbers can't be trusted.
Where is that and what's the price of these cards vs. AMD ones? Because in most places the Arc cards just don't make sense in terms of performance for the price.
 
The surprise was intel getting market share to begin with as their prime market would have been the budget concious gamer - but as most of them are still on PCIe 3.0 systems that don't support resizable BAR features the discrete intel cards are a non-starter.
PCI-E 3 supports Resizable Bar / Smart Access Memory just fine?
I have three AMD B450 systems and all of them support Resizable Bar.
iirc it was part of the PCI-E specs for ages but just wasn't used until AMD finally did and NVIDIA soon followed. Intel does rely on it far more heavily as Arc performance with it disabled is abysmal.

I'm not surprised that nVidia has 88% of the GPU market share, what I find surprising is that there are 12% of the market choosing the less capable cards that AMD puts out.
I picked up the Radeon RX 6700 XT for the same price as the RTX 3060 12GB went for. The card is so much more powerful that even in some raytracing titles performance is about on par. Without Raytracing it's clearly the better card - my partner has the RTX 3060 which at the time of purchase was the better deal so performance is really easy to compare and I'm definitely not unhappy with my choice.

In the low-mid range their offerings are definitely not less capable and if the price is right (not the case at all times) imo great buys.

No, AMD is hurt by their abysmal software. Their drivers are just as bad as Intel's and have NEVER not been that way. They absolutely ruined ATI when they bought it, and they've been limping it along ever since. I work in IT and have for over 20 years. 99.9% of the time, the issue is AMD's software/firmware. It is not stable. Even if they somehow perform better from time to time, you can't rely on their stability. Any of my customers that use AMD get switched over or they get dropped as a customer. I refuse to deal with AMD's incompetence.
Is this due to any software in particular that's a bad match? The ITers I know have never had a bias against Radeon unless CUDA was required. In my personal experience I have never had any major problems either (not more than with NVIDIA at least) and I've been using both sides for like 24 years now under both Linux and Windows. (Linux went from NVIDIA being the preferred choice to AMD being the preferred choice)

The only person I know that has sworn to never buy AMD again had a Vega card with the black screen of death and used the system professionally so downtime was a big no-no.
And yes the black screen of death was bad and took them too long to fix. But it's not like NVIDIA is free from screw ups like that either, hell - GPUs desoldering themselves (was it the 8000, 9000 series?) cost them Apple as a customer forever.

Definitely wouldn't put them in the same league as Intel where hundreds of games refused to launch or crashed when they first launched (much improved now, but they're still behind).

---

Oh I'll add to my previous post that NVIDIA didn't just play dirty in software with Gameworks, they also 'de-incentivized' their partners to sell Radeon products. XFX became a victim of this, they used to be NVIDIA only but then started selling Radeon products. NVIDIA wasn't having that and nowadays XFX is AMD exclusive. Bet their partners have some regrets playing along now that NVIDIA is putting in more efforts at selling their founders editions in greater numbers.

If that's too long ago for you younger readers, in 2018 NVIDIA had to pull their "NVIDIA Partner Program". There's tons of rumors about similar behavior if you listen to channels like Moore's Law is Dead etc that supposedly have industry contacts (yes, rumors - but they seem credible).
 
It remains unclear whether Team Blue plans to step into the enthusiast tier…I’d say it’s pretty clear they can’t. Maybe in five years, unless they’re not bought by Qualcomm in the meantime.
If it was be easy, there would be more than 2 companies for entire planet who makes the best graphic cards.
 
Nvidia has 88%
AMD has 12%
Intel has 0%
Intel sales are non-zero. Low, but more than zero. Amazon has one at #40 on its GPU sales list. The thing now is, what happens next? Do Intel GPU sales head closer to zero, or reverse course. We'll see in a few months.
 
Intel sales are non-zero. Low, but more than zero. Amazon has one at #40 on its GPU sales list.
It's probably rounded down, meaning that it's under 0.5%. Which I think is consistent with #40. And the card that's available at #40 is the A380, which isn't a gaming card at all. The top 100 doesn't have any other card, so A770, A750, A580 just aren't selling, or selling in very low quantities.

All in all, I think this is an indication that the stat is correct.
 
Where is that and what's the price of these cards vs. AMD ones? Because in most places the Arc cards just don't make sense in terms of performance for the price.
Oh yeah, I'm going to dox myself, just for you...:rolleyes:

I'm in North America is as much info as I'll state.

All in all, I think this is an indication that the stat is correct.
You can believe that if you want. It's wrong, but it's a free world.
 
That's a big 'if'. It's always the question whether a company can set the right price, and I have my doubts that Intel could compete well. Most likely Battlemage will still be a bigger chip than AMD's performance equivalent chip, so Intel will have lower margins to begin with. Drivers will continue to be a struggle. Intel will likely price low, but AMD has also stated its intention to win market share, which would imply more aggressive pricing than until now. So Intel has an uphill struggle ahead of it.


Thinking about it, my guess is that most sales were OEM systems. Some are shipping with Arc cards, and I'm sure Intel incentivised the OEMs to create them. They probably didn't sell well, so OEMs moved back to NVIDIA.

Intels drivers are not perfect, but neither is AMDs. Nvidia is clearly better when looking at the overall picture. I came from 6800XT to 4090 and play all kinds of games, including betas, alphas, early access, lesser popular titles and even do emulation, there's no doubt at all that Nvidia is just better in terms of drivers and support.

Intel Xe2 looks very good. Have a look at Lunar Lake performance and efficiency. I could easily see Intel competing in low to mid-end going forward. It is not like Radeon 8000 is going to be fast anyway. Intels 1st gen GPU launch did mature very well all things considered.

AMD never did a 1st GPU Gen, they bought ATi instead. The easy way to do it.

Intel XeSS is better than FSR too. True for many games:


Intel does alot of things right actually. And I would rather have 3 companies in the GPU market than 2. There is only one in High-End tho...

Poor drivers and worse features is the reason why AMD is cheaper than Nvidia to begin with. All Intel has to do, is price their GPUs accordingly.

Intel might be hitting Radeon 8600/8700 performance level with Battlemage and then AMD will feel the heat for sure. Especially when Intel starts making their GPUs on 18A and below, cutting out the middleman (TSMC). They can price them very aggressively at this point.

AMD is stuck at TSMC and TSMC raised prices many times over the last 5 years.
 
Last edited:
I really think that this card (which performs similarly to the 1080Ti) would sell like hotcakes today if it just doubled the GDDR6 VRAM to 36GB. It would only cost an extra $30 (that was the price of 16GB of GDDR6 a year ago), which is a small price to pay for the ability to interfere with 70B llms at ~4t/s at 3q. I guess the question on everyone's mind is why Intel made this rookie mistake?

It's a bit of a shame, but it looks like they will make again the same mistake with the Battlemage. Instead of going for at least 256GB of vram, they're probably going for 48GB. I'm sure they think that's more than enough for graphics the other cards have lower, but these aren't just graphics cards anymore. They're now AI interference cards. You can play any game at 60Hz at 1080p with just a 3060. Why to buy Intel card instead of Nvidia which has and cuda? What’s the competitive advantage?
A card at the 1080Ti level could never utilize 36GB of vram. Its output is insufficient to ever be able to use that much. And 36GB is an odd number to throw out there. Did you mean 32GB?
 
Amd and intel should just focus on 100w gaming laptop dgpu.
they are sold much more than desktop
If both companies did that there would be an oversupply of said chips. Neither company would do well. So, they historically supply a mix projected to be needed for various uses, including IOT, cars and other uses.
 
Amd and intel should just focus on 100w gaming laptop dgpu.
they are sold much more than desktop
A SOC like Apples M-line / Qualcomms windows ambitions would be the better move imo.
Saves power and allows them to sideline NVIDIA entirely as they're the odd one out without an x86 license. Plus they get to charge more because more would be coming straight from them.

Avoiding competing with Nvidia directly is the smart thing to do.
 
A SOC like Apples M-line / Qualcomms windows ambitions would be the better move imo.
Saves power and allows them to sideline NVIDIA entirely as they're the odd one out without an x86 license. Plus they get to charge more because more would be coming straight from them.

Avoiding competing with Nvidia directly is the smart thing to do.
a soc with big on die igpu like apple silicon pro/max will need separate manufacturing from U series apu.
that will increase the cost which reduces competitiveness.

meanwhile on package igpu like upcoming strix halo can only use regular ram,
hence wont be competitive againts midrange mobile geforce 60/70 with dedicated gddr.

amd/intel can use the same midrange dgpu chip for both mobile/desktop dgpu.
 
PCI-E 3 supports Resizable Bar / Smart Access Memory just fine?
I have three AMD B450 systems and all of them support Resizable Bar.
iirc it was part of the PCI-E specs for ages but just wasn't used until AMD finally did and NVIDIA soon followed. Intel does rely on it far more heavily as Arc performance with it disabled is abysmal.

No, it does not.

I own a PCIE 3.0 based system, Asus Maximus VI Hero Z87 Lynx Point /i7-4770K Haswell and Resizable bar and "Smart Access Memory" is not only NOT supported AT ALL but it can't even be enabled through a hack as Intel went out of their way to make sure that these two features can NEVER be enabled no matter what.

Better be more careful next time with gross generalizations.
 
No, it does not.

I own a PCIE 3.0 based system, Asus Maximus VI Hero Z87 Lynx Point /i7-4770K Haswell and Resizable bar and "Smart Access Memory" is not only NOT supported AT ALL but it can't even be enabled through a hack as Intel went out of their way to make sure that these two features can NEVER be enabled no matter what.

Better be more careful next time with gross generalizations.
fadingfool pointed at PCI-E 3 being the culprit. It is not.
I made no such claim as "All PCI-E 3.0 systems support Resizable BAR/SAM" so I'm not sure where you get gross generalizations from, I gave specific examples.

The limitation with your system is the CPU, not the PCI-E version. Resizable BAR requires:
  • A modern GPU [NVIDIA: RTX 3000 series (or later), AMD: RX 6000 series (or later), Intel - all cards)]
  • A modern CPU [*¹Intel 10th gen+, *²AMD Ryzen 5000 series or later]
  • A BIOS version that has the support for it and has it enabled

*¹ I've seen multiple people confirm that some Intel 9th Gen CPUs are capable of enabling it in the right motherboard/BIOS version
*² I've seen multiple people claim Ryzen 3000 series works a-okay as well - might be down to the bios version/chipset.
 
a soc with big on die igpu like apple silicon pro/max will need separate manufacturing from U series apu.
that will increase the cost which reduces competitiveness.

meanwhile on package igpu like upcoming strix halo can only use regular ram,
hence wont be competitive againts midrange mobile geforce 60/70 with dedicated gddr.

amd/intel can use the same midrange dgpu chip for both mobile/desktop dgpu.
It will increase the cost of the soc itself, but it will also mean all those components are coming straight from Intel or AMD. No middleman that needs a margin and they can simply charge more.

It would indeed require separate manufacturing with increased costs but that could (easily?) be offset by charging more and having a more premium product. The way I see it they can segment the market into:
APU: Simple desktops and laptops
SOC: Premium mid to high end laptops and gaming handhelds
Dedicated graphics: Datacenter and mid to high end desktop (although they seem to have given up on high end desktop)

A soc should
* Greatly reduce powerdraw
** This completely counters Qualcomms one advantage in the laptop market (better squash them whilst they're still small in this market)
** Allow for much longer battery life compared to anything with NVIDIA graphics as NVIDIA can't do a SOC (at least not x86)
** Allow for much higher performance due to higher bandwidth

In AMD's case I'd also be curious to see if they could do something smart with X3D memory/inifinity cache.
 
Back