At least one analyst believes Intel should shut down or sell off its GPU division

Shawn Knight

Posts: 15,300   +192
Staff member
TL;DR: Intel currently finds itself between a rock and a hard place in the GPU market, and one well-known analyst believes it might be time for the chipmaker to cut its losses and move on.

Jon Peddie of Jon Peddie Research laid out the scenario in a recent editorial highlighting Intel CEO Pat Gelsinger's willingness to jettison projects that aren't working out. The chipmaker announced plans to sell its NAND business to SK Hynix in late 2020 for $9 billion and no longer operates its aerial drone business. Last month, Intel said it will be shutting down its Optane memory business.

Since rejoining Intel, Gelsinger has scrapped six business units, saving the company around $1.5 billion in costs and losses in the process.

Intel's dedicated GPU ambitions got off to a hot start with the surprise hiring of Raja Koduri as the division's lead architect. Intel landed other big names including former Nvidia engineer Tom Petersen but thus far, there's been very little to show for it.

As Peddie highlights, Intel has reported a $2.1 billion loss since the first quarter of 2021 when it started sharing information on its dedicated GPU group. In fact, the research firm believes Intel has invested more than that with the actual figure perhaps being closer to $3.5 billion.

Intel's dGPU intentions were noble but according to Peddie, the results thus far in the consumer space "have been an embarrassment."

Our own Steven Walton recently got his hands on an Intel Arc A380 – the company's most entry-level offering – from China. Predictably, the card didn't blow anyone's socks off and Steven recommends not buying an Intel Arc GPU in its current condition.

Frankly, it would be extremely disappointing if Intel didn't push forward. Intel's first generation of discrete GPUs was never going to be profitable, the execution was never going to be flawless or even smooth, and it was always going to take 3 or 4 generations before they started to get on their feet. We'd assumed Intel knew this and had budgeted for it, but we'll soon see if this goes one way or the other. -- Steven Walton

Most believed Intel's first GPUs would have driver optimization issues, and some of that has probably come to fruition. More recently, however, we've been hearing that the Arc series may be suffering from fundamental hardware issues that can't be fixed by tweaking drivers.

Falling graphics card prices and improved availability from AMD and Nvidia aren't going to help Intel's efforts, either. This trend is likely only going to continue so long as crypto miners continue to sell their equipment amid local crackdowns and cooling coin values.

As for Peddie, be believes that Intel should "probably" find a partner and sell off the group but believes it has a 50-50 shot of going either way at this point.

Permalink to story.

 
Intel will never prioritize performance on graphics chipsets because that means higher-end components and therefore higher cost, and Intel is all about the integrated sector where value is king. I still think their discrete GPUs were meant to be nothing but cheaper alternatives for crypto mining..if you really listened to Intel's talk the innuendo was strong. Too bad crypto went through the floor just before their first gen products were ready to ramp up production. Its even worse that despite being a chipmaker themselves Intel is nearly as hobbled by shortages as everyone else in tech. In hindsight, selling off all those subsidiaries making the cheap but crucial support chipsets seems like a bad move, but somehow despite lots of red flags nobody seemed to predict the chip crunch.
 
Intel should put 2 and 2 together, and realize their GPU-s will never be as good as the competition, while their CPU-s still have a chance to become as bad as their GPU-s, if they persist in this vein.

 
They should not quit. I would be embarrassed to quit and tell the world I can't compete with AMD. That won't look good. They should double down and get their cards up to par for the second generation. Just look at it as a long-term investment.

Though I don't know where the market for an A380 is. If you don't play games then integrated graphics are totally fine. Nobody needs a discrete card that isn't much better than what they already have in the CPU and can't really run games well. I don't know why they are making this card. They should just focus on gaming cards.
 
There simply hasn't been enough time for the seeds they've planted to bare fruits yet, absolutely shut it down if it doesn't stand a chance buts its simply too early to tell I reckon, there's still so much scope for them to improve and so much internal expertise that once they've learned (and maybe a few more external hires if they want to poach the competition) can really stand to gain.

I don't really give a sh*t whether they do succeed or not, but I would like more competition in the space.
 
If there's an Intel open mouth, you can be sure that Intel will stick their foot squarely in it.
 
I wonder if this was ever a serious attempt at a dGPU or they just wanted to exploit perceived lack of supply to compensate for a poor quality product.
 
Intel will never prioritize performance on graphics chipsets because that means higher-end components and therefore higher cost, and Intel is all about the integrated sector where value is king. I still think their discrete GPUs were meant to be nothing but cheaper alternatives for crypto mining..if you really listened to Intel's talk the innuendo was strong. Too bad crypto went through the floor just before their first gen products were ready to ramp up production. Its even worse that despite being a chipmaker themselves Intel is nearly as hobbled by shortages as everyone else in tech. In hindsight, selling off all those subsidiaries making the cheap but crucial support chipsets seems like a bad move, but somehow despite lots of red flags nobody seemed to predict the chip crunch.
You conveniently forget about the fact that the higher end products always have the higher margins.
 
When Jim Keller came to AMD to help with Zen he told AMD to literally dump everything they were working on, and they survived. Albeit after a 5 year hiatus.

Polaris was rebranded for 3 years.

Intel survived on 14nm longer than it should.

Intel still has at least another year before I count them out. If you really want competition, and understood it most likely wasn't going to be a slam dunk shouldn't be counting them out this soon either.
 
Last edited:
Hope they continue. Yeah, their cards will probably NEVER approach that of Nvidia or AMD, but, there are people like me that aren't "gamer/crypto/benchmark" types that just need a good quality card for their computer. Not for high end games, but maybe photography, watching videos, typical use that don't want just a "built in" VERY low end graphics card.
If they do that, and continue to work and make it better, then they can profit off of it, and build on the future.
 
Never been an Intel fan however this is Intel's (Serious) 1st attempt at the dGPU market.Never tried Arc GPU, but from the reviews is not that bad for a 1st. they should continue to improve on the drivers and I know they will get it right. The market is there for a 3rd player. I see a lot of potential for the Arc GPU (Once they get it right)within Intel's portfolio on the long run
 
Thing is with stuff like NAND you have such strong competition across several companies, and with aerial drones it's not really an intel thing. There's like five other companies that are in NAND markets and others could still move in. Someone in China could bolt together some drones and change that segment quickly. They already did. The discrete GPU market? It's like Mount Everest. Only massive chip designers could possibly do it, or even try to do it from scratch. The required investment, accumulated knowledge and technology portfolio to make it work is both extremely wide and deep.

That's why Intel must continue. The competition will be for the distant future limited and known. The profits are potentially still there in this market with a long future ahead in areas of AI acceleration, VR and everything beyond just games.

They can take a few hits early on, anyone would have to. It was not a secret. Manage expectations and keep moving the division forward to see where it is in three years.
 
If Intel would completely separate this division and run it as a separate company without upper management interference, I think they could be much more competitive and could give the other Big Two a run for their money ..... but over the years Intel has proven to be much too anal to make such a bold move so, looks like were stuck with them until they decide to get out of that market.
 
Thing is with stuff like NAND you have such strong competition across several companies, and with aerial drones it's not really an intel thing. There's like five other companies that are in NAND markets and others could still move in. Someone in China could bolt together some drones and change that segment quickly. They already did. The discrete GPU market? It's like Mount Everest. Only massive chip designers could possibly do it, or even try to do it from scratch. The required investment, accumulated knowledge and technology portfolio to make it work is both extremely wide and deep.

That's why Intel must continue. The competition will be for the distant future limited and known. The profits are potentially still there in this market with a long future ahead in areas of AI acceleration, VR and everything beyond just games.

They can take a few hits early on, anyone would have to. It was not a secret. Manage expectations and keep moving the division forward to see where it is in three years.
I still think that people are surprised at their first attempted wasn't all that great. The fact they have a product AT ALL is amazing.
 
There will always be naysayers. I think Intel knows what’s best for them. I feel they have gone a long way to deliver their first dGPU (even though it’s mostly benchmarks now). There’s merits to having Intel CPU and GPU in the same machine as hardware integration will be better.
 
I hate Intel practices and won't buy their products for a long long time, still I think it would be a very bad thing for the market to have AMD and Nvidia alone. Any honest person would acknowledge that Intel would not succeed from the first generation. That would be the most stupid thing in the world to stop at the first try. It can take them three generations to be able to really compete. Intel has money and needs to have their own CPU/GPU combo to compete with AMD and Nvidia the day that they have their own CPUs.
 
They shouldn't give up so easily.. how could they succeed if they give up in just first try.. that's what the other company wants it to fail and give right away.. they don't want other to have the other chunks of the pie.. they're greedy as the others..
 
How ridiculous is this? I'd be very surprised if Intel didn't expect their first generation of GPU's to be less than stellar - especially for gaming. GPU drivers are hugely complex and it will take them a few generations to really nail it. Hardware-wise they seem to be in the right ballpark, but lessons need to be learned from the SW/Driver launch. Surely they expected this, didn't they???? This has to be a long-term project. Getting GPU's working right is so important for Intel not just for gaming but as they become a more and more important part of all sorts of processes like ML, the dreaded crypto, particle simulations etc etc etc
 
Last edited:
I'm not too fond of Intel but it would be best for everyone if they tried harder next generation. Everything I've heard concerning Intel GPUs has made me think it's half-assed.
 
Quitting now would be dumb. They JUST got their product out there. It takes years for a GPU arch to come to fruition, and thats from companies with thousands of patents and experience going back decades.

Intel should give it a few more attempts before throwing in the towel. No duh it's lost money, it just came out!
I still think that people are surprised at their first attempted wasn't all that great. The fact they have a product AT ALL is amazing.
People assume that money = speed and that if you dont succeed immediately its a total failure. Extreme short term thinking.
They should not quit. I would be embarrassed to quit and tell the world I can't compete with AMD. That won't look good. They should double down and get their cards up to par for the second generation. Just look at it as a long-term investment.

Though I don't know where the market for an A380 is. If you don't play games then integrated graphics are totally fine. Nobody needs a discrete card that isn't much better than what they already have in the CPU and can't really run games well. I don't know why they are making this card. They should just focus on gaming cards.
Believe it or not there is quite a large market between "run PS2 games at 720p potato quality" and "ZOMG $5000 settings porn PC". A card liek the A380 exists in the same market as the 6400/6500, the 1650, RTX 3050, ece. These cards sell in high volume. Believe it or not, even the garbage 6400 often has 1% lows equivalent or slightly higher then the integrated vega 8 manages for average FPS, and for budget builders there are millions of PCs out there with 8-11th gen intel or 1st-3rd gen ryzen with even weaker iGPUs that could use a nice boost.

The A380 only pulls 75w and, on a hardware level, is 40-50% faster then the 6400, putting it a good step above the 1650 in term of performance. The fact this rarely translates to gameplay screams driver issues, if it worked right it would be a major shot int he arm to sub 75w gaming, where many get their start. We've been stuck on sub par 4GB GPUs for too long in the space. Crucially, the A380 can be made low profile for OEM PC upgrades, where the current "best" option is the crippled 6400 or the 3 year old 1650, if you can find it.
 
Intel getting stronger on the dGPU business has for Intel and consumers several advantages:

- competition against AMD and Nvidia, what heats up the fight for better and cheaper

- Intel mirroring the dGPU tech onto iGPU which gives us much better performance out of mobile chips

That alone is a win win situation for Intel. If they stay just with CPUs, they will end up on the low end segment. No-one needs too strong CPUs with weak gpu, as nowadays most things need to be gpu accelerated.
 
It sure amuses me that all these "experts" or "analysts" make all these predictions. Intel has to start somewhere, just like Nvidia or AMD did with graphics chips. Lets see how things play out. Let us see where their 2nd generation chips and onward go for Arc. I wish Intel nothing but success.
 
It sure amuses me that all these "experts" or "analysts" make all these predictions. Intel has to start somewhere,
Unknown small companies have to start somewhere;
Intel is a huge rich company that has iGPUs since many many many years ago. If Intel did the homework, they should have dGPU coming from iGPUs right at first try with decent drivers. As Intel is sleeping under a tree for many years, at one they were slapped from AMD (which makes iGPUs with RT out of dGPUs and great CPUs) and from Apple (apple started designing their own CPUs and GPUs decades later), not to mention mobile CPUs and OSes also started supporting techs, hardware video codecs much sooner than Intel or Microsoft.

To sum: Intel and Microsoft slept under a tree for many years and saw many competing companies take over, now they are just pulling the strings from partners to manage the sinking...
 
Back