Intel says integrating RAM into Lunar Lake SoC was a mistake, might abandon desktop GPUs again

Why? Financial hardships. Difficult decisions are coming soon at Intel.
Nvidia are making more money than Intel, without making X86 CPU's. The GPU market is absolutely not something Intel should leave if they plan on making money again.
Because it is a money pit, it is a luxury that Intel cannot afford anymore.
CPU's are also a "money pit" and that's clearly not working for them either. The GPU market is huge and growing, it's not a "luxury" to be in, if Intel leave it, that's a massive market they're just leaving to the competition.
Desktop gpu has lower profit margin compared to desktop and laptop cpu.
More over, Intel arc has low market share
4090's go for over £1.5k, the most expensive desktop CPU intel does is £500. Yes I'm aware most people buy a mid-range GPU, it's still got pretty high margins in it. My point being though, if Intel really put some effort in, they could make more money in the GPU market than they could in the CPU market, just look at Nvidia, making far more money than Intel, whilst selling no CPU's at all.
In theory gaming GPUs is a large market, but in practice it will be hard to win any part of it unless Intel is willing to undercut its competitors in a major way. Consumer GPUs have lower margins than other products. They are big chips, and if Intel is behind in performance/mm2 (as was the case with Alchemist) then the margins are even lower.

Until its GPUs are successful, Intel will lose money on them, and if it takes several generations to get to break even, then it will be quite a few years of losses. Intel is already losing money. Adding more loss isn't what it wants right now.
But if they do catch up in performance over the next generation or two, which they absolutely could, their GPU's are actually amazing in certain aspects, the ray-tracing performance is really good, better than AMD's in the first generation, XeSS is better than FSR, and these are Intel's first try.

Some of Intel's biggest issues have been their in-ability to look further than their own nose, the GPU market is going nowhere and is only growing, They openly admit they regret not buying Nvidia many moon's ago, they've spent the money and R&D on producing dedicated GPU's, better drivers etc... all that money and effort goes out the window because they've reported their first financial loss in nearly 30 years? I'd argue it's a bigger loss to throw out everything they've already worked and achieved just to save a little bit of money over the short term.
 
I switched to console this year but, let's HOPE Intel will not kill their GPU department. More competition is better for gamers since Nvidia is focusing majorly on AI chips. Amd is only good option left for normal gamers.
 
No. It's just a rumor based on speculation. It's all a bunch of "what ifs". There's nothing concrete about it, the true example of bait and you're hooked hard.

At one point the writer says the gpu division is on the chopping block with no quote to back that up or anything. Then they say battlemage is coming late 2024 before Nvidia and AMD release anything and that it's success or failure "might" decide the fate of Celestial.
It's really just "what if all this different stuff happens then Intel might do this or that".
No, if you haven't noticed, intel got rid of all the divisions and technologies that were making a loss. It's not sustainable under current conditions to keep throwing billions down the drain.
 
I for one like the idea of having inexpensive laptops with integrated graphics powerful enough for 4K gaming at 60fps. From what I've heard, it's possible to get there with AI upscaling and DLSS rather than brute force hardware GPU devices. That would make tablets extremely powerful.
 
How many people would really mis intel's ARC GPU if they disappeared tonight?
Nvidia is so far ahead of the competition that even their lowest 5000 series GPU will be way ahead of anything Intel's offering while the 5090 will be way ahead of anything else offered by anyone else.

I personally liked the idea of intel going in the direction of bringing powerful integrated graphics to the lower end market. Make every laptop capable of 4K gaming at 60 fps even if you need AI to do it.
 

If Intel give up on discrete GPU's they give up on having a long-term future, as this technology is not just for building triangle renderers.

That isn’t true - nvidia telling us they aren’t a graphics company anymore is all the proof you need that discrete graphics aren’t the same profit center they once were relative to other areas.

Intel isn’t saying they are giving up on graphics just giving up on a market where after a couple of years they have about a zero percent market share.

As others have said, Intel simply cannot afford the investment resources discrete GPUs require at this point and they have abandoned discrete graphics before.

They have to fix their core business areas and they need to have a frank assessment as to what has gone wrong. It’s never been clear beyond bafflegab answers as to what has gone wrong in so many areas all around the same period of time.

It's simply shocking
 
Obviously incorrect since Intel is the one with largest market share in all markets where they use x86.
Sure they been loosing market share to AMD, but even at this rate it will take several years before AMD could gain majority of the market share.
Well, by staying relevant I meant they need to compete and what you said just proves it, they have been losing (btw, one “o” in losing, bro) market share and the rate has increased over the last years. They’re still dominant, but my point was not about who has more market share, but the fact they can’t afford with lesser revenue to throw money into discrete GPUs, where they are at least 3-4 years behind in development compared to the competition.
 
Intel's valued at $100bn at the current share price and that is becoming more of a bargain for other big tech to take them over. Wonder how the market would react if Microsoft, Apple or Amazon acquired Intel.
 
Well, by staying relevant I meant they need to compete and what you said just proves it, they have been losing (btw, one “o” in losing, bro) market share and the rate has increased over the last years. They’re still dominant, but my point was not about who has more market share, but the fact they can’t afford with lesser revenue to throw money into discrete GPUs, where they are at least 3-4 years behind in development compared to the competition.
All it takes(big deal ofc) is intel comming out with a product (mainly a DC product)that is ahead of AMD for a couple of gens, then the table turns and we have intel ahead and AMD with a smaller share and decreasing, which is a worse situation than now.
My point is, in the end its good for the consumer with a smaller intel and a more balanced market.
 
NOPE, its not a mistake by putting RAM in the SoC, infact its a good move because it reduces power consumption, if you have components outside the Soc it needs to have interface logic with voltages usually double that of internal components at the least. Intel is only being modest in admitting something it doesn't need to because the mkt is not buying yet in numbers one would have expected. Till elections are over we will see some imbalance post that hoping and keeping fingers crossed that the money will be soon released under the CHIPS act once delivery starts.
 
Intel lost a lot of money on GPU department already. The only way to make it back is to keep pushing.
That's the sunk cost fallacy. The decision should be made based on future costs and revenue, not what happened until now.

Considering that consumer GPUs are the least profitable market of those Intel has (consumer and server CPUs, server AI accelerators), it's a natural one to cut if Intel needs to cut something.
 
RIP Arc Alchemist, you came in hot, and left… very quietly. If Battlemage doesn’t deliver, Intel might just become a permanent spectator in the GPU war (if that) between Team Red and Team Green.
 
No, if you haven't noticed, intel got rid of all the divisions and technologies that were making a loss. It's not sustainable under current conditions to keep throwing billions down the drain.
No? the GPU division doesn't exist anymore? Where'd you hear that? Not from Intel? Right, because it was a speculative statement you made based on "what ifs".
"Intel got rid of all the divisions and technologies" what exactly have they gotten rid of if they're coming out with a 2nd generation of GPUs? The only thing that comes to mind is MobileEye the self driving car technology division, which Intel still owns over 95% of and is still operational.

Speculating is fine, spreading speculation as matter of fact is careless and ignorant.
 
Intel's valued at $100bn at the current share price and that is becoming more of a bargain for other big tech to take them over. Wonder how the market would react if Microsoft, Apple or Amazon acquired Intel.

Why would they consider taking over a failing company with significant problems? Their businesses are operating perfectly well without Intel.
 
So Intel still needs a good integrated graphics solution. Iris Xe was OKAY, but it could have been a lot better. I think Intel might try to go the way AMD is going with their Z series chips. Intel could make a very compelling SoCs with what they've done with ARC so far.

The other thing is that many people just really care that much about chasing highend performance in competitive games. There is an outspoken part of the gaming community that things everyone needs to have 4090's and play at 40k240 with raytracing. These people are wrong.

Most people want to relax for a few hours after work with a game the comforts them. I'm not going to tell people to not be interested in competitive gaming. If that's what people like to do, great, but they are a minority of gamers. My wife likes to play the sims on her laptop a few hours a week, it has an i5-11300h with Intel Integrated graphics on it, the game runs fine.

Since we are part of a community of tech enthusiasts it's easy to forget those people exist, but there are plenty of people who play games that don't even identify as gamers. There are a lot of people who have been playing the same 2-3 games for years that probably don't even have Steam installed.

The reason I'm trying to drive home this point home so hard is that people who like playing those same 2-3 games, they probably notice that Intel badge on their laptop without having any idea what it means. However, when they need to go buy a new laptop they'll think "hey, I had a great ownership experience with my last computer and it it had *insert brand* sticker on it, I should buy that one." Intel is looking to create brand recognition and brand loyality. Moving ARC from a dedicated GPU to an integrated solution, especially with AMD having a difficult time providing enough mobile parts, is a good move. It allows them to keep developing the architecture while not having to completely write everything off as a loss. On top of that, integrated graphics have been getting really good these last few years. Intel's Iris XE and AMD's 680/780M chips are VERY capable. With the price of graphics cards these last few generations, you are going to have to REALLY want to be gaming in order to get a dGPU over anything with a competent iGPU. I've already decided that my next laptop isn't going to have a dGPU. AMD's 780m or 890M would be fantastic for anything I'd do away from my PC. If I'm on a business trip for a week or two, a 780m or 890m would be fantastic. Intel can already match that level of performance with ARC and they've made some really significant improvements to ARC's drivers.

Saying all that, I doubt Intel will drop Desktop GPUs. This sounds like some short sighted idea their CEO had after spending the last several years running the company into the ground. There is absolutely no reason intel should be in this position. They are filled with tons of talented people and I think they'll come back from the last few years just fine if they get rid of Pat Gelsinger
It's an arms race between high end hardware and aggressive game software developers with normal gamers as collateral damage.
 
It's an arms race between high end hardware and aggressive game software developers with normal gamers as collateral damage.
The high end market doesn't exist like the comments section would have you believe. Do you know why there are only 3 games that actually use real ray tracing? Because ray tracing hardware is so expensive that it doesn't make sense for companies to spend money developing their games for it. High end GPUs make up a fraction of the market, who has the best GPU should be irrelevant, but fanbois and gamers think you need to buy the midranged card from whoever has the winning flagship.

The fanbois created the GPU mess we're in and now the rest of us have to deal with it.
 
Yup, I would *NEVER* buy a system with soldered RAM (if it was like a Chromebook, which really is going to be used with ChromeOS and not Linux...well fine.. but otherwise.) Sure 32GB sounds OK -- but not having the option to expand? Secondly, too many systems are like "here's the 4GB model. Oh you want 16 or 32GB? That's the one that's $1000 more expensive." I.e. they won't let you order a cheapy with more RAM in the sockets. So I ordered one with 4GB and dropped a 16GB stick in myself. With it soldered on you do not have that option.

Also, consider, this is shared video RAM. So you now have systems with quite capable integrated GPUs (really, I have an 11th Gen Intel CPU -- Tiger Lake -- and at least in Linux the drivers are VERY good. But that means when your game is taking RAM *and* VRAM out of your system RAM, even 20GB is tight. I would not want to buy a 16GB system then find out I can't bump it up to 32GB later in a case like this.
 
Why would they consider taking over a failing company with significant problems? Their businesses are operating perfectly well without Intel.
because it is a good deal, highly undervalued by current stock price. You get a lot for that money. Intel is not a dead company.
 
I will not claim they soldered ram into CPU purely for profits. It pays to have RAM so close to the CPU. I am just saying, it is very possible they do not have much right now to show it is a great improvement in speed. It seems like in times of trouble, they are just looking for any way to make more money. If that RAM goes bad, so is CPU. All the water and electricity spent on making CPU chip alone is wasted. It is a waste, just like any other company that solders everything in large blocks where you lose one tiny capacitor and everything else follows.
It is more convenient for them this way, but I believe there is a way in our world to still create things that can be fixed, or modified using easily swappable parts. It is a pure hypocrisy when they boast how many solar panels they installed on their factories while making many tons of unneeded waste.
 
Do low power mobile users even need more than 32GB ??! Not many people would upgrade if they already have 32GB
RAM can go bad fast, then you can throw your PC in the trash can if it's soldered.

I would also add.. this chip is not for "low power mobile users". This thing, the lowest TDP chip in the line is 17W TDP with 37W boost. And the others are 30W TDP with 37W boost. It's a 8 core chip (4 P core and 4 E Core).

If you want a low power system, you'd get something with a Celeron N in it (... the Celeron Ns of years past were IMHO terrible... but I was surprsied to see the 5000-series Celeron Ns get the same or a bit higher performance per clock as a Coffee Lake (8th gen) desktop CPU, so not too shabby.)

Even there, Intel specs them as topping out with 16GB, but I've seen vendors now shipping Celeron N5100s with 32GB. I mean, the reality is sometimes there is software that is RAM-intensive without being excessively CPU-intensive. Maybe the user wants to open up some memory-intensive virtual machines for instance.

On the flip side, this takes away the option (at the low end) of some vendor equipping a system with 8GB to save a few bucks... I can't speak for Windows, but Ubuntu runs perfectly fine in 8GB for typical desktop usage. To be honest, as cheap as RAM is I'm not going to shed a tear for this... but I'm sure a few of the bean counters at the system vendors did.
 
Back