Nvidia are making more money than Intel, without making X86 CPU's. The GPU market is absolutely not something Intel should leave if they plan on making money again.Why? Financial hardships. Difficult decisions are coming soon at Intel.
CPU's are also a "money pit" and that's clearly not working for them either. The GPU market is huge and growing, it's not a "luxury" to be in, if Intel leave it, that's a massive market they're just leaving to the competition.Because it is a money pit, it is a luxury that Intel cannot afford anymore.
4090's go for over £1.5k, the most expensive desktop CPU intel does is £500. Yes I'm aware most people buy a mid-range GPU, it's still got pretty high margins in it. My point being though, if Intel really put some effort in, they could make more money in the GPU market than they could in the CPU market, just look at Nvidia, making far more money than Intel, whilst selling no CPU's at all.Desktop gpu has lower profit margin compared to desktop and laptop cpu.
More over, Intel arc has low market share
But if they do catch up in performance over the next generation or two, which they absolutely could, their GPU's are actually amazing in certain aspects, the ray-tracing performance is really good, better than AMD's in the first generation, XeSS is better than FSR, and these are Intel's first try.In theory gaming GPUs is a large market, but in practice it will be hard to win any part of it unless Intel is willing to undercut its competitors in a major way. Consumer GPUs have lower margins than other products. They are big chips, and if Intel is behind in performance/mm2 (as was the case with Alchemist) then the margins are even lower.
Until its GPUs are successful, Intel will lose money on them, and if it takes several generations to get to break even, then it will be quite a few years of losses. Intel is already losing money. Adding more loss isn't what it wants right now.
No, if you haven't noticed, intel got rid of all the divisions and technologies that were making a loss. It's not sustainable under current conditions to keep throwing billions down the drain.No. It's just a rumor based on speculation. It's all a bunch of "what ifs". There's nothing concrete about it, the true example of bait and you're hooked hard.
At one point the writer says the gpu division is on the chopping block with no quote to back that up or anything. Then they say battlemage is coming late 2024 before Nvidia and AMD release anything and that it's success or failure "might" decide the fate of Celestial.
It's really just "what if all this different stuff happens then Intel might do this or that".
If Intel give up on discrete GPU's they give up on having a long-term future, as this technology is not just for building triangle renderers.
Well, by staying relevant I meant they need to compete and what you said just proves it, they have been losing (btw, one “o” in losing, bro) market share and the rate has increased over the last years. They’re still dominant, but my point was not about who has more market share, but the fact they can’t afford with lesser revenue to throw money into discrete GPUs, where they are at least 3-4 years behind in development compared to the competition.Obviously incorrect since Intel is the one with largest market share in all markets where they use x86.
Sure they been loosing market share to AMD, but even at this rate it will take several years before AMD could gain majority of the market share.
All it takes(big deal ofc) is intel comming out with a product (mainly a DC product)that is ahead of AMD for a couple of gens, then the table turns and we have intel ahead and AMD with a smaller share and decreasing, which is a worse situation than now.Well, by staying relevant I meant they need to compete and what you said just proves it, they have been losing (btw, one “o” in losing, bro) market share and the rate has increased over the last years. They’re still dominant, but my point was not about who has more market share, but the fact they can’t afford with lesser revenue to throw money into discrete GPUs, where they are at least 3-4 years behind in development compared to the competition.
It's just click bait.
That's the sunk cost fallacy. The decision should be made based on future costs and revenue, not what happened until now.Intel lost a lot of money on GPU department already. The only way to make it back is to keep pushing.
No? the GPU division doesn't exist anymore? Where'd you hear that? Not from Intel? Right, because it was a speculative statement you made based on "what ifs".No, if you haven't noticed, intel got rid of all the divisions and technologies that were making a loss. It's not sustainable under current conditions to keep throwing billions down the drain.
Intel's valued at $100bn at the current share price and that is becoming more of a bargain for other big tech to take them over. Wonder how the market would react if Microsoft, Apple or Amazon acquired Intel.
It's an arms race between high end hardware and aggressive game software developers with normal gamers as collateral damage.So Intel still needs a good integrated graphics solution. Iris Xe was OKAY, but it could have been a lot better. I think Intel might try to go the way AMD is going with their Z series chips. Intel could make a very compelling SoCs with what they've done with ARC so far.
The other thing is that many people just really care that much about chasing highend performance in competitive games. There is an outspoken part of the gaming community that things everyone needs to have 4090's and play at 40k240 with raytracing. These people are wrong.
Most people want to relax for a few hours after work with a game the comforts them. I'm not going to tell people to not be interested in competitive gaming. If that's what people like to do, great, but they are a minority of gamers. My wife likes to play the sims on her laptop a few hours a week, it has an i5-11300h with Intel Integrated graphics on it, the game runs fine.
Since we are part of a community of tech enthusiasts it's easy to forget those people exist, but there are plenty of people who play games that don't even identify as gamers. There are a lot of people who have been playing the same 2-3 games for years that probably don't even have Steam installed.
The reason I'm trying to drive home this point home so hard is that people who like playing those same 2-3 games, they probably notice that Intel badge on their laptop without having any idea what it means. However, when they need to go buy a new laptop they'll think "hey, I had a great ownership experience with my last computer and it it had *insert brand* sticker on it, I should buy that one." Intel is looking to create brand recognition and brand loyality. Moving ARC from a dedicated GPU to an integrated solution, especially with AMD having a difficult time providing enough mobile parts, is a good move. It allows them to keep developing the architecture while not having to completely write everything off as a loss. On top of that, integrated graphics have been getting really good these last few years. Intel's Iris XE and AMD's 680/780M chips are VERY capable. With the price of graphics cards these last few generations, you are going to have to REALLY want to be gaming in order to get a dGPU over anything with a competent iGPU. I've already decided that my next laptop isn't going to have a dGPU. AMD's 780m or 890M would be fantastic for anything I'd do away from my PC. If I'm on a business trip for a week or two, a 780m or 890m would be fantastic. Intel can already match that level of performance with ARC and they've made some really significant improvements to ARC's drivers.
Saying all that, I doubt Intel will drop Desktop GPUs. This sounds like some short sighted idea their CEO had after spending the last several years running the company into the ground. There is absolutely no reason intel should be in this position. They are filled with tons of talented people and I think they'll come back from the last few years just fine if they get rid of Pat Gelsinger
The high end market doesn't exist like the comments section would have you believe. Do you know why there are only 3 games that actually use real ray tracing? Because ray tracing hardware is so expensive that it doesn't make sense for companies to spend money developing their games for it. High end GPUs make up a fraction of the market, who has the best GPU should be irrelevant, but fanbois and gamers think you need to buy the midranged card from whoever has the winning flagship.It's an arms race between high end hardware and aggressive game software developers with normal gamers as collateral damage.
because it is a good deal, highly undervalued by current stock price. You get a lot for that money. Intel is not a dead company.Why would they consider taking over a failing company with significant problems? Their businesses are operating perfectly well without Intel.
I see the future of Intel! It's going to be bought and renamed to ChipX by its new owner.Why? Financial hardships. Difficult decisions are coming soon at Intel.
RAM can go bad fast, then you can throw your PC in the trash can if it's soldered.Do low power mobile users even need more than 32GB ??! Not many people would upgrade if they already have 32GB
RAM can go bad fast, then you can throw your PC in the trash can if it's soldered.Do low power mobile users even need more than 32GB ??! Not many people would upgrade if they already have 32GB