Intel's comeback plan: Panther Lake in 2025, Nova Lake in 2026, says CEO Lip-Bu Tan

Skye Jacobs

Posts: 673   +15
Staff
Something to look forward to: Lip-Bu Tan has joined Intel as CEO at a pivotal moment, as the semiconductor giant faces significant challenges in regaining its competitive edge. In his first letter to shareholders, Tan laid out a clear vision for Intel's future, emphasizing a dual focus on product innovation and operational efficiency. His message was unflinching: Intel must simplify its operations, reduce costs, and deliver on its promises to regain its competitive edge in the tech industry.

At the core of Tan's strategy is Intel's product roadmap, particularly the upcoming Panther Lake and Nova Lake processors, both of which rely on the company's 18A process node. The 18A node represents a major technological leap for Intel, incorporating innovations such as RibbonFET transistor architecture and PowerVia backside power delivery.

These advancements promise up to 15% better performance per watt and 30% higher chip density compared to earlier nodes. Intel's Arizona facility is gearing up to manufacture the 18A process at high volume, with production expected to ramp up later this year, according to Tan.

However, challenges remain. Reports of low yield rates – estimated between 20% and 30% – have raised concerns about Intel's ability to meet production targets.

Despite these hurdles, Tan has expressed confidence in the technology's readiness, stating that it will enhance Intel's competitiveness. He noted that early customer projects using the 18A process are nearing completion, with tape-outs anticipated by mid-2025.

Panther Lake will be the first major product to leverage the 18A process and is scheduled for release in late 2025. These processors will feature a hybrid architecture combining performance cores (P-cores), efficiency cores (E-cores), and potentially low-power efficiency cores (LPE cores), delivering a total of 16 cores and 16 threads. Panther Lake will also include integrated GPU cores to boost AI capabilities, although specific performance metrics have yet to be disclosed.

Following Panther Lake, Intel plans to launch Nova Lake CPUs in 2026. Nova Lake is expected to push the envelope even further, with early reports suggesting it could feature up to 52 cores using the Coyote Cove and Arctic Wolf architectures. The product will likely leverage a mix of Intel's internal manufacturing and TSMC's advanced nodes to improve yield and ensure supply chain resilience.

Intel is also preparing to make waves in the data center market with its next-generation Xeon processors. The Clearwater Forest series, slated for release in the first half of 2026, will rely exclusively on E-cores and will be the first server product built on the 18A process node. These processors are expected to showcase advancements in compute chiplets and packaging technologies, including Foveros Direct.

While Intel's CPU roadmap appears strong, its GPU division faces more uncertainty. The company has reportedly canceled its high-end Arc Battlemage BMG-G31 GPUs, leaving only mid-range models like the Arc B580 in its lineup. This move raises questions about Intel's long-term commitment to competing with AMD and Nvidia in the discrete GPU market.

Additionally, recent updates have made little mention of Intel's next-generation Xe3 "Celestial" GPUs, fueling speculation about their future as standalone products.

Permalink to story:

 
Still using hybrid architecture pretty much means Intel is already lost. For big companies it's just so hard to admit something is fundamentally wrong and start from scratch.
 
Hybrid architecture makes sense, but after half a decade of hybrid cpus, windows` scheduler is still struggling. Apple seems to have it working better, probably because they control the software more.
 
Yeah let's just trust this opinion over that of Apple, Nvidia, Intel, Qualcomm, basically all ARM chips, and countless other mega corps all using big+little architectures.
None of those big+little architectures are actually targetting high performance. Another problem is that Intel really didn't think about their hybrid architecture. Unlike AMD did, Intel's solution was just panic backup that failed miserably. Scheduler problems, software problems etc followed because there existed two different architectures on same CPU package.

FYI, AMD also uses big-little architecture but only on low(er) power APUs. AMD has server CPUs with small cores but IPC still is same as on bigger cores. That is something Intel refuses to do.

My opinion is highlighted by fact that Intel is using low performance cores to promote it's first server CPU using new, supposedly groundbreaking, process node. Good luck "(y) (Y)"

Hybrid architecture makes sense, but after half a decade of hybrid cpus, windows` scheduler is still struggling. Apple seems to have it working better, probably because they control the software more.
It only makes sense with AMD type of solution where small cores are architecturally same (from software's POV) as bigger ones.
 
Intel’s fortunes rise or fall with their foundries… they’ve lagged behind since 14nm, and have been playing catch-up since. On the off chance that 18a is actually successful (believe it when I see it), they could make a come back… wouldn’t hold your breath though…
 
Aside from scheduling, what good are 52 cores, be they big, little, medium, or super size? There's only so much to be gained by off loading background tasks, and outside of some intense productivity software, nothing takes advantage of more than 6-8 cores now, if that.

Intel stayed in the game as long as they did due to single core IPC and higher clock speed. That's why AMD didn't go for higher cores counts in their consumer CPU's. They worked on increasing IPC and cache to improve their single core performance. Looks like Intel may have given up on this to play the highest core count and Cinebench score marketing game.
 
Aside from scheduling, what good are 52 cores, be they big, little, medium, or super size? There's only so much to be gained by off loading background tasks, and outside of some intense productivity software, nothing takes advantage of more than 6-8 cores now, if that.

Intel stayed in the game as long as they did due to single core IPC and higher clock speed. That's why AMD didn't go for higher cores counts in their consumer CPU's. They worked on increasing IPC and cache to improve their single core performance. Looks like Intel may have given up on this to play the highest core count and Cinebench score marketing game.
For games maybe…. But lots of productivity apps will use as many cores as you give them… Threadrippers have up to 96 cores and they are handy…
 
For games maybe…. But lots of productivity apps will use as many cores as you give them… Threadrippers have up to 96 cores and they are handy…

I have a Threadripper, and aside from photo/video editing, for just about every app I have (And I have software from the 1990's through today) the cores sit idle. I repeat, aside from specialty/video/animation software, the vast majority of consumer Core whatever CPUs are pointless.

If what you were saying were true, the 9950X3D would have been released first and sales would far outstrip the 9800X3D, because there are countless ways to use 16 cores and 32 threads.

Even if Word, Excel, Powerpoint, etc. could ues all of the cores, what would be the point? As for personal AI, Microsoft goes to the cloud even if you have a cherished NPU in your system. Don't really see that taking off yet, either.

If your using it to compete with Threadripper, instead of using repurposed Xeon chips, fine. But while profitable, the workstation market doesn't move as many CPUs.
Panther/Nova lake are their consumer CPU to take on Ryzen. not Epyc
 
Last edited:
I have a Threadripper, and aside from photo/video editing, for just about every app I have (And I have software from the 1990's through today) the cores sit idle. I repeat, aside from specialty/video/animation software, the vast majority of consumer Core whatever CPUs are pointless.

If what you were saying were true, the 9950X3D would have been released first and sales would far outstrip the 9800X3D, because there are countless ways to use 16 cores and 32 threads.

Even if Word, Excel, Powerpoint, etc. could ues all of the cores, what would be the point? As for personal AI, Microsoft goes to the cloud even if you have a cherished NPU in your system. Don't really see that taking off yet, either.
I have a Threadripper 7980x… and while you are correct that most software doesn’t utilize 64 cores, some does…. I remember my intel 5960x with an “insane” 8 cores that everyone said no one would ever need….
That’s the thing about progress… 10 years from now, we’ll probably be running 500 cores…
 
This looks pretty bleak. Panther Lake is just a mobile refresh on a smaller node and Nova Lake is coming too late and will have to compete with Zen6 which on paper looks to be a monster.
 
I find it hilarious everyone is flaming Intel so hard because they release 1 generation of CPUs that aren't competing at the high end and everyone seems to think it means their demise. Nobody remembers when AMD sucked and for how long they sucked? Lol?

Also, if the 13th/14th gen fiasco wouldn't have happened I highly doubt this much heat would be happening. Also let's not forget, none of those Intel CPUs exploded lol. Do any Intel CPUs explode? Not that I've seen? Do any AMD CPUs explode? Yea, still happening to this day, even with a 5700X3D which was shown to us by TechYesCity.

Intel CPUs don't explode bro, so obviously AMD is still messing up.

Let's also not forgot Intel deleted hyperthreading on the Core Ultra Series. Go turn off SMT on your AMD CPUs right now and see if they're still as fast. They won't be.

Nothing is happening to Intel and they'll be fine.
 
Hybrid architecture makes sense, but after half a decade of hybrid cpus, windows` scheduler is still struggling. Apple seems to have it working better, probably because they control the software more.
Intel sh*t the bed with their hybrid architecture when they decided to allow for different instruction set capabilities between big and little cores. It is telling that both AMD and ARM avoided that, and instead reduced size by reducing the internal structures that implement the ISA, rather than cutting features of the ISA from the core itself.

Had Intel kept AVX-512 support in their little cores such as Gracemont, but instead implemented a double or quad pump design (128 or 64b wide arithmetic units) to reduce die size, then developer support for AVX-512 would be far higher and scheduler design much simpler. Instead they effectively nuked AVX-512 and then backtracked with AVX10 which is basically an about-turn on the whole ISA fragmentation mistake they made in the first place with AVX-512.
 
Still using hybrid architecture pretty much means Intel is already lost. For big companies it's just so hard to admit something is fundamentally wrong and start from scratch.
It's super easy to criticize a massive company's direction from a position outside of the company. Rock climbing is always easier when you are on the ground.

I find it hilarious everyone is flaming Intel so hard because they release 1 generation of CPUs that aren't competing at the high end and everyone seems to think it means their demise. Nobody remembers when AMD sucked and for how long they sucked? Lol?

Let's also not forgot Intel deleted hyperthreading on the Core Ultra Series. Go turn off SMT on your AMD CPUs right now and see if they're still as fast. They won't be.
Your fanboying is showing. When AMD made low-performance CPU's they didn't charge high-performance prices. I don't care whose name is on the box I care about performance and value. My current 5700x3D I got for $180 has been great. Nothing Intel makes can beat that price to performance. Whoever makes the best CPU at the time when I want to upgrade will get my money I don't care if the box is blue or red and you shouldn't either. You're ignoring the issues with Intel's last three generations of CPUs that's not a 1 gen issue it's been three generations. Why would anyone buy an Intel CPU right now when they aren't competitive in price or performance? What use case is the best one for Intel?

"Exploding" and burning out are pretty much the same thing since the end result is a CPU that no longer works and both companies have had issues with CPU's failing and every computer component company has allowed bad parts to be sold to consumers, it's just how that business works and it's unavoidable without having to charge significantly more.

Why would anyone turn off a feature that would make their CPUs slower? What kind of point are you trying to make? AMD CPUs are faster than Intel's right now in most cases.

Intel’s fortunes rise or fall with their foundries… they’ve lagged behind since 14nm, and have been playing catch-up since. On the off chance that 18a is actually successful (believe it when I see it), they could make a come back… wouldn’t hold your breath though…

Their foundry has held them back, but I don't think it's fair to criticize them for it since making CPU's is a lot more difficult than designing them. If it was easy no one would be contracting TSMC to make them.
 
It's super easy to criticize a massive company's direction from a position outside of the company. Rock climbing is always easier when you are on the ground.
It's very easy when we compare hybrid-solutions;

AMD

- Supports AVX512
- Same IPC and instruction set on both big and small cores
- No compatibility problems and/or program crashes just because of two different cores used

Intel

- Had to ditch support fro AVX512 completely because "small" cores do not support it.
- IPC, internal architecture and instruction set support different on big and small cores
- Serious compatibility and other issues on programs because old software do not understand that there could be two different architectures on same package
- Background tasks easily go on E-cores that makes using virtual machines essentially pointless

Intel tries to fix SOME of problems above but many still remain. I did see those coming even before Intel launched hybrid CPUs. Like said many times, hybrid core BS was purely panic solution. AMD thought what hybrid solution should be, Intel didn't.
 
Their foundry has held them back, but I don't think it's fair to criticize them for it since making CPU's is a lot more difficult than designing them. If it was easy no one would be contracting TSMC to make them.
Hard/easy is irrelevant… they have been trying to make both work for awhile - and to be fair, it worked for many years - but it hasn’t been working recently and it’s killing them.

If something isn’t working, you’ve got to fix it… Intel claims they have - we’ll have to wait and see, but given how difficult it is, it would take a miracle…
 
Your fanboying is showing.

Why would anyone turn off a feature that would make their CPUs slower? What kind of point are you trying to make? AMD CPUs are faster than Intel's right now in most cases.
I don't really see how my fanboy is showing by making accurate points about what AMD CPUs have been documented to do. I've only owned a 4690k and a 12400f when it comes to their chips I've bought on my own and before I had my 12400f I had a 5600g with no GPU playing games like Dead By Daylight at 720p low, so I'm preeeety sure I know AMD is capable of making good CPUs.

My hyperthreading/smt point doesn't need to be explained but ok. "A hardware feature that allows a single processor core to handle multiple threads of execution concurrently." It affects performance in all scenarios. Intel deleted it. Of course there's a chance they're gonna be behind in performance vs their last gen let alone the competition that has it enabled. Lol?

My criticisms of AMD have nothing to do with loving Intel as opposed to AMD being a disappointing red headed step child. AMD has been garbage for a long time until Ryzen came along, then they continued to ignore the graphics division compared to nvidia, the issues I experience with drivers on my 5700 xt and 6650 xt were a joke for a very long time, still no hardware accelerated gpu scheduling on anything below 7000 series, still no "prefer maximum performance" option in their control panel like they used to have in their software when they were ATI yet nvidia still has it, wattman reads my clocks completely wrong compared to what they're supposed to be so I choose the minimal install, multi plane overlays are still trash on anything below the 7000 series, usb issues were never fixed on my am4 chipset with my 5600g, no option to install the gfx driver on it's own then the software later like nvidia allows, AMD iGPUs do nothing for content creation, freesync is honestly a dollar tree version of g sync and if/when you downgrade you'll understand.

I'm not an Intel fanboy, just someone who's been sitting around watching AMD do basically everything wrong for the better part of 20 years while wanting them to succeed because NVIDIA will rule everything if they don't so...yea. AMD still isn't fine. 9000X3D chips have burnt up as well. Everything in this section of the industry is honestly a joke so it's pretty hard to be anyones "Fanboi" right now
 
Intel’s future sounds like a wild game of code names: Panther Lake, Nova Lake, Coyote Cove, Arctic Wolf… are we designing CPUs or assembling a new Avengers lineup?
 
I don't really see how my fanboy is showing by making accurate points about what AMD CPUs have been documented to do. I've only owned a 4690k and a 12400f when it comes to their chips I've bought on my own and before I had my 12400f I had a 5600g with no GPU playing games like Dead By Daylight at 720p low, so I'm preeeety sure I know AMD is capable of making good CPUs.

My hyperthreading/smt point doesn't need to be explained but ok. "A hardware feature that allows a single processor core to handle multiple threads of execution concurrently." It affects performance in all scenarios. Intel deleted it. Of course there's a chance they're gonna be behind in performance vs their last gen let alone the competition that has it enabled. Lol?

My criticisms of AMD have nothing to do with loving Intel as opposed to AMD being a disappointing red headed step child. AMD has been garbage for a long time until Ryzen came along, then they continued to ignore the graphics division compared to nvidia, the issues I experience with drivers on my 5700 xt and 6650 xt were a joke for a very long time, still no hardware accelerated gpu scheduling on anything below 7000 series, still no "prefer maximum performance" option in their control panel like they used to have in their software when they were ATI yet nvidia still has it, wattman reads my clocks completely wrong compared to what they're supposed to be so I choose the minimal install, multi plane overlays are still trash on anything below the 7000 series, usb issues were never fixed on my am4 chipset with my 5600g, no option to install the gfx driver on it's own then the software later like nvidia allows, AMD iGPUs do nothing for content creation, freesync is honestly a dollar tree version of g sync and if/when you downgrade you'll understand.

I'm not an Intel fanboy, just someone who's been sitting around watching AMD do basically everything wrong for the better part of 20 years while wanting them to succeed because NVIDIA will rule everything if they don't so...yea. AMD still isn't fine. 9000X3D chips have burnt up as well. Everything in this section of the industry is honestly a joke so it's pretty hard to be anyones "Fanboi" right now
Let me simplify this for you. Intel sucks at making CPUs now and AMD doesn't. Not saying this can't change, but Intel has made some very foolish decisions the past few years and AMD has been playing a perfect hand. I'm not sure why you dragged GPUs into a CPU conversation, but AMD isn't playing in the "fastest GPU" game right now, and are still finding a way to stick it to Nvidia.

Your arguments are rambling, incoherent, and are obviously based on emotion vs rational debate.
 
My hyperthreading/smt point doesn't need to be explained but ok. "A hardware feature that allows a single processor core to handle multiple threads of execution concurrently." It affects performance in all scenarios. Intel deleted it. Of course there's a chance they're gonna be behind in performance vs their last gen let alone the competition that has it enabled. Lol?

We know what hyperthreading is… and we all know that it’s a good thing… so… ask yourself, why would Intel get rid of it?

Hmmm… maybe because they realize their architecture won’t work with it enabled? They don’t deserve praise for this… they deserve scorn…
 
I find it hilarious everyone is flaming Intel so hard because they release 1 generation of CPUs that aren't competing at the high end and everyone seems to think it means their demise. Nobody remembers when AMD sucked and for how long they sucked? Lol?

Also, if the 13th/14th gen fiasco wouldn't have happened I highly doubt this much heat would be happening. Also let's not forget, none of those Intel CPUs exploded lol. Do any Intel CPUs explode? Not that I've seen? Do any AMD CPUs explode? Yea, still happening to this day, even with a 5700X3D which was shown to us by TechYesCity.

Intel CPUs don't explode bro, so obviously AMD is still messing up.

Let's also not forgot Intel deleted hyperthreading on the Core Ultra Series. Go turn off SMT on your AMD CPUs right now and see if they're still as fast. They won't be.

Nothing is happening to Intel and they'll be fine.

Though I'm not a fanboy like you, I can't ever see myself buying a product from a company that invests billions of $$$ in the apartheid regime.

For that reason, I go for AMD every time.
 
Though I'm not a fanboy like you, I can't ever see myself buying a product from a company that invests billions of $$$ in the apartheid regime.

For that reason, I go for AMD every time.
By apartheid regime, I assume you mean the only real democracy in the Middle East as opposed to the religious dictatorships surrounding it?
 
Though I'm not a fanboy like you, I can't ever see myself buying a product from a company that invests billions of $$$ in the apartheid regime.

For that reason, I go for AMD every time.

True. Intel's most advanced manufacturing facilities are in Israel. They recently had to idle their $25 billion Kiryat Gat plant. Understandable, it's just a few miles from Gaza. If the Iranian situation spins out of control, Kiryat Gat may be visited by Mr. Oreshnik. Just saying.

But I don't worry too much about Intel's future, they're just going through an under-dog period. Intel fan-boys will just have to button their lips and grit their teeth. AMD fans like myself went through a much worse period when bulldozer came out. The gloom and doomers were forecasting the immediate demise of AMD. I owned a couple of 'bulldozers'. They were slow but rock solid. AMD bounced back and Intel will too.

 
Kiryat Gat may be visited by Mr. Oreshnik. Just saying.

This made me 😂

By apartheid regime, I assume you mean the only real democracy in the Middle East as opposed to the religious dictatorships surrounding it?

I wasn't going to bother replying, as it's too easy and this isn't the forum for political debate for a so-called "democracy".

I'll just say, as they laugh while they commit a holocaust in front of the world, God is taking account of a day no one can escape. And God is not unjust.
 
This made me 😂



I wasn't going to bother replying, as it's too easy and this isn't the forum for political debate for a so-called "democracy".

I'll just say, as they laugh while they commit a holocaust in front of the world, God is taking account of a day no one can escape. And God is not unjust.
Yeah, while your name gives you away, I’d love to see you do some research and realize that Israel is a far more tolerant state - even to Palestinians - than any of the surrounding nations… ask yourself why, when Israel won the wars in 67 and 73, the Syrians, Jordanians, Lebanese and Egyptians all refused to take any in - and still do to this day.

Israel isn’t committing any holocaust - they’re attempting to destroy Hamas, a terrorist group responsible for reprehensible actions against innocent civilians.

Had Hamas tried that with the US, they’d have been exterminated with EXTREME prejudice - ask Iraq and Afghanistan about that…
 
Back