Intel's 15th-gen Arrow lake CPUs: How much faster will they be over Alder lake?

PriyaWalia

Posts: 17   +1
Something to look forward to: As Intel's upcoming Arrow Lake CPU approaches launch, rumors suggest it could come with a staggering 45 percent IPC uplift over the already popular 12th-gen Alder Lake family. If that turns out to be the case, it would mean substantial increases in per-core and per-clock performance.

YouTuber Red Gaming Tech claims Arrow Lake could come with a 45 percent IPC uplift over the previous generation. The claim is made in relation to the performance cores, also known as P-cores. However, the new efficiency cores haven't been assigned an IPC improvement number, even though they will undoubtedly play a significant role in deciding overall performance.

Other major factors include clock speeds and core counts. Even though neither have received any official word from Intel, Red Gaming Tech claims Arrow Lake was once planned to offer eight P-cores and 32 E-cores but has since been reduced to eight P-core and 16 E-count as Raptor Lake. Even if this were the case, a CPU that at least matches Raptor Lake's clock speed would still be extremely quick and excel at multi-threaded tasks.

According to the leaker, Arrow Lake will have the same physical package, socket, and pin configuration as its desktop predecessor, Meteor Lake. Thus, while the new Arrow Lake processors might offer a valuable speed boost over current-generation Intel processors, many of their distinguishing characteristics won't be available until Meteor Lake arrives later this year. The MTL-S processor, which will combine silicon from TSMC N3 and Intel 4 on a single chip, will be the first Intel CPU to use disaggregated cores.

As a side note, it's unclear how Arrow Lake will fit into Intel's generational nomenclature. 13th-gen chips are what Raptor Lake currently uses. You might therefore presume Arrow Lake will be 14th-gen. However, a Raptor Lake update later in the year is anticipated and it may receive the 14th-gen designation.

Lastly, it's important to keep in mind that unconfirmed information should be taken with a grain of caution until it is officially confirmed by the manufacturer.

Permalink to story.

 
So... the news are that the 15th gen will be faster than the... 12th Gen?! Wow

What I would like to hear:
- 20% faster than the 14th gen
- the GPU is as fast as RTX 2060
- it consumes as much as an apple A2 class chip.

Now, to tell me that it is faster than 3 generations' old chip and nothing about TDP/ power usage, means very little.
 
So... the news are that the 15th gen will be faster than the... 12th Gen?! Wow

What I would like to hear:
- 20% faster than the 14th gen
- the GPU is as fast as RTX 2060
- it consumes as much as an apple A2 class chip.

Now, to tell me that it is faster than 3 generations' old chip and nothing about TDP/ power usage, means very little.
GPU as fast as an RTX 2060 I hope you don't honestly expect that. Intel has basically put a V8 in a Miata for the last couple generations so I don't know how they're going to pull that off.
 
Red Tech Gaming is not a leaker, you can't even call him a leaker, cause he is wrong every single time and his info comes out of thin air.
MLID and Raichu's Intel leaks are solid and on point.
 
MLID and Raichu's Intel leaks are solid and on point.
Buddy I hope this is satire.
MLID has been wrong just in the last half year about Intel ARC cancellation, AMD's expected cpu/gpu performance/some of the 7000 series having igpu and at the very least with his wild claims about the upcoming 7950X3D performance "almost 30% faster than the 13900K" being almost certainly wrong. Also the dude is well known for deleting his youtube videos if too many people notice he lied.
 
Buddy I hope this is satire.
MLID has been wrong just in the last half year about Intel ARC cancellation, AMD's expected cpu/gpu performance/some of the 7000 series having igpu and at the very least with his wild claims about the upcoming 7950X3D performance "almost 30% faster than the 13900K" being almost certainly wrong. Also the dude is well known for deleting his youtube videos if too many people notice he lied.
That's why I specifically wrote his Intel leaks are on point most of the time, regarding ARC he didnt say it was completely going to be cancelled, he said dgpus might be cancelled igpus will continue
 
I'm rather skeptical of that number, but it's possible. There's always better tech coming, I'd probably avoid grabbing Intel for gaming purposes until Arrow Lake though in light of this news.

I'm due for a new build so 7800X3D is probably what I'd grab, but if I don't get that I'll probably just wait for Arrow Lake. I would not touch AL or RL right now.
 
There is no reason for a desktop processor to have a combination of weak (supposedly "efficient" according to the advertising) and strong cores. The only reason is that they're behind in nm and want to keep temperatures from skyrocketing while they can have momentary spikes in processor performance similar to the competition, confuse consumers because they can't convince them, and exploit Amdahl's law a bit (don't say too much about the fact that performance in process parallelism doesn't evolve in proportion to the available number of processing threads).

If the new processor is going to be manufactured at a 3nm process node, they have no practical reason to continue to have this cloud of cheating with the weak cores that we call "efficient" for purely psychological reasons. They are not “efficient” they are WEAK, shame shame shame. 🙃
 
GPU as fast as an RTX 2060 I hope you don't honestly expect that. Intel has basically put a V8 in a Miata for the last couple generations so I don't know how they're going to pull that off.
Most highly demanding apps are 3D apps and those use nVidia's GPUs to accelerate the process; most CPUs since the 8th gen are more than enough except for very specific tasks; the newest chips should focus on gpu power and overall power efficiency.
 
Buddy I hope this is satire.
MLID has been wrong just in the last half year about Intel ARC cancellation, AMD's expected cpu/gpu performance/some of the 7000 series having igpu and at the very least with his wild claims about the upcoming 7950X3D performance "almost 30% faster than the 13900K" being almost certainly wrong. Also the dude is well known for deleting his youtube videos if too many people notice he lied.
These are some very weak counter arguments against MLID. Yes ARC was not straight cancelled but it was reorganized and is barely alive. What do you mean 7000 series iGPU. They all have that. 7950X3D is not out yet so you cant claim it's not 30% faster in some edge cases. Care to point out specific videos he deleted?

And Intel. Oh boy where do I start. Arrow Lake is 2024 best case assuming Intel can release something on time for once. That means it will be competing again Zen 5. Possibly Zen 5 X3D.
 
There is no reason for a desktop processor to have a combination of weak (supposedly "efficient" according to the advertising) and strong cores.
Power efficiency doesn't just matter for mobile.

Also, for humor value, I'll paraphrase Agner Fog about the usefulness of multi-core CPUs: 'Well, I suppose they're useful for running the spyware faster.'

Efficiency cores can run background spyware better than powerful cores. They prevent the spyware from intruding on the user experience, by freeing the power cores, and use less power in the process. Not having to optimize all the spyware code for mobile-only codebases is a boon.
 
Power efficiency doesn't just matter for mobile.

Also, for humor value, I'll paraphrase Agner Fog about the usefulness of multi-core CPUs: 'Well, I suppose they're useful for running the spyware faster.'

Efficiency cores can run background spyware better than powerful cores. They prevent the spyware from intruding on the user experience, by freeing the power cores, and use less power in the process. Not having to optimize all the spyware code for mobile-only codebases is a boon.

Efficiency cores allow that maintenance, background low importance tasks run without too much power draw or speed draw.

The global psycho about Spyware may also be happy knowing that they consume less. Nowadays most advanced OSes use feedback / telemetry data to know what to improve, consumer uses, performance stats, etc.. Some see it as a must to improve the apps, others psycho see it as Spyware. The funny thing is, those last ones want the improvements but that they achieve it with no data... well it's hard to please all.

I also agree that other than that, other data should not be gathered and it should be prohibited and aggressive controlled. It should also be established a max of resources used for that data gathering (like less of 1% CPU) and not what the company thinks is ok...
 
Last edited:
Back