Intel 14th-gen processors may have ray tracing capabilities

Jimmy2x

Posts: 141   +11
Staff
Why it matters: While the 13th-gen Raptor Lake CPU release may be right around the corner, Intel fans are already looking ahead at what the 14th generation will bring to the table. A South Korean tech enthusiast has uncovered information that may shed additional light on Intel's upcoming 14th-gen CPUs. According to Coelacanth's Dream, a graphics compiler found in Intel's GitHub repository looks to include elements supporting on-board ray tracing functionality.

Coelacanth's Dream posted the initial finding late last week, highlighting an earlier commit that added Meteor Lake to the virtual instruction set architecture (vISA) as well as a patch to the Intel Graphics Compiler (IGC). The updates, which show the potential for on-board ray tracing capabilities, would accompany Intel's already anticipated move from the Intel 7 architecture to the new Intel 4 node.

According to an article from Tom's Hardware, the architecture difference between the 12th- and 13th-gen CPUs and upcoming 14th-gen CPUs supports the theory that Meteor Lake will include ray tracing hardware. Alder Lake and Raptor Lake both come equipped with 96 Xe-LP (low power) execution units (EUs). Available information points to Meteor Lake deviating from this architecture and instead using the Xe-HPG architecture, used in Intel's Arc Alchemist graphics cards. The move could bring 14th-gen processors with anywhere from 128 to 196 EUs.

Other changes noted in the GitHub findings indicate that the 14th-gen iGPU architecture may lack the XMX units present in desktop Xe-cores. While not explicitly stated in the GitHub findings, the lack of dot product accumulate systolic (DPAS) instructions, which rely on Xe Matrix eXtensions (XMX units) for processing, indicates the units may be missing altogether.

The Meteor Lake architecture will mark a departure from Intel's 10nm process (Intel 7) and the move to the company's new 7nm node, Intel 4. The new process node will be Intel's first production run using extreme ultraviolet lithography (EUV) system and is expected to bring upwards of a 20% performance uplift once released.

Unlike AMD, which has found success by extending their socket life across the Ryzen product line, 14th-gen users will once again have to replace hardware to accommodate a new socket type. While 12th-gen adopters may be able to use their existing motherboards for the upcoming Raptor Lake launch, Meteor Lake adopters will have to make the shift to the new LGA 1851 socket.

Permalink to story.

 

wiyosaya

Posts: 8,261   +7,611
Interesting, but developers will need to code for this new ray-tracing feature. This makes me wonder if "ray tracing" is the new marketing buzz word.
 

Icysoul

Posts: 45   +17
So their Alchemist cards are at the back of the pack when it comes to gaming performance, but they want to integrate ray tracing into their CPUs...? Err... sure... at least they have money to spend...
 

takaozo

Posts: 409   +626
Never had the curiosity to turn RT on and not going to with performace cripple on FPS. Only that DLSS I tried once and it's bullcrap. Image look dirty.
Maybe in future if the new hardware can do it better.
But Intel can try, who knows maybe their implementation it's better if the CPU assume this task.
What's next a PCIE RT accelerator like 3DFX?
 

hahahanoobs

Posts: 4,675   +2,654
"Unlike AMD, which has found success by extending their socket life across the Ryzen product line"

AMD needed the marketshare and revenue. There was also backlash after the attempt to limit Zen 3 to ONE chipset, then two. Don't expect 4 year socket life again from AMD as long as they are competitive. AMD would love nothing more than to follow Intel's lead.

Again, historically you only needed one Intel socket to compete with three from AMD. Also, using the same chipset allows innovation in motherboards and cpu perf to stall. No thanks.
 

kiwigraeme

Posts: 1,305   +953
"Unlike AMD, which has found success by extending their socket life across the Ryzen product line"

AMD needed the marketshare and revenue. There was also backlash after the attempt to limit Zen 3 to ONE chipset, then two. Don't expect 4 year socket life again from AMD as long as they are competitive. AMD would love nothing more than to follow Intel's lead.

Again, historically you only needed one Intel socket to compete with three from AMD. Also, using the same chipset allows innovation in motherboards and cpu perf to stall. No thanks.

This is all pure speculation - AMD has stated it will try and kept platforms/motherboards going longer - however given the speed of progress older MBs will be shunted as new stds are implemented in drives, memory , wifi ,BT etc .

Why would AMD love to follow Intel - when the stability of the platform as endeared so many people . pure speculation again.

As for 1 intel MB equalled 3 AMD - I suppose your logic is highly selective to a very small subset - ie those who spent a truck load of money on one CPU - to run games only at 1080p even though they bought cutting edge GPUs -ie single core speed in selected games at 1080p

Yeah that's how everyone ( Not ) uses a PC ( when downgrading a few GPU settings - and spending $400 less will give you the FPS needed in competitive gaming )

Your rant is as bad as my Apple rants
 

rmcrys

Posts: 292   +237
AMD latest iGPU already support RT but it is pointless anyway

Never had the curiosity to turn RT on and not going to with performace cripple on FPS. Only that DLSS I tried once and it's bullcrap. Image look dirty.

1) RT is fantastic and it is the future, nevertheless dGPUs already have a huge work to deal with it and iGPUs will still take many years of development until they can deal with RT on an usable way...

2) DLSS or FSR 2.0 are fantastic if you set it for quality. If the image looks bad you are using it the wrong way (like too high hopes, and try at 4K on a big monitor with the lowest dlss preset...).

3) for the next few years RT on iGPUs will only be usable: if DLSS/FSR 2.0 is also enabled with it; if the iGPU has a lot of fast system memory and enough TDP for it.

iGPUs are *extremely* important because most very slim laptops or hybrids like the Surface Pro, Dell 13" laptops, etc need good GPUs for 3D/CAD and to allow play some games. Why do I have to choose between portability and decent (not great) graphics´ performance? I want both.

So if the iGPUs like the ones from AMD RDNA2 (soon RDNA 3) and Apple M2 come to a portable device like an SP9 / iPad Pro and the drivers/game allow RT+FSR2.0, people can have CAD and some good gaming experience. Until now, Intel iGPUs were a garbage and I would have better gaming experience from a Samsung Tab S8 Ultra or iPad Pro than from "powerful" windows laptop...
 

Xausejo

Posts: 15   +3
Intel should stop keeping changing motherboard more often.. They should be doing same as the AMD doin.. People are sick of keep on changing motherboards.. it wasting too much money.. no problem if they raise the processors price tag.. as long as the don't keep replacing motherboard every two years..