In brief: Compared to traditional rendering methods, real-time path tracing is often less labor-intensive and can help produce more realistic lighting and shadows in 3D graphics. However, due to its heavy computational demands, even high-end GPUs can face challenges with path tracing. In light of this, Intel is looking for ways to make path tracing feasible on lower-end hardware, particularly for iGPUs.

Intel recently presented research papers outlining how new algorithms and AI implementations can make ray tracing and path tracing efficient enough for mid-range and entry-level graphics cards. If successful, the methods could drive applications for the company's ubiquitous integrated GPUs and its recent attempts to enter the dedicated graphics market.

Ray tracing is typically associated with recent high-end AMD and Nvidia graphics cards. This technology aims to simulate the behavior of light as it interacts with surfaces, resulting in more accurate reflections, refractions, and shadows. While ray tracing can significantly enhance visual fidelity, it also places a heavy computational burden on GPUs, decreasing framerates in real-time applications.

Also read: Path Tracing vs. Ray Tracing, Explained

Because of its increased computational demands and realism, path tracing has primarily been experimented with in retro games and titles featuring low-detail graphics, such as Minecraft, Quake, and Descent. Its application in visually elaborate games like Cyberpunk 2077 can bring even high-end graphics cards, like the $1,600 GeForce RTX 4090, to their knees without upscaling and frame generation.

In light of prior performance metrics, Intel's claim that path tracing can become viable on cheaper hardware is bold. However, previous analysis of Cyberpunk's path-tracing mode suggests that today's implementations leave much room for efficiency improvements. As far as real-time rendering is concerned, the technology is still in its infancy.

One of Intel's papers details a method to generate reflections for certain hemispheric surfaces more efficiently than prior state-of-the-art processes. Another article explains how to render certain materials 50 percent to 500 percent faster by better calculating the light that reaches the camera. Research from Inria in Bordeaux shows how developers could gain greater control in simulating dynamic spectral effects.

A 2023 Eurographics Symposium on Rendering study proposes improved ways of simulating photons in real-time direct illumination, shown in the Quake Arcane Dimensions video above. The research incorporates RTX Direct Illumination, a feature Nvidia uses to optimize its path-tracing methods. Intel also demonstrated how neural networks can increase path-tracing efficiency in May.

The company plans to make its work open-source so all GPU vendors can benefit, though end-users probably won't see the results for some time. Intel is likely framing the research as helpful for mid-range and integrated graphics hardware because those are the performance brackets its products currently occupy.