Unreal Engine ray tracing stuttering in AMD RDNA 4 GPUs linked to Nvidia-optimized game code

Alfonso Maruccia

Posts: 1,897   +572
Staff
What just happened? Ray tracing was supposed to be faster and smoother on AMD's latest GPUs, yet some demanding games still struggle. A closer look now points to the real reason behind the disappointing performance. It's not just hardware limitations at play – engine choices and driver quirks are also shaping the experience.

Radeon RX 9070, Radeon RX 9070 XT, and other RDNA 4-based cards should perform much better than they have. AMD's latest GPU architecture promises improved ray tracing performance, but real-world results tell a different story.

According to user reports, Unreal Engine 4 (UE4) games experience severe stuttering on Radeon 9000-series cards when ray tracing is enabled. Digital Foundry replicated these issues in a recent YouTube video, confirming significant frame rate drops and stuttering in UE4 titles.

The problem is especially severe in Hellblade: Senua's Sacrifice, where stuttering can last several seconds and effectively freeze the game. Digital Foundry and the gaming community speculate that RDNA 4's poor ray tracing performance may result from a hidden AMD driver bug that disrupts shader compilation. However, a more detailed analysis by another YouTuber likely uncovered the true culprit.

The YouTube channel Tech Yes City examined RDNA 4 performance in Hellblade, The Ascent, and other ray tracing-heavy games – some of which stuttered even on Nvidia cards. It discovered both Hellblade and The Ascent are based on the UE4 framework but use a fork of the engine optimized for Nvidia cards called NvRTX. Developers chose NvRTX over the vendor-agnostic DirectX Raytracing implementation, effectively forcing Radeon 9000 owners to run sub-optimized code on their new GPUs.

Developers likely chose NvRTX because Nvidia markets its "RTX Branch of Unreal Engine" as an easy way to integrate advanced graphics technologies into games. NvRTX includes GeForce-exclusive features such as DLSS, DLAA, Nvidia Image Scaling, and real-time denoisers, along with other AI-based enhancements.

Despite RDNA 4's notable improvements, AMD is still playing catch-up with Nvidia in ray tracing. Tech Yes City noted that RT effects in Hellblade and similar titles can tank frame rates regardless of GPU, and few RX 9070 owners are likely enabling the feature anyway. The channel also uncovered a bug in AMD's graphics drivers, an issue that should be far easier to resolve with a software update.

Permalink to story:

 
Where is Tims to slam Nvidia for sabotaging AMD's performance?

Let's not act like we didn't knew about Gameworks sabotaging AMD performances for a long time. The examples are staggering, like The Witcher 3, Cyberpunk, Alan Wake 2, Wukong, Control... name them...
 
It works both ways, but developers cater to the Nvida crowd every time when the dust settles. I can't remember the game, but one that AMD actually got to optimize with all of the RDNA3 supported features enabled left the Nvidia 40 series in the dust. Needless to say, the outrage that came from the Nvidia camp was swift and large about an AMD sponsored title deliberately crippling Nvida cards. Turned out the optimizations simply took advantage of features AMD had that Nvidia didn't. A few months later, after some heavy Nvidia optimizations, the lead wasn't as big.

Point is, Nvidia has a history of writing software to deliberately lock out the competition in the name of "improvements", while publishers won't include AMD specific optimizations for just 10% of the market.
 
Who would have thought that Nvidia was to blame? Nobody ever knew that Nvidia has tentacles all over the gaming industry... Oh, well.
If you were listening to Hardware Unboxed, good guy Nvidia never would have done something like that, however bad guy AMD obviously prevented the implementation of DLSS in Star Wars Jedi Outcast!
 
This is also an issue on Intel cards. The unreal engine has turned into an unoptimized mess
It is not the engine, but the Nvidia proprietary driver used for Ray Tracing in Unreal Engine 4.

You know, when Nvidia is approaching you and asking you to implement RTX in Cyberpunk, Wukong, Alan Wake 2... in exchange for a monetary compensation...
 
It is not the engine, but the Nvidia proprietary driver used for Ray Tracing in Unreal Engine 4.

You know, when Nvidia is approaching you and asking you to implement RTX in Cyberpunk, Wukong, Alan Wake 2... in exchange for a monetary compensation...
The engine was a mess without the proprietary nVidia nonsense that game developers put into it. The whole reason the nVidia specific version exists is that it allows game devs to optimize for the most common PC hardware since Epic can't optimize their own engine.
 
Unreal engine has been riddled with performance issues of every kind for years and through most iterations. It can produce some of the best visuals, but also has some of the worst performance of any engine that exists. I strongly dislike it for that reason and sadly, my favorite franchise, Borderlands, is always built on it.
 
Unreal engine has been riddled with performance issues of every kind for years and through most iterations. It can produce some of the best visuals, but also has some of the worst performance of any engine that exists. I strongly dislike it for that reason and sadly, my favorite franchise, Borderlands, is always built on it.

It's strange because Tim Sweeney and Epic's programmers have been and should be at id Software's calibre but are the opposite when it comes to efficiency.
 
Back