AMD RDNA 4 leaks tease massive ray tracing upgrades for PC and PlayStation 5 Pro

zohaibahd

Posts: 934   +19
Staff
Something to look forward to: A partial data sheet has revealed some new details about the ray tracing capabilities coming to AMD's next-gen RDNA 4 GPU architecture. Headlining the list is something called a "Double Ray Tracing Intersect Engine." The data sheet doesn't delve into any details beyond listing the features, so we can only guess what it means.

From the name, Double Ray Tracing Intersect Engine could mean one of two things: either AMD is doubling the number of dedicated ray tracing units compared to RDNA 3, or these new RT engines can process twice as many ray-triangle intersection calculations per clock cycle. Either way, this could significantly boost RT performance.

The leak, as shared by @Kepler_L2, hints at some under-the-hood optimizations and efficiency improvements. RDNA 4 will pack 64-byte ray tracing nodes and includes a "Ray Tracing Tri Pair Optimization." There are also supposedly some enhancements coming to improve BVH (bounding volume hierarchy) traversal performance. Specifics are again lacking, but it's fair to assume the changes will boost performance while lowering computational workload at the same time.

How effective these improvements will be can only be ascertained when some real benchmarks show up or, better yet, we get our hands on the GPUs. But any enhancements are welcome.

AMD has long lagged in ray tracing and although progress has been made, Nvidia's offerings continue to pull ahead. For instance, despite the impressive rasterization muscle of AMD's latest RDNA 3 flagship, the RX 7900 XTX, when ray tracing effects get cranked up, even older Nvidia GPUs like the RTX 3090 Ti can outclass it.

These early RDNA 4 leaks position advanced ray tracing capabilities as a key area of focus for AMD's next architectural revamp, which makes total sense from a future-proofing standpoint. The improved RT hardware won't just be aimed at high-end gaming PCs either. It will likely make its way into the inevitable PS5 Pro console refresh too, delivering improved visuals on Sony's next-gen machine even though the console is expected to be built mostly around RDNA 3, utilizing RDNA 4 solely for boosting RT performance.

Of course, while mainstream RDNA 4 desktop graphics cards probably won't debut until early 2025, all signs point to an AMD vs Nvidia slugfest unfolding over the next year. Nvidia's own next-gen Blackwell offerings, presumably launching as GeForce RTX 50 series GPUs, are also rumored for a CES 2025 unveiling.

While AMD's RX 8000 series seems to heavily focus on ray tracing, Nvidia isn't resting on its laurels and could once again pull ahead in that department if these improvements are anything short of a generational leap.

Permalink to story:

 
Could, would, potentially, but we'll see. In the meantime, another and dare I say, more important leak, the best RDNA 4 card won't even match a 7900XTX.
 
This article makes it sound like there is something "mystical" on improving RT performance. Basically, there isn't. Almost any company that has some experience on designing microchips can make fastest RT GPU ever. Just add enough fixed function hardware and there is RT monster that leaves AMD and Nvidia cards miles behind. One problem though, what about Other than RT-performance?

AMD could very easily have miles better RT performance than Nvidia. However any improvements on RT make chip bigger OR something else must be sacrificed.

So yeah, RT performance is only about chip size and price. It's not like neiter AMD or Nvidia couldn't put much more RT performance. They could but for obvious reasons don't.
 
This article makes it sound like there is something "mystical" on improving RT performance. Basically, there isn't. Almost any company that has some experience on designing microchips can make fastest RT GPU ever. Just add enough fixed function hardware and there is RT monster that leaves AMD and Nvidia cards miles behind. One problem though, what about Other than RT-performance?

AMD could very easily have miles better RT performance than Nvidia. However any improvements on RT make chip bigger OR something else must be sacrificed.

So yeah, RT performance is only about chip size and price. It's not like neiter AMD or Nvidia couldn't put much more RT performance. They could but for obvious reasons don't.
Well, you're right, but the issue about ray tracing has always been that, every time you dedicate more of the die area to ray tracing hardware, that's less area you dedicate to the other components that handle raster graphics. The area AMD is using in RDNA 4 for improved ray tracing hardware could instead have gone to adding more CUs to the chip instead, or a larger memory bus, for example. That's the trade-off Nvidia chose to do with the RTX 2000 cards (transistor count increased compared to Pascal, but raster performance didn't improve much, because the new transistors all went to RT and tensor cores), and it seems AMD is choosing to do a similar trade-off now. They get good improvements in RT but the improvements in raster won't be as good as they could have been otherwise.
 
Well, you're right, but the issue about ray tracing has always been that, every time you dedicate more of the die area to ray tracing hardware, that's less area you dedicate to the other components that handle raster graphics. The area AMD is using in RDNA 4 for improved ray tracing hardware could instead have gone to adding more CUs to the chip instead, or a larger memory bus, for example. That's the trade-off Nvidia chose to do with the RTX 2000 cards (transistor count increased compared to Pascal, but raster performance didn't improve much, because the new transistors all went to RT and tensor cores), and it seems AMD is choosing to do a similar trade-off now. They get good improvements in RT but the improvements in raster won't be as good as they could have been otherwise.

We have also seen here I think Nvidia , AMD and I think Microsoft ( Sony???) have released patents to improve RT algorithms- we won't be getting real time RT on any platform soon or even in 20 years, we will always get an approximation and/or simulation of it . Just like physicist struggle to model a real world particle model outside a very small test .
Plus people like the wow of hyper real RT or compressed dynamic contrast ( HDR etc ) - you want toss puddles to really shine and reflect images with none of the real life reasons they may not - wind , floaters
 
RT is not really a big selling point for me. I'll see what comes along and vote with my wallet. Hopefully it's enough to get me excited again.
 
Great news. Never been a big fan of nvidia and AMD works much better on Linux so seeing progression there is pleasing.
 
Glad to hear AMD are finally doubling down on RT, it's about time. Now, if they can sort FSR out too, they might actually gain some marketshare instead of lose it.
 
Back