AMD RDNA4 graphics cards may only receive a minor bump in ray tracing performance

Daniel Sims

Posts: 1,375   +43
Staff
Rumor mill: Although the last few generations of AMD graphics cards have matched their Nvidia equivalents in rasterization performance and often featured more memory, Team Red has been a step behind in hardware-accelerated ray tracing ever since it debuted in consumer hardware. New information regarding the company's next GPU lineup suggests the situation won't change.

Sources have told popular YouTuber RedGamingTech that AMD's upcoming RDNA 4 graphics cards will only see a roughly 25 percent ray tracing performance uplift over the Radeon RX 7000 series. If Nvidia releases its next lineup – codenamed Blackwell – before the end of the year, it will likely remain uncontested in ray tracing.

Rumors have long indicated that RDNA 4 will not include enthusiast products and will focus entirely on the mid-range and mainstream tiers. However, the latest information has slightly revised the projected performance metrics and die sizes, which could change again before the final hardware ships. What hasn't shifted from previous leaks is AMD's alleged plan to offer two GPUs that can match mid-range RTX 4000 cards at significantly lower prices.

The larger product, named N48, might measure between 237 and 204 mm2, significantly smaller than what Moore's Law is Dead heard in early February. Expected performance might fall slightly below the RX 7900 XT but with a price tag somewhere in the ballpark of $500. Combined with a projected 25 percent ray tracing performance uplift, these numbers might have a chance of challenging the RTX 4070 Super, but probably not a theoretical mid-range Blackwell card.

Meanwhile, AMD plans a significantly smaller die for N44, the lower-end RDNA 4 component. Its performance could sit between the RX 7600 and 7700 XT while costing consumers less than $300. $199 isn't out of the question, but the rumors, if true, likely reflect plans that are still subject to change. RDNA 4, Blackwell (presumably to be named RTX 5000), and Intel's upcoming Battlemage series are all expected to launch before the end of 2024.

AMD might be employing a modest strategy with RDNA 4 to conserve resources for RDNA 5, which will play a bigger role in data center and gaming applications. Earlier information suggested that RDNA 5 will utilize a chiplet-based design. Furthermore, Microsoft could be planning to use an RDNA 5 GPU for a next-generation Xbox slated for a 2026 release, but the move would be a radical step, so take the information with a grain of salt.

Permalink to story.

 
Most will be happy if AMD brings better efficiency and performance in the sub-$500 price range. Obviously, resources must have been focused on the GPU architecture aimed at consoles because Sony and Microsoft are subsidizing the development. RT is a waste of resources anyway.

Not even the 4090 can deliver substantial RT with an adequate framerate without relying on upscaling and all kinds of tricks.
 
I think AMD just consolidated their position on CPU market, and now they really need the rebuild R&D for GPU side. Still I prefer AMD myself as RT is not as much of a request for games I play, and using Linux AMD works just a bit better there.
 
Most will be happy if AMD brings better efficiency and performance in the sub-$500 price range. Obviously, resources must have been focused on the GPU architecture aimed at consoles because Sony and Microsoft are subsidizing the development. RT is a waste of resources anyway.

Not even the 4090 can deliver substantial RT with an adequate framerate without relying on upscaling and all kinds of tricks.
Well that's not really true. The 4090 is powerful enough to do RT ultra at 60+ FPS at 1080p and mostly at 1440p resolutions. It cant quite handle 4k without upscaling, but that's a far cry from "cant deliver adequate framerate".

RT is going to stay. Maybe not in its overt implementation, but for general lighting it will likely continue to be present moving forward. AMD will need to play catch up.
AMD is giving up. And I can't blame them considering how badly RDNA has fallen behind.
AMD did themselves no favors mis pricing the 7900xt, the 7700 xt, and the 7600xt.
 
AMD is giving up. And I can't blame them considering how badly RDNA has fallen behind.
Behind what? Only the 4090 soundly beats every RDNA card in all performance metrics. Nvidia's lower end Ada cards only claim victory in RT heavy settings but we don't have RT only games. All games utilizing RT are Raster/RT hybrids so raster perf will continue to be relevant for years to come.
 
Not surprised. I doubt they will catch up to Nvidia in the near future in that regard. RT is a feature everyone cares about only in benchmarks but in practice rarely anyone uses it and Nvidia succeeded in making a mountain out of a molehill.
 
Behind what? Only the 4090 soundly beats every RDNA card in all performance metrics. Nvidia's lower end Ada cards only claim victory in RT heavy settings but we don't have RT only games. All games utilizing RT are Raster/RT hybrids so raster perf will continue to be relevant for years to come.

In what not? GPU market share, feature set, efficiency, it's all been a steady slide downhill.
 
In what not? GPU market share, feature set, efficiency, it's all been a steady slide downhill.

Beside the Whale tier of the 4090, AMD is kicking Nvidia *** in performance on all segment.

The only titles AMD is having problems with are the Nvidia sponsored titles that are RTX showcase that are literally sabotaging AMD drivers to cripple performances, for example Cyberpunk and Alan Wake.

Defending Nvidia over the actual pricing is literally pure $hilling. All the Ada lineup pricing is an horrendous joke.

The XTX is 900$ and barely 15% slower than a 4090 at 2160p with a slight OC while the 4090 cost the double.
 
Defending Nvidia over the actual pricing is literally pure $hilling. All the Ada lineup pricing is an horrendous joke.
Not sure where I did that but the market says you are mistaken.

Maybe if RDNA was a better architecture or priced more appropriately for its deficiencies AMD would sell more. They only seem to start moving off store shelves after being discounted twice, but even then it's in homeopathic quantities.
 
The problem for AMD has always been the surrounding architecture and showing that despite not having Nvidia's tech, they are worthwhile purchase if you are comparing dollar to dollar, especially at the higher pricing, where its not just rasterisation and usually you want your gpu to be able to do more than just run a game alone, the key here is what game devs will do with regards to the way games are built and how much emphasis is put on things like path tracing / ray tracing to crank the eye candy factor alongside relying on DLSS to the max to rush iut titles without having to optimise them properly with all the new stuff added, so if more and more games lean that way, then AMD really needs to bolster that and make sure it catches up, AMD have good work in the CPU space, and the GPU's are fairly alright, but it feels like spmething is always a catch up or a compromise vs Nvidia, no matter what we might say about ray tracing or dlss or ehatever not being needed, if you blow 1k on a gpu, you want it to be the best of the best
They 'll have an advantage in hitting that mid market and hopefully normalising the prices (highly doubt that will happen though, who would tell the money hungry investors "hey, we chose to get less profit"), but it feels like its less of a change compared to what it should be and the AMD cards will probably feel like juggling a compromise again once Nvidia just releases their stuff, inevitably with inflated prices of their own and stupid bottlenecks to really oush that segmentation, but still
 
If the mid-tier 8700XT class is stronger than the 7900XT in raster as everyone is saying, and is 25% stronger in RT than the 7700XT, and uses at most same power as 7700XT and will naturally be well under $500, AMD will have a winner for sure. This would decimate the 4060 Ti, beat the 4070 TI Super except for RT, and cost $300 less for sure. 5060 won't be released until well into 2025 and will easily cost $550+ as Nvidia pays TSMC through the nose for bleeding edge node.
 
Well that's not really true. The 4090 is powerful enough to do RT ultra at 60+ FPS at 1080p and mostly at 1440p resolutions. It cant quite handle 4k without upscaling, but that's a far cry from "cant deliver adequate framerate".

RT is going to stay. Maybe not in its overt implementation, but for general lighting it will likely continue to be present moving forward. AMD will need to play catch up.
AMD did themselves no favors mis pricing the 7900xt, the 7700 xt, and the 7600xt.
A voice of reason comment, nice.
 
Last edited:
If the mid-tier 8700XT class is stronger than the 7900XT in raster as everyone is saying, and is 25% stronger in RT than the 7700XT, and uses at most same power as 7700XT and will naturally be well under $500, AMD will have a winner for sure.
for me, its like a dream comes true..
 
Always find the comment sections of any GPU related article fascinating. Soo many comments about how great AMD is and how they're "smashing" Nvidia.

Then you look at sales numbers...
 
Always find the comment sections of any GPU related article fascinating. Soo many comments about how great AMD is and how they're "smashing" Nvidia.

Then you look at sales numbers...
This is all irrelevant as NVidia and AMD only care about corporate customers because that's where the money is now. AI :/
There is no winner. Consumers are only losers in the GPU space.
 
This is all irrelevant as NVidia and AMD only care about corporate customers because that's where the money is now. AI :/
There is no winner. Consumers are only losers in the GPU space.
But if you were to look at non-corporate customers, you’d see Nvidia absolutely ruining AMD in terms of sales numbers.

I personally don’t understand how it’s so bad, I’ve been building machines with 7600’s in them and honestly, AMD are just better value for their money.

I don’t really understand why Nvidia gets as many sales as they do, RT doesn’t mean anything in the mid to lower end of cards as none of them do a great job of it anyway…
 
Most will be happy if AMD brings better efficiency and performance in the sub-$500 price range. Obviously, resources must have been focused on the GPU architecture aimed at consoles because Sony and Microsoft are subsidizing the development. RT is a waste of resources anyway.

Not even the 4090 can deliver substantial RT with an adequate framerate without relying on upscaling and all kinds of tricks.

If I remember correctly AMD was in talks with a company that specializes in raytracing. If that is the case it's possible that they are letting them handle that side of the tech while they continue to work on the GPU side of things.
 
This is fine. RT should be little focus. Matters to very few PC gamers. They should rather improve FSR to match DLSS/DLAA if possible and FSR should be in more games.

Maybe Microsoft DirectSR will help with that.

Also AFMF needs to be improved massively compared to DLSS 3 Frame Gen.
 
If the mid-tier 8700XT class is stronger than the 7900XT in raster as everyone is saying, and is 25% stronger in RT than the 7700XT, and uses at most same power as 7700XT and will naturally be well under $500, AMD will have a winner for sure. This would decimate the 4060 Ti, beat the 4070 TI Super except for RT, and cost $300 less for sure. 5060 won't be released until well into 2025 and will easily cost $550+ as Nvidia pays TSMC through the nose for bleeding edge node.
No-one with actual knowledge is saying this.
AMD will have no high-end SKU in Radeon 8000 series. It will be like Radeon 5000 series, with 5700XT being the fastest SKU here. Mid-end stuff. Maybe upper mid-end.

The best SKU will have around 7900XT performance in raster, but will use less power and be cheaper. 7900XT will probably still beat it in raw performance, especially when OC'ed.

"8700XT" won't decimate anything we have today. Calm down. You expect too much of a mid tier generation. 4060 Ti and 4070 Ti Super? There is day and night difference in performance between these cards. 4060 Ti is mediocre, 4070 Ti Super is pretty much high-end.

I expect "8700XT" to beat 7800 XT and 4070 Super. I don't expect "8700XT" to beat 4070 Ti Super or 7900XT.

AMD should have full focus on regaining marketshare. Delivering good performance per dollar without gimmicks. They have lost high-end GPU market long ago. 4090 can't be touched and soon 5090/5080 comes out, in around 6-9 months.

AMD will have nothing to compete with 5090 and 5080 and probably won't have anything that even comes close before 2026+

First gen MCM approach by AMD failed pretty hard really. Lets see if they go back to monolithic design with 8000 series.

AMD should forget about RT performance for now and spend all their R&D funds on improving FSR and AFMF to match DLSS/DLAA/Frame Gen and all those other sweet RTX features.

AMD should stop delivering half assed features like Anti Lag + that got people VAC Banned. AFMF is a joke as well. Even FSR has way too much issues meaning shimmering, artifacts, jitter and is just far worse than DLSS overall.

RTX features are in 500+ games now. Numbers rising week for week.
Most gamers today don't just look at raster perf and make a buying decision. They look at the whole picture and AMD is far behind on features.
 
Last edited:
Rastafarization performance... /jk

Well... My two cents daydreaming on my own "IMHO"... I don't know if anyone talked about that but low/upper low range seems to be where the sales numbers are so if a 7900 scored like a 4060 that's the price tiert they will be, no matter if they bake any FPS rising tech like DLSS/AFSR... the perception of image quality/performance may be the reason behind 90% of gpu consumer buys because they don't even know DLSS/AFSR do exist and/or can't tell them apart, would end going green/red team because a friend/someone "that know about computer" told them to go RTX XYZ or RX WZX.

And then there's the low budget tech savy GPU consumer that can't/don't want to afford anything better...

Well I think I'm looking at this from a bussiness perspective but its what every company aims for... These techs for squeezing more performance from gpus without increasing peformance will get even more important in the long run because they can profit more delivering cheaper GPUs, even on high end tier and every GPU tier consumer will bend for that because if they can achieve the desired FPS and image quality goal even us techies will not argue if it's AI, smoke signal, sailor's knot... Pure rasterization performance will be not the defining metric.

Just thoughts.

Note: didn't read comments.
 
Last edited:
Back