4 Years of AMD RDNA: Another Zen or a New Bulldozer?

The issue is horrible SW support (driver bugs), and also CUDA being as dominant as it is. That is in part because NV shenanigans, but also because of how painful the AMD counterpart is (again, SW).
 
rDNA was a major step up from GCN in every way. rDNA2 was like the good old days. Good scaling, fantastic raster, some RT capability. the 6900xt going toe to toe with the 3090 was a sight to behold.

rDNA3 is a mixed bag. The power efficiency compared to rDNA2, for the monolithic chips, is decent. The 7900xtx is a great chip, so is the 7900xt, but AMD's pricing scheme totally screws with their value. Had the 7900xt been an $800 product, and the 7700xt been a $350 product (and the 7700/7800 come out much earlier) I think rDNA3 would have been received better. Combine the timing and pricing with efficiency losses compared to ADA and AMD's stagnating RT performance, and rDNA3 has become somewhat glossed over.

Hopefully they do better with rDNA4.
The issue is horrible SW support (driver bugs), and also CUDA being as dominant as it is. That is in part because NV shenanigans, but also because of how painful the AMD counterpart is (again, SW).
I think their biggest problem now is the compiler. rDNA3 CAN produce major improvements, as seen with MW2, but it almost never happens.
 
'TSMC's N5 process node might have a logic density that's, say, 20% higher than N7"

5nm is up to 50% denser than 7nm...

About RDNA I think that overall it was a great advance, but it requires more investment in software, note that if the capabilities of RDNA3 were being used properly the 7900xtx would be much closer to the 4090, the exclusivity that Nvidia has over GDDR6X as well doesn't help much, as AMD needs larger caches to alleviate the bandwidth deficiency. Samsung's faster memories could solve this disadvantage, I hope they are working in time.

I never cared about RT, I probably never will, the few games that actually make a visible difference, RT makes the 4090 drop at 20-30fps. So several layers of paraphernalia were created such as fake frames and upscaling to pretend that it is running as it should, but it is simply a waste of energy and space on the die, I think that whoever bought Turing believing in the promises should agree with this.

 
Rdna2 gave me a zen vibe.
But rdan1 and Rdan3 are full on bulldozer vibe.
Rdna1 did not even support DX12 fully. Even RTX 2000 that came 1 year ago had full DX12 Support.
Rdan3 is complete mess. 7900xtx compete with 4070 ti at max settings 4k.
Not to mention most amd features are lack luster and full of problems.
Amd has finnaly started shrinking Radeon group with lots of lay offs.

Amd will gradually leave gpu market as competition is too much here. You have to innovate and offer stable sollutions. Which amd can not do.

They silently left laptop gpu market. Next is high end gpu market. Then fully.

They will make console soc and apus in future unless ms and sony moves to intel or nvidia.
With switch 1 selling more than xbox x/s+ ps5 combined the future is not that bright.
 
I never cared about RT, I probably never will...
RT is the way forward for games.
You might not like it, interestingly, the people who don't like this fact are the loudest in the comment sections.

There's no getting around the fact, to improve the graphics in games any further, we not only need to find better ways of lighting scenes, we need to be able to develop games in a much more efficient manor.

RT does both of these, there's multiple interviews across multiple developers that all agree RT not only increases the realism of the lighting (however, we got so good at faking it, the difference can be hard to notice sometimes) it is much quicker to work with.

Before, you'd put a bulb on the roof, then spend ages getting the lighting right and making sure it interacted with everything and dynamically if required. With RT? You literally just place a bulb there, choose how bright, how big, direction (or lack of) and colour, let RT do the rest. No need to bake anything in, no need to fake anything and add bogus light sources, no light bleed through walls, no glowing effects, RT just works.

Unfortunately, there's this sentiment at the moment that we're heading into a future where everything is "fake", mainly DLSS and FSR being blamed as faking everything. That's just not true at all, game engines are now giving us the most realistic and "true" frames and pixels we've ever seen thanks to technology's such as RT.

The problem however, to run these "true" frames at a similar framerate to what we're used to in "older" or "fake" games, the GPU resources required to do that are exponentially higher. So to mitigate this, game developers have been using tech like TAA and denoisers to mask imperfections, This has been going on way before RT mind you, not only have they had no choice but to "fake" lighting until now, it's been hard enough to run on modern hardware, they've had to find ways to lower the processing burden on GPU's.

FSR and particularly DLSS could be considered advanced implementations of TAA, Why many people have such a hatred for a technology that is better than what they were already using is beyond me.

DLSS 3.5 Ray Reconstruction is in its first iteration, even so, it has shone a light on the amount of work denoisers are doing in games to mask and blend the scenes you see onscreen.

If RT isn't the future, what is? Continue to fake everything? To increase fidelity and lighting, give the development teams 20 years to create a single level that's very well faked?

Anyway, to bring it back to AMD, they will need to compete with Nvidia on RT performance, I do think game engines are not optimising for RDNA, It's been pretty impressive what Devs have pulled on console (latest Spiderman on PS5 is using Ray-Tracing in all modes including the 60fps mode) but it seems on desktop, less optimisation has gone into RDNA RT.
 
Most of us couldn't care less who holds the halo GPU performance crown, because we are not willing to pay four figures for a GPU. We care about what we can get within our budgets. We also don't care about ray tracing or path tracing. We are playing games: not producing Hollywood movies. Good enough is good enough, and most rasterized games look pretty terrific. Most of us are not going to set in front of our PCs with a magnifying glass, analyzing every pixel. We just want games to play smoothly, without glitches, crashes, and lockups. I've been running AMD GPUs for many years now and have never had a problem with a single AMD GPU I've owned, and I've owned several. If you know what you are doing, and keep your PC clean and up to date, AMD drivers work just fine.
 
Most of us couldn't care less who holds the halo GPU performance crown.
We read reviews and make comparisons, decide on what's best for our use cases and what we consider valuable. Anyone not technical and simply asks "what is the best I can get" the only answer they'll ever get is "Nvidia".
We also don't care about ray tracing or path tracing. We are playing games: not producing Hollywood movies. Good enough is good enough, and most rasterized games look pretty terrific. Most of us are not going to set in front of our PCs with a magnifying glass, analyzing every pixel.
So that's it then? That's the end of game graphics progress?
That's it boys, we've done it, that's as far as game graphics will ever come, a load of fake, pretty rasterization that we've been doing for decades...
We just want games to play smoothly, without glitches, crashes, and lockups.
That's really down to the Devs and their willingness to optimise, set expectation and ultimately deliver a good product. Unfortunately, as games have gotten more complicated and difficult to develop, more issues arise. Ironically for your hatred of RT, RT would make developing games substantially easier...
 
Most of us couldn't care less who holds the halo GPU performance crown, because we are not willing to pay four figures for a GPU. We care about what we can get within our budgets. We also don't care about ray tracing or path tracing. We are playing games: not producing Hollywood movies. Good enough is good enough, and most rasterized games look pretty terrific. Most of us are not going to set in front of our PCs with a magnifying glass, analyzing every pixel. We just want games to play smoothly, without glitches, crashes, and lockups. I've been running AMD GPUs for many years now and have never had a problem with a single AMD GPU I've owned, and I've owned several. If you know what you are doing, and keep your PC clean and up to date, AMD drivers work just fine.
I'm not so sure about that, there's a reason nvidia and AMD both release the high end cards first, its because people want the best and are willing to drop cash to get it.

I tried AMD, replaced my 2070s with a 6800xt, their driver/software wasn't up to snuff so I sent that card back and ended up with a 4080 and it just worked, and has been working and pushing every game with every bell and whistle enabled at 4k.

imo if youre on PC you want to see every game in its ultimate form, and thats expensive to do and right now nvidia is doing it best, if I'm gonna burn more than a grand on one component to play vidya games, then it better just do it.
 
We read reviews and make comparisons, decide on what's best for our use cases and what we consider valuable. Anyone not technical and simply asks "what is the best I can get" the only answer they'll ever get is "Nvidia".

So that's it then? That's the end of game graphics progress?
That's it boys, we've done it, that's as far as game graphics will ever come, a load of fake, pretty rasterization that we've been doing for decades...

That's really down to the Devs and their willingness to optimise, set expectation and ultimately deliver a good product. Unfortunately, as games have gotten more complicated and difficult to develop, more issues arise. Ironically for your hatred of RT, RT would make developing games substantially easier...
I don't hate RT, but until it matures and will run on a $300 midrange card without cutting the framerate in half, and actually looks better - it often doesn't - I'm not interested. And that day is many years away.
 
I don't hate RT, but until it matures and will run on a $300 midrange card without cutting the framerate in half, and actually looks better - it often doesn't - I'm not interested. And that day is many years away.
Right, but it has to start somewhere, your GPU right now that you're using, probably blows away the best of the best from 10 years ago.

Rasterization performance might not be improving very quickly anymore, but RT is improving massively every generation. That all trickles down to the lower price segments overtime.

Imagine your attitude when we started transitioning from 2D to 3D games, "I'll wait until it's mature and I can get into it for very little money". Would you argue 3D still isn't mature today? Since there's plenty of glitches and issues with it?
 
RT is the way forward for games.
You might not like it, interestingly, the people who don't like this fact are the loudest in the comment sections.

There's no getting around the fact, to improve the graphics in games any further, we not only need to find better ways of lighting scenes, we need to be able to develop games in a much more efficient manor.

RT does both of these, there's multiple interviews across multiple developers that all agree RT not only increases the realism of the lighting (however, we got so good at faking it, the difference can be hard to notice sometimes) it is much quicker to work with.

Before, you'd put a bulb on the roof, then spend ages getting the lighting right and making sure it interacted with everything and dynamically if required. With RT? You literally just place a bulb there, choose how bright, how big, direction (or lack of) and colour, let RT do the rest. No need to bake anything in, no need to fake anything and add bogus light sources, no light bleed through walls, no glowing effects, RT just works.

Unfortunately, there's this sentiment at the moment that we're heading into a future where everything is "fake", mainly DLSS and FSR being blamed as faking everything. That's just not true at all, game engines are now giving us the most realistic and "true" frames and pixels we've ever seen thanks to technology's such as RT.

The problem however, to run these "true" frames at a similar framerate to what we're used to in "older" or "fake" games, the GPU resources required to do that are exponentially higher. So to mitigate this, game developers have been using tech like TAA and denoisers to mask imperfections, This has been going on way before RT mind you, not only have they had no choice but to "fake" lighting until now, it's been hard enough to run on modern hardware, they've had to find ways to lower the processing burden on GPU's.

FSR and particularly DLSS could be considered advanced implementations of TAA, Why many people have such a hatred for a technology that is better than what they were already using is beyond me.

DLSS 3.5 Ray Reconstruction is in its first iteration, even so, it has shone a light on the amount of work denoisers are doing in games to mask and blend the scenes you see onscreen.

If RT isn't the future, what is? Continue to fake everything? To increase fidelity and lighting, give the development teams 20 years to create a single level that's very well faked?

Anyway, to bring it back to AMD, they will need to compete with Nvidia on RT performance, I do think game engines are not optimising for RDNA, It's been pretty impressive what Devs have pulled on console (latest Spiderman on PS5 is using Ray-Tracing in all modes including the 60fps mode) but it seems on desktop, less optimisation has gone into RDNA RT.
Real-time ray tracing (RT) in games is flawed from the beginning, mainly because GPUs lack the performance to effectively handle real RT, and they never will.

We've reached a point where there are no longer sufficient advancements in lithography to enable the progress needed to make RT practical and accessible for all. Manufacturing processes have become so complex that increasing density by just a few % incurs twice the capital costs, and the caches that consume more and more space on GPU die no longer yield significant practical improvements.

And in this scenario, Nvidia dragged the market with its influence into the illusion that RT is the future. I tell you I would trade the ability to run RT on my GPU for 20-30% more performance or a commensurate discount any day. People care so much about RT that Nintendo switch with simple graphics, no RT and low resolution sells more than all consoles combined.


Well... in short, GPUs are going to get more and more expensive in the quest to make RT in games possible.
 
It would be very unrealistic to think that AMD is in position to compete with NVIDIA in the top GPUs segment, they can only keep challenging. I made up my mind very clear that if the 8800xt can outperform the 4080 and remain in the 500-700euros price range, it will be my last GPU for 1080p and 2440p.
 
Fantastic article as usual, Nick. Just a great overview of the evolution of AMD's graphics architecture over time.

Unsurprisingly, I feel the conclusion is too charitable. Since the introduction of RDNA, AMD has witnessed serious decline in its discrete graphics processing market share. It may not have been great to begin with, but RDNA managed to effectively cut it in half somehow between 2018 and the present. There is no 4D chess here. That is failure.

AMD's GPU efforts are now largely kept afloat by one and a half major players in the console market. Which brings reliable volume, certainly, but also leaves AMD wholly at the mercy of Sony and Microsoft, with next to no room to improve profit. It is not a great position to be in, as Nvidia for one recalls all too clearly from its own forays in the past.

And sadly, AMD just does not seem to extract much benefit from being in both the console and PC markets. Effort spent on the first somehow never seems to bring much improvement in the dire state of AMD's PC software ecosystem, perpetuating the perception of Radeon as the cheap and nasty also-ran.
 
I'm watching the kids play Spiderman2 on the PS5 while I write this. It's looks absolutely incredible, a showcase for what AMD GPU hardware can do.

There was huge interest and hype ahead of the 7900 launch, fueled also by enourmous frustration with NVIDIA's shinanigens. One gets the sense there's an open goal just waiting for AMD to take it.

Firstly, there's nothing wrong with having a slower product if the price is right, so stop chasing margin. Second, plan to release the full product stack from the outset. The fact that NVIDIA won't is a massive opportunity that goes unexploited. Lastly, get the PC software support in order. Every developer under the sun is working with RDNA because of the consoles. The software experience with this architecture should be incredible across the board.
 
Do we "really" need a upper high end card at a 1500$ pricetag? Or do we need cards that can compete at mid-end for around 600$ and provide a good 4K or excellent 1440p experience?

I know! That's why I bought a 6700XT as a replacement for my RX580. 2.5x as fast with half the power requirement. And on top of that unlocked OC'ing by using MPT (Morepowertools) which is excluded on the 7x00 series completely.

Compute is where the money is at. If AMD manages to design a qualifying compute chip (which is has) and in return also provides software eco system support, they have a winner.

 
Real-time ray tracing (RT) in games is flawed from the beginning, mainly because GPUs lack the performance to effectively handle real RT, and they never will.
It already is, I don't really know how to convince someone like you, but Path Traced Cyberpunk is absolutely fantastic, on my 4090, at 1440p, I'm getting around 80fps, and it looks absolutely incredible, I don't think I've stopped and just taken in the views like this since Crysis back in 2007, and that ran way worse on my GPU at the time than Cyberpunk is today.

Also, Spiderman 2, makes amazing use of Ray-Tracing, and that's console only and running at 60fps.

There's some smaller games that make great use of ray-tracing like Lego: Builders Journey, It's already all possible, I could list a surprising amount of games I've played to date, which use RT with no problems.
 
I am never going back to Nvidia... simple as that. You get more FPS for what you pay than the competition, and AMD is having the console market.

Starfield is a good example of what to expect in the future. The PS5 Pro is rumored to use RDNA 3 also, it will be hard for Nvidia to go again an ecosystem it is absent from.
 
It already is, I don't really know how to convince someone like you, but Path Traced Cyberpunk is absolutely fantastic, on my 4090, at 1440p, I'm getting around 80fps, and it looks absolutely incredible, I don't think I've stopped and just taken in the views like this since Crysis back in 2007, and that ran way worse on my GPU at the time than Cyberpunk is today.

Also, Spiderman 2, makes amazing use of Ray-Tracing, and that's console only and running at 60fps.

There's some smaller games that make great use of ray-tracing like Lego: Builders Journey, It's already all possible, I could list a surprising amount of games I've played to date, which use RT with no problems.

This is where I am laughing...

1440p to 2160p is a higher image fidelity uplift than stupid RT.

Using DLSS is even more retarded because the downgrade in image fidelity is worst than the RT uplift.

Not to mention that you paid for a 2160p GPU, not a 1440p GPU. At 1440p, the 4090 is barely 15% faster than an XTX.

Lastly, Sony is using AMD hardware and create the most stunning games of the business. I still can't believe that TLOUP1 on PC is a PS4 game. It look and run amazingly. I run the game at 70-75 FPS at 2160p maxed on an XTX.

Not to mention that God of War Ragnarok, the best looking game to date, DOES NOT host ray tracing.

The water in TLOUP2 during the crossing of the harbor during a storm is absolutely breathtaking.

RTX was a gimmick for Nvidia to paint AMD in a corner, and they succeeded. Jensen is the best for doing those things, however it was built on a pack of lies.
 
The issue is horrible SW support (driver bugs), and also CUDA being as dominant as it is. That is in part because NV shenanigans, but also because of how painful the AMD counterpart is (again, SW).
Amd indeed needs to improve the drivers to utilize the dual fpu.
Nvidia ampere, the first with dual fpu, was gas guzzler too (though samsung 8nm contributes also contributed to the cause)
 
These days I'm reading "The History of the GPU" by Jon Peddie, I'm halfway through the second book. It has helped me fill in some gaps that are not even included in Wikipedia.
 
AMD should focus on being the best value for money CPU & GPU for mainstream consumers by building upon from the energy efficiency improvements, chiplet architecture[cost reductions], generous device driver support & features and competitive pricing of Zen 1+, Zen 2, Zen 3 CPUs, Polaris[Radeon RX 400/500] and RDNA 1[Radeon RX 5000] GPUs. The Ryzen 7 2700X & X470 Motherboards were hot-selling & highly-acclaimed during their time of release/availability because of their competitive pricing & power efficiency despite being slower compared to Intel's Core i9-9900K & Z390. The Radeon RX 480 were also praised for being able to match the performance of GeForce GTX 980 while consuming significantly less power, producing significantly less heat and while being more affordable as well at USD300. The same commendations also apply to Radeon RX 5700 XT by being able to match the performance of GeForce GTX 1080 Ti while consuming less power, producing less heat & while being cheaper as well at USD400. The Ryzen 7 5800X is currently the new value king CPU in place of 2700X because of significant price drops, power efficiency, competitive performance[via 19-20% IPC improvements] and with the new Curve Optimizer[Automated Undervolting Process]. For RDNA 4, I am wishing for competitive performance & pricing around USD350-500 while consuming between 150-200W[lower cooling requirements as well that will further reduce costs]. For Zen 5, I am wishing for competitive performance while also drawing less power & producing lower heat. It will surely boost their revenue[economy of scale from sales] because end users will be enticed for an upgrade since they will no longer need to replace their existing PSU & Cooling Systems[Case, CPU Cooler, Auxilliary Fans] to accommodate future AMD CPUs & GPUs. Plus, its also more sustainable for the environment since these will reduce the demands for fossil fuels/coal and emission of greenhouse gases aside from the ongoing transition towards renewable energy sources and nuclear power.
 
Back