It's been four years since AMD launched RDNA, the successor to the venerable GCN graphics architecture. We take a look through the tech and numbers to see just how successful it's been.
https://www.techspot.com/article/2741-four-years-of-amd-rdna/
It's been four years since AMD launched RDNA, the successor to the venerable GCN graphics architecture. We take a look through the tech and numbers to see just how successful it's been.
https://www.techspot.com/article/2741-four-years-of-amd-rdna/
I think their biggest problem now is the compiler. rDNA3 CAN produce major improvements, as seen with MW2, but it almost never happens.The issue is horrible SW support (driver bugs), and also CUDA being as dominant as it is. That is in part because NV shenanigans, but also because of how painful the AMD counterpart is (again, SW).
Remedy is a trash developer these days. Needing DLSS for 1080p low, SMH.Just been annonced rx5000/gtx10 won't be supported in alan wake 2 because of lack of mesh shader compability.
Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders
According to Remedy's Lea-Newin, Alan Wake 2 will require graphics cards that support DX12 Ultimate's Mesh Shaders.www.dsogaming.com
RT is the way forward for games.I never cared about RT, I probably never will...
We read reviews and make comparisons, decide on what's best for our use cases and what we consider valuable. Anyone not technical and simply asks "what is the best I can get" the only answer they'll ever get is "Nvidia".Most of us couldn't care less who holds the halo GPU performance crown.
So that's it then? That's the end of game graphics progress?We also don't care about ray tracing or path tracing. We are playing games: not producing Hollywood movies. Good enough is good enough, and most rasterized games look pretty terrific. Most of us are not going to set in front of our PCs with a magnifying glass, analyzing every pixel.
That's really down to the Devs and their willingness to optimise, set expectation and ultimately deliver a good product. Unfortunately, as games have gotten more complicated and difficult to develop, more issues arise. Ironically for your hatred of RT, RT would make developing games substantially easier...We just want games to play smoothly, without glitches, crashes, and lockups.
I'm not so sure about that, there's a reason nvidia and AMD both release the high end cards first, its because people want the best and are willing to drop cash to get it.Most of us couldn't care less who holds the halo GPU performance crown, because we are not willing to pay four figures for a GPU. We care about what we can get within our budgets. We also don't care about ray tracing or path tracing. We are playing games: not producing Hollywood movies. Good enough is good enough, and most rasterized games look pretty terrific. Most of us are not going to set in front of our PCs with a magnifying glass, analyzing every pixel. We just want games to play smoothly, without glitches, crashes, and lockups. I've been running AMD GPUs for many years now and have never had a problem with a single AMD GPU I've owned, and I've owned several. If you know what you are doing, and keep your PC clean and up to date, AMD drivers work just fine.
I don't hate RT, but until it matures and will run on a $300 midrange card without cutting the framerate in half, and actually looks better - it often doesn't - I'm not interested. And that day is many years away.We read reviews and make comparisons, decide on what's best for our use cases and what we consider valuable. Anyone not technical and simply asks "what is the best I can get" the only answer they'll ever get is "Nvidia".
So that's it then? That's the end of game graphics progress?
That's it boys, we've done it, that's as far as game graphics will ever come, a load of fake, pretty rasterization that we've been doing for decades...
That's really down to the Devs and their willingness to optimise, set expectation and ultimately deliver a good product. Unfortunately, as games have gotten more complicated and difficult to develop, more issues arise. Ironically for your hatred of RT, RT would make developing games substantially easier...
Right, but it has to start somewhere, your GPU right now that you're using, probably blows away the best of the best from 10 years ago.I don't hate RT, but until it matures and will run on a $300 midrange card without cutting the framerate in half, and actually looks better - it often doesn't - I'm not interested. And that day is many years away.
Real-time ray tracing (RT) in games is flawed from the beginning, mainly because GPUs lack the performance to effectively handle real RT, and they never will.RT is the way forward for games.
You might not like it, interestingly, the people who don't like this fact are the loudest in the comment sections.
There's no getting around the fact, to improve the graphics in games any further, we not only need to find better ways of lighting scenes, we need to be able to develop games in a much more efficient manor.
RT does both of these, there's multiple interviews across multiple developers that all agree RT not only increases the realism of the lighting (however, we got so good at faking it, the difference can be hard to notice sometimes) it is much quicker to work with.
Before, you'd put a bulb on the roof, then spend ages getting the lighting right and making sure it interacted with everything and dynamically if required. With RT? You literally just place a bulb there, choose how bright, how big, direction (or lack of) and colour, let RT do the rest. No need to bake anything in, no need to fake anything and add bogus light sources, no light bleed through walls, no glowing effects, RT just works.
Unfortunately, there's this sentiment at the moment that we're heading into a future where everything is "fake", mainly DLSS and FSR being blamed as faking everything. That's just not true at all, game engines are now giving us the most realistic and "true" frames and pixels we've ever seen thanks to technology's such as RT.
The problem however, to run these "true" frames at a similar framerate to what we're used to in "older" or "fake" games, the GPU resources required to do that are exponentially higher. So to mitigate this, game developers have been using tech like TAA and denoisers to mask imperfections, This has been going on way before RT mind you, not only have they had no choice but to "fake" lighting until now, it's been hard enough to run on modern hardware, they've had to find ways to lower the processing burden on GPU's.
FSR and particularly DLSS could be considered advanced implementations of TAA, Why many people have such a hatred for a technology that is better than what they were already using is beyond me.
DLSS 3.5 Ray Reconstruction is in its first iteration, even so, it has shone a light on the amount of work denoisers are doing in games to mask and blend the scenes you see onscreen.
If RT isn't the future, what is? Continue to fake everything? To increase fidelity and lighting, give the development teams 20 years to create a single level that's very well faked?
Anyway, to bring it back to AMD, they will need to compete with Nvidia on RT performance, I do think game engines are not optimising for RDNA, It's been pretty impressive what Devs have pulled on console (latest Spiderman on PS5 is using Ray-Tracing in all modes including the 60fps mode) but it seems on desktop, less optimisation has gone into RDNA RT.
It already is, I don't really know how to convince someone like you, but Path Traced Cyberpunk is absolutely fantastic, on my 4090, at 1440p, I'm getting around 80fps, and it looks absolutely incredible, I don't think I've stopped and just taken in the views like this since Crysis back in 2007, and that ran way worse on my GPU at the time than Cyberpunk is today.Real-time ray tracing (RT) in games is flawed from the beginning, mainly because GPUs lack the performance to effectively handle real RT, and they never will.
It already is, I don't really know how to convince someone like you, but Path Traced Cyberpunk is absolutely fantastic, on my 4090, at 1440p, I'm getting around 80fps, and it looks absolutely incredible, I don't think I've stopped and just taken in the views like this since Crysis back in 2007, and that ran way worse on my GPU at the time than Cyberpunk is today.
Also, Spiderman 2, makes amazing use of Ray-Tracing, and that's console only and running at 60fps.
There's some smaller games that make great use of ray-tracing like Lego: Builders Journey, It's already all possible, I could list a surprising amount of games I've played to date, which use RT with no problems.
Amd indeed needs to improve the drivers to utilize the dual fpu.The issue is horrible SW support (driver bugs), and also CUDA being as dominant as it is. That is in part because NV shenanigans, but also because of how painful the AMD counterpart is (again, SW).