AMD's RDNA 4 GPUs bring major encoding and ray tracing upgrades

Scorpus

Posts: 2,214   +246
Staff member
The big picture: AMD finally revealed the specifications, pricing, and performance details of Radeon RX 9070 and Radeon RX 9070 XT graphics cards, with full reviews expected in the coming days. While we await those, we've already discussed the potential performance implications, FSR 4 upscaling, and now we want to provide additional context on improvements in the new Radeon's encoding quality – an often overlooked aspect of new GPUs (including by us).

Update (Mar 5): TechSpot's Radeon RX 9070 XT review is now live.

AMD's GPU encoders have long been criticized for poor video encoding quality when using popular formats and bitrates for game streaming, leaving Nvidia as the clear choice for anyone who wants to use video encoding.

With RDNA 4, AMD claims encoding quality is significantly improved, and the examples they showcased were certainly attention-grabbing. AMD is specifically highlighting 1080p H.264 and HEVC at 6 megabits per second – one of the most commonly used setups – demonstrating a substantial increase in visual quality.

Whether this will hold true across a broad variety of scenarios remains to be seen, but historically, AMD's discussions on encoding quality have revolved around supporting new formats like AV1. With RDNA 4, AMD is focusing on tangible improvements in real-world use cases, suggesting they are far more confident in their encoder's quality.

AMD is touting a 25% gain in H.264 low-latency encode quality, an 11% improvement in HEVC, better AV1 encoding with B-frame support, and a 30% boost in encoding performance at 720p. These numbers likely refer to VMAF scores.

Beyond encoding, there are several other notable improvements. The ray tracing core now features two intersection engines instead of one, doubling throughput for ray-box and ray-triangle intersections. A new ray transform block has been introduced, offloading certain aspects of ray tracing from shaders to the RT core.

The BVH (Bounding Volume Hierarchy) is now twice as wide, and numerous other enhancements have been made to the ray tracing implementation – one reason why RDNA 4's ray tracing gains exceed its rasterization improvements.

Compute, Memory, and Display Enhancements

The compute engine also includes several optimizations, along with a PCIe 5.0 x16 interface and a 256-bit memory bus using GDDR6. AMD claims enhanced memory compression, and the GPUs are equipped with 16GB of VRAM, which should be sufficient for most modern games.

The display engine, however, is a mixed bag. While it supports DisplayPort 2.1, its capabilities remain unchanged from RDNA 3, with a maximum bandwidth of UHBR 13.5 instead of the full UHBR 20 now used on some 4K 240Hz displays and supported by Nvidia's Blackwell architecture.

HDMI 2.1b is also included. On a positive note, AMD claims lower idle power consumption for multi-monitor setups, and video frame scheduling can now be offloaded to the GPU.

The Navi 48 die is 357mm² of TSMC 4nm silicon, featuring 53.9 billion transistors. This makes it 5% smaller than Nvidia's Blackwell GB203 used in the RTX 5080 and 5070 Ti, yet it contains 18% more transistors, meaning the design is more densely packed.

However, Nvidia still holds an advantage in terms of die area and transistor efficiency, as the RTX 5080 is expected to be about 15% faster in rasterization and possibly over 50% faster in ray tracing based on AMD's RX 9070 XT claims – all while using fewer transistors and a smaller die. That said, the TDP of the RTX 5080 is 360W, compared to 304W for the 9070 XT, an 18% higher power draw – though actual power consumption in games may vary. The 9070 XT should be closer to the RTX 5070 Ti with its 300W TDP.

A couple of additional details to round things out: AMD is not producing reference models for the RX 9070 XT or RX 9070, meaning all designs will come from board partners. These partners include ASUS, Gigabyte, PowerColor, Sapphire, XFX, and other familiar names.

Availability is expected to be strong on March 6th, as these cards have reportedly been ready since early January.

Additionally, AMD is releasing a new version of its driver-based frame generation technology, AFMF 2.1, which will be available for the Radeon RX 6000 series and newer GPUs.

This version promises superior image quality and includes Radeon Image Sharpening, which can be applied to any game, video, or application at the driver level. AMD also claims improved quality for this feature. There are also some AI-related features, though nothing particularly groundbreaking.

Permalink to story:

 
Im a fan of 3D imaging technology. I even work in it (In a dental imaging capacity). Personally, I love real time lighting in games. I think it's interesting and it looks awesome. I'm glad AMD are finally beginning to get with the times. I think it will help their sales a lot more than people think.

It won't be long before you can't turn ray tracing off. So, they don't really have a choice if they want to remain competitive.
 
Yeah, just a couple of the many things that just aren't working quite as well as on Nvidia's platform. You'd never hear about any of it if you listened to the tiny but loud AMD community, though. They prefer to focus on talking about how much they dislike Nvidia. Over and over, at length.

Meanwhile, Nvidia customers can just assume these things work, and work well. Customer satisfaction is the main reason why they keep gaining market share.

 
Congratulations AMD !!! About time !
Can you also keep a healthy supply of cards closely priced to your MSRP?
Please and thank you?
 
Yeah, just a couple of the many things that just aren't working quite as well as on Nvidia's platform. You'd never hear about any of it if you listened to the tiny but loud AMD community, though. They prefer to focus on talking about how much they dislike Nvidia. Over and over, at length.

Meanwhile, Nvidia customers can just assume these things work, and work well. Customer satisfaction is the main reason why they keep gaining market share.
Its the whole tech community, reviewers and everything. From looking at the headlines you would think that NVidia's 50 series are a bad product. But they will be a huge success for Nvidia and the people buying them will probably get the best frame rates with the best 3D imaging technology. And I think the driving factor behind the negativity in the tech community is envy. These people are upset because they want the latest Nvidia GPU but they aren't prepared to pay the price of ownership. It's practically entitlement. Like people feel they are owed the best GPU for a low price.

I really do think that there is a big disconnect between the tech communities and the actual people buying these products right now. And personally I am so done with hearing complaints I want to hear positive things about these products, I want the community to be a pleasant place again, with excitement and interest in the latest hardware. Its not like you read a car website and all they do is talk miserably about how Ferraris cost too much.
 
Its the whole tech community, reviewers and everything. From looking at the headlines you would think that NVidia's 50 series are a bad product. But they will be a huge success for Nvidia and the people buying them will probably get the best frame rates with the best 3D imaging technology. And I think the driving factor behind the negativity in the tech community is envy. These people are upset because they want the latest Nvidia GPU but they aren't prepared to pay the price of ownership. It's practically entitlement. Like people feel they are owed the best GPU for a low price.

I really do think that there is a big disconnect between the tech communities and the actual people buying these products right now. And personally I am so done with hearing complaints I want to hear positive things about these products, I want the community to be a pleasant place again, with excitement and interest in the latest hardware. Its not like you read a car website and all they do is talk miserably about how Ferraris cost too much.
Well, when you buy a Ferrari, you'd expect it to have the parts it claims it has, and for it to not try and burn down and kill you, so its tough to be positive about a company thst delivers a product that has minor improvements, and is built like a cost cutting exercise while costing more, I couldn't care less what brand it is, so long as its a good product, and the 50 series simply isn't (which makes sense when gaming is now a much lower proportion of their revenue)
 
Last edited:
Its the whole tech community, reviewers and everything. From looking at the headlines you would think that NVidia's 50 series are a bad product. But they will be a huge success for Nvidia and the people buying them will probably get the best frame rates with the best 3D imaging technology. And I think the driving factor behind the negativity in the tech community is envy. These people are upset because they want the latest Nvidia GPU but they aren't prepared to pay the price of ownership. It's practically entitlement. Like people feel they are owed the best GPU for a low price.

I really do think that there is a big disconnect between the tech communities and the actual people buying these products right now. And personally I am so done with hearing complaints I want to hear positive things about these products, I want the community to be a pleasant place again, with excitement and interest in the latest hardware. Its not like you read a car website and all they do is talk miserably about how Ferraris cost too much.
I have no problem with your point of view, and this is why AMD is so important.

Not everyone can afford a Ferrari. So if I can have instead a very competitive Toyota...
 
I have no problem with your point of view, and this is why AMD is so important.

Not everyone can afford a Ferrari. So if I can have instead a very competitive Toyota...
I don't know if AMD competing better would force Nvidia to lower its prices. I think it would just allow AMD to charge high prices too. I mean look at Ryzen, the prices of the CPUs and motherboards are quite high now.

I think the days of getting bleeding edge hardware for cheap are over and we have to accept that. I have, I don't plan on buying these new GPUs but I do like to read about the technology and performance they offer. I'm just bored of it all being negative because it doesn't cost the same as a flagship GPU from 15 years ago did on launch.
 
Well, when you buy a Ferrari, you'd expect it to have the parts it claims it has, and for it to not try and burn down and kill you, so its tough to be positive about a company thst delivers a product that has minor improvements, and is built like a cost cutting exercise while costing more, I couldn't care less what brand it is, so long as its a good product, and the 50 series simply isn't (which makes sense when gaming is now a much lower proportion of their revenue)
I disagree it looks like a very good product. The benchmarks and testing indicatre this very well. And demand is through the roof. I feel that the small batch of 5080s missing ROPs has been overblown a bit. A similar thing happened with vapor chambers on the 7900 GPUs and nobody is going around saying that product is bad.

Also new supercars do have a lot of issues on them. It comes with the space. Same as the launch drivers on any new GPU will have problems.
 
I disagree it looks like a very good product. The benchmarks and testing indicatre this very well. And demand is through the roof. I feel that the small batch of 5080s missing ROPs has been overblown a bit. A similar thing happened with vapor chambers on the 7900 GPUs and nobody is going around saying that product is bad.

Also new supercars do have a lot of issues on them. It comes with the space. Same as the launch drivers on any new GPU will have problems.

You can claim demand is through the roof, but that's only because they're hard to find and it is even reported by Nvidia that inventory is low and they claim it won't really improve for a few months (since the launch, so supposedly around April we should see inventory more readily available, if what Nvidia claims is true).

I don't know anyone that's actively looking for a RTX 5000 card, nor anything from AMD for that matter. Most folks already have a high-ish end Ampere/Ada card or RDNA2/3 and feel no need to upgrade this gen because of the lack of improvement vs last gen plus with the blatant price gouging going on right now.

That clearly doesn't mean there isn't a demand, but the demand seems lower than expected compared to last gen. The 4080 (I mean, 4070) when it dropped there were dozens and dozens of them on the shelves of my local Micro Center, but no one wanted them and the glut of inventory sat for months before they started to trickle out. This time around the 5080 are far and few between on inventory from what I've learned from my Micro Center so the few that have come in tend to sell fairly quick, making it feel and look like demand is high. Unfortunately with such low inventory it's hard to substantiate the claim that demand is high. If the same amount of 5080s hit the shelves as did the 4080s I'd venture to guess they'd sit there for months.
 
It's good AMD is focussing on this other stuff, as maybe a good few but for extra features like encoding .

Always find supermarkets in NZ someone strange in data analysist using myself as N=1

When they remove a favourite product as many sales are poor - especially a premium one - sometimes feels like really stupid 1D ie low sales - , shelf space important , off it goes, yet those shoppers may be big spenders . a $400 shopper would change supermarket for nappies their baby likes , or pet food their pet prefers

as more things move to commodity status , those extras count
but if rasterization is what you want AMD generally makes sense , unless want a rtx **90
 
Meanwhile, Nvidia customers can just assume these things work, and work well. Customer satisfaction is the main reason why they keep gaining market share.
What things?
The power connector? Nope, might melt.
The drivers? Better make sure to download the version that fixes the Black screens.
Performance? Double check you got all your Rops, oh and dont play any titles using 32 bit PhysX if you got the latest and 'greatest'.

Oh you meant video encoding, oh right that just works... Except that when my gf used it on a GTX 1060 and now uses it on a RTX 3060 she can't decode video at the same time, stutters and freezes. So no instant replay whilst having a video on the second screen.
 
What things?
The power connector? Nope, might melt.
The drivers? Better make sure to download the version that fixes the Black screens.
Performance? Double check you got all your Rops, oh and dont play any titles using 32 bit PhysX if you got the latest and 'greatest'.

Bravo, you hit all the talking points. I bet you watch all the sensationalist youtube clickbaiters.

What you fail to understand, or acknowledge, is that literally none of these things affect or matter to the vast, vast majority of Nvidia owners. Whereas the structural problems with the AMD software stack affect a lot of their customers. In as far as you can call a large percentage of the few remaining AMD GPU faithful 'a lot'.
 
Bravo, you hit all the talking points. I bet you watch all the sensationalist youtube clickbaiters.

What you fail to understand, or acknowledge, is that literally none of these things affect or matter to the vast, vast majority of Nvidia owners. Whereas the structural problems with the AMD software stack affect a lot of their customers. In as far as you can call a large percentage of the few remaining AMD GPU faithful 'a lot'.

Bravo, you've made 2 posts now saying AMD has serious problems without pointing out a single specific one. Yet when people point out very specific problems with Nvidia, you just call them irrelevant and cherry picked.

I'm not saying AMD is perfect, but I haven't run into any of your unspecified problems, while every Nvidia I've had after my Riva TNT2 has died within 2 weeks.

So unless you start getting specific, you'll keep seeming like a Nvidia fan boy with no real arguments to me.
 
Yeah. Let's keep it honest. The AMD hive mind is reduced to making up stuff now.

It’s almost like the internet encourages polarised views…
I currently own an AMD 7800XT. Prior to this, over 25 years of gaming I have owned an even mix of Nvidia and AMD cards, changing every 2-3 years.
Over 25 years I have not had a single significant issue with either manufacturer.
 
Bravo, you hit all the talking points. I bet you watch all the sensationalist youtube clickbaiters.
If you mean Gamers Nexus, Level 1 Tech, Hardware Unboxed, Dr Ian Cutress, LTT, Tech Yes City and Kit Guru. Then yes I do
What you fail to understand, or acknowledge, is that literally none of these things affect or matter to the vast, vast majority of Nvidia owners. Whereas the structural problems with the AMD software stack affect a lot of their customers. In as far as you can call a large percentage of the few remaining AMD GPU faithful 'a lot'.
Just gave you an example of something that's affecting my gf right now. Frustratingly she keeps having to toggle instant replay on and off depending on wether she wants to watch a video on her RTX 3060.
The video encoding quality on my RX 6700 XT might be worse but at least it's usable and I have YouTube playing during the majority of my game time.

The black screens were affecting several of my friends (fixed now thankfully). Everyone who wants to play say the old Arkham games for example is affected by the PhysX thing. And I'll recommend those games over Suicide Squad any day (sadly I'm one of the few people that tried the Suicide Squad game, what a taint on the legacy).

Thankfully the melting cables don't affect many but the way NVIDIA handled it is awful. First they claim there's no problem, it's user error. Then they claim they solved it with the RX 5000 just for it to still happen.

It's not like I'm anti NVIDIA, hell I told my own girlfriend to get a RTX 3060 when a decently priced one popped up at the end of the mining bubble. I've advised several people to get them. I flip flop between NVIDIA and AMD myself although admittedly my last 3 were from AMD.
I'm not a fan of DLSS on full HD/WQHD, I don't use CUDA so those features add no value. I'd rather get more VRAM so with my last choice being between a RTX 3060 Ti (8 GB) and RX 6700 XT (12 GB) I went with AMD again.

Personally I prefer AMDs software as well, NVIDIA made a big leap forward here with the NVIDIA app but is still missing some features. With NVIDIA I always ended up installing MSI after burner for tuning, RIVA Tuner for monitoring and The Manufacturers app for fan control. With AMD I just use the Adrenalin software.
I don't know if NVIDIA has added it since last I checked but another feature I like with AMD is that you can set the instant replay cache to be the RAM. Saves a lot of write operations to the SSD prolonging its life.
 
Frustratingly she keeps having to toggle instant replay on and off depending on wether she wants to watch a video on her RTX 3060.
So sorry, what's the issue she's having? My gf has a 3060Ti, usually has Plex running on one screen watching a TV show, then the other screen she'll be playing Hogwarts or Sackboy or a Sims game, she hasn't said she's had any issues to me. I'm just intrigued if she does have an issue and simply hasn't noticed it!
 
New GPU generation(s) are a plus for all gamers when we are in the market for a new gaming system. I think, though, that far too many of us upgrade when we don't really get anything (If you are stuck at a 1080p monitor, get a better monitor before upgrading, a new card may make for a little faster FPS, but your monitor wont show you and let you appreciate it)
 
So sorry, what's the issue she's having? My gf has a 3060Ti, usually has Plex running on one screen watching a TV show, then the other screen she'll be playing Hogwarts or Sackboy or a Sims game, she hasn't said she's had any issues to me. I'm just intrigued if she does have an issue and simply hasn't noticed it!
Videos here are completely unwatchable whilst playing for example Lost Ark or Blade and Soul. Windowed full screen game on one monitor, YouTube on the other. Not even super intensive games and on low Res 1080p screens .
Tried windows reinstall, tried different browsers, upgraded from a 1060 to 3060 and the problem always remains it's not just YouTube either, the same thing happens with Potplayer. The audio keeps playing but the video either gets 1 frame every few seconds or competely freezes. disable instant replay and the videos play fine.
My guess is that NVIDIAs media encoder and decoder run on the same part of the chip and it gets bogged down trying to do both simultaneously, but that's just a guess.
 
Looking at fluffy marketing from both big tech companies like AMD, Intel and Nvidia, I would wait and see how it actually performs through independent testings, before drawing any conclusions. Every one of them claimed "greatness" of their products, but turns out to be a dud. Not too long ago, it was Intel's Arrow Lake which they claimed "unexpected" lower performance as compared to internal testing. Till today, I don't believe they have addressed or bridge that actual performance vs their expectation gap. AMD, RDNA 3 was supposed to be great step up from RDNA 2, and we know that did not really happen. Nvidia, Jensen's famous quote, " RTX 5070 performs like a RTX 4090". Only 3 letters to respond to his bold statement, LOL.

Back to RDNA 4, I feel the product is really like AMD's knee jerk reaction after getting caught up by Intel. AMD know they can't gun for Nvidia at this point, but Intel's GPU have already overtaken them in terms of AI upscaling and RT performance, which is become a clear threat to them.
 
Last edited:
AMD, RDNA 3 was supposed to be great step up from RDNA 2, and we know that did not really happen.
To be fair, going by the rumor channels that have contacts within AMD it did seem that AMD believed this themselves. Seems to have been a "if we can fix this one thing it'll be great" situation but then they never were able to fix it.

Nvidia, Jensen's famous quote, " RTX 5070 performs like a RTX 4090". Only 3 letters to respond to his bold statement, LOL.
I don't think anyone besides the biggest NVIDIA fans believed that one. Jenssen has been sipping the AI kool aid for far too long.

Back to RDNA 4, I feel the product is really like AMD's knee jerk reaction after getting caught up by Intel. AMD know they can't gun for Nvidia at this point, but Intel's GPU have already overtaken them in terms of AI upscaling and RT performance, which is become a clear threat to them.
Imo just how the market works. Raytracing until recently was a nice to have, not a need to have. The first games requiring it rather than it being a toggle are now coming out.
Doing it better can be done by dedicating more die space to it. Something AMD was reluctant to do because at the same raster performance level that means a bigger and more expensive chip. Now that games start requiring it AMD simply dedicated more die space to it and it's likely to have a performance hit more similar to that of NVIDIA*. Wait and see, they haven't made any bold claims yet.

I dont think Intel came into the equation much at all. Their first gen had no almost no demand (happens when you sell a gaming card with drivers that might not let you run a decent number of games). The second gen has the opposite problem - Intel isn't matching demand and barely producing any.

* Third party benchmarks are in now, looks like Raytracing isn't quite at NVIDIA levels yet but not massively behind anymore.
---

AMDs GPU marketing branch has always been incredibly stupid and arrogant (anyone remember "poor Volta"?). I'd even say it's so inept and counter productive that if they got rid of it entirely; sales wouldn't be hurt one bit. If I was in charge I'd replace the entire branch.

AMDs track record when it comes to presenting numbers and examples however is solid. Third party testers pretty much always confirm getting the same numbers without doing anything strange (their own numbers have even underreported the CPU performance quite a few times).

Intel is the king of creating weird conditions to make themselves look good and in very dubious ways.
e.g. anyone remember the Principled Technologies scandal? Creating an unrealistic weird environment to run the tests for the competition in?
Or not disclosing they were using a water chiller?
Or one of the most telling things for myself that they tend to fudge numbers - hiring snake oil salesman Ryan Trout.
NVIDIA tends to lean hard into showing numbers for titles using what they optimized their technology for (Tessellation, Ray Tracing - that kind of thing) and in more recent times pushing the numbers by making sure framegen and DLSS are used. Not as bad as Intel but definitely skews the numbers their way.

I agree that it's best to wait for third parties to test things but I'd put money on it that it will be a big upgrade.
 
Last edited:
Back