AMD's answer to GeForce's Tensor Cores may be coming with next-gen RDNA3 architecture

Daniel Sims

Posts: 1,319   +43
Staff
In context: Tensor cores have been one of the main advantages of Nvidia's RTX graphics cards, enabling machine learning-based image upscaling, which significantly improves performance for some PC games. A recent repository update suggests AMD could bring something similar to its next GPU series.

This week, AMD patched a Github repository to add a matrix-based instruction set to its upcoming RDNA 3 graphics cards. It could let them perform AI-based image reconstruction similar to Nvidia DLSS or Intel XeSS.

Team red's current reconstruction solution, FidelityFX Super Resolution 2.0 (FSR), already effectively lightens rendering loads while maintaining image quality without AI, but it's a double-edged sword. Deep Learning Super Sampling (DLSS) offers better results but requires the tensor cores in Nvidia's RTX cards, while FSR supports a much greater range of hardware.

The repository update could imply a change to that situation. It adds Wave Matrix Multiply-Accumulate instructions to GFX11 — a codename for RDNA 3. These matrix operations could lead to the kind of AI machine learning DLSS and XeSS employ. Known leaker Greymon55 sees it as confirmation of AI acceleration for FSR 3.0.

Built on TSMC's 5-nanometer process, RDNA 3 promises to improve performance over AMD's RX 6000 GPUs from 2020. It will feature 50 percent better performance per watt, rearchitected compute units, and a next-generation Infinity Cache. The latest rumors predict the cards will launch between late October and mid-November.

Permalink to story.

 
I truly hate how all “tech” sites simply say “dlss is better”, but better how?

How much better?

Better as that FSR 2.0 titles looks like a ps1 game next to a current gen running at 4k on a 6950 xt?

Seriously, this “free” nvidia propaganda is tiresome.
 
Last edited by a moderator:
OT, but is a 1630 review planned, or at least an article on its launch ?

You teased the new GPU in May, a follow up and review would be nice.

If the GTX 1630 is around $190 or under, that will put it up against AMD's RX 6500 XT and RX 6400, two cards we rated poorly due to their PCIe 4.0 x4 interface, no hardware encoding, and no AV1 decode. Nvidia could do well if the GTX 1630 addresses these shortcomings.

Would be nice to see how it performs vs those two cards.
 
Why, because you have lots of Nvidia stock?
Because of too many consecutive years of Radeon playing catch-up. New ownership that could dedicate more time into it so AMD can focus on what they are actually good and better at - CPU's, would be best.

Not only did they pay too much for ATi, it took them until 2022 to FINALLY get an iGPU on mainstream desktop parts (Ryzen 7000) to compete with Intel and get into office PC's, as well as giving gamers a backup GPU while they RMA their dGPU's.

Radeon has been a joke since the 9000 series.
 
Last edited:
I don't understand the comments here. AMD GPU's are right up there and sometimes faster than nVidia cards. And, at a cheaper price point. Looking ahead the AMD cards are using less power as well.

I would have gotten a 6800 XT if I didn't already have a capable card in my system.
 
I don't understand the comments here. AMD GPU's are right up there and sometimes faster than nVidia cards. And, at a cheaper price point. Looking ahead the AMD cards are using less power as well.

I would have gotten a 6800 XT if I didn't already have a capable card in my system.
Simple, Nvidia (like Intel) pays huge amount of moneis to tech sites, youtubers and others just to keep publishing the name Nvidia and praise all of their proprietary tech as being magical and superior then add the huge amount of weak minded drones that blindly follow these sites and 'Tubers and you get the current situation.

Just look at my previous post, all sites do the same BS, praises and praises, but never say anything negative about them.
 
Last edited by a moderator:
Looking at the amd patent published a couple of years ago and the operation of fsr 2, it only remains to add the small neural network described in the patent.

When the patent was published, I implemented the network and tested it on static images (I trained it with almost 1 million Witcher 3 image patches, only the color information) and although the reconstruction was inferior to another somewhat more complex network that had As done before, if in an "FSR 3" AMD includes the temporal information and additional buffers (eg depth) as input information, it will improve the final FSR result. being a small network, and these instructions speed up the inference, the negative impact should not be great, and keep the loss-gain ratio similar to the current one of FSR 2. or better.

On the other hand, I think that these instructions and the hardware will be similar to what is done in CDNA. they are not dedicated units, they are the ALUs on the CU optimized for the operation (basically D = A+B*C with matrixes and vectors). this operation is useful for algebraic computations and things like convolutional layers or nonlinear gather-combination layers of CNNs.
 
Last edited:
But FSR 2.0 works on GTX 1080ti's....
So?

The very article is about how AMD themselves seem to agree that having functionality to accelerate features like these is a good idea. You know, like Nvidia's has had for 2 generations.

Good news for AMD, then. But of course the resident AMD Jehova community had to go and find something to whine about. Because.. not positive ENOUGH. This urge is truly tiresome.
 
This urge is truly tiresome.
These volunteer marketing department types of folk consistently are also more upset for GTX 10/16 and older series users not getting DLSS, than the actual owners of said hardware ever were, just another 'not good enough' thing to whinge about.
Breaking news! new products come with new features. More at 11.
 
Last edited:
DLSS is one of those features that is a bit of a hack . Disguising the inability of cards to create real-time GI , reflections and refraction on adequate resolution. I really don't like the cloud based training for these things too... it will need some sort of gpu as a service system to be viable . And it will work only on static games ... creative games that you build new and custom things will still have issues. If I was a hardware developer I will focus on RT cores for better and faster GI. Ideally we will end up with a photorealistic frames where every object is dynamic ...can be changed or destroyed and looks amazing ...
 
To tell you the truth, I don't know much of anything about DLSS or FSR, don't even know if I have them on. Don't really care that much. You stop noticing the tiny little details in a game, when you PLAY the game. You just want things to display properly per your settings, and a fast frame rate. AMD does just fine and so does nVidia. I'm sure for lower 1080p the Intel cards will work ok as well.

This whole argument turns into "well with X I can see 1 more pixel". I don't care.
 
Back