AMD's answer to GeForce's Tensor Cores may be coming with next-gen RDNA3 architecture

I think it is great that some graphic card buyers simply don't care about technology at all. Undemanding customers have helped keep budget choices like AMD around for all of us to enjoy.
:) You don't know me very well. I care a LOT about tech. I just don't beat a dead horse over a pixel. I have one of the fastest PC's money can buy, but not the video card. Still happy with my 2080 super.
 
So?

The very article is about how AMD themselves seem to agree that having functionality to accelerate features like these is a good idea. You know, like Nvidia's has had for 2 generations.

Good news for AMD, then. But of course the resident AMD Jehova community had to go and find something to whine about. Because.. not positive ENOUGH. This urge is truly tiresome.

Sad that you think Nvidia created upscaling or machine learning.

All they did is pay game developers large Sums of money so they exclusively use their tech.... otherwise consoles would've had Ray traced Cyberpunk a lot sooner, but again Nvidia payed for exclusivity like they have for many of their dlss and RTX titles.

Today though, almost all game developers are on the Console's RDNA bandwagon. And AMD is Leading the industry forward.

Nearly everyone can see this & I have a slew of EVGA cards at home. I am just not a blind lemming.
 
Sad that you think Nvidia created upscaling or machine learning.

I do?

All they did is pay game developers large Sums of money so they exclusively use their tech.... otherwise consoles would've had Ray traced Cyberpunk a lot sooner, but again Nvidia payed for exclusivity like they have for many of their dlss and RTX titles.

Hmm. Well, an interesting opinion, but more relevant to this topic is that they spent some of their transistor budget on acceleration features, and AMD seems to be following suit.

Today though, almost all game developers are on the Console's RDNA bandwagon. And AMD is Leading the industry forward.

Nearly everyone can see this & I have a slew of EVGA cards at home. I am just not a blind lemming.

Ok. Not sure what any of that has to do with RDNA3 adding matrix multiplication functionality, or the complaint about too much DLSS positivity that I was joking about, but good for you man. Don't be a lemming!
 
I do?



Hmm. Well, an interesting opinion, but more relevant to this topic is that they spent some of their transistor budget on acceleration features, and AMD seems to be following suit.



Ok. Not sure what any of that has to do with RDNA3 adding matrix multiplication functionality, or the complaint about too much DLSS positivity that I was joking about, but good for you man. Don't be a lemming!

Because it eats at you, that the helm moved from Nvidia to AMD... and you are the only one salty over it, instead of accepting it.
 
To tell you the truth, I don't know much of anything about DLSS or FSR, don't even know if I have them on. Don't really care that much. You stop noticing the tiny little details in a game, when you PLAY the game. You just want things to display properly per your settings, and a fast frame rate. AMD does just fine and so does nVidia. I'm sure for lower 1080p the Intel cards will work ok as well.

This whole argument turns into "well with X I can see 1 more pixel". I don't care.

This is really the key, isn't it? DLSS 2 and FSR 2 do a great job at upscaling to where your 1440p upscaled game looks close enough to native that the most important thing is the extra FPS, not the tiny pixel-peeping changes. Some ghosting weirdness with fast movement can take you out of the game but then improvements in edge shimmering immerse you more in-game.

Win for everyone and if AMD can design dedicated blocks or adapt the general GPU capability to accelerate these FPR calculations, then that's great. Keep developing the current software version for older GPUs and accelerate it with hardware in the RX 7000 series and later.
 
Not anywhere near as tiresome as the ever present laments about uncomfortable truths from the AMD faithful.
Comments like these are hilarious. It's like the CEO of a coal mine that is polluting a whole country is complaining that workers enter with dirty shoes in his office.
 
I think it is great that some graphic card buyers simply don't care about technology at all. Undemanding customers have helped keep budget choices like AMD around for all of us to enjoy.
The more someone knows about tech, the more likely they are to consider AMD.
Most undemanding customers buy nVidia, because they don't know anything about tech.
 
Some people are into marketing & others are into hardware and using it...

3080ti is being placed at 6900xt's price at new egg for the last 3 weeks, because it's slower at Warzone (The main FPS competitive game in the world right now) and people want frames....

Many of those streamers who were gifted RTX3090's by their sponsors have now switched to Radeon cards. And with Warzone 2.0 rumors of using "new-age console technology" (ie: RDNA) the new game engine will take Warzone (Call of Duty) to new heights this November. With a monster new FPS game hitting & with new GPU's landing in October, it's going to be a GPU storm...

I suspect AMD's Radeon RDNA3 will be leading the way.
 
Honestly, I don't care about this any more than I cared about DLSS and/or ray-tracing performance on GeForce cards.
2022-05-23-image-3-j_1100.webp

As for FSR vs. DLSS, this image tells me that both are perfectly acceptable to me and one would not provide any noticeable advantage to my gaming experience over the other. I guess that nVidia's image is somewhat better, but it's nothing that would compel me to buy an nVidia product and it's definitely not something that I would even consider paying extra for.

The only thing that looks even the slightest bit different to me in the two images are the shadows on the rocks with the GeForce image having shadows that are a bit darker. This is nothing that I would notice during gameplay because who looks at the shadows? In fact, whenever I want a performance boost, turning the shadows down is the first thing that I do (and it's the first thing that most people I know do as well).

As for ray-tracing, I care even less about that than I do hardware upscaling. I consider it to be just another way for nVidia to try and justify charging people up the posterior for their products. I also cringe every time I read or hear the "tech press" calling it a "game-changer". The only real game-changer that I've seen in the past 15 years was tessellation and compared to tessellation, ray-tracing is a complete joke.
 
Maybe it's my eyes but the 4K FSR 2.0 image above looks identical to the 4K DLSS image though I can't say I'm particularly impressed with either image.
The shadows are a bit darker in the DLSS image but other than that, yeah, I can't see a difference and I don't give a damn about the shadows either. No true FSP gamer cares about shadows because we can't shoot them and they can't shoot us! :laughing:
 
Your imagination is taking you ever further off-topic.
Ever further from the topic of AMD being reactive against any form of hardware or instruction accelered reconstruction.

First DLSS was garbage, too many flaws - especially in motion.

Then an inferior option surfaces, but because it wasn't a TAA derivative it was 'better'.

Now, the competitor to DLSS that works essentially the same way, but worse, is aok, because it's not made by nvidia needing a card that 4/5 pc gamers own, oh and tensor cores are useless... Only they're not.
 
Last edited:
Ever further from the topic of AMD being reactive against any form of hardware or instruction accelered reconstruction.

First DLSS was garbage, too many flaws - especially in motion.

Then an inferior option surfaces, but because it wasn't a TAA derivative it was 'better'.

Now, the competitor to DLSS that works essentially the same way, but worse, is aok, because it's not made by nvidia needing a card that 4/5 pc gamers own, oh and tensor cores are useless... Only they're not.

But it is not worse. FSR is universal and thus superior to Developers, because it's hardware agnostic... and they (meaning the people who develop games) do not have to worry about proprietary hardware... of fracturing their game engine, etc.

FSR is the Industry Standard and what the consoles use. (DLSS does not work on GTX cards, or laptops...)
 
But it is not worse. FSR is universal and thus superior to Developers, because it's hardware agnostic... and they (meaning the people who develop games) do not have to worry about proprietary hardware... of fracturing their game engine, etc.

FSR is the Industry Standard and what the consoles use. (DLSS does not work on GTX cards, or laptops...)
Heard it all before, and it has worse image quality. With streamline DLSS will be continue to be implemented as well as FSR for some time yet, given the inputs are virtually identical, and engine plugins exist, coexistence is where fsr's success will lie, at least for now. It has a lot of improvement to do before it's going to accepted as the only reconstruction technique to add into pc games.

Some passionate people really hanging out for DLSS to die, well you'll be waiting some time yet, evidently FSR has a lot to learn.
 
Last edited:
Heard it all before, and it has worse image quality. With streamline DLSS will be continue to be implemented as well as FSR for some time yet, given the inputs are virtually identical, and engine plugins exist, coexistence is where fsr's success will lie, at least for now. It has a lot of improvement to do before it's going to accepted as the only reconstruction technique to add into pc games.

Some passionate people really hanging out for DLSS to die, well you'll be waiting some time yet, evidently FSR has a lot to learn.
ONLY if NVidia pays them...
Why would a Developer go out of their way, or deviate from the standard?

FSR2.0 works on all hardware. DLSS doesn't work on consoles, nor any games the Developers are working on for Consoles. There is ZERo incentive to make DLSS work... other than NVidia paying them.

Otherwise, FSR2.0 is free, easy and works.
 
Why would a Developer go out of their way, or deviate from the standard?
I won't be surprised when DLSS continues to be included in PC AAA game releases, perhaps it's because it's already so easy to implement, perhaps it's because Nvidia provides tools (streamline) to implement any multiple reconstruction techniques alongside each other, perhaps "money" does change hands, perhaps it's because 10's of millions of AAA gamers posses DLSS capable hardware, but either way, it will keep happening for the foreseeable future.

This announcement is a clear indication AMD is working on an accelerated FSR (3.0?), so that won't work on consoles either. I'm really not convinced FSR 2.0 will be the lasting go-to (read: only) reconstruction that devs add to games, at least not in it's current form, and being free and open source isn't enough to convince me, or devs and the community as a whole apparently.
 
I won't be surprised when DLSS continues to be included in PC AAA game releases, perhaps it's because it's already so easy to implement, perhaps it's because Nvidia provides tools (streamline) to implement any multiple reconstruction techniques alongside each other, perhaps "money" does change hands, perhaps it's because 10's of millions of AAA gamers posses DLSS capable hardware, but either way, it will keep happening for the foreseeable future.

This announcement is a clear indication AMD is working on an accelerated FSR (3.0?), so that won't work on consoles either. I'm really not convinced FSR 2.0 will be the lasting go-to (read: only) reconstruction that devs add to games, at least not in it's current form, and being free and open source isn't enough to convince me, or devs and the community as a whole apparently.

Yes, no doubt.
Unfortunately, PC sales pale in comparison to Console sales of games. For instance, Warzone is one of the most played PC games and only makes up about 15% of all Warzone players.

Half of those PC players will be on NVidia cards... which can still use FSR.


The mere fact you care so much about one technology and make that your over-all buying decision is laughable. When there is an equal choice that offer better frames for all...
 
Yes, no doubt.
Unfortunately, PC sales pale in comparison to Console sales of games. For instance, Warzone is one of the most played PC games and only makes up about 15% of all Warzone players.
Not the biggest bearing on PC reconstruction inclusion as you'd like it to be.
Half of those PC players will be on NVidia cards... which can still use FSR.
Yes, but with virtually no extra time to also add the superior DLSS, I know what I'll bet on them also including.
The mere fact you care so much about one technology and make that your over-all buying decision is laughable. When there is an equal choice that offer better frames for all...
For starters, make no mistake - they're not equal, in performance or image quality, and despite it perhaps not looking like it to some eyes, there's quite a gap to bridge in some areas.

Secondly, you assume too much, how much I care for one, and that anything of this has altered my buying decision in any meaningful way, plus FSR didn't exist at all when I bought my last graphics card.

Don't get me wrong, I am OK with the notion of a free, open source reconstruction technique that runs on anything being the technique that stands the test of time and supercedes all others,I just happen to want it to be excellent. I need a lot more convincing that it will be FSR 2.0.
 
Last edited:
Not the biggest bearing on PC reconstruction inclusion as you'd like it to be.

Yes, but with virtually no extra time to also add the superior DLSS, I know what I'll bet on them also including.

For starters, make no mistake - they're not equal, in performance or image quality, and despite it perhaps not looking like it to some eyes, there's quite a gap to bridge in some areas.

Secondly, you assume too much, how much I care for one, and that anything of this has altered my buying decision in any meaningful way.

Don't get me wrong, I am OK with the notion of a free, open source reconstruction technique that runs on anything being the technique that stands the test of time and supercedes all others,I just happen to want it to be excellent. I need a lot more convincing that it will be FSR 2.0.
I don't think you understand.... BETAMAX was superior to VHS. VHS won because it had wide adoption and was the industry standard.

Game Developers do not care one bit about DLSS, unless NVidia pays them to care, otherwise why bother...? (RTX cards can still use FSR, so why would a Developer choose to promote RTX cards proprietary technology..?)

Most people do not care about freeze-framing and comparing each pixel, and to assume most people are bent on DLSS over FSR is a joke. Nobody cares... except people stuck in the past unable to move on.


Pro tip: Top Tourney last week, none of the players even had G-Sync on, or even anti-aliasing, because they want 200+ frames.... Low shadows, low textures, particle effects low, etc...
 
I don't think you understand....
Better than you think. And now we're going around in circles anyway. I'm not going to convince you of anything it seems, and you're not going to convince me with your brand of arguments and examples, so how about we just agree to disagree.
 
Better than you think. And now we're going around in circles anyway. I'm not going to convince you of anything it seems, and you're not going to convince me with your brand of arguments and examples, so how about we just agree to disagree.
Correct, you will not convince me that my RTX 2080 w/proprietary hardware is better for games. Because it is not...

That is just marketing.
 
Back