AMD video shows off over 70 current and upcoming games that support FSR

Thats where I stopped reading.

Really sad how people can’t notice a biased site.

DF, LTT and some others that cant be mentioned, are not trustworthy places to depend on for valid and unbiased information.

Biased or not DLSS is simply better than FSR and the identical game screenshots of comparisons between FSR and DLSS make that evidently clear. Horizon Zero Dawn is a very good example, with the amount of foliage aliasing that is cleaned up being a particularly excellent demonstration.

It's really sad that people can't address the point of a post I.e DLSS is better, rather than dismiss a factually correct assertion just because they don't like the website that also pointed it out. Even if others also come to the exact same conclusion. I'm sure you'll find a way to dislike those too.

One could say that's serious bias unto itself.
 
I guess its good I just picked up this game yesterday and playing at Ultrawide 3440x1440 and don't need to use any upscaling because I have a gpu powerful enough to render at native.
Native is generally inferior to DLSS on Horizon Zero Dawn. Because the native anti aliasing method built into the game derives from the game's consoles roots and those low end limitations. At that lower quality unfortunately it cannot clear up the image properly particularly lots of shimmering foliage. DLSS however does.
 
I've been a PC gamer since the early 90s, I don't remember a time when 30 FPS was considered normal. Hitting 60FPS or better was always the goal for 3D games.
You’re right. I don’t remember there being a golden 30 fps standard in the ‘90s (or before - grew up on the C64/Amiga). I deliberately oversimplified in order to make it easier for younger generations to grasp without being long winded about it and making it sound like ‘when I was young’.

To be honest, I don’t remember actual fps numbers being something me and my RL friends (Internet or online in general wasn’t much of a thing in my country until around 2000) would pay that much attention to. A lot of games didn’t even have the feature to show fps. It was a case of ‘my game feels so sluggish on my 386 and so smooth on your 486, so I guess I’ll have to upgrade’. Whatever the actual frame rate was, it definitely wasn’t as fluid except perhaps for sprite base scrollers.

I would say it started slowly with Quake, but it wasn’t until around the release of Unreal Tournament and Counter Strike that I remember people around me really starting to care about actual frame rate numbers and comparing them with each other. At that point however, I had just passed my mid 20s and had little interest left for shooters because of this newfangled thing called MMORPGs (I was big into tabletop pen and paper RPGs at the time, so it was a perfect fit for me).

I remember WoW feeling so smooth compared to the competition, but even then I remember having to tweak settings because my 2 generations old mid-range card would drop to low 20s in certain areas. I also remember giving up on beta testing EverQuest 2 because it was a slideshow. I don’t think it was around Wrath of the Lich King in 2008 where I personally saw 60fps as being something to strive for and when I switched from CRT to LCD (switched a bit later than most because I had a very nice CRT), hitting either 30 or 60fps V-sync became a must because uneven frame rates looked much worse than on the CRT.

Anyway, I don’t know what the experience nor standard was like for people playing shooters and other fast paced competitive games because I was in my 30s and done with that segment of gaming by that point. These days, at almost 50, I’m all about simulators, (MMO)RPGs, strategy games, building games and what not. When I want to relax and have a good time, I play Satisfactory, Hearts of Iron 4, Anno 1800, Rimworld or even solo Minecraft of all things and 60fps is all I need for that. What I don’t feel like playing is CoD MCCCXXXVII.
 
Native is generally inferior to DLSS on Horizon Zero Dawn. Because the native anti aliasing method built into the game derives from the game's consoles roots and those low end limitations. At that lower quality unfortunately it cannot clear up the image properly particularly lots of shimmering foliage. DLSS however does.
I see no need for it when I play I'm too busy enjoying the game. I didn't buy a highend gpu to use upscaling but to each his own.
 
You’re right. I don’t remember there being a golden 30 fps standard in the ‘90s (or before - grew up on the C64/Amiga). I deliberately oversimplified in order to make it easier for younger generations to grasp without being long winded about it and making it sound like ‘when I was young’.

To be honest, I don’t remember actual fps numbers being something me and my RL friends (Internet or online in general wasn’t much of a thing in my country until around 2000) would pay that much attention to. A lot of games didn’t even have the feature to show fps. It was a case of ‘my game feels so sluggish on my 386 and so smooth on your 486, so I guess I’ll have to upgrade’. Whatever the actual frame rate was, it definitely wasn’t as fluid except perhaps for sprite base scrollers.

I would say it started slowly with Quake, but it wasn’t until around the release of Unreal Tournament and Counter Strike that I remember people around me really starting to care about actual frame rate numbers and comparing them with each other. At that point however, I had just passed my mid 20s and had little interest left for shooters because of this newfangled thing called MMORPGs (I was big into tabletop pen and paper RPGs at the time, so it was a perfect fit for me).

I remember WoW feeling so smooth compared to the competition, but even then I remember having to tweak settings because my 2 generations old mid-range card would drop to low 20s in certain areas. I also remember giving up on beta testing EverQuest 2 because it was a slideshow. I don’t think it was around Wrath of the Lich King in 2008 where I personally saw 60fps as being something to strive for and when I switched from CRT to LCD (switched a bit later than most because I had a very nice CRT), hitting either 30 or 60fps V-sync became a must because uneven frame rates looked much worse than on the CRT.

Anyway, I don’t know what the experience nor standard was like for people playing shooters and other fast paced competitive games because I was in my 30s and done with that segment of gaming by that point. These days, at almost 50, I’m all about simulators, (MMO)RPGs, strategy games, building games and what not. When I want to relax and have a good time, I play Satisfactory, Hearts of Iron 4, Anno 1800, Rimworld or even solo Minecraft of all things and 60fps is all I need for that. What I don’t feel like playing is CoD MCCCXXXVII.
The game that has the most players at around your age would be the MMO Eve Online. Give it a try sometime and talk to people there :D
 
Biased or not DLSS is simply better than FSR and the identical game screenshots of comparisons between FSR and DLSS make that evidently clear. Horizon Zero Dawn is a very good example, with the amount of foliage aliasing that is cleaned up being a particularly excellent demonstration.

It's really sad that people can't address the point of a post I.e DLSS is better, rather than dismiss a factually correct assertion just because they don't like the website that also pointed it out. Even if others also come to the exact same conclusion. I'm sure you'll find a way to dislike those too.

One could say that's serious bias unto itself.
In the end, you simply missed many points on the matter and went back to the “dlss is betta bro!”

Using you “foliage “ example, do you really pause a game just to admire that part, just to be able to say “dlss is bettah yo!”?

I wont continue in the back and forth because I am tired of trying to open the corporate little drones eyes about biased reporting ir anticonsumer practices by both the corporations and tech sites.

And as usual, this simply reminds me of the slaves that used to fight with each other because they got offended when one said to the other that his master’s plantation was bigger than the other….
 
In the end, you simply missed many points on the matter and went back to the “dlss is betta bro!”

Using you “foliage “ example, do you really pause a game just to admire that part, just to be able to say “dlss is bettah yo!”?

I wont continue in the back and forth because I am tired of trying to open the corporate little drones eyes about biased reporting ir anticonsumer practices by both the corporations and tech sites.

And as usual, this simply reminds me of the slaves that used to fight with each other because they got offended when one said to the other that his master’s plantation was bigger than the other….
One second browsing your previous posts on the forum is enough for anyone to understand exactly where your blind loyalties lie.

I'm less bothered by the brand of my hardware, since I'm not self deluded enough to think one faceless semiconductor corporation is much more saintly than another.

I just say it as I see it. So putting your puerile preaching aside: DLSS is simply better. That was my point. If that was written on the communist party's website and you're Joseph McCarthy they're not necessarily wrong about that specific assertion just because you dislike the rest of their ideology.

People/outlets you personally hate can still be right. It takes a bit of maturity to acknowledge that.

Horizon Zero Dawn is merely one very good example of DLSS. No pausing required to see that aliasing is much reduced. It's a basic technical aspect of the game. Moving on.
I see no need for it when I play I'm too busy enjoying the game. I didn't buy a highend gpu to use upscaling but to each his own.
Confusing DLSS as mere upscaling is an unfortunate mistake. It's a rendering tool that in many instances has such excellent anti aliasing and texture filtering that it is superior to a 'native' raw image, especially one with low quality console tier anti aliasing. Like Horizon Zero Dawn.

Anti aliasing is a luxury. It is used in the pipeline because a raw unfiltered 'native' image has finite resolution and is generally not pretty, with a lot of perceptible rough edges. We accepted multi sampled or temporal AA because it makes selected things look nicer. It is desirable. We don't dismiss citing a 'corruption' of the raw 'native' output- but that's exactly what it does. It alters certain pixels.

Ideally we could render everything at 4x or 8x native without any selective AA and then super sample it down. It would look fantastic with every pixel treated. But of course that's dumb, because it's way too performance expensive. So instead you have stuff like DLSS that can deliver that quality of broad AA for a fraction of the cost. Work smarter not harder.

Many times this has been explained, many times people have demonstrated they do not understand. Or they don't want to understand this because of ulterior motives.....

People with a 3090 should still turn it on in HZD for the best image quality. Techniques such as DLSS are your friend regardless of the power of your GPU. Just like old fashioned multisample AA was your friend 20 years ago. Embrace it, because it's definitely not going away. Eventually your favourite brand of cola will also get it right and you'll be singing the praises. We'll be waiting for your day to arrive, I'll save you a seat.
 
Last edited:
I think this hurts AMD more than it helps.
It leaves little incentive to buy AMD when their best software products are open. In the long term, will we see new versions and fixes quickly, sporadic or not at all? Will devs just stop adding it a year from now?

With 70+ titles already, I'd like to know why AMD made it open so quickly. I used to think this would be best for consoles, but I feel dynamic resolutions would affect the visuals.

Kudos to AMD for sharing, but they need more help than Nvidia in the dGPU market so they should have held onto FSR at least a little while longer. In the short term, Intel only has to beat AMD to make things harder than it already is for them. I'll be watching.
Ermm...maybe thank god AMD does not think like you do.
 
One second browsing your previous posts on the forum is enough for anyone to understand exactly where your blind loyalties lie.

I'm less bothered by the brand of my hardware, since I'm not self deluded enough to think one faceless semiconductor corporation is much more saintly than another.

I just say it as I see it. So putting your puerile preaching aside: DLSS is simply better. That was my point. If that was written on the communist party's website and you're Joseph McCarthy they're not necessarily wrong about that specific assertion just because you dislike the rest of their ideology.

People/outlets you personally hate can still be right. It takes a bit of maturity to acknowledge that.

Horizon Zero Dawn is merely one very good example of DLSS. No pausing required to see that aliasing is much reduced. It's a basic technical aspect of the game. Moving on.

Confusing DLSS as mere upscaling is an unfortunate mistake. It's a rendering tool that in many instances has such excellent anti aliasing and texture filtering that it is superior to a 'native' raw image, especially one with low quality console tier anti aliasing. Like Horizon Zero Dawn.

Anti aliasing is a luxury. It is used in the pipeline because a raw unfiltered 'native' image has finite resolution and is generally not pretty, with a lot of perceptible rough edges. We accepted multi sampled or temporal AA because it makes selected things look nicer. It is desirable. We don't dismiss citing a 'corruption' of the raw 'native' output- but that's exactly what it does. It alters certain pixels.

Ideally we could render everything at 4x or 8x native without any selective AA and then super sample it down. It would look fantastic with every pixel treated. But of course that's dumb, because it's way too performance expensive. So instead you have stuff like DLSS that can deliver that quality of broad AA for a fraction of the cost. Work smarter not harder.

Many times this has been explained, many times people have demonstrated they do not understand. Or they don't want to understand this because of ulterior motives.....

People with a 3090 should still turn it on in HZD for the best image quality. Techniques such as DLSS are your friend regardless of the power of your GPU. Just like old fashioned multisample AA was your friend 20 years ago. Embrace it, because it's definitely not going away. Eventually your favourite brand of cola will also get it right and you'll be singing the praises. We'll be waiting for your day to arrive, I'll save you a seat.
You will be waiting a very long time I'm using a 6800XT which will never support DLSS and I have no interest what so ever in using FSR when I can push 3440x1440 with this kinda of frame rate.

 
You’re right. I don’t remember there being a golden 30 fps standard in the ‘90s (or before - grew up on the C64/Amiga). I deliberately oversimplified in order to make it easier for younger generations to grasp without being long winded about it and making it sound like ‘when I was young’.

To be honest, I don’t remember actual fps numbers being something me and my RL friends (Internet or online in general wasn’t much of a thing in my country until around 2000) would pay that much attention to. A lot of games didn’t even have the feature to show fps. It was a case of ‘my game feels so sluggish on my 386 and so smooth on your 486, so I guess I’ll have to upgrade’. Whatever the actual frame rate was, it definitely wasn’t as fluid except perhaps for sprite base scrollers.

I would say it started slowly with Quake, but it wasn’t until around the release of Unreal Tournament and Counter Strike that I remember people around me really starting to care about actual frame rate numbers and comparing them with each other. At that point however, I had just passed my mid 20s and had little interest left for shooters because of this newfangled thing called MMORPGs (I was big into tabletop pen and paper RPGs at the time, so it was a perfect fit for me).

I remember WoW feeling so smooth compared to the competition, but even then I remember having to tweak settings because my 2 generations old mid-range card would drop to low 20s in certain areas. I also remember giving up on beta testing EverQuest 2 because it was a slideshow. I don’t think it was around Wrath of the Lich King in 2008 where I personally saw 60fps as being something to strive for and when I switched from CRT to LCD (switched a bit later than most because I had a very nice CRT), hitting either 30 or 60fps V-sync became a must because uneven frame rates looked much worse than on the CRT.

Anyway, I don’t know what the experience nor standard was like for people playing shooters and other fast paced competitive games because I was in my 30s and done with that segment of gaming by that point. These days, at almost 50, I’m all about simulators, (MMO)RPGs, strategy games, building games and what not. When I want to relax and have a good time, I play Satisfactory, Hearts of Iron 4, Anno 1800, Rimworld or even solo Minecraft of all things and 60fps is all I need for that. What I don’t feel like playing is CoD MCCCXXXVII.
That is my exact experience and recollection .

I do remember selecting higher refresh rate to eliminate flickering, especially outside of games, since otherwise, you would get headaches and severe eyes strain.

Hell, at times, I am having a hard time in noticing any differences between my Series X and 6900xt powered PC on the same game and TV (LG C9).

Maybe I need to pause the game, do crazy zooming on some foliage and then proceed to call it garbage. 😁
 
Last edited by a moderator:
Tell that to their poor GPU market share and losing their 3 year CPU lead in 4 years.

No consistency. Poor software overall.
GPU market share is based on broken Steam Survey?

Intel only has chance on desktop chips and that's where AMD is investing least. Nice to see how "well" Intel handles servers where AMD just dominates. Not to mention Intel's HEDT line has been dead for years.
 
You will be waiting a very long time I'm using a 6800XT which will never support DLSS and I have no interest what so ever in using FSR when I can push 3440x1440 with this kinda of frame rate.

By posting a performance graph you have clearly missed the point made despite painfully extensive explanation on my part.

DLSS isn't needed in Horizon Zero Dawn to make an RTX3090 run the game smoothly.

It should be used because it looks way better than what any of the 'native' anti aliasing solutions can deliver. It completely eliminates the game's shimmery aliased look particularly noticeable in motion that cannot be anti aliased with any other current in engine method.

The side bonus is that it improves performance, but the primary bonus is it makes the game look better. Which is why everyone wants high quality anti aliasing.

If you do not have DLSS enabled you have a somewhat inferior looking version of the game. Should knowing that fact disastrously ruin your life? Nope. But it's still true, and it should still be acknowledged.
 
I seriously don't get why people are still arguing about stupid small arguments when things are simple:
1. DLSS is objectively better quality
2. FSR has now much broader support because of its open source nature (the open source community is doing great things with FSR)
 
FSR is failure currently. I tend to find that DLSS improves image quality in some games but whenever I’ve turned on FSR the image quality takes a much bigger, much more noticeable hit.

I’ve noticed a few tech outlets have stopped even talking about FSR and I don’t blame them. I have yet to encounter a game that it’s actually worth using for if you have modern hardware.

I’m sure budget laptop owners and owners of older GPUs appreciate it but if you can handle the frame rate of a game at native then you don’t need to murder your image quality with FSR.

The real clincher is the image quality improvement on DLSS, it offers a level of fidelity and smoothness that you can’t get at native, it makes it a good feature for high end cards to deliver better experiences. The current gulf between DLSS and FSR is enormous.
 
Last edited:
By posting a performance graph you have clearly missed the point made despite painfully extensive explanation on my part.

DLSS isn't needed in Horizon Zero Dawn to make an RTX3090 run the game smoothly.

It should be used because it looks way better than what any of the 'native' anti aliasing solutions can deliver. It completely eliminates the game's shimmery aliased look particularly noticeable in motion that cannot be anti aliased with any other current in engine method.

The side bonus is that it improves performance, but the primary bonus is it makes the game look better. Which is why everyone wants high quality anti aliasing.

If you do not have DLSS enabled you have a somewhat inferior looking version of the game. Should knowing that fact disastrously ruin your life? Nope. But it's still true, and it should still be acknowledged.
And you are also missing the point I'm making I have no interested in upscaling. You sound like you work for Nvidia dude. And I don't have an NV gpu so DLSS doesn't mean anything to me regardless if you think it looks better than native which I also don't agree with.

I don't really care to continue this discussion believe what you will.
 
And you are also missing the point I'm making I have no interested in upscaling. You sound like you work for Nvidia dude. And I don't have an NV gpu so DLSS doesn't mean anything to me regardless if you think it looks better than native which I also don't agree with.

I don't really care to continue this discussion believe what you will.
Lmao, DLSS looks way better and way smoother than normal in loads of games now. For me I thought RDR2 was dramatically smoother with DLSS on, I guess that’s what happens when you get rid of TAA. Also death stranding, control, HZD.

If you had an RTX GPU you could turn on DLSS and see this for yourself. But if you don’t you won’t know what you’re missing out on.
 
Lmao, DLSS looks way better and way smoother than normal in loads of games now. For me I thought RDR2 was dramatically smoother with DLSS on, I guess that’s what happens when you get rid of TAA. Also death stranding, control, HZD.

If you had an RTX GPU you could turn on DLSS and see this for yourself. But if you don’t you won’t know what you’re missing out on.
Cool story Bro.
 
I've been a PC gamer since the early 90s, I don't remember a time when 30 FPS was considered normal. Hitting 60FPS or better was always the goal for 3D games.

You must come from an alternate timeline. When quake came out in '96 practically nobody could even play it at 60fps. The whole 60fps thing, apart from arcade games, wasn't even a thing until the 2000's.
 
You must come from an alternate timeline. When quake came out in '96 practically nobody could even play it at 60fps. The whole 60fps thing, apart from arcade games, wasn't even a thing until the 2000's.
Wrong. The whole 60FPS has always been a thing. Everybody was striving towards it because that's what some of the best games on consoles and arcades were hitting.

When Quake launched you had Pentium 200 on the market which was hitting about 45FPS in DOS (this is pre GLQuake) and in less than a year Pentium II came on the market which was easily hitting over 60FPS.

Sometime in 1997 you also got patch 1.08 which improved performance a lot (it moved the P 200 closer to 50 FPS and with aa good OC you could hit between 55-60FPS).
 
FSR is failure currently.
Considering its current usage, it's way above DLSS can achieve even if quality wise it's not better. Let me see you put DLSS in many unity games, linux support and even VR games without any intervention from the devs.

An example of an open source project:

 
Last edited:
Back