AMD video shows off over 70 current and upcoming games that support FSR

Stingy McDuck

Posts: 215   +161
As someone who has always set custom resolution through the nVidia control panel I don't see the point of FSR, it's not even sharper than my GPU or my display doing the scaling. Let's just analyze this:

1. DLSS 1.0 came out, people says that a lower resolution with a good sharpening filter like CAS was a better way to improve performance.
2. FSR comes out, people say is at least as good as DLSS 1.0.
3. FSR has a slight performance penalty while CAS doesn't.

Does it means that CAS > FSR?
 

Stingy McDuck

Posts: 215   +161
Lmao, DLSS looks way better and way smoother than normal in loads of games now. For me I thought RDR2 was dramatically smoother with DLSS on, I guess that’s what happens when you get rid of TAA. Also death stranding, control, HZD.

DLSS is a form of TAA.
 

Sausagemeat

Posts: 1,584   +1,407
Wrong. The whole 60FPS has always been a thing. Everybody was striving towards it because that's what some of the best games on consoles and arcades were hitting.

When Quake launched you had Pentium 200 on the market which was hitting about 45FPS in DOS (this is pre GLQuake) and in less than a year Pentium II came on the market which was easily hitting over 60FPS.

Sometime in 1997 you also got patch 1.08 which improved performance a lot (it moved the P 200 closer to 50 FPS and with aa good OC you could hit between 55-60FPS).
. 60fps was always the target but PC games mostly failed to hit that target, even with new hardware it wouldn’t be unusual to see games running at under 60, case and point here with the GeForce 6800 ultra, in most of the titles tested it wasn’t hitting 60 and this was the 3090 of its day.

https://www.anandtech.com/show/1293/14

Quake, yes that community pushed hard for 60fps but general PC gaming had 60 as target and most failed to get there. But hardware improved so fast back then and so did games. The vast majority of PC gamers weren’t getting 60fps in most games.
 

Sausagemeat

Posts: 1,584   +1,407

AMD's tech may be inferior for now but it has more potential to be better because it's open, easy to implement and has wider range of hardware support.
I wouldn’t hold your breath. FSR is not like DLSS, it’s not based on machine learning and AI, it’s closer to running a filter over your game than using something like DLSS which require machine learning to provide an algorithm.

Also it’s made by AMD and they are hopeless at software. I mean they still haven’t fixed their 2 year old black screen bug (I had it again literally yesterday and I’m using the latest drivers). I’m not holding much hope for FSR, in its current form I find the results horrendous and would rather have a lower frame rate and turn it off. But AMD do need to come up with something. The future of 3D gaming will be using lots of upscaling tech. Personally I’m just going to buy an Nvidia card next time, my Radeon experience has been dreadful.
 

Puiu

Posts: 5,461   +4,390
TechSpot Elite
. 60fps was always the target but PC games mostly failed to hit that target, even with new hardware it wouldn’t be unusual to see games running at under 60, case and point here with the GeForce 6800 ultra, in most of the titles tested it wasn’t hitting 60 and this was the 3090 of its day.

https://www.anandtech.com/show/1293/14

Quake, yes that community pushed hard for 60fps but general PC gaming had 60 as target and most failed to get there. But hardware improved so fast back then and so did games. The vast majority of PC gamers weren’t getting 60fps in most games.
"general PC community" - anyone who joined the PC "community" back then knew what they were doing (mostly nerds like me). You are thinking of people who weren't tech savvy and didn't even know what FPS meant. We still have such people even today, but don't confuse them with enthusiasts that were pushing bleeding edge tech.

Older PCs not getting 60FPS doesn't mean that both gamers and devs were not pushing to get 60FPS.

edit: small correction
 
Last edited:

Sausagemeat

Posts: 1,584   +1,407
"general PC community" - anyone who joined the PC "community" back then knew what they were doing (mostly nerds like me). You are thinking of people who weren't tech savvy and didn't even know what FPS meant. We still have such people even today, but don't confuse them with enthusiasts that were pushing bleeding edge tech.

Older PCs not getting 60FPS doesn't mean that both gamers and devs were pushing to get 60FPS.
What an obnoxious thing to claim. How do you fit through doorways with a head that massive?

No, back then 60 was the target, most rarely hit it. You would need to get the latest CPU and GPU every year and use it with a lower res monitor to get 60 all the time. You practically always needed to lower settings to get 60 even then. But this was at a time when consoles could be targeting 20fps (N64). We had much lower standards back then. The requirement for much higher frame rates is something that has happened mostly over the last 10 years. Skyrim released in 2011 on the Xbox 360 performing between 20-30fps and there were some complaints it still sold by the bucketload. Its odd because Skyrim was a far buggier, slower experience than Cyberpunk on release yet we dont call Skyrim a failure by any means.

Sounds to me like your parents bought you an expensive PC when you were a kid and you are remembering the performance to be better than it actually was. You probably had a CRT monitor and they were very smooth, 45fps on a CRT often feels smoother than 60 on an LCD flat panel, especially an older LCD.
 

Puiu

Posts: 5,461   +4,390
TechSpot Elite
What an obnoxious thing to claim. How do you fit through doorways with a head that massive?

No, back then 60 was the target, most rarely hit it. You would need to get the latest CPU and GPU every year and use it with a lower res monitor to get 60 all the time. You practically always needed to lower settings to get 60 even then. But this was at a time when consoles could be targeting 20fps (N64). We had much lower standards back then. The requirement for much higher frame rates is something that has happened mostly over the last 10 years. Skyrim released in 2011 on the Xbox 360 performing between 20-30fps and there were some complaints it still sold by the bucketload. Its odd because Skyrim was a far buggier, slower experience than Cyberpunk on release yet we dont call Skyrim a failure by any means.

Sounds to me like your parents bought you an expensive PC when you were a kid and you are remembering the performance to be better than it actually was. You probably had a CRT monitor and they were very smooth, 45fps on a CRT often feels smoother than 60 on an LCD flat panel, especially an older LCD.
"back then 60 was the target" - this is what I'm trying to say. the argument began with the statement that this wasn't a thing back then which is completely untrue. I don't like moving goal posts, but I think in this case the discussion went on for too long and people forgot what we were arguing about (aka a misunderstanding).

Using Quake as an example also might have exaggerated things since it was the Crysis of its time.

TL;DR: in the 90s both devs and gamers were striving towards 60FPS just like today. insulting me doesn't change this simple fact.
 

Sausagemeat

Posts: 1,584   +1,407
"back then 60 was the target" - this is what I'm trying to say. the argument began with the statement that this wasn't a thing back then which is completely untrue. I don't like moving goal posts, but I think in this case the discussion went on for too long and people forgot what we were arguing about (aka a misunderstanding).

Using Quake as an example also might have exaggerated things since it was the Crysis of its time.

TL;DR: in the 90s both devs and gamers were striving towards 60FPS just like today. insulting me doesn't change this simple fact.
60 was the target back then just as 240 is the target today (or 144, pick a high refresh rate). In the vast majority of cases users are not running these high refresh rates though, checking benchmarks for Doom 3, only the Geforce 6800 line could give you 60fps or more in that game, meaning everyone else had to have less than 60fps, I had an X800 Pro and I wasnt playing at 60 but I dont remember that being a massive problem.

The difference between now and then is that these days everyone runs at 60fps, even with older hardware (An RTX 2070 from 2018 will easily hit 60 in almost any game).
 

scavengerspc

Posts: 2,345   +2,456
TechSpot Elite
It's not really a win for AMD over Nvidia considering FSR supports Nvidia GPUs. DLSS will make people choose Nvidia over AMD, FSR won't change peoples prefrences.
And that is exactly why I often tend to favor AMD. It is the same thing with adaptive sync. G sync was Nvidia only, but Free sync would work regardless of the GPU make until Nvidia saw the opportunity to capitalize off FS.
 

Solokreep14

Posts: 14   +6
It's not really a win for AMD over Nvidia considering FSR supports Nvidia GPUs. DLSS will make people choose Nvidia over AMD, FSR won't change peoples prefrences.
actually you are wrong, because RSR will be driver implemented and geforce wont be compatible with it, and RSR will work with all games! DLSS is dev resource only not geforce, so this year AMD will finally win
 

Puiu

Posts: 5,461   +4,390
TechSpot Elite
actually you are wrong, because RSR will be driver implemented and geforce wont be compatible with it, and RSR will work with all games! DLSS is dev resource only not geforce, so this year AMD will finally win
Let's not exaggerate. Nvidia has plenty of cash to make devs implement DLSS.
 

Nobina

Posts: 3,721   +4,095
actually you are wrong, because RSR will be driver implemented and geforce wont be compatible with it, and RSR will work with all games! DLSS is dev resource only not geforce, so this year AMD will finally win
I don't get what you're saying. Nvidia already has Image Scaling which is pretty much what RSR is going to be.
 

zx128k

Posts: 8   +2
Basically steam hardware survey gpu shows next to no one bought an AMD 6000 series gpu. So FSR is a dead tech. AMD 6000 series is a flop, it failed to sell the gamers. NVidia cards did sell to gamers.

NVidia cards have NIS in drivers so they dont need FSR. DLSS has far better image quality, so in any AAA game that has DLSS and FSR. Most people will use DLSS because AAA gamers have the lastest cards.

Any AMD 6000 series features and tech is dead. The market rejected AMD's latest cards.

Then there is the fact in the coming AAA games the requirement for upscaling is from 1080p to 4k. FSR cannot do this without the image quality becoming insanely bad. FSR 1.x is dead.

Everyone will use the builtin TAAU inthe likes of Unreal Engine 5 (software all cards), XeSS or DLSS for faster hardware support. This way good image quality from 1080p to 4k will be posible. This is with near photo real graphics.

NVidia won with the 30 series and every other upscaling method is massively better than FSR by design. FSR wont be a good option in photo realistic AAA games because 1080p to 4k will be a blurred mess. FSR is only good for games where higher resolutions close to native 4k are posible.

Given the poor 1440p DXR performance FSR wont be able to run at quality or ultra quality and have acceptable 4k performance. The whole AMD 6000 series is a dead end.
 

Lionvibez

Posts: 2,619   +2,378
Basically steam hardware survey gpu shows next to no one bought an AMD 6000 series gpu. So FSR is a dead tech. AMD 6000 series is a flop, it failed to sell the gamers. NVidia cards did sell to gamers.

NVidia cards have NIS in drivers so they dont need FSR. DLSS has far better image quality, so in any AAA game that has DLSS and FSR. Most people will use DLSS because AAA gamers have the lastest cards.

Any AMD 6000 series features and tech is dead. The market rejected AMD's latest cards.

Then there is the fact in the coming AAA games the requirement for upscaling is from 1080p to 4k. FSR cannot do this without the image quality becoming insanely bad. FSR 1.x is dead.

Everyone will use the builtin TAAU inthe likes of Unreal Engine 5 (software all cards), XeSS or DLSS for faster hardware support. This way good image quality from 1080p to 4k will be posible. This is with near photo real graphics.

NVidia won with the 30 series and every other upscaling method is massively better than FSR by design. FSR wont be a good option in photo realistic AAA games because 1080p to 4k will be a blurred mess. FSR is only good for games where higher resolutions close to native 4k are posible.

Given the poor 1440p DXR performance FSR wont be able to run at quality or ultra quality and have acceptable 4k performance. The whole AMD 6000 series is a dead end.
Really so how do I have a 6800XT in my rig and guess what I've not been prompted for a steam survey in months.

So superior user's are making mods to turn off the sharpening in GOW PC because its over done.

 

zx128k

Posts: 8   +2
https://store.steampowered.com/hwsurvey/videocard/ only a tiny amounts of AMD Radeon RX 6700 XT appear. Its over for AMD. More people bought the 3090 than all the amd 6000 series. AMD lost and its not biased. AMD 6000 series flopped based on the available data. Gamers did not buys then in any numbers close to nvidia.

In World of Warcraft the FSR image quality is on par with bicubic upscaling and looks like crap. The AMD Ray Tracing does not look any better than raster lighting. FSR causes stuttering and fps drops. So you have to turn it off.

This means that in future games will support nvidia unless payed by AMD to add their feature. nvidia has almost all the market share for upscaling and ray tracing. The games will need to be optimised for nvidia hardware or face failure.

I will be suprised if this is posted because of the steampowered link shows negatives about amd.
 

HardReset

Posts: 1,625   +1,274
https://store.steampowered.com/hwsurvey/videocard/ only a tiny amounts of AMD Radeon RX 6700 XT appear. Its over for AMD. More people bought the 3090 than all the amd 6000 series. AMD lost and its not biased. AMD 6000 series flopped based on the available data. Gamers did not buys then in any numbers close to nvidia.
Care to explain wtf is second most popular AMD graphic card? That is:

AMD Radeon Graphics

FYI, AMD has not released product named "AMD Radeon Graphics" so far...
 

zx128k

Posts: 8   +2
Care to explain wtf is second most popular AMD graphic card? That is:

AMD Radeon Graphics

FYI, AMD has not released product named "AMD Radeon Graphics" so far...
AMD Radeon Graphics means all AMD cards that have to low a presence in the data for a more specific named entry. Like for example the 3090 has enough numbers for its own entry. Thus we see the 3090 in its own named entry.

The 6900xt is not their as its own entry because there are too few in the data. So they are part of the AMD Radeon Graphics entry. Thus gamers have rejected the AMD 6000 series and developers should ignore it in developing new software.

FSR in its current from cant upscale from 1080p to 4k. Image quality is just tooo poor. Unreal Engine is making this a standard. Thus because of this FSR in its current form is DoA. Also AMD DXR performance is too poor for future Photo real games. AMD DXR support uses shaders for some RT processing and nvidia use dedicated hardware. AMD uses there texture cores for RT processing and you cant use both at the same time. NVidia have dedicated hardware.

AMD's RDNA2 chips contain one Ray Accelerator per CU, which is similar to what Nvidia has done with it's RT cores. Even though AMD sort of takes the same approach as Nvidia, the comparison between AMD and Nvidia isn't clear cut. The BVH algorithm depends on both ray/box intersection calculations and ray/triangle intersection calculations. AMD's RDNA2 architecture can do four ray/box intersections per CU per clock, or one ray/triangle intersection per CU per clock.

From our understanding, Nvidia's Ampere architecture can do up to two ray/triangle intersections per RT core per clock, plus some additional extras, but it's not clear what the ray/box rate is. In testing, Big Navi RT performance generally doesn't come anywhere close to matching Ampere, though it can usually keep up with Turing RT performance. That's likely due to Ampere's RT cores doing more ray/box and ray/triangle intersections per clock.

This is why drivers wont improve AMD's performance. Its the hardware that is slower. No matter what, with raster on its way out. In future current NVidia cards will perform better in newer AAA RT games than AMD.

For cards over a £1000 or $1364.03 you want them to last as long as possible. In current DXR games AMD are already on the ropes. When the DXR games designed for the performance of current NVidia hardware appear. Then AMD's 6000 series will have to be replaced. Not even the top cards will have enough performance for decent gameplay. The more you strip the die down and move down the product line. You find DXR support on AMD becomes next to unless, as the performance is not there. You cant use a DLSS feature to go from 1080p to 4k RT in Cybepunk 2077 using a low end RTX 2060 like GPU. AMD cant upscale from 1080p to 4k the FSR quality is just not there. Basically DLSS performance mode is better than all FSR quality modes. You should see the quality drop if you use third party software to add FSR to cyberpunk 2077. Its nowhere even close to DLSS for quality.

This is why steam hardware survey gpu has no entries for the best AMD cards. NVidia has a massive performance lead.
 

zx128k

Posts: 8   +2
You could be right but that would mean even less AMD 6000 series GPUs and they are all in Other. The Radeon RX 6700 XT is the only RDNA 2 to register a blip on Steam in the main GPU section. Making the 6000 series dead at this point. Tomshardware stated:-

If we look at the DirectX 12 chart, the 6800xt reaches a 0.11% market share in all, right underneath that is the 6900xt with 0.10% of market share, and the 6800 sits at just 0.05% market share. For comparison, Nvidia's 3090 sits at 0.42% market share in the same category, the 3080 reaches an even higher 0.99%, and the 3080 tialready has 0.26% despite being released just a few months ago.

Perhaps more telling, summing all the RX 6000-series GPUs from the DX12 API list gives just 0.43% of the total, while the RTX 30-series manages to claim 7.96%. Based off those figures, Ampere GPUs have outsold RDNA2 GPUs among surveyed Steam users by a ratio of more than 18 to 1.
I wonder why this post needs moderation.
 
Last edited:

HardReset

Posts: 1,625   +1,274
AMD Rdeon Graphics means all AMD cards that have to low a presence in the data for a more specific named entry. Like for example the 3090 has enough numbers for its own entry. Thus we see the 3090 in its own named entry.
What? How does Steam Survey report Nvidia's "not enough numbers for it's own entry" cards? Better to admit Steam Survey is bugged as Hell.
You could be right but that would mean even less AMD 6000 series GPUs and they are all in Other. The Radeon RX 6700 XT is the only RDNA 2 to register a blip on Steam in the main GPU section. Making the 6000 series dead at this point. Tomshardware stated:-
I have proven many times Steam Survey has serious bugs. That's why using Steam Survey data to predict sales data is useless. No actual sales data support Steam Survey's "opinion" on actual sales when considering RDNA2 series. That's why Steam Survey is wrong here.
 

HardReset

Posts: 1,625   +1,274
I thought "AMD Radeon Graphics" just meant AMD APUs.
That is exactly how the GPU is described in Windows device manager for on board AMD graphics.
Possibly but it also applies to some external GPU's too. Anyway that's more than enough to definitely prove Steam Survey does not recognize all AMD Radeon GPU's correctly. It also makes all Steam Survey based Nvidia vs AMD GPU analysis completely pointless because there is no way telling what those failed to recognize GPU's are.

Funny how sites like Tom's and Techspot spend time on analysis based on clearly false data. Intelligent people call it waste of time...
 

zx128k

Posts: 8   +2
Possibly but it also applies to some external GPU's too. Anyway that's more than enough to definitely prove Steam Survey does not recognize all AMD Radeon GPU's correctly. It also makes all Steam Survey based Nvidia vs AMD GPU analysis completely pointless because there is no way telling what those failed to recognize GPU's are.

Funny how sites like Tom's and Techspot spend time on analysis based on clearly false data. Intelligent people call it waste of time...
This is the opposite of a proof. Please prove you are right. Conspiracy prove one. False data prove it.

Steam hardware survey is only gaming PC's. The target we are looking for is gamers. This is the right data targeting the right PC's. The data is right to the point. Nvidia won.
 

HardReset

Posts: 1,625   +1,274
This is the opposite of a proof. Please prove you are right. Conspiracy prove one. False data prove it.

Steam hardware survey is only gaming PC's. The target we are looking for is gamers. This is the right data targeting the right PC's. The data is right to the point. Nvidia won.
It's a proof. AMD Radeon Graphics may include Any Radeon GPU. Now, if it contains even single Radeon 6000 series GPU, it invalidates all RX6000 vs Ampere share comparisons on Steam survey.

And since that mystic "card" is gaining market share...

You can figure out rest.