Nvidia DLSS vs. AMD FSR Performance Compared: Have Reddit Users Exposed Steve?

The lady doth protest too much, methinks.

Outside of 'the community', that is the forums here, HUB Youtube comments and your Discord, a single odd benchmarking choice every once in a while would be perceived as little more than just that. But when arbitrary choices always seem to end up leaning towards one side, a narrative becomes visible. Responding with a snarky article won't prevent that - as you yourself note - so I'm not sure why you even feel you need to try.
 
Last edited:
The market disagree with you.
For example consumers chose to buy Freesync monitors, instead of expensive GSync which brings barely or nothing more than what AMD is offering. Oh, wait, Gsync monitors brought something more than Freesync monitors, noisy coolers and higher prices :)
Actually, the market doesn't disagree with him. People are buying more Nvidia than AMD cards. You can see that in the market share Nvidia commands. From Q3 21 to Q3 22, Nvidia gained 5 points of market share, AMD lost about 50% from 17 to 8% and Intel gained a small foothold of around 4-5%.
 
Dlss and fsr add input lag so they should be disabled anyway
Only FSR 3 and DLSS 3 increase input lag.

In general FSR 2 and DLSS 2 lower input lag because more real FPS = lower input lag (when comparing native resolution to the upscaled result).

The exception being when you are heavily CPU bottlenecked (like using DLSS at 1080 with a high end GPU). In this case you won't see a noticeable FPS increase to help overcome the upscaling overhead.

This is the typical order of input lag (from higher to lower): Native resolution > DLSS2/FSR2 Upscaled resolution > Native Resolution that DLSS/FSR use as a base.
 
I would have been one to say use FSR with AMD and DLSS with Nvidia. However, it's good that you are able to show that there is little difference, FPS-wise, between them. I think it's useful to include these into benchmarking, but, I also think you have to quantify the benefit beyond just FPS increases. If I could triple FPS but at the expense of a blurry image, I wouldn't do it.

I think these technologies are still in their infancy but getting better. I also think these technologies will help us get better framerates without having to build high-power GPUs that are hard to cool. I think down the road this technology will be like antialiasing, which we all use every day.
 
I'm curious. With many people still using 1st and 2nd generation RTX cards - say the RTX 2070 Super or RTX 3070 / Ti - does this quality difference between DLSS and FSR still hold true? I'm thinking that 1st and 2nd gen DLSS probably gives the same graphics quality, but at a lower framerate. That's just an uneducated guess.
 
And lower sync rates. Freesync can't sync near as low as Gsync can. There's a reason it's free and nvidia charges.

Hi owner of PG32UGX here with GSYNC premium module. Screen flickers at low FPS, sub 30fps. Lower sync rate is not a feature.

This has been known for years and still not fixed.

2014:https://techreport.com/news/27449/g-sync-monitors-flicker-in-some-games—and-heres-why/

2018, with updates from users in 2020
https://www.nvidia.com/en-us/geforc...rivers/13/265231/g-sync-flicker-fix-must-see/


2021:
https://forums.flightsimulator.com/t/g-sync-causes-subtle-yet-rapid-flickering/366615
 
You are trying hard to defend your integrity, and this blown out of proportion on Reddit, but your preference towards AMD is quite clear. At this point you could use DLSS Balanced vs FSR Quality (and DLSS sometimes is even better) and Nvidia performance will be better… DLSS is a selling point, you can’t deny it. And every Nvidia card must be evaluated with DLSS
 
I'm curious. With many people still using 1st and 2nd generation RTX cards - say the RTX 2070 Super or RTX 3070 / Ti - does this quality difference between DLSS and FSR still hold true? I'm thinking that 1st and 2nd gen DLSS probably gives the same graphics quality, but at a lower framerate. That's just an uneducated guess.
DLSS image quality doesn’t change generation to generation. Only performance
 
The market disagree with you.
For example consumers chose to buy Freesync monitors, instead of expensive GSync which brings barely or nothing more than what AMD is offering. Oh, wait, Gsync monitors brought something more than Freesync monitors, noisy coolers and higher prices :)
How is that relevant?
 
Competition is what drives the prices down. A truth hard to chew for all of the fanboys on either side.
So instead of sticking to your guns, stay calm and try to accept the facts.
You don't have to take my words at face value, just look at the history. Compare Intel's generational "advancements" before Ryzen and after.
If you care about good products available on acceptable prices, you need to accept the competition and not turn it into a matter of life and death fanboy discussion.
TechSpot did a great job once again.
 
I watched Steve's YouTube video about this "controversy" too.
I think that Nvidia PR and its fanboys base are doing too much noise for nothing.
Regardless, in 2-3 years, Nvidia DLSS will be as obsolete as is their defunct GSync.
AMD FSR and especially the new FSR3 approach is better for consumers and game developers.
AMD has blown another nail in Nvidia's DLSS coffin, and Nvidia knows it.
Hope that all of this will determine both Nvidia and AMD to lower their videocards prices, or to bring more performance/price instead of raising the price with 50-100% more for a 20-25% more performance.

" in 2-3 years, Nvidia DLSS will be as obsolete as is their defunct GSync."

Everything proprietary will ultimately obsolete. The difference between AMD and NVIDIA is that NVIDIA don't think long term, they want instant gratification like addicts who don't think about the consequences of their actions.
 
Benchmarking is a strange science. Because it's true purpose for most people isn't to compare relative performance, it's to have the highest score. Or lowest if that's the superior quantifier. So depending on a person's bias they'll argue for the methodology that shows their preferences in the best light. Some manufactures will even try to influence the scores because they know some consumers pay more heed to them then they should.

But for me what's more important for benchmarking is transparency and constancy. I want to know how the numbers were arrived at, and that each new benchmarking session will be faithfully comparable to past ones. So in this case I have no problem with how these benchmarks were arrived at. More importantly only a fool would use a single set of benchmarks to base a purchasing decision on.

Which leads me into my secondary topic. Reddit...

I've been a Reddit user for close to 10 years now, and for most of those 10 years I've had a love-hate relationship with the service. I love it enough to actually support it financially. It can have a great community in certain subs with thoughtful and well reasoned discussions. Of all the social media platforms it's my default one. Hell other than here, it's my only one.

But in the end it is a social media platform, with few restrictions on user discourse. This means that sometimes you'll have to deal with the unsavory side of the www. Some people are quite frankly *****s, but they have their right to be heard like anyone else, as long as they aren't spewing falsehoods. Due to this, it even reaches a point for me where I just stop using it for a few months to allow the vitriol to drain from my system.

Reddit's biggest selling point is also it's biggest Achilles heel. The purpose behind the voting system is to allow the community to decide what is and isn't important information. Useful information and thoughtful discourse is supposed to percolate upwards while anything that doesn't fit that criteria won't. That's the theory anyway. The big problem is the sorting mechanism is also a source of ranking, and many users forget it's intended purpose due to that. It's the major reason that I never consider Reddit to be anything more than a slightly more enlightened version of Facebook and it's brethren.

Because in the end, giving any credence to thoughts expressed on a social media platform beyond the fact that they're opinions more than fact is a rabbit hole that you shouldn't go down. Any social media site, including Reddit. You should know better. Would you react like this to a **** storm on Facebook? I would hope not. Let the donkeys bray, it's what they do. It's better for a person's sanity to simply ignore them.
 
" in 2-3 years, Nvidia DLSS will be as obsolete as is their defunct GSync."

Everything proprietary will ultimately obsolete. The difference between AMD and NVIDIA is that NVIDIA don't think long term, they want instant gratification like addicts who don't think about the consequences of their actions.

Well that is certainly an interesting opinion, what with Nvidia's annual research budget sitting pretty at $7.3 billion this year, but sure - who needs research if you can just 'think long term'.

Unless long term thinking is some euphemism for just faithfully copying their competitors' every feature, only about 2 years later and just a little worse. Which is pretty much been AMD's corporate DNA since, what, the AM9080 back in '75?
 
Thank goodness for FSR being open source and not AMD-specific. I got rather poor frame rates on Cyberpunk 2077 on my 11th gen Intel notebook (with Intel Xe graphics.) Don't know if FSR works on Xe in Windows but it does in Linux; frankly Cyberpunk 2077's a bit much to run on there (it's the catch-22 of it being integrated, since Cyberpunk 2077 is so CPU-intensive, with default settings the GPU clocks down 200-300mhz to let the CPU cores run at high speed; clock down the CPU cores and the GPU bumps up to full speed but it's CPU-limited instead). FSR boosts the framerate from "unplayable" to "low". (By far the most demanding game I've run on it, other games usually hit 60FPS+ on high or occasionally medium settings.)
 
Competition is what drives the prices down. A truth hard to chew for all of the fanboys on either side.
So instead of sticking to your guns, stay calm and try to accept the facts.
You don't have to take my words at face value, just look at the history. Compare Intel's generational "advancements" before Ryzen and after.
If you care about good products available on acceptable prices, you need to accept the competition and not turn it into a matter of life and death fanboy discussion.
TechSpot did a great job once again.
Competition doesn't seem to be having any impact on pricing. About the only cheap GPUs are Intel and for low end builds they may be OK but they have nothing for the high-end, yet. AMD went right along with Nvidia when pricing cards. They are all too expensive, or rather, they all cost more than we want to pay for that level of performance. Whether the prices are justified, well, that's hard to say for sure. There's the cost to build a GPU, market it, sell it and support it. What's an acceptable margin to make on GPUs? Do we even know what AMD and Nvidia are making? I think that data is obscured by both companies having more than just GPUs to sell.

The current crop of GPUs do bring new levels of performance, at a high price, but the mid-low end cards coming will only deliver performance we've already seen in previous generation cards. If they don't bring that level of performance at a lower price, what's the point?
 
These articles need a TLDR. Nobody in their right mind would run FSR on RTX cards for an extra 2fps. Thanks for the performance difference proof but it's so insignificant that nobody cares and will just use DLSS as they have been.

And lower sync rates. Freesync can't sync near as low as Gsync can. There's a reason it's free and nvidia charges.
Freesync is the same as Gsync when it comes to sync rates and has been for a
very long time now (it's no longer 2016). You can find Freesync monitors that support rates as low as 30Hz. It just depends on how good the panel is and the price of the monitor.

Here are a few quick examples I found in a few minutes on google:
Acer KG251QZ
AOC AGON AG352QCX
ASUS MG279Q
Nixeus NX-EDG27
 
OK chat, time for a welcomed joke, to help us see this debate from another vantage point.

A "fisherman" was selling his fishes in the market. Joe guy asked him the prices.
"Fisherman" said: $100 for 10 fishes. $1000 for 10 fish heads.
Why so expensive fish heads? Asked Joe.
Because fish head contains phosphorus, which makes you smarter. If you eat 10 fish heads, you'd turn into a genius.
Really? Asked Joe. OK, then I buy 10 fish heads.
While eating the 2nd fish head, Joe screamed:
Wait a minute, If I had bought 10 fishes, I should get 10 fish head too for only $100.
You cheated on me.
The "fisherman" answers: see that phosphorus is working as I told you, now you are more intelligent than before.

Nvidia PR is the "fisherman". Who is Joe? Spoiler: uninformed customer(s).
How many Joes fell in the DLSS2-3 net?

Back to reality, watching Nvidia business approach, I propose the "Nvidia conjecture":
DLSS version will increase in number with every "new" Nvidia generation videocards and will "work" only on that last generation.
Spoiler warning: DLSS4 will come and "work" only for the next gen Nvidia 5xxx videocards.
 
Last edited:
Nvidia PR is the "fisherman". Who is Joe? Spoiler: uninformed customer(s).
How many Joes fell in the DLSS2-3 net? Spoiler warning: DLSS4 will come only to next gen Nvidia 8xxx videocards.

Who is the other fisherman down the back alley selling 2 days old off-brand fish that at least some people consider to be just as good?
 
Probably most people screaming are those that already bought - If you are unfortunate to study marketing ,
You will learn about post purchase behaviour - some people read up on reviews more after they buy - strange but true.

But really if we were in the market to buy some $800 card - we watch 2 or 3 reviews at least - decide which parts are important to us.

The only thing I complain about with Steve's reviews is the metric frames per buck - as seems a great metric to buy - but useless frame is equal to one critical frame - ie frame number 1 vs frame number 357

We did have the serendipitous 4090 review that 4K gave 144 frames - kind of spot on - but with no buffer

taken to extreme a supercar seems much better value to cruise 3rd Avenue top speed/price ratio applied - when the lime scooter will be enough
 
Competition doesn't seem to be having any impact on pricing. About the only cheap GPUs are Intel and for low end builds they may be OK but they have nothing for the high-end, yet...
That's bullseye. AMD and Nvidia have the process polished already. Intel steps up with bargain prices and brings some money and bright people along the way, so if not with the next gen, then with the following one they will get the formula working. Companies will need to differentiate, so you get innovation on top of the pricing competition.
There you go, back on the topic.
 
You are all losers, Intel ARC is the future. ... Quadruple irony intended. I don't like the proprietary APIs that Nvidia does, I use nvidia for decades, but hairworks, gsync, RTX, DLSS, cuda are a pain for developers. The problem is that Nvidia is gaining ground in cuda and RTX and the competition is not there, I don't see nvidia using opensource anytime soon. Nvidia invested much more in APIs, libraries, drivers and software in general... and for the others to catch up it will need big investment. AMD has to make faster silicon and spend billions on the 'software' side at the same time, it won't be easy but at least nvidia looks completely out of touch. Maybe as Intel was a few years ago ?
In the corporate space cuda is miles ahead of opencl.
 
I refute the notion DLSS looks a lot better other than at 1080p. At 1440p and 4K they are incredibly close and the only thing I can clearly see DLSS does better is thin objects like wire fences etc.

If AMD makes FSR leverage tensor cores on Nvidia for even better performance, even rusted on Nvidia developers would change over to FSR.

I hope ultimately we only have one open source upscaling approach and if Nvidia want to contribute something positive to that they are more than welcome.
 
Back