Nvidia DLSS vs. AMD FSR Performance Compared: Have Reddit Users Exposed Steve?

Freesync is the same as Gsync when it comes to sync rates and has been for a
very long time now (it's no longer 2016). You can find Freesync monitors that support rates as low as 30Hz. It just depends on how good the panel is and the price of the monitor.

Here are a few quick examples I found in a few minutes on google:
Acer KG251QZ
AOC AGON AG352QCX
ASUS MG279Q
Nixeus NX-EDG27
This is true for "regular" Gsync, but Gsync Ultimate goes all the way down to 1HZ. Any dip below 30HZ is still covered.
 
This is true for "regular" Gsync, but Gsync Ultimate goes all the way down to 1HZ. Any dip below 30HZ is still covered.
Hi owner of PG32UGX here with GSYNC ultimate module. Screen flickers at low FPS exclusively below 30fps. Lower sync rate is not a feature. You must not have used gysnc ultimate below 30 fps. Because you will get screen flicker.

This has been known for years and still not fixed.

2014:https://techreport.com/news/27449/g-sync-monitors-flicker-in-some-games—and-heres-why/

2018, with updates from users in 2020
https://www.nvidia.com/en-us/geforc...rivers/13/265231/g-sync-flicker-fix-must-see/


2021:
https://forums.flightsimulator.com/t/g-sync-causes-subtle-yet-rapid-flickering/366615
 
Hi owner of PG32UGX here with GSYNC ultimate module. Screen flickers at low FPS exclusively below 30fps. Lower sync rate is not a feature. You must not have used gysnc ultimate below 30 fps. Because you will get screen flicker.

This has been known for years and still not fixed.

2014:https://techreport.com/news/27449/g-sync-monitors-flicker-in-some-games—and-heres-why/

2018, with updates from users in 2020
https://www.nvidia.com/en-us/geforc...rivers/13/265231/g-sync-flicker-fix-must-see/


2021:
https://forums.flightsimulator.com/t/g-sync-causes-subtle-yet-rapid-flickering/366615
Well, I just tested it myself.

I fired up Deathloop and limited it to 20FPS through RivaTuner. I confirmed it was running at that rate in the onscreen display using RivaTuner/HWiNFO64.

Obviously it didn't look very smooth, BUT there was zero flicker. Perhaps this is more of an issue with specific monitors and their ability to display low fps, rather than the Gsync module itself. So instead of stating "Because you will get screen flicker." it would be more accurate to say you MAY get screen flicker, as I saw none.

Forums are not a reliable source to base an argument on. I searched for this issue and could not find any mention of it from respected (and hence, technically knowledgeable) websites, such as TechSpot, Tom's Hardware, or Display Ninja.

Maybe it's time for a GPU upgrade - who wants to game at under 30FPS anyway?
 
Testing with any upscaler never made sense as the default.
Especially considering overclocking wasn't entertained which has a far better argument, but I'm not advocating for either. I'm just glad the upscaler issue has been dealt with.
 
The point is, you can't please everyone. The moment someone published some article that makes 1 party worst than the other, there will be people that will cry foul. Visual qualities aside, impression is that both DLSS and FSR performs quite close. And back to visuals, real gamers don't pixel peep. Real gamers enjoys the game. I can play a 16bit Chrono Trigger and still think it is a great game despite the ancient graphics.
 
This is true for "regular" Gsync, but Gsync Ultimate goes all the way down to 1HZ. Any dip below 30HZ is still covered.
Any dip below 30Hz is covered by running the screen in multiples of that framerate. Gsync Ultimate, just like Freesync Premium, is just a certification that mostly deals with the quality of HDR and refresh rates. Freesync Premium also includes LFC (Low Framerate Compensation) for when the FPS is sub the advertised minimum.

The advertised "1Hz" on some monitors is just marketing.
 
Last edited:
I genuinely and sincerely wish I had so little to worry about in life that I could get triggered over a f**king graphics card benchmark. Some people need to get a life. Well done Techspot for calling it out.
People like to have confirmation that they made a good purchase also they like to brag about it. I must confess that this happens to me too from time to time :)
 
And it just makes sense. They both render the frames at the same base resolution, so unless there is a large time cost difference in the actual up scaling process, you’d expect very similar performance, if not necessarily image quality.

The largest part of the frame generation process, by far, is still just generating the frame to be upscaled. And that frame is the same, is the same, is the same.
 
Well, I just tested it myself.

I fired up Deathloop and limited it to 20FPS through RivaTuner. I confirmed it was running at that rate in the onscreen display using RivaTuner/HWiNFO64.

Obviously it didn't look very smooth, BUT there was zero flicker. Perhaps this is more of an issue with specific monitors and their ability to display low fps, rather than the Gsync module itself. So instead of stating "Because you will get screen flicker." it would be more accurate to say you MAY get screen flicker, as I saw none.

Forums are not a reliable source to base an argument on. I searched for this issue and could not find any mention of it from respected (and hence, technically knowledgeable) websites, such as TechSpot, Tom's Hardware, or Display Ninja.

Maybe it's time for a GPU upgrade - who wants to game at under 30FPS anyway?
I don't game at 30 fps.

Usually it's load screens like Total Warhammer 3 where the loading screens are only 6-20 fps tons of flicker.

Also you didnt search very hard.

 
I like to see image quality comparison between DLSS balanced mode and FSR quality mode

If DLSS is better than FSR at same mode then that means you can lower mode quality on DLSS
 
I refute the notion DLSS looks a lot better other than at 1080p. At 1440p and 4K they are incredibly close and the only thing I can clearly see DLSS does better is thin objects like wire fences etc.

At 1440p and 4K, you can drop DLSS to lower mode (balanced or performance) and still not see big difference. FSR will not look as good in lower modes
 
I don't game at 30 fps.

Usually it's load screens like Total Warhammer 3 where the loading screens are only 6-20 fps tons of flicker.

Also you didnt search very hard.

Again, that's some guy in a forum. Show me a link to a REPUTABLE tech site where the STAFF addresses this "well known issue' in an actual article.

Besides, I just told you that I tested it myself and saw no flicker. Do you really think you're going to convince me otherwise? Your statement is wrong. One MAY see flickering, but it's not a guarantee as you are implying. I just proved it.
 
Again, that's some guy in a forum. Show me a link to a REPUTABLE tech site where the STAFF addresses this "well known issue' in an actual article.

Besides, I just told you that I tested it myself and saw no flicker. Do you really think you're going to convince me otherwise? Your statement is wrong. One MAY see flickering, but it's not a guarantee as you are implying. I just proved it.
Run Total War: Warhammer 3 campaign benchmark on ultra 4k if you want to reproduce.
Weird how you don't believe end users report problems with Nvidia but I bet you're the first to to say forum post are definitively proof AMD have bad drivers.
 
Steve, you can let the sarcasm fly the morons will watch your videos regardless. What are they going to do actually see a therapist or take out their angst on a hardware reviewer.
 
Last edited:
I love the tone of this article.
Go-get-'em, Steve!

I mean; yes, for christ' sake.
Why all the friction?
Journalism is a profession, whereas tweeting is merely a hobby. And a useless one at that. It's a Sign o' the Times that so many just stick to opinions, instead of checking for facts.

Which doesn't bode well for AI-integration, in the forseeable future... :-(
 
" as games using the added effects have continued to gain ground and become an important aspect of GPU performance. Despite that fact that every time we poll your interest in ray tracing, the data generally comes back looking very bleak for the technology – seems like most of you still don't find the visual upgrade significant enough to warrant the massive performance hit, but that's a separate matter."

Wow!! So, you have ventured a ludicrous opinion that your own data tells you it's wrong!!!

Well done!! This reads like something taken right from the underbelly of Reddit!!
 
Dead Space FSR probably isn't the best example of FSR these days. The performance and artifacts scream FSR v1. Even RSR looks and runs better.
 
The lady doth protest too much, methinks.

Outside of 'the community', that is the forums here, HUB Youtube comments and your Discord, a single odd benchmarking choice every once in a while would be perceived as little more than just that. But when arbitrary choices always seem to end up leaning towards one side, a narrative becomes visible. Responding with a snarky article won't prevent that - as you yourself note - so I'm not sure why you even feel you need to try.

Super snarky right, especially when the benchmark data does show that there can be significant differences, the conclusion is "I'm right, but I'll test with Native anyway" lol. I also think there's merit to showing another side of the coin, take this example.

Warhammer 40,000: Darktide runs slightly better when using DLSS, the RTX 4070 Ti was 4% faster at 1440p and 2% faster at 4K.

Sure that's true there's 2% when you measure 95fps vs 93fps - the end result fps with upscaling on. But what's also interesting here is to calculate the difference in respect to the input resolutions fps, not starting from zero, then we can see the actual difference in compute time each upscale is taking, and that difference in this example is 10%. Of course, there are swings in the other direction too.

Personally I think Steve wanted to test with FSR only because it brings him physical pain and discomfort every time he admits DLSS looks better.
 
Back