Comparing Nvidia DLSS vs AMD FSR in the same games: Round 1

And mark my words FSR will surpass DLSS in the number of games it will be implemented on very soon. FSR will be in almost every game, but DLSS won't - no matter how much money nvidia will throw at this problem.

Problem is, FSR is not locked to AMD Card. So at the end of the days, RTX GPU will be even more sought after because you have access to both DLSS and FSR.
 
That's not the reason he's upset. He is pro AMD and in the video the conclusion is DLSS is better, although in some situations they are very close.
Why would anyone be upset ? I think it was the general consensus that DLSS 2 would be better from the start. I think most of us actually expected lower performance.

As long as FSR is good enough - and according to the video it‘s almost as good at 1440p and 4k in the highest quality mode - all is fine. It neither requires additional transistors nor anyone buying a new card.

Not sure why this has to turn into an e-pen measuring contest.

Thinking in car terms, it‘s a bit like boosting that an expensive full EFI+turbo conversion kit gives slightly better results than a new spark plug.
 
I would take this as a win for FSR, considering nvidia is expensing vast amounts of resource into something that after two generations is only marginally better than the competition.

How much is it costing them to support DLSS in the backend, all the A.I training etc? FSR seems vastly more efficient.

The real value of the tensor cores would and should be in applications beyond DLSS, what those are I do not know, but it seems having these specialist components and only using them to upscale (in gaming GPUs) is selling them short somewhat.
 
Problem is, FSR is not locked to AMD Card. So at the end of the days, RTX GPU will be even more sought after because you have access to both DLSS and FSR.
Wrong!

Some people will see this as you say (those that like to pay the nvidia tax), but also some will see it as I do: "I no longer have to buy an nvidia GPU to get the same tech. So F U Jensen and all you BS and mocking gamers in the last years, you don't deserve my money anymore!"


I can now easily buy an AMD GPU (or even Intel DG2) and I'm not "forced" to give money to those scums, because nvidia has such a little advantage now (and soon it will be even smaller) that it's irrelevant and I don't care.

This is only FSR 1.0. FSR has a much larger scale to grow and get better, whereas DLSS is almost as good as it can be. I does not have much more room to grow...
 
Why don't they compare it to DLSS 1.0 and see how much better AMD has done with its first attempt.
No, no, no, no.

These websites cannot do a fair thing like that, especially when the nv!diots are running wild like the apple rabid cult members.

Especially knowing that they would see their beloved deity massacred.
 
Last edited by a moderator:
Don't worry the same hypocrites that cry DLSS is better will use FSR in all those games that will have FSR, but not DLSS. And they won't even admit it, or say thanks AMD.
I can already see nvidia marketing team re-labeling FSR as something they did, like the BS they pulled with Freesync, calling it G-Sync.

Those d!ck$ have no decency in them.
As a GTX 1080 owner, I give kudos to AMD for doing what nvidia did not want to do for us, their customers.
That is my main issue with giving any money to nvidia.

As a customer, they had zero issues in burning my @ss in the past, so I said enough and dumped their behinds.

Unless you have something that ties you to them, like a CUDA related work tool, I would suggest to dump them.
 
Problem is, FSR is not locked to AMD Card. So at the end of the days, RTX GPU will be even more sought after because you have access to both DLSS and FSR.
Nvidia cards could cure world hunger and I wouldnt give them a penny.

Because you can guarantee that it will still come with some BS attached.

Nvidia customers should open their damned eyes and see that DLSS, like everything they release is simply a lock-in tool and as a consumer, we must resist anything that limit our options.

I wonder when we the consumers stopped caring about ourselves and start blindly worshiping these stupid corporations?
 
The best part is that even though FSR will surpass the adoption rate of DLSS at one point (soon?), so that is very important, also very important is the fact that more and more versions of this tech come out: MS has one in the works, Intel besides adopting FSR has their own version (or at least an accelerated version) and lots of game engines have it - like UE5 and others.

So DLSS is looking more and more redundant and irrelevant as time passes...

Of course you have to have an open mind to see this and not be a blind fanboi drunk on nvidia's BS kool aid.
 
Are you insinuating that Techspot might be supporting Geforce more than Radeon? :D
You'd have to be crazy to think that... Steve and the boys spent years now capping on rtx and going out of their way to hide any hint of it being good.
 
The best part is that even though FSR will surpass the adoption rate of DLSS at one point (soon?), so that is very important, also very important is the fact that more and more versions of this tech come out: MS has one in the works, Intel besides adopting FSR has their own version (or at least an accelerated version) and lots of game engines have it - like UE5 and others.

So DLSS is looking more and more redundant and irrelevant as time passes...

Of course you have to have an open mind to see this and not be a blind fanboi drunk on nvidia's BS kool aid.
That's fine and I'm happy to switch over when the built in / open source version is superior and has more usage in more games.... Until then I'm not going to go out of my way to pick something inferior if dlss is available.

G-Sync was in the same boat for. Long time and once the competition caught up enough they did the right thing and adopted the open standards as well.

I now have a "G-Sync" lg OLED but it's really just hdmi 2.1 vrr and Im happy with its results and don't feel the need for a "real" gsync display.
 
Don't worry the same hypocrites that cry DLSS is better will use FSR in all those games that will have FSR, but not DLSS. And they won't even admit it, or say thanks AMD.

And mark my words FSR will surpass DLSS in the number of games it will be implemented on very soon. FSR will be in almost every game, but DLSS won't - no matter how much money nvidia will throw at this problem.

Can't people see how fast the official and unofficial FSR adoption rate is? DLSS won't be able to keep up at this rate.

As a GTX 1080 owner, I give kudos to AMD for doing what nvidia did not want to do for us, their customers. GJ AMD, I'm waiting for lower prices for GPUs so I can buy one from you.
You're attitude here is kind of weak. Plenty of us are happy to use FSR when it's either (a the better choice, or b) the only option.

But that doesn't mean I'm gonna NOT use the superior tech when it's available.

FSR doesn't get an automatic "win" just cause they make it open source or run on outdated hardware.

I buy the best tech because I want the best experience and until amd can make FSR superior AND in more games I'm obviously going to continue to use the tech that's enabled by my purchase of top tier gear.

I have no problem using the "amd way" (as well as gpu) of amd when it's superior.

I was an ati / amd radeon user for 14 years but I switched to Nvidia for first time in 2013 when I finally reached a point in my life where I could afford and use "the best" and amd wasn't competing at that level.

They've yet to reach a point to me where they are "the best" (with 4k/120 with all the bells and whistles of rtx) but if they ever do offer the best gpu I'll gladly jump back over to their side.

Having the opinion that dlss or rtx in general is "good" or "worth it" doesn't automatically make you some Nvidia fanboy.

Some of us just want the best can afford the best and won't settle for less.

 
This will do more to alleviate the inflated prices of video cards than anything else that they could have done. It's very possible that FSR will literally double the lifespan of my RX 5700XT. I can live quite happily with that. :laughing:
 
The guy on the video is boring to see and hear.
These are the choices you get with HWU. Timmie Mercury isn't the best orator but he is clearly the brains behind the operation whilst Steve is a creepy, obnoxious, smarmy, sod but he's got a smoother delivery... He reminds me of the guy from the Expanse who got the sack for allegedly sexting under age girls he was meeting at conventions and I suspect Steve's career will have a similar ending.

They make excellent charts to pause and flick through though.
 
Last edited:
The guy on the video is boring to see and hear.
I've personally never found Tim boring... Hey, maybe that's why he grew the 'stache! :laughing:
Nope. I had GeForces 256, 2 MX400, 2 Pro, 4 MX440, FX5900XT, 670, 980 and 1080. My opinion is strictly related to my perception after watching that guy 25 minutes. They should put someone more charismatic, next time.
Geralt, I think that maybe it's just this video. Tim and Steve are the two halves of Hardware Unboxed and have been for many a year. Granted, Tim's not nearly as entertaining as Steve but that doesn't make him boring, maybe just boring in comparison. Tim does work very hard to keep a poker face when he does this because he wants to be professional. If you want entertaining though, nobody tops Steve Burke in that department. He's hilarious. Hardware Unboxed calls him "Long-hair Steve".
Fair enough.
First point still stands, make your own 'more entertaining' video then.
But that's just because I'm of the opinion that a tech analysis like this doesn't need to be 'entertaining', it's meant to be educational about the differences between these two techs that do similar things, and the benefits/disadvantages of both within these games that support both.
To be fair, Tim's not usually this wooden.
Did you jump into their analysis videos of the various settings in RDR2 or FH4, the impact each setting itself make on system usage (and image quality), and then how to optimize them for lower end machines complaining about how he didn't sing and dance enough to keep your entertained as well? Because as much as I found those videos useful and educational, they're far from entertaining. Because entertainment wasn't the point.
You're right, entertainment isn't the point but it probably wouldn't hurt to have a bit of fun with it like Steve Burke does on Gamers Nexus.
This isn't Linus Tech Tips (whom, frankly, I find insufferable).
Tell me about it. I call him the "Tech Justin Bieber" (although Linus has a LOT more personality than Bieber). It's a shame really because back in the day, Linus was REALLY good. Then one day he sold out and became the meme that he currently is.
Weird to see the comments a referendum on a guy who's been doing videos/content for ... forever?

The video was extremely informative and balanced. It would have been nice if the video was converted into a long form review (like what Techspot usually has) but it's better than nothing. Video + Summary = ain't bad.
I couldn't help but think that same thing. This is Tim "The 'Stache" Scheisser, also known simply as 'Stacheisser. He's been doing a fine job of this for many years. It seems odd that some people don't recognise him after all this time but I guess, it's not like he's Andre the Giant or someone like that. :laughing:
So true, can't even use DLSS on Nvidia's own cards that aren't even possible to buy in most cases even more so at MSRP.

Not at MSRP they don't exist to me
I have it even easier than you because nVidia cards don't exist to me unless I'm not paying a dime for it. :laughing:
Don't worry the same hypocrites that cry DLSS is better will use FSR in all those games that will have FSR, but not DLSS. And they won't even admit it, or say thanks AMD.
I couldn't agree more. They STILL haven't thanked AMD for Mantle even though DirectX12 and VULKAN use it extensively. In fact, VULKAN really IS Mantle, but what can ya do, eh? :laughing:
I had gtx 960 2gb , rx 580 8gb and now I have rtx 2060 6gb. amd graphic card have bad driver optimization and haven't service like geforce experience , physicx ,... after many years. video encoder in amd gpu is very bad quality vs file size.I have 1080p monitor. for example in Marvel's Avengers game I play in max quality in 2k res with super res and use dlss balanced with above 60 htz and realy better quality than native 1080p and I am realy satisfy. I never buy amd gpu again if amd havent competitive services and technologies
This is one of the most confused, unknowledgeable and discombobulated posts that I have ever seen on Techspot. I didn't think that I'd ever see something this insane in my life but you did it on your first post! BRAVO!
 
The premise of the test stops NVIDIA from winning outright because "improve performance without a significant loss to visual quality" is NOT the only objective of the technology. The second objective of the tech is to display resolutions that the device is not capable of directly rendering. AMD's tech is not going to be able to achieve this the way it's doing things.
 
I found this video . this is really happened when using FSR. comparing between native 720p (no FSR) vs 1080p FSR quality (native 720p) vs 1440p FSR performance (native 720p)


u can see FSR didn't add any detail like DLSS because didn't have A.I hardware and A.I algorithm so comparing this 2 methods is wrong. FSR is just a upscaling algorithm. image quality of 1080p in all quality modes of FSR is realy bad and people like to use this option on low res monitor and cheap graphics card like rtx 2060 (I mean real msrp not Current msrp). it is good to have FSR for old gpu but it is really disappointing AMD RX 6000 series didn't have comparable Hardware to compete Nvidia. plus didn't have specific Ray tracing hardware so realy bad performance when using raytracing and price of their hardware is like nvidia.
 
DLSS Quality mode have better image quality than native resolution ( 720p 1080p 1440p 4k) in almost every scenario ( in Marvel's Avengers game hair are really good and better than native res but in Red Dead Redemption 2 hair is bad). with FSR u never have better image quality than native because FSR render game in lower than native res and just upscale with algorithm. this is good for old GPUs but not good for RX 6000 . I hope AMD Introduction real competitive hardware or just algorithm to make image better than native with boosting fps
 
Last edited:
In real scenario when u want to play latest AAA game like Cyberpunk 2077 with Ray Tracing u can't play 4k and have 60 fps with RTX 3090 so u must to use performance mode and in reality performance mode render game in native 1080p.
2 picture I used from article of techspot
https://static.techspot.com/articles-info/2165/images/F-14.jpg
https://static.techspot.com/articles-info/2165/images/F-17.jpg
u can compare fences in 2 picture and I must say FSR can't make this detail even in dream
AMD must introduce real technology for RX 7000 or future GPUs
link of article
https://www.techspot.com/article/2165-cyberpunk-dlss-ray-tracing-performance/
 
The guy on the video is boring to see and hear.

Dude were not here expecting golden globe acting awards, it's a PC benchmark site just be lucky it's not some 800lbs slob dude with a hairy chest while wearing a stained wife beater shirt. Don't like the guy then go somewhere else for reviews.
 
In real scenario when u want to play latest AAA game like Cyberpunk 2077 with Ray Tracing u can't play 4k and have 60 fps with RTX 3090 so u must to use performance mode and in reality performance mode render game in native 1080p.
2 picture I used from article of techspot
https://static.techspot.com/articles-info/2165/images/F-14.jpg
https://static.techspot.com/articles-info/2165/images/F-17.jpg
u can compare fences in 2 picture and I must say FSR can't make this detail even in dream
AMD must introduce real technology for RX 7000 or future GPUs
link of article
https://www.techspot.com/article/2165-cyberpunk-dlss-ray-tracing-performance/

You are missing the point....

All the GTX owners are thanking AMD for FSR. Nobody in the world cares what video card YOU BUY... but the people who make games, have notices FSR and are fully adopting it, because it just works.

DLSS doesn't just work... and requires a lot of money from nVidia and it takes months of working with the Game Developer to get it working right. Most Game Developers are making games for the PlayStation5 and XSX and are NOT making games for nVidia. That is why nVidia has to pay Project Red, to use DLSS and RTX in Cyberpunk2077 and why Cyberjunk was delayed.

 
You are missing the point....

All the GTX owners are thanking AMD for FSR. Nobody in the world cares what video card YOU BUY... but the people who make games, have notices FSR and are fully adopting it, because it just works.

DLSS doesn't just work... and requires a lot of money from nVidia and it takes months of working with the Game Developer to get it working right. Most Game Developers are making games for the PlayStation5 and XSX and are NOT making games for nVidia. That is why nVidia has to pay Project Red, to use DLSS and RTX in Cyberpunk2077 and why Cyberjunk was delayed.
I selling 20 years pc hardware and I love hardware. I sell amd cpu more than intel in this year because has better performance watage than intel and good for gaming.I am disappointed because FSR is not realy change any thing. I love to see amd is better than nvidia so nvidia force to sell their GPUs lower price than right now
FSR is a good thing and free and simple to use for developer but can't help in lower res than 1440p . this option hasn't image quality like dlss. u can search in youtube and see playing game with FSR render game in 720p didn't really better than native 720p. FSR with ultra quality in 4k res rendering game in 1660p ( 1660x2950=4.900.000 ~ 5 mil pixel).if u play in 28~32 inch 4k monitor realy can u recognize deference between 2160p and 1660p from 30cm distance of monitor or in image verses video can u see lots of deference? when u want find out best cpu bench mark u play 1080p or 4k to cpu show u has bottleneck or not?
if FSR is good why every body use in ultra quality in 1080p tell we see FSR is lower res than native and dint good as native. so what is deference between native rendering and same res rendered with FSR? what is benefit if image quality didn't change in lower than 1440p?
 
This is my thought:
there is a lot of article and news tell us A.I is future of computing. every body finally will force to buy new CPU and GPU for gaming. gtx 1080ti is realy good in this day but when lowest setting of a game became like xbox sx , time of gtx 1080ti become end. 2 or 3 years later maybe. finaly u most buy new graphic card. maybe u use adobe software or any software in future use A.I , deep learning or some think like that need A.I specific hardware. finally AMD must develope A.I Hardware and that point can make a technology like DLSS. but DLSS will be in 5 or 6 years old and evolved. I am disappointed from AMD didn't want develop hardware it must developed right now and people in 3 years later Regret buying GPUs that didn't have importatnt technologies but Nvidia introduce many years ago
 
Back