Nvidia DLSS 2 vs Intel XeSS vs AMD FSR 2 on Nvidia GeForce and AMD Radeon GPUs

I disagree. I could buy a Mercedes and a 4090 if I wanted to but there is a mindset to building wealth. Just because you have money doesn't mean you have money to spend.
For example, you buy a 3060 and go crazy with dlss 2, with ghosting, etc. Better save some money and buy a 3070, and have more real native fps and not that AI invention. OK, hopefully you understand me better now.
 
For example, you buy a 3060 and go crazy with dlss 2, with ghosting, etc. Better save some money and buy a 3070, and have more real native fps and not that AI invention. OK, hopefully you understand me better now.
okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....

Make a budget and stick with it or you'll end up broke
 
okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....

Make a budget and stick with it or you'll end up broke


To me, it is about matching your GPU to your monitor. Why set an artificial amount... instead of an actual goal, or need..?

Why spend $1500 on a monitor and have a $500 gpu..? Or vice-versa...
 
To me, it is about matching your GPU to your monitor. Why set an artificial amount... instead of an actual goal, or need..?

Why spend $1500 on a monitor and have a $500 gpu..? Or vice-versa...
I use a 4k,65 inch TV as a monitor. I won't post numbers, but I spent way more than $1500 on it.

I bought it with the idea of "future proofing" in mind. I'll have this display for 10 years, videocards are replaceable
 
I use a 4k,65 inch TV as a monitor. I won't post numbers, but I spent way more than $1500 on it.

I bought it with the idea of "future proofing" in mind. I'll have this display for 10 years, videocards are replaceable
Right, that is why I said u have to set a goal or have a need.

So any HDMI 2.1 video card will most likely serve your need... but if your goal is for 120Hz, then 3080/6900 is your starting criteria.
 
okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....

Make a budget and stick with it or you'll end up broke
I am not speaking about mean people, obviously. To them everything is expensive.
 
Using on the 1st screenshot dlss and FSR in quality mod but Xess on Balanced is bordeline dishonest, sry but I don't think it's fair... even if not important for most of the informed ppl it mislead quite a part of your audience, and it's because I have a great respect for your website that I take the time to tell you this, even if you need to scroll down to see the full test, it's still the "front page" image, and for a lot of ppl, they don't even bother going down or read the article and you know it. I would have understood to make the correlation to FPS on a multi-player title but for what is primarly a solo player game, I think image quality is 1st, but it's only my humble opinion

The main reason you enable one of these upscalers is to get a performance gain. What's the point of Xess Quality mode looking as good as DLSS or FSR's if it's gonna be slower? When they perform the same, then they could compare Quality to Quality.
 
The main reason you enable one of these upscalers is to get a performance gain. What's the point of Xess Quality mode looking as good as DLSS or FSR's if it's gonna be slower? When they perform the same, then they could compare Quality to Quality.
like I said I'm with you for a mulltiplayer game, on a solo, graphic quality come 1st (if I got at least 60fps) but it's a personnal thing
 
To me, it is about matching your GPU to your monitor. Why set an artificial amount... instead of an actual goal, or need..?

Why spend $1500 on a monitor and have a $500 gpu..? Or vice-versa...
The thing is, how much is too much (even if you can afford it)?

Monitors or TVs are much cheaper than before and GPUs due to mining went higher and stayed there, artificially. So if CPU makers also go on the same train, soon you'll need to spend $1500 for each part ($1500 CPU + $1500 gpu + $1500 screen,...).

People that set goals instead of a price are the loving case from companies:
- before tech evolved so that for the same $500 you would get 4 years later 2x as much performance (this is just an example), the same as when I bought my first 1080p lcd I paid the same amount as now for my first 4K QD-OLED 120 Hz;
- now (since mining / scalpers started a couple of years ago) they give you 100% more performance for a huge price increase.

So "setting " goals whatever the price makes companies increase the price.
 
The thing is, how much is too much (even if you can afford it)?

Monitors or TVs are much cheaper than before and GPUs due to mining went higher and stayed there, artificially. So if CPU makers also go on the same train, soon you'll need to spend $1500 for each part ($1500 CPU + $1500 gpu + $1500 screen,...).

People that set goals instead of a price are the loving case from companies:
- before tech evolved so that for the same $500 you would get 4 years later 2x as much performance (this is just an example), the same as when I bought my first 1080p lcd I paid the same amount as now for my first 4K QD-OLED 120 Hz;
- now (since mining / scalpers started a couple of years ago) they give you 100% more performance for a huge price increase.

So "setting " goals whatever the price makes companies increase the price.

How much is too much...?

Is based on one thing, how much performance do you need. If you play single player games that you can save progress... and just want eye candy and to be able to say you play with all the glizzy turned up... performance really isn't a need, it is a WANT.

If you play competitive e-sports... then glizzy doesn't matter and frames do, that is a need/requirement.

So you BUILD your machine for that purpose...



You have stated you play on a TV, thus even a Vega64 can play 4k games on a cheap 4k TV. Which coincidentally, you don't even need to buy new GPU and a used $500 used 6900/3090 would be a massive upgrade to your 1080. (ie: Newest video cards would serve zero purpose for you).
 
How much is too much...?

Is based on one thing, how much performance do you need. [...] So you BUILD your machine for that purpose... You have stated you play on a TV, thus even a Vega64 can play 4k games on a cheap 4k TV.
We & you are talking about games, so you give a wrong message of "need". There is no "need" when playing games but desire / lust 😉 so, if you are buying hardware because you work with 3D, CAD, video editing, etc. then you need it. Gaming (unless you work making or playing games) is not something that people need.
 
I just tested all this tech in Gotham Knights and XeSS, FSR and DLSS are just making the image "dirty". Pixel clarity is lost, artefacts and some other strange issues.
Like someone say before 1:1 pixel density on LCD is recommended.
If your GPU don't have enough horsepower just move up to another card that can drive native resolution at the FPS count desired.
I have tested also the "ray tracing" for the first time, it's crap.
All looks surreal and oversaturated. How on earth the pavement is a mirror in real life?
 
We & you are talking about games, so you give a wrong message of "need". There is no "need" when playing games but desire / lust 😉 so, if you are buying hardware because you work with 3D, CAD, video editing, etc. then you need it. Gaming (unless you work making or playing games) is not something that people need.

Some people buy a car to just get to work.... some by a second car to race on the weekend, because it is their hobby....

Some people build a computer for non-competitive games, some people build their rigs for competitive play.


If you work on a computer that isn't gaming, or playing.
 
Some people buy a car to just get to work.... some by a second car to race on the weekend, because it is their hobby....

Some people build a computer for non-competitive games, some people build their rigs for competitive play.
I understand your point but I'm on the point of view that gaming was for most people (2 or 3 minimum wages European average for a gaming rig that would last sobre years), now is just for some OR you need a credit for it...
 
RTX 4000-s are a good performance upgrade, but unfortunately nVidia went all in with its DLSS 3, which tests showed is a useless gimmick, because it creates too many motion artifacts, enough to steer away from it.

DLSS 3 is only worth it if you have over 45-50 FPS to start with. At that framerate the fake frame artefacts are imperceptible unless you're really trying to find a scenario where you can kind of notice them if you struggle. So if you want 100 FPS smoothness but your GPU can only render 50 it's useful, but if you want 30-60 FPS but your GPU can only handle 15-30 you're going to have a bad time with artefacts and increased latency.

The artefacts are also less noticeable when using a mouse as opposed to a controller because the camera movements are not as smooth so you're less likely to notice the ghosting effects.

In my experience, frame generation is also useful for visually mitigating micro-stutters caused by asset loading because they're being offset by the interpolated frames. They're still noticeable, but not as much.

I've tested it on several games and so far I haven't found one where I'd rather turn DLSS 3 off. It's not groundbreaking, but it's a nice feature when used properly.
 
DLSS 3 is only worth it if you have over 45-50 FPS to start with. At that framerate the fake frame artefacts are imperceptible unless you're really trying to find a scenario where you can kind of notice them if you struggle. So if you want 100 FPS smoothness but your GPU can only render 50 it's useful, but if you want 30-60 FPS but your GPU can only handle 15-30 you're going to have a bad time with artefacts and increased latency.

The artefacts are also less noticeable when using a mouse as opposed to a controller because the camera movements are not as smooth so you're less likely to notice the ghosting effects.

In my experience, frame generation is also useful for visually mitigating micro-stutters caused by asset loading because they're being offset by the interpolated frames. They're still noticeable, but not as much.

I've tested it on several games and so far I haven't found one where I'd rather turn DLSS 3 off. It's not groundbreaking, but it's a nice feature when used properly.
From the 3 titles I tried, Darktide, Plagues tale requiem and Hogwarts, the only one that was actually a good experience was Hogwarts in terms of latency and visual compromise. I believe this is the best implementation so far making me think that it will improve as time goes on.
 
From the 3 titles I tried, Darktide, Plagues tale requiem and Hogwarts, the only one that was actually a good experience was Hogwarts in terms of latency and visual compromise. I believe this is the best implementation so far making me think that it will improve as time goes on.
I have a 4070ti, playing on a 120Hz TV at 1440p ultra settings with RT on. Not 4k because from 4 meters away 4k is a huge waste of FPS. With frame generation I can guarantee 120FPS in all games with this setup.

The only real issue with frame gen for me was the screen tearing. In Portal with RT, Control and Plague Tale Requiem, enabling VSync in Nvidia Control Panel fixed it without introducing too much latency, but in Cyberpunk 2077 I'm getting 100ms+ latency with VSync, which is a big deal-breaker. Setting the frame cap to 58 didn't work either for some reason, it also introduced too much latency. I managed to get it under control by forcing DLSS2 to run in quality mode, bringing the FPS just slightly under 120 90% of the time. I hope NVidia fixes VSync with frame gen because it would benefit a lot from it on non G-Sync compatible displays.

I've yet to try Darktide and maybe Hogwarts, although the latter seems to have a very disappointing RT implementation and poor optimization.
I also tested The Witcher 3 with RT. Latency is an extremely big issue with frame gen there, even without VSync. That game has high latency even on lower resolution without DLSS or frame gen, I think it's just broken because it crashes a lot with RT on.
 
Last edited:
I have a 4070ti, playing on a 120Hz TV at 1440p ultra settings with RT on. Not 4k because from 4 meters away 4k is a huge waste of FPS. With frame generation I can guarantee 120FPS in all games with this setup.

The only real issue with frame gen for me was the screen tearing. In Control and Plague Tale Requiem, enabling VSync in Nvidia Control Panel fixed it without introducing too much latency, but in Cyberpunk 2077 I'm getting 100ms+ latency with VSync, which is a big deal-breaker. Setting the frame cap to 58 didn't work either for some reason, it also introduced too much latency. I managed to get it under control by forcing DLSS2 to run in quality mode, bringing the FPS just slightly under 120 90% of the time. I hope NVidia fixes VSync with frame gen because it would benefit a lot from it on non G-Sync compatible displays.

I've yet to try Darktide and maybe Hogwarts, although the latter seems to have a very disappointing RT implementation and poor optimization.
Yes Hogwarts at native the performance is all over the place and hardware under utilized at 4k ultra settings and rt on dlss set to quality with frame generation I get an average of 120 fps with dips into the mid 90s. The game is still pretty and very playable with dlss3 although native 4k with Nvidia dlaa anti aliasing is slightly superior in image quality at the cost of frame variance ranging 40 to 90 fps.
Update it seems initially when playing at native frame generation is on be default which makes sense based on the performance I was experiencing.
 
Last edited:
Back