Nvidia DLSS 2 vs Intel XeSS vs AMD FSR 2 on Nvidia GeForce and AMD Radeon GPUs

RTX 4000-s are a good performance upgrade, but unfortunately nVidia went all in with its DLSS 3, which tests showed is a useless gimmick, because it creates too many motion artifacts, enough to steer away from it.
 
Last edited:
DLSS 2.0 still crushing it. Spiderman really shows up the weaknesses of the other two methods. The speed of camera and object movement is very high throughout the game and FSR2 struggles as a result with a lot of ghosting and blurring.
 
RTX 4000-s are a good performance upgrade, but unfortunately nVidia went all in with its DLSS 3, which tests showed is a useless gimmick, because it creates too many motion artifacts, enough to steer away from it.

I distinctly remember folks saying that about DLSS 1.0 but it seems to have developed into something of value in 2.0
 
Using on the 1st screenshot dlss and FSR in quality mod but Xess on Balanced is bordeline dishonest, sry but I don't think it's fair... even if not important for most of the informed ppl it mislead quite a part of your audience, and it's because I have a great respect for your website that I take the time to tell you this, even if you need to scroll down to see the full test, it's still the "front page" image, and for a lot of ppl, they don't even bother going down or read the article and you know it. I would have understood to make the correlation to FPS on a multi-player title but for what is primarly a solo player game, I think image quality is 1st, but it's only my humble opinion
 
Last edited:
This site has some of the best looking graphs. The entire layout and colors of the pages are a beautiful showing anyway. Thanks for the review.
 
I recently picked up a 6700xt and started playing around with it. A problem that I've had, which I haven't seen talked about, is studdering. Quality mode will give me what looks like an average of a 50% boost but the studdering makes it near unplayable
 
I recently picked up a 6700xt and started playing around with it. A problem that I've had, which I haven't seen talked about, is studdering. Quality mode will give me what looks like an average of a 50% boost but the studdering makes it near unplayable
What game are you talking about? What CPU do you have? Do you know what kind of FPS you are getting?
 
What game are you talking about? What CPU do you have? Do you know what kind of FPS you are getting?
I main just play EvE, ESO and AOE2:DE

ESO is the only one that supports FSR. I have an 1800X. It went, as recorded by AMD's software, from an average for 55FPS to 89FPS. However, the frame time went from 16 to 22ms.
 
I main just play EvE, ESO and AOE2:DE

ESO is the only one that supports FSR. I have an 1800X. It went, as recorded by AMD's software, from an average for 55FPS to 89FPS. However, the frame time went from 16 to 22ms.
I play EVE, I have a 6700xt too but ... a 5600x ... your problem is the 1800x... if you're expecting high framerate... IPC performance of 1st gen rizen is about Haswell gen' cpus... and you can have FSR ( at least 1.0 ) on EVERY games with "Magpie" ( Free on github) or "Loseless scaling" ( about 3 or 4€ on steam... )
 
I play EVE, I have a 6700xt too but ... a 5600x ... your problem is the 1800x... if you're expecting high framerate... IPC performance of 1st gen rizen is about Haswell gen' cpus... and you can have FSR ( at least 1.0 ) on EVERY games with "Magpie" ( Free on github) or "Loseless scaling" ( about 3 or 4€ on steam... )
I don't know why it would be very smooth with it off. My first thought was the 1800x, too. I decided to Uninstaller the AMD adrenilen software and that seemed to fix my problem. Apparently it plays weird when exiting out of programs and opening it back up. This is a problem common to the b350 motherboards when using the amd adrenile software. The 1800x still does very well with all the games I play, especially considering I play at 4k. I just got the 6700xt 3 days ago so I'm still experimenting with settings
 
I main just play EvE, ESO and AOE2:DE

ESO is the only one that supports FSR. I have an 1800X. It went, as recorded by AMD's software, from an average for 55FPS to 89FPS. However, the frame time went from 16 to 22ms.
It is not possible to have a frame time greater than 16.6ms if your fps is higher than 60. However you got your frame time numbers is not accurate.

The 1800x is likely holding you back. Get a used 3700x on ebay for $150.
 
It is not possible to have a frame time greater than 16.6ms if your fps is higher than 60. However you got your frame time numbers is not accurate.

The 1800x is likely holding you back. Get a used 3700x on ebay for $150.
I plan on getting a 5800x3d as it's compatible with my motherboard
 
RTX 4000-s are a good performance upgrade, but unfortunately nVidia went all in with its DLSS 3, which tests showed is a useless gimmick, because it creates too many motion artifacts, enough to steer away from it.
Some are giving dlss 3.0 benefit of the doubt in comparing dlss launch to dlss 1.0 in terms of eventually lead to dlss 2.0 and now 2.4.12.0.
Nvidia needs improve dlss 3.0 3 fold.
1) They need to mitigate image artifacts and some complaints of image softness vs dlss 2.0 set to quality or native resolution. I bet most people would settle for slight performance hit from generated frames as long as the image is stable with no artifacts.
2) Improve latency hit.
3) Resolve the gsync/vrr issue or make a hard frame cap at set max frame even for generated frames to mitigate taring.

I also hope RDNA 3 has something up its sleeve on improving fsr and leapfrog dlss.
 
I just can't stand those res tricks. [...] In my case, I prefer to grab a powerful card and that's it.

Of course, not only cards are cheap (sarcasm) but also everyone earns a lot to burn their money on a graphics card (sarcasm again).

If games can run faster giving the impression that the actual res is higher, I don't understand why not. Everyone would be happy to buy a 4090 if it costed $400 and would spend a normal amount of electricity. That's why a lot buy an integrated RDNA2 graphics device (gaming console Xbox and ps5 and steam deck) and call it a day.
 
Of course, not only cards are cheap (sarcasm) but also everyone earns a lot to burn their money on a graphics card (sarcasm again).

If games can run faster giving the impression that the actual res is higher, I don't understand why not. Everyone would be happy to buy a 4090 if it costed $400 and would spend a normal amount of electricity. That's why a lot buy an integrated RDNA2 graphics device (gaming console Xbox and ps5 and steam deck) and call it a day.
In theory, it looks great, but in practice there are artifacts, ghosting and instability. Save some money and buy a better card.
 
I recently picked up a 6700xt and started playing around with it. A problem that I've had, which I haven't seen talked about, is studdering. Quality mode will give me what looks like an average of a 50% boost but the studdering makes it near unplayable

Can you please tell me if you don't use FSR if you have any other issues with games (what games) and also what resolution you use. I'm planning to get a used 6750 or 6800 if the new cards are also power hungry and cost an arm and a leg.
I will use it on 1440p.
 
Can you please tell me if you don't use FSR if you have any other issues with games (what games) and also what resolution you use. I'm planning to get a used 6750 or 6800 if the new cards are also power hungry and cost an arm and a leg.
I will use it on 1440p.
I just picked up the card and I can tell you I'm thrilled with my switch from nVidia to AMD.

I play lots of older games at 4k, my gaming situation is very unique.

I managed to fix the stuttering issue, it's something to do with AMDs software suite and the b350 motherboards I'm still using so I simply Uninstalled AMDs software suite and that fixed the problem
 
Back