Nvidia claims a 50 percent framerate uplift in Monster Hunter: World with DLSS

Again, you fail to comprehend the scope of Deep Learning. It can be explained in the description of the technology and a couple YouTube videos, but I get it, you're impatient and can't be bothered. It's easier to just say it sucks, because it's not better than a little piece of software AMD cooked up on a weekend.

"scope of Deep Learning"? That's a lame excuse for "it doesn't work well now, but the tech says it shoud work well in the future." So I should now buy a product which can't deliver on one of it's distinctive features for a weak promise of "maybe" in the future?

Seriously, how many people are buying RTX cards for DLSS? Nobody. There are good reasons to buy RTX cards but DLSS is not one of them. It's OK to admit that DLSS sucks but maybe in the future it won't.

LOL on the AMD comment, though. Apparently all it takes for AMD to best Nvidia in a graphical fidelity feature is a weekend of coding as opposed to years of hardware design by Nvidia.
 
Umm, doesn't wine age better over time though?yep So DLSS may age great and actually be better over time but isn't that what DLSS designed to do to begin with?yep

AMDs sharpening, isn't ground breaking to begin with. Games could do that if they choose to, they just don't.

That's the point. I see nobody claiming AMD did anything groundbreaking. Instead AMD implemented a feature which nobody is doing and did it better than Nvidia's competing tech. Lots of people come up with good ideas but the ones who put the work into actually implementing them reap the rewards, which AMD is doing.

AMD has been smacked around by Intel and Nvidia for years, they aren't doing any smacking. They may get that hand warmed up but that's only cause itll end up smacking them in the face.

In the past, yes. Articles on this website and AMD's stock performance are data which argue against that opinion nowadays.

Anyone who falls for or takes a chance on any new tech should never feel they got screwed. It's your fault you dove head first in the poll with no water, don't cry about it cause you listened to the crowd/hype. Fool me once, shame on you, fool me twice, shame on me. People keep falling for things before they even realize that they fell for it.

I have no issue over the RTX cards and I am fully aware of their tech. I don't worry about all of those, I only card about how the card performs naturally. Yes you may end up paying for that card cause of the new tech, just how it goes. Nothing is free in this world.

I agree.
 
Problem is, it actually doesn't. Except Nvidia slides there is no ai improvements in titles. And anyway it would be easier to teach model separately at development and provide updates than leave it at end user pc.

Show me these slides.

Problem is, you have no idea how Deep Learning works. NVIDIA is using supercomputers to analyze the images to improve the technology.
AMD comes up with a little piece of software that is already ported to use with NVIDIA cards.
Did you know AMD's sharpening doesn't work with any DX11 games? Of course not. You don't do any research before you speak.
Do we need to know the full details of how it works when in reality it just doesn't work well? You are better off using sharpening tools as they give better results like the one provided by sweetfx.

There has been zero evidence that DLSS will get better in the future. All that NVIDIA has done was to throw in more sharpening to alleviate the blur that DLSS causes although it seems that no amount of sharpening will fix the oily effect and loss in details from trying to upscale.

As for DX11 titles, you don't need AMD to provide the sharpening tool, although they said they are working on DX11 support for it.

On a side-note, it seems someone ported the Radeon Image Sharpening CAS to ReShade and can be used on NVIDIA (although it still needs some work). It's not as fast as the driver version and it doesn't have scaling, but it works on all DX versions. This is why being open source is so important for PC gamers.
https://gist.github.com/martymcmodding/30304c4bffa6e2bd2eb59ff8bb09d135

not sure how legit these are since reddit users posted them:
https://I.imgur.com/SkfPCbT.jpg
https://imgsli.com/NDU1OA
 
Last edited:
That's the point. I see nobody claiming AMD did anything groundbreaking. Instead AMD implemented a feature which nobody is doing and did it better than Nvidia's competing tech. Lots of people come up with good ideas but the ones who put the work into actually implementing them reap the rewards, which AMD is doing.



In the past, yes. Articles on this website and AMD's stock performance are data which argue against that opinion nowadays.



I agree.
Of course their stock is going up, nowhere else for it to go when you've been at the bottom for so long.
I was saying it wasn't ground breaking, I didn't say anyone said it.

I don't use or need anything to process a image better. Maybe if I dealing with photos or editing but there is already software that can do that.
Ive learned that these companies never implement anything correct, especially when dealing with gaming. So I don't and likely wont ever care what they do unless it's something built into a game settings but even then most wont ever use or because Streamers wont use it, neither will a ton of others.
 
Do we need to know the full details of how it works when in reality it just doesn't work well? You are better off using sharpening tools as they give better results like the one provided by sweetfx.

There has been zero evidence that DLSS will get better in the future. All that NVIDIA has done was to throw in more sharpening to alleviate the blur that DLSS causes although it seems that no amount of sharpening will fix the oily effect and loss in details from trying to upscale.

As for DX11 titles, you don't need AMD to provide the sharpening tool, although they said they are working on DX11 support for it.

On a side-note, it seems someone ported the Radeon Image Sharpening CAS to ReShade and can be used on NVIDIA (although it still needs some work). It's not as fast as the driver version and it doesn't have scaling, but it works on all DX versions. This is why being open source is so important for PC gamers.
https://gist.github.com/martymcmodding/30304c4bffa6e2bd2eb59ff8bb09d135

not sure how legit these are since reddit users posted them:
https://I.imgur.com/SkfPCbT.jpg
https://imgsli.com/NDU1OA
AMD did NOT say they are working on it. They said only if demand is there. So for now, AMD is doing nothing for DX11 when it comes to Sharpening.
 
Last edited:
This comments section is pathetic. This really isn’t Nvidia vs AMD. No one is choosing either brand based on AMDs sharpening tech or Nvidias DLSS. For me, if you want 4K gaming you spend more and buy a 2080, a 2080 Super or a 2080Ti, Nvidia are the only company right now who can provide a reliable 4K experience, if you have a lot of cash that is. Especially since AMDs only viable 4K option - the Radeon VII is now discontinued. You don’t spend $350-$400 on a card and rely on either of these techs to give you a smooth 4K experience.

DLSS, whilst barely supported does allow you to get ray tracing at better framerates, this is it’s primary advantage to me, you can turn both ray tracing and DLSS on and ray tracing is a lot more viable, particularly at 4K. It gives you something AMD can’t, on the small amount of games that support it. However, AMDs tech works on any DX12 game, so whilst you don’t get playable ray tracing you do get something you can use anywhere.

I would say however, that Nvidia can probably implement a similar feature to AMDs sharpening tech with software updates. Whereas we know we won’t see ray tracing or DLSS delivered on AMD cards. This article mentions a sharpening slider added to the game, for all we know this might give you the same results. Il wait for Techspot to give us an answer to this.

This is all about what you want, what you’re using the cards for. I think if you’re buying something like a 2060S or a 5700 then AMDs solution is going to give you more. But if you’re buying a higher end Nvidia card like a 2070S or greater and playing ray tracing supported games then DLSS is going to do more.
 
I can imagine that one might be able to increase the framerate of a game by switching from running it at 4K to running it at 1K; so a feature that subsituted upscaling for actually drawing all those extra pixels would allow a game to run faster but look almost as pretty. Thus, I have no reason to doubt that it will work as advertised.
 
AMD did NOT say they are working on it. They said only if demand is there. So for now, AMD is doing nothing for DX11 when it comes to Sharpening.
As if the demand isn't there. AMD basically has a gold mine in front of them and you are telling me that they won't do DX11 when the community already managed to enable DX11 when using CAS after they ported it to ReShade?

BTW Open Source FTW!
As soon as AMD released the code people started to work with it. ReShade doesn't have FP16 and Rapid Packed Math support to make it better performance wise, but it is a good start and can be used in DX11. You can use it with Nvidia GPUs now too. DO you want to know what isn't open source? The crap Nvidia does.
 
Are you for real here? DLSS drops the render resolution to 2560x1440 then upscale it 4K, that is were you get this magical performance gains.....
Oh well if the scaling tech is good, there is nothing wrong about using it. I have been using MadVR's more advanced upscaling algorithm when watching 1080P movies on my 75 inch 4K TV with great results, yes, it is very magical. Of course it won't beat native 4K, but, why not I mean, there is no real 4K capable video card anyways.
 
Oh well if the scaling tech is good, there is nothing wrong about using it. I have been using MadVR's more advanced upscaling algorithm when watching 1080P movies on my 75 inch 4K TV with great results, yes, it is very magical. Of course it won't beat native 4K, but, why not I mean, there is no real 4K capable video card anyways.
Id say there is, the 2080Ti will handle basically any game at 4K60hz running with max settings (not including ray tracing of course).
 
Are you for real here? DLSS drops the render resolution to 2560x1440 then upscale it 4K, that is were you get this magical performance gains.....
Oh well if the scaling tech is good, there is nothing wrong about using it. I have been using MadVR's more advanced upscaling algorithm when watching 1080P movies on my 75 inch 4K TV with great results, yes, it is very magical. Of course it won't beat native 4K, but, why not I mean, there is no real 4K capable video card anyways.

I dont actually mind DLSS is when some people making it out to be some tech wonder that annoys me. I game at 4K with Radeon VII and I need to use resolution scaling in games like Division 2 to get 4K 60fps I drop the resolution to 75% or 85% and OC the card, I wish RIS worked with my card :)
 
Id say there is, the 2080Ti will handle basically any game at 4K60hz running with max settings (not including ray tracing of course).

Nah, 2080Ti's still dip under 60fps mark in many cases... The only real 4K solution so far is unfortunately with NVLink or CrossFire... You can get dual Vega64 or 1080 for much less than a 2080Ti and have that real above 60 fps in those games that dip under 60fps mark under 2080Ti. But then again the compatibility troubleshooting and tweaking required are just not worth it, as many considered too... That's why I still consider no real 4K gaming video card solution at the moment.
 
Nah, 2080Ti's still dip under 60fps mark in many cases... The only real 4K solution so far is unfortunately with NVLink or CrossFire... You can get dual Vega64 or 1080 for much less than a 2080Ti and have that real above 60 fps in those games that dip under 60fps mark under 2080Ti. But then again the compatibility troubleshooting and tweaking required are just not worth it, as many considered too... That's why I still consider no real 4K gaming video card solution at the moment.
I have a 2080Ti and a 4K monitor, handles everything I can throw at it (ray tracing off). Which games have you found dip under 60?
 
I have a 2080Ti and a 4K monitor, handles everything I can throw at it (ray tracing off). Which games have you found dip under 60?

You must be the guy who only looks at maximum frame rate... please, quit it. no one look at maximum frame rate OK, anyone with the ability to Google can tell you are a novice.
 
You must be the guy who only looks at maximum frame rate... please, quit it. no one look at maximum frame rate OK, anyone with the ability to Google can tell you are a novice.
I did just google it actually. Practically every review states the opposite that you have and state the 2080Ti is the first properly capable 4K card. There are a small handful of games that won’t run at a consistent 60fps on it but I haven’t played any of them. Ghost recon Wildlands for example.

An RX580 can’t run every game out there at 1080p at 60fps. Some of them are games that run at 60fps at 4K on a 2080Ti. So by your logic the RX580 isn't a true 1080p card? And you’re calling me the novice?

People like you give the internet a bad name...
 
Last edited:
What if the in-game AI need the graphics card for computing... well... AI decisions? Would the DLSS convert artificial intelligence to artificial imbecile?
 
I did just google it actually. Practically every review states the opposite that you have and state the 2080Ti is the first properly capable 4K card. There are a small handful of games that won’t run at a consistent 60fps on it but I haven’t played any of them. Ghost recon Wildlands for example.

An RX580 can’t run every game out there at 1080p at 60fps. Some of them are games that run at 60fps at 4K on a 2080Ti. So by your logic the RX580 isn't a true 1080p card? And you’re calling me the novice?

People like you give the internet a bad name...

If you read my replies again you would read that I have never denied that 2080Ti is the "best" 4K single video card solution you can get at the moment, or maybe even for months to come, but it's just that you could get much better results with SLI-ish setup, cheaper, but with trade offs. And we digress, I was talking about quality upscaling algorithms DO MATTER here.
 
If you read my replies again you would read that I have never denied that 2080Ti is the "best" 4K single video card solution you can get at the moment, or maybe even for months to come, but it's just that you could get much better results with SLI-ish setup, cheaper, but with trade offs. And we digress, I was talking about quality upscaling algorithms DO MATTER here.
You stated that’s 2080Ti good enough for 4k and the only real 4K solution is a dual card solution. I have done some looking and a 2080Ti plays games at 4K at higher frame rates than an rx580 does at 1080p. I guess you think that’s not a real 1080p solution either huh. Also, if you think dual GPU is the way forward here then you’re definitely a novice. It worked well around 5 years ago, today multi GPU is hopeless. Currently there is no crossfire setup anywhere near as strong as a single 2080Ti and SLI support is limited at best, with many games simply not supporting it. If you have ever ran dual GPU like I have you would know. A single 2080Ti will give you better performance most of the time than dual 2080’s for example. Dual 2080ti’s would offer you more but not always, some games perform worse with two GPUs as devs don’t bother optimising for them any more.
 
You stated that’s 2080Ti good enough for 4k and the only real 4K solution is a dual card solution. I have done some looking and a 2080Ti plays games at 4K at higher frame rates than an rx580 does at 1080p. I guess you think that’s not a real 1080p solution either huh. Also, if you think dual GPU is the way forward here then you’re definitely a novice. It worked well around 5 years ago, today multi GPU is hopeless. Currently there is no crossfire setup anywhere near as strong as a single 2080Ti and SLI support is limited at best, with many games simply not supporting it. If you have ever ran dual GPU like I have you would know. A single 2080Ti will give you better performance most of the time than dual 2080’s for example. Dual 2080ti’s would offer you more but not always, some games perform worse with two GPUs as devs don’t bother optimising for them any more.

Er, that's what I have been saying, I guess you just a kid likes to win, OK you win, yay. 2080Ti is the best for 4K, better than SLI setup in all games, yay.

[edit] P.S tons of Vega GPUs in second hand market, you can get a pair for as little as 500 dollars and beat a 1300 dollar 2080Ti in all the games you care, with at least 20% better minimum framerate, it is still the only way to get above minimum 60fps in 4K for now for this moment. Oh I am such an ***** I should have invested in the shiny 2080Ti instead, cry cry.
 
Last edited:
Er, that's what I have been saying, I guess you just a kid likes to win, OK you win, yay. 2080Ti is the best for 4K, better than SLI setup in all games, yay.

[edit] P.S tons of Vega GPUs in second hand market, you can get a pair for as little as 500 dollars and beat a 1300 dollar 2080Ti in all the games you care, with at least 20% better minimum framerate, it is still the only way to get above minimum 60fps in 4K for now for this moment. Oh I am such an ***** I should have invested in the shiny 2080Ti instead, cry cry.

You are absolutely clueless. You couldn’t more wrong. 2 Vega 64’s won’t touch a 2080Ti. Have ever used any of these cards? Or are you just trying to wind people up by coming on here and talking nonsense?
 
Back