Resolution Scaling: The Secret to Playable "4K Gaming"

Raendor

Posts: 23   +17
Yes, but non-native resolution is blurry even at 90%. I tried in RDR2 and settled for 50-70 fps window, rather than solid 60 fps minimum with blurred image.
 

DrSuess

Posts: 197   +179
Yes, but non-native resolution is blurry even at 90%. I tried in RDR2 and settled for 50-70 fps window, rather than solid 60 fps minimum with blurred image.
Right, what is the point in playing if the graphics look blurry/hazy. If you want to do that you are better off playing @ 1080p while drinking a 12pack of beer, you will have more fun playing it that way.
 

Dimitriid

Posts: 2,076   +3,979
Right, what is the point in playing if the graphics look blurry/hazy. If you want to do that you are better off playing @ 1080p while drinking a 12pack of beer, you will have more fun playing it that way.
Yes, but non-native resolution is blurry even at 90%. I tried in RDR2 and settled for 50-70 fps window, rather than solid 60 fps minimum with blurred image.

Actually, not always. Here's a use case: I've got a 4k 60FPS monitor, 32" I got this mainly for productivity: I was already on dual screens and since I sit close to it, this is effectively like going on a 4x 1080p screen array except I have no bezels and can do things like reading windows that take 2 panels. Great for coding, great for spreadsheets and off work is also great for content with decent 4k 60fps

Gaming however, well I have a 1070 and looks like I'm stuck with it for at least another 3 to 6 months while supply of GPUs increases in the painfully slow way we're seeing. There is literally nothing better I could realistically buy (Scalper prices are not an option) so for any newer games I pretty much have to do resolution scaling.

Just running Cyberpunk at 1080p and letting my monitor do the scaling translates into a terrible, terrible image: it's not only blurry but this monitor really can't do 1080p with proper gamma and brightness it just gets washed out. But with resolution scaling, Cyberpunk 2077 stays at it's native 4k so my monitor shows more accurate color settings but using Fidelity CAS (No DLSS on the 1070 and I'm not buying a 2060 just to get rid of it in 4 months) I get better results than just letting the monitor handle scaling on it's own. Specially because being variable it only really goes down to the factor I set (50% resolution) when free roaming in the city. This is playable but it looks like it gets film with 300% film grain. But once you're in an indoors area it can scale back up and it can maintain way better looking results.

Gaming isn't everything and a lot of people are getting screens other than 1080p and 1440p since they do have a lot more uses so resolution scaling is actually extremely useful.
 

nnguy2

Posts: 400   +804
Actually, not always. Here's a use case: I've got a 4k 60FPS monitor, 32" I got this mainly for productivity: I was already on dual screens and since I sit close to it, this is effectively like going on a 4x 1080p screen array except I have no bezels and can do things like reading windows that take 2 panels. Great for coding, great for spreadsheets and off work is also great for content with decent 4k 60fps

Gaming however, well I have a 1070 and looks like I'm stuck with it for at least another 3 to 6 months while supply of GPUs increases in the painfully slow way we're seeing. There is literally nothing better I could realistically buy (Scalper prices are not an option) so for any newer games I pretty much have to do resolution scaling.

Just running Cyberpunk at 1080p and letting my monitor do the scaling translates into a terrible, terrible image: it's not only blurry but this monitor really can't do 1080p with proper gamma and brightness it just gets washed out. But with resolution scaling, Cyberpunk 2077 stays at it's native 4k so my monitor shows more accurate color settings but using Fidelity CAS (No DLSS on the 1070 and I'm not buying a 2060 just to get rid of it in 4 months) I get better results than just letting the monitor handle scaling on it's own. Specially because being variable it only really goes down to the factor I set (50% resolution) when free roaming in the city. This is playable but it looks like it gets film with 300% film grain. But once you're in an indoors area it can scale back up and it can maintain way better looking results.

Gaming isn't everything and a lot of people are getting screens other than 1080p and 1440p since they do have a lot more uses so resolution scaling is actually extremely useful.

You have 1070...I thought Fidelity CAS was only on amd card?
 

Crinkles

Posts: 217   +204
Right, what is the point in playing if the graphics look blurry/hazy. If you want to do that you are better off playing @ 1080p while drinking a 12pack of beer, you will have more fun playing it that way.

Using sharpening and film grain reduction the screens clarity should be okay. I run everything at 1440 with the monitor upscaled to 4k, ignore film grain with sharpening on. It's not perfect but upscaled to fit the screen is better. This was a one time setting, I haven't needed to make adjustments.
 

Stoly

Posts: 93   +58
I thought so as well but I was able to enable it and use it without issue: apparently it supports any GPU so if the game includes it you can use it instead of this guide.

Nvidia also has its own scaling option with filtering on the control panel and in geforce experience, you don't need CRU at all.

DLSS 2.x is a far better solution if available.
 

BSim500

Posts: 893   +2,113
Lol; aka don't play in 4k.
Exactly. I'm bewildered at the almost punch-drunk levels of over-hype 4k receives on tech sites relative to the 2% of people using it. If frame-rates are poor enough that you have to downscale most games, that's an admission that you bought the wrong monitor and would have been far better off with 1440p or 1440 Ultrawide without the ugly non-native blurring.
 

fluffydroid

Posts: 57   +50
Has anyone tried this on Cyberpunk 2077 with a 1070? I have a 1440p monitor and the 1070 looks like it would struggle to run at native resolution. 1080p would be better using this tool to render at 1440p.
 

jdwii

Posts: 23   +28
Has anyone tried this on Cyberpunk 2077 with a 1070? I have a 1440p monitor and the 1070 looks like it would struggle to run at native resolution. 1080p would be better using this tool to render at 1440p.

Medium settings with a GTX 1080 and a 3700X I get like 40-45FPS on 1440P.

1070 would probably be at 30-35 at medium 1440P
 

Michael7

Posts: 93   +91
Nice article but I thought the whole point of high res monitors is to enjoy the higher res with finer details. I get it as a last resort if your GPU struggles in a few games but if you need to resort to it in all games then it means that you either need to upgrade the GPU or you should have bought lower res monitor in the first place.