Actually, not always. Here's a use case: I've got a 4k 60FPS monitor, 32" I got this mainly for productivity: I was already on dual screens and since I sit close to it, this is effectively like going on a 4x 1080p screen array except I have no bezels and can do things like reading windows that take 2 panels. Great for coding, great for spreadsheets and off work is also great for content with decent 4k 60fps
Gaming however, well I have a 1070 and looks like I'm stuck with it for at least another 3 to 6 months while supply of GPUs increases in the painfully slow way we're seeing. There is literally nothing better I could realistically buy (Scalper prices are not an option) so for any newer games I pretty much have to do resolution scaling.
Just running Cyberpunk at 1080p and letting my monitor do the scaling translates into a terrible, terrible image: it's not only blurry but this monitor really can't do 1080p with proper gamma and brightness it just gets washed out. But with resolution scaling, Cyberpunk 2077 stays at it's native 4k so my monitor shows more accurate color settings but using Fidelity CAS (No DLSS on the 1070 and I'm not buying a 2060 just to get rid of it in 4 months) I get better results than just letting the monitor handle scaling on it's own. Specially because being variable it only really goes down to the factor I set (50% resolution) when free roaming in the city. This is playable but it looks like it gets film with 300% film grain. But once you're in an indoors area it can scale back up and it can maintain way better looking results.
Gaming isn't everything and a lot of people are getting screens other than 1080p and 1440p since they do have a lot more uses so resolution scaling is actually extremely useful.