I have a crazy idea for you assuming that your display is the standard 60Hz. Try using a big-screen 4K TV. The reason I say this is that televisions seem to have some kind of built-in hardware-based upscaling tech that works incredibly well. When I bought my 4K TV, I ended up getting a (less than $500CAD) Haier 55" from Costco because I did
not want a "Smart" TV from Samsung or LG (or anyone else for that matter) because of all the stories of them spying on their owners and also because I was going to have my PC hooked up to it anyway, rendering the "Smart" feature moot anyway. My TV has no ethernet port or WiFi (or any other way to gain access to the web) so it literally
can't spy on me.
What I discovered is that, no matter what resolution I set the game (in this case,
Far Cry 5) to, the picture I got was 2160p in all it's glory. I didn't realise it at first because I had actually tried setting games to 2160p to see how my R9 Fury could handle it. It looked amazing (as I expected) but (as I also expected), some games were more-or-less unplayable at 4K depending on where in the game map I was. I tried turning it down from 2160p to 1440p and noticed that it still looked amazing with much better frame rates. I was a bit mystified however because it seemed to look
exactly the same as 2160p. I went back and forth a couple of times between 1440p and 2160p to see if I was imagining things. As it turns out, I wasn't.
I confirmed this when I turned the res down to 1080p and it looked the same again. I thought to myself "This is impossible because people are always raving about how good 4K looks!" and decided to make a (somewhat) more scientific test. What I did was run the
Far Cry 5 in-game benchmark and put my face less than 1m away from the panel so I could focus on specific graphics-related things like tree/foliage textures, vehicles and water.
I ran the benchmark at 1080p, 1440p and 2160p and used FRAPS to ensure that there was at least a frame-rate performance difference between the three scenes. By doing the test this way, I'd know for certain that the PC side of things was doing EXACTLY what it was supposed to. Well, I can say that the (three which became six, which became eight which became ten) benchmark runs all looked 100
identical. Since it was just the benchmark and not actual gameplay, the scene was rendered (seemingly) smoothly at all resolutions but FRAPS told a different story.
At 1080p, the benchmark averaged 75fps while at 1440p, it lost 20fps on average. At 2160p though, it averaged only 30fps although it didn't really stutter at all. I had the graphics settings turned up so that I could see the textures in the foliage and the water and even with my face up-close, I couldn't tell one from the other. I was puzzled because I couldn't understand why a 1080p rendering blown up to fit a 55" panel looked just as good as a 2160p rendering. I dropped 1440p from my test resolutions thinking that maybe it was playing tricks on my eyes by lowering the rendering gradually but back and forth (twice) between 1080p and 2160p was completely indistinguishable despite the fact that I was
trying hard to spot
anything that wasn't 100% the same.
That's when I realised that "OH YEAH, IT'S A TV!" because TVs have to take an HD (or lower-res) signal and turn it into a resolution that better fits the panel without looking awful. It would seem that my TV does the same thing with what my PC sends it. So with all of these people going on and on about DLSS, I say, just use a goddamn TV instead of a monitor and you'll have it already! There's a limit to this of course because TVs will never render as fast as gaming monitors (because there's no need to be able to do so).
While there are 240Hz TVs out there, I can't comment on whether or not they can upscale like that at 240Hz because I my TV is only 60Hz and I've never used a 240Hz TV for gaming. Having said that, I don't know of any media that is encoded at 240fps because that would just make overly huge video files no reason (unless it was a master copy that doesn't get distributed)..
Sure, a high FPS rate can (and does) give better results in high-speed gaming like e-Sports but for watching encoded video media, it offers no benefit at all. So it's possible that a 240Hz 4K TV may not upscale well at 240Hz because there was no foreseeable reason to do it since I'm sure that they didn't expect someone (like me) to game on a 55" 4K TV.
I have no proof that a 240Hz TV won't upscale well at 240Hz but I figure that if I were making TVs and wasn't expecting them to be used for gaming, I'd only be concerned with how well it upscales at 60Hz since spending extra money for more tech to upscale at 240Hz would be a waste in my eyes. Having said that, I've never been able to understand "corporate logic" so I could be 100% wrong. Regardless, if you already own a 4K TV, it's definitely worth a shot, if only for curiosity's sake.
4K upscaling: everything you need to know about how TVs turn HD into 4K