Screen Tearing or Input Lag? To Vsync or Not to Vsync?

At this point you can pick up VRR monitors pretty cheaply. I have an LG 49 Nano 120hz television with VRR that I got for $500 on Black Friday. I also have an acer predator 32" 1440p monitor at 144hz monitor that supports VRR(FreeSync) that I got for $250 last year (I saved $150 buying refurbished there). These are paired with the RTX 3080 I picked up in late October at MSRP! I seem to have been very lucky given the status of GPUs today.
 
Good tips, especially for those using TVs.

I wonder sometimes about the push for 4k on consoles. High refresh rate and VRR displays are more impactful imo.
 
Good tips, especially for those using TVs.

I wonder sometimes about the push for 4k on consoles. High refresh rate and VRR displays are more impactful imo.
I'll be honest with you, on my TV, I can't tell the difference between 1080p and 2160p. I think it's because TV's have built-in upscalers while monitors don't. This is because the manufacturers know very well that there's almost no 4K content in the broadcast market and at the sizes that these TVs are (50"-70"), 1080p would look awfully grainy. I had been scratching my head about why I couldn't see any difference between 1080p, 1440p and 2160p when running the Far Cry 5 benchmark and that made me worried about my eyesight so I did a little digging.

What I found is that if you're using a 4K TV for gaming, you don't need to worry about DLSS because the TV does a better job of it than the nVidia solution because it looks EXACTLY the same as if I'm using native 2160p. As I said, I can't tell ANY difference at all, even standing less than 30cm away from the screen and trying to find differences. I seriously doubt that DLSS is that good and even if it is, if you game with a 4K TV, you might get no benefit from DLSS because your TV already does it for you.

There's a VERY interesting article that I found on TechRadar that explains it far better than I can:

Now, this article talks about underwhelming 4K upscaling which I found puzzling because I've never had a blurry image of any kind on my screen, even when watching old DVD-quality videos like episodes of Babylon-5. My TV is by no means fancy and doesn't have some high-end brand-name on it like Samsung, LG, Sharp or Panasonic. Hell, my TV's brand is probably considered to be bottom-of-the-barrel like RCA, Westinghouse or Polaroid (I still can't believe that there's a Polaroid TV...LOL). Maybe I just really lucked out but my TV's image has always been crystal-clear and as sharp as chipped obsidian. I had just assumed that all TVs were that good.

What I didn't know at the time was that I was ALWAYS gaming at 2160p, even when I had the game set to 1080p because of upscaling. That's also why I thought that gaming at 1080p looked amazing even at 55". Now, I run my Windows standard resolution at 720p because at 2160p the icons are far too small to see from across the room but the desktop background image doesn't change, like, at all. The only thing that setting my desktop resolution lower does is increase the size of the icons and status bar, nothing else.

Maybe I'm just lucky but it seems that my TV has always had "DLSS" built-in and it's like what DLSS v5.0 would be, absolutely flawless and without any reduction in frame-rate. The latest thing was while I was playing AC: Odyssey. I was getting stuttering and it even crashed a couple of times. I was mystified because I'd been playing it since December with no issues like that. I went to check the fps graph in the options menu and it had dropped from an average of 60 (with the odd minimum of 33) to an average of 55 (with an often minumum of 11!). I wondered what it was but then looked over and the resolution was set to 3840 × 2160 (native) at my usual Ultra settings (I guess the game updated and reset all that without me knowing). So, I set immediately it back to 1080p ultra. I guess I should be impressed with my RX 5700 XT because that game was 100% playable for the most part at 4K Ultra.

When I resumed playing at 1080p, the game still looked exactly the same except now it was buttery-smooth. So, this game had switched to running at 2160p all by itself and I couldn't tell AT ALL because it didn't look any better than every other time I played it. The only difference was that it now stuttered from time to time and crashed to the desktop maybe once every couple of days. I honestly thought that there was another program sapping my CPU power but it's an R5-3600X and, nope, nothing was. I thought maybe it was because I more often put my PC to sleep rather than turning it off but that didn't help either. In hindsight, I started to notice my card's fans would ramp up from time to time but I almost never notice that or find it bothersome so I just ignored it.
 
Last edited:
My tips about a perfectly well Vsync'd experience with the lowest input lag:

Use Borderless Fullscreen + cap your fps some 4 fps beneath monitor's refresh rate, if you have a FreeSync/GSync capable display.

For AMD users: use Anti-Lag at all times
For Nvidia users: Use Reflex in the games that support it.

I've never bothered with Vsync option in the last almost 2 years, since I got the 5700 XT.
 
Back