I would say keep it native resolution. I'm not very picky about my game play. Anything around 30 FPS at 900x1600 is fine by me. But if I change the resolution any at all from native resolution, it drives me crazy.Would it be better to lower to 1080p? Or would staying at native res be better?
So what if I have a Dell 1440p 144hz with gsync. I dont use gsync so lets not worry about that part. Would it be better to lower to 1080p? Or would staying at native res be better?
I have a old i7 3770k, ddr 3 16gb and a 1070.
For me it reminds me or looks like the game is running in a soap opera effect like when TVs have that motion option turned on. I just don't like the look of it but TVs have the option to turn that off monitors don't or simply have no way of adjusting it.Why would you not use g-sync? I'm running almost the same setup as you, OC'd i5 3570k, 16gb DDR3, 1070 and a Dell 1440p, 144hz, g-sync. G-sync removes stuttering and tearing, its great. Also I would stick to 1440p. It looks better and the 1070 delivers enough frames in the games I play. Generally >60 with a few running at >100. Rocket league hits 144 at times and is silky smooth
Yep. Lotsa o config files to try on my Orchid Righteous to get 30 FPS.I remember playing Quake at around 16fps in my old AMD 5x86 133Mhz and at a weird mode-X resolution (yes kids, the timerefresh command actually took a little while to complete back then). It was a revelation to see the game running at around 30fps with an K6-2 333Mhz at standard 640x480 resolutions.
People use that for an abbreviation.What's the TLDR?