4K vs 1440p vs 1080p: What Monitor to Buy?

xenonxenonxenon

Posts: 8   +11
For me, 1440p is the sweet spot. You need a very powerful GPU to do 4K and 1080p is too low of a resolution in my opinion. My gaming computer has two 27 inch 1440p 144 Hz IPS panels hooked up to it. Of course this all comes down to which GPU you have as that will truly dictate which resolution monitor you should get.

I was lucky enough to be able to pick up a 3070 from Micro Center. I went there to pick up a 34 inch ultra wide monitor for my work laptop and saw they had GPUs in stock so I picked up both. I am very happy with games at 1440p with my 3070.
 

fadingfool

Posts: 257   +322
1440p 32inch on my desktop - 55 inch 4K when on the sofa (and as I'm sitting further away the detail drop to medium for better frame rates goes unnoticed). Handy my PC is within HDMI range of the TV when I want some relaxed gaming (steam controller handles the transition).
Still running a 1080Ti - but as the refresh rate of the TV is only 60hz it can handle 4K medium for most games I throw at it.
 

Kosmoz

Posts: 579   +1,061
I don't give a flying F about anything above 1080p until I can buy a $500-$600 GPU (real price, not fairy tale MSRP) that can hold me for 4 years at least, at Ultra settings + RT.

I'm not gonna buy a $2000+ GPU that can barely do that now... so when those GPUs exist, like I said above, then I'll consider 1440p, because a $600 GPU to use at Ultra + RT at 4k is not gonna happen, ever. So 1440p maybe, someday, but until then I'm really content and perfectly fine with 1080p Ultra + RT. I actually find pathetic those that play at 4k Medium, they have a much worse experience than I do.

From 1080p to 1440p is not enough increase in image quality for me to make it worth the loss in performance now with curernt GPUs and 4k even though is noticeable in image quality, it's just criminal on the fps.
So no thanks, I refuse to play "their" game and upgrade a $1000+ GPU every year or to downgrade my settings from Ultra to High to Medium in 3 years just to keep the fps above 60 at resolutions above 1080p, because this is the reality.

People are a bunch of sheeple caught in this marketing game and they got the "bug" that they "need" to have the best of the best or they throw a temper tantrum and so they spend money like *****s in this zombie consumerism society, especially on tech and all PC related, but not only. Not me.

edit: I'm talking from a gaming only POV, not work and money making stuff. Just to be clear.
 
Last edited:

mrSister

Posts: 67   +98
I in fact, bought one of those 1080p AoC 24g last year. Why buy 1080p in this day and age? Because they are inexpensive, because I don't give two f*cks about the number of pixels and because you need less horsepower to drive it.

Imagine playing at 4k in this day and age and having to pay thousands for a gpu capable of running games at that resolution.
 

Kazkas

Posts: 23   +61
True, but only if you’re 100% gamer. I do much more on my PC, so the “real estate” of 1440p in 32” is a big plus, nonetheless I am still using RX580 for my seldom gaming….

I don't give a flying F about anything above 1080p until I can buy a $500-$600 GPU (real price, not fairy tale MSRP) that can hold me for 4 years at least, at Ultra settings + RT.

I'm not gonna buy a $2000+ GPU that can barely do that now... so when those GPUs exist, like I said above, then I'll consider 1440p, because a $600 GPU to use at Ultra + RT at 4k is not gonna happen, ever. So 1440p maybe, someday, but until then I'm really content and perfectly fine with 1080p Ultra + RT. I actually find pathetic those that play at 4k Medium, they have a much worse experience than I do.

From 1080p to 1440p is not enough increase in image quality for me to make it worth the loss in performance now with curernt GPUs and 4k even though is noticeable in image quality, it's just criminal on the fps.
So no thanks, I refuse to play "their" game and upgrade a $1000+ GPU every year or to downgrade my settings from Ultra to High to Medium in 3 years just to keep the fps above 60 at resolutions above 1080p, because this is the reality.

People are a bunch of sheeple caught in this marketing game and they got the "bug" that they "need" to have the best of the best or they throw a temper tantrum and so they spend money like *****s in this zombie consumerism society, especially on tech and all PC related, but not only. Not me.
 

Kosmoz

Posts: 579   +1,061
True, but only if you’re 100% gamer. I do much more on my PC, so the “real estate” of 1440p in 32” is a big plus, nonetheless I am still using RX580 for my seldom gaming….
For anything work related yes, I agree, higher resolutions are important, but only there.

But for gaming, anything more than 1080p is called being a fool and falling for their PR BS and lies in a never ending game of higher fps and higher resolutions.
 

Irata

Posts: 2,049   +3,494
Still on 1080p and tbh, my 5500XT is actually a pretty good match with the two dirt cheap 1080p60 monitors.

The plan was to upgrade the GPU and primary screen to 1440p and higher refresh, but in the current market that‘s not worth it.

Like you pointed out - graphics card and monitor do need to be matched to get the most out of your system and that isn‘t cheap even under normal circumstances.
 

Ohnooze

Posts: 160   +370
My PC is connected to a 120hz tv that can do native 1440p and 4k. So I just use whatever resolution runs well for the game I'm playing at the time.

Honestly I have no idea why people use monitors anymore. A good tv set has just as good if not better picture quality and generally is a lot larger.

But hey to each his own. I'm just saying for me it makes no sense.
 

Dr Roboto

Posts: 17   +32
IMO, 1440p is the sweet spot. I have been using a 1440p monitor since 2015 and never looked back. It is a substantial upgrade for both gaming and everyday use, home and office. It pairs nicely with my 1070 Ti and I have never had an issue running most games at relatively high graphics settings.

I have now upgraded to a curved ultra-wide at 3840x1600 and will never go back to standard format. I would compare it to upgrading from 1080p to 1440p. The ultra-wide is amazing. I still use my 1070 Ti but it does require some tweaking to play games. I use NVIDIA image scaling set to 77% (2954x1231) and can run Far Cry 6 at 60 Hz with High graphics settings.

To be honest, I don't see the big deal about 120Hz and higher monitors. I am sure if you were a competitive gamer it matters, but I run at 60Hz and never have any real issues. I mostly focus on single player games, so may that is the difference. However, I would take the extra resolution over higher update rate any day for my type of gaming. And when it comes to everyday tasks (what I use it for 90% of the time), 60Hz is overkill and the extra resolution and screen size are worth it 100 fold.
 

BobDoleStillAliv

Posts: 27   +32
The size of your screen is a huge point on what resolution to get. This was mentioned, but not really brought home.

A 32in 1080p screen is gonna look bad. A 32in 4k screen will look beautiful. A 15in laptop screen will look fine at 1080p. A 15in 4k laptop screen, you'll see a difference but it wont be enormous.
 

amghwk

Posts: 1,177   +1,098
1080p is actually very good enough. There's no need to stretch and squeeze the graphics card dry to make it crawl and strain to play at 4K.

To buy 4K setup just because it's "the thing now" is baseless and wasteful. Especially the graphics card prices of the top end to play anything reasonably well at 4K.

TVs and normal 4K monitors on the other hand are already very affordable already, however.

Just don't fall for the 4K 144Hz or higher "gaming 4K monitors". No 4K capable cards can even drive all games that high yet.

I myself game at 1440p, because my 5700XT can handle even latest games well. I just need 60fps minimum framerate for smooth play on my 4K Sony TV and Samsung 4K monitor. ( I don't game at 4K but do watch movies and videos at 4K). I always turn V-Sync on and set the FPS limit to 60fps wherever possible. I still game at 1080p for certain games and I don't mind even DOS games resolutions since I still play DOS games and other console emulations.
 

Toju Mikie

Posts: 278   +265
I'm also a TV instead of monitor guy. I have a Samsung QLED 32" that I use as a monitor. It's an affordable 4K60 monitor and the HDR is great for games and movies. It's also connected to a laptop that has 2560x1600 resolution at 165hz so if I want high refresh I can use my laptop screen as well. The only bad thing I would say about the TV is that the input lag is slightly higher through HDR (about 15ms) but the advantages of using a TV (lots of ports for game consoles, etc. and ability to connect things like a TV antenna) outweigh some of the disadvantages for me.
 

Ohnooze

Posts: 160   +370
I don't give a flying F about anything above 1080p until I can buy a $500-$600 GPU (real price, not fairy tale MSRP) that can hold me for 4 years at least, at Ultra settings + RT.

I'm not gonna buy a $2000+ GPU that can barely do that now... so when those GPUs exist, like I said above, then I'll consider 1440p, because a $600 GPU to use at Ultra + RT at 4k is not gonna happen, ever. So 1440p maybe, someday, but until then I'm really content and perfectly fine with 1080p Ultra + RT. I actually find pathetic those that play at 4k Medium, they have a much worse experience than I do.

From 1080p to 1440p is not enough increase in image quality for me to make it worth the loss in performance now with curernt GPUs and 4k even though is noticeable in image quality, it's just criminal on the fps.
So no thanks, I refuse to play "their" game and upgrade a $1000+ GPU every year or to downgrade my settings from Ultra to High to Medium in 3 years just to keep the fps above 60 at resolutions above 1080p, because this is the reality.

People are a bunch of sheeple caught in this marketing game and they got the "bug" that they "need" to have the best of the best or they throw a temper tantrum and so they spend money like *****s in this zombie consumerism society, especially on tech and all PC related, but not only. Not me.

edit: I'm talking from a gaming only POV, not work and money making stuff. Just to be clear.
I find your view interesting. You call people sheeple for wanting the latest tech and you call them pathetic for choosing resolution over things like ray tracing.
I would argue you're exactly like the people you're trashing. Ray tracing is not worth the frame rate hit it takes and it IS the latest buzz word tech. I would take 1440p over 1080p with ray tracing absolutely any day of the week. I find 1080p to look too blurry for me and no ray tracing is going to change my mind. But hey to each his own...I like what I like and you like what you like. But to trash people for wanting the new tech and then in the same statement talk about being an avid ray tracing supporter...the irony is mind blowing.
 

zaku49

Posts: 49   +51
I went from 1080p to 4k and it was like wearing glasses for the first time, it's impossible to go back now, everything's so clear. The higher resolution is also beneficial for large scale FPS where it becomes easier to make out players in the distance.
 

Lionvibez

Posts: 2,628   +2,389
I had to make this decision in December and I went LG 34GP83A-B which is a 3440x1440p 144hz Ultra wide to match with a 6800XT and I couldn't be happier with the setup. Most of the games I play support Ultra wide.

My PC is connected to a 120hz tv that can do native 1440p and 4k. So I just use whatever resolution runs well for the game I'm playing at the time.

Honestly I have no idea why people use monitors anymore. A good tv set has just as good if not better picture quality and generally is a lot larger.

But hey to each his own. I'm just saying for me it makes no sense.
Because the average computer desk doesn't have room for tv. And not everyone wants to be sitting infront of a 42inch TV for desktop use.

IMO, 1440p is the sweet spot. I have been using a 1440p monitor since 2015 and never looked back. It is a substantial upgrade for both gaming and everyday use, home and office. It pairs nicely with my 1070 Ti and I have never had an issue running most games at relatively high graphics settings.

I have now upgraded to a curved ultra-wide at 3840x1600 and will never go back to standard format. I would compare it to upgrading from 1080p to 1440p. The ultra-wide is amazing. I still use my 1070 Ti but it does require some tweaking to play games. I use NVIDIA image scaling set to 77% (2954x1231) and can run Far Cry 6 at 60 Hz with High graphics settings.

To be honest, I don't see the big deal about 120Hz and higher monitors. I am sure if you were a competitive gamer it matters, but I run at 60Hz and never have any real issues. I mostly focus on single player games, so may that is the difference. However, I would take the extra resolution over higher update rate any day for my type of gaming. And when it comes to everyday tasks (what I use it for 90% of the time), 60Hz is overkill and the extra resolution and screen size are worth it 100 fold.
I was of the same mind set on refresh rate until I picked up my new monitor which supports Freesync Premium and at 144hz. There is a noticeable difference on the desktop and in games. You won't see it until you try it.
 
Last edited:

Aceseven

Posts: 287   +369
I don't give a flying F about anything above 1080p until I can buy a $500-$600 GPU (real price, not fairy tale MSRP) that can hold me for 4 years at least, at Ultra settings + RT.

I'm not gonna buy a $2000+ GPU that can barely do that now... so when those GPUs exist, like I said above, then I'll consider 1440p, because a $600 GPU to use at Ultra + RT at 4k is not gonna happen, ever. So 1440p maybe, someday, but until then I'm really content and perfectly fine with 1080p Ultra + RT. I actually find pathetic those that play at 4k Medium, they have a much worse experience than I do.

From 1080p to 1440p is not enough increase in image quality for me to make it worth the loss in performance now with curernt GPUs and 4k even though is noticeable in image quality, it's just criminal on the fps.
So no thanks, I refuse to play "their" game and upgrade a $1000+ GPU every year or to downgrade my settings from Ultra to High to Medium in 3 years just to keep the fps above 60 at resolutions above 1080p, because this is the reality.

People are a bunch of sheeple caught in this marketing game and they got the "bug" that they "need" to have the best of the best or they throw a temper tantrum and so they spend money like *****s in this zombie consumerism society, especially on tech and all PC related, but not only. Not me.

edit: I'm talking from a gaming only POV, not work and money making stuff. Just to be clear.
the crazy part is how much power it takes to really push maxed out 1080p, its all I've ever gamed at on pc and so far...

sli gtx 570's
gtx titan
gtx 980ti - current card

those cards have all been beaten by it, the 980ti still wrangles em if you cool off on shadows though, honestly if all devs optimized their games like id or something the current cards wouldnt need to even exist. but these janky games need alot of brute force to push all that slapdash code.
 

Kosmoz

Posts: 579   +1,061
I find your view interesting. You call people sheeple for wanting the latest tech and you call them pathetic for choosing resolution over things like ray tracing.
I would argue you're exactly like the people you're trashing. Ray tracing is not worth the frame rate hit it takes and it IS the latest buzz word tech. I would take 1440p over 1080p with ray tracing absolutely any day of the week. I find 1080p to look too blurry for me and no ray tracing is going to change my mind. But hey to each his own...I like what I like and you like what you like. But to trash people for wanting the new tech and then in the same statement talk about being an avid ray tracing supporter...the irony is mind blowing.
Yet you fail to see the reality and reasoning.

1080p to 1440p increase in pixels is nothing, I don't see any "blurry" at all. If the TAA makes a game blurry that's something different and it does that at all resolutions. There are sharpening filters for that, to fix the issue.

So if the pixel increase from 1080p to 1440p is nothing, but the performance hit is noticeable, then the RT ON or OFF makes a bigger impact in image quality.

People who say RT is just a buzz word either did not try it, or if they did they are so used to the fake lighting and reflections we had for dozens of years now that they can't tell the difference... but that does not mean there is none. It's like brainwashing into something, you might think is ok, but is not.

I guarantee you that if our world thru your eyes now would be rendered with the fake lighting and reflections used in games before RT you would feel something is very very wrong.

The more you play using RT the more you realize that is the correct way (or the closest yet,until Path Tracing) to light and reflect worlds and objects in games. It just looks right... there is no better way to say it and the fake techniques look wrong.

So yes, I'll take the RT hit at 1080p any day because I can make almost all games play with the max eye candy at over 60 fps (except CP 77), than play a downgraded game at 1440p with slightly more pixels, which I can't tell the difference, just the lower fps... yet I can tell the difference between RT on and off. The more you use it, the more you can tell the difference... I have taken myself dozens of screenshots in games with RT on vs off and the difference is there and clearly noticeable.

Everyone will come to the exact conclusion in a few years, it's not that I have some extra senses or anything, I just erased the fake ways I learned in so many years and now I can see the difference.

Next gen GPUs, Lovelace and RNDA3 having at least 2x performance will bring RT to A LOT more people. We will have this argument less and less going forward. RT is the right way and then comes Path Tracing.

P.S. I would not use RT to play under 60 fps though, that is a NO from me. That's why CP 77 has RT OFF, you need at least an RTX 3080 with DLSS ON for 1080p RT ON in that un-optimized game.
 

Ohnooze

Posts: 160   +370
Yet you fail to see the reality and reasoning.

1080p to 1440p increase in pixels is nothing, I don't see any "blurry" at all. If the TAA makes a game blurry that's something different and it does that at all resolutions. There are sharpening filters for that, to fix the issue.

So if the pixel increase from 1080p to 1440p is nothing, but the performance hit is noticeable, then the RT ON or OFF makes a bigger impact in image quality.

People who say RT is just a buzz word either did not try it, or if they did they are so used to the fake lighting and reflections we had for dozens of years now that they can't tell the difference... but that does not mean there is none. It's like brainwashing into something, you might think is ok, but is not.

I guarantee you that if our world thru your eyes now would be rendered with the fake lighting and reflections used in games before RT you would feel something is very very wrong.

The more you play using RT the more you realize that is the correct way (or the closest yet,until Path Tracing) to light and reflect worlds and objects in games. It just looks right... there is no better way to say it and the fake techniques look wrong.

So yes, I'll take the RT hit at 1080p any day because I can make almost all games play with the max eye candy at over 60 fps (except CP 77), than play a downgraded game at 1440p with slightly more pixels, which I can't tell the difference, just the lower fps... yet I can tell the difference between RT on and off. The more you use it, the more you can tell the difference... I have taken myself dozens of screenshots in games with RT on vs off and the difference is there and clearly noticeable.

Everyone will come to the exact conclusion in a few years, it's not that I have some extra senses or anything, I just erased the fake ways I learned in so many years and now I can see the difference.

Next gen GPUs, Lovelace and RNDA3 having at least 2x performance will bring RT to A LOT more people. We will have this argument less and less going forward. RT is the right way and then comes Path Tracing.

P.S. I would not use RT to play under 60 fps though, that is a NO from me. That's why CP 77 has RT OFF, you need at least an RTX 3080 with DLSS ON for 1080p RT ON in that un-optimized game.
I understand your logic just fine.
It really comes down to two things...what you're used to and personal preference.
Yes, it's a fact that RT is much more realistic and I never argued against that. However, just how much picture quality it adds is somewhat subjective. I personally like RT but don't think the upgrade to the lighting and whatnot is worth taking a resolution or huge frame rate hit. I much prefer 1440p to RT but I'm very used to that resolution now and 1080p is absolutely noticeable to my eyes.
Again, to each his own. We all have preferences but in the end you're chasing ray tracing is no different than someone else chasing a higher resolution. You're a sheeple just like everyone else.

EDIT: I will say that how noticeable the difference between 1080 and 1440 may very depending on screen size and whatnot.
 
Last edited:

Lionvibez

Posts: 2,628   +2,389

I understand your logic just fine.
It really comes down to two things...what you're used to and personal preference.
Yes, it's a fact that RT is much more realistic and I never argued against that. However, just how much picture quality it adds is somewhat subjective. I personally like RT but don't think the upgrade to the lighting and whatnot is worth taking a resolution or huge frame rate hit. I much prefer 1440p to RT but I'm very used to that resolution now and 1080p is absolutely noticeable to my eyes.
Again, to each his own. We all have preferences but in the end you're chasing ray tracing is no different than someone else chasing a higher resolution. You're a sheeple just like everyone else.

EDIT: I will say that how noticeable the difference between 1080 and 1440 may very depending on screen size and whatnot.

Well said.
 

Kosmoz

Posts: 579   +1,061
I understand your logic just fine.
It really comes down to two things...what you're used to and personal preference.
Yes, it's a fact that RT is much more realistic and I never argued against that. However, just how much picture quality it adds is somewhat subjective. I personally like RT but don't think the upgrade to the lighting and whatnot is worth taking a resolution or huge frame rate hit. I much prefer 1440p to RT but I'm very used to that resolution now and 1080p is absolutely noticeable to my eyes.
Again, to each his own. We all have preferences but in the end you're chasing ray tracing is no different than someone else chasing a higher resolution. You're a sheeple just like everyone else.

EDIT: I will say that how noticeable the difference between 1080 and 1440 may very depending on screen size and whatnot.
Ok, let's assume you are right and I'm also a "sheeple". So let's see if I really am.

1. How much did you pay for you GPU?
2. How many years can you run it at 1440p without downgrading graphics settings from Ultra-High to lower than that (I'm being reasonable here since you can't use RT properly at that resolution with current GPUs)?
3. How many fps is the minimum you accept as playable?

My reasoning is: at 1080p, I pay less and less often for the same but more often than not better image quality than those that pay more for playing at 1440p and 4k and upgrade more frequent.

I don't play their marketing game of chasing the dangling carrot of non-stop upgrading and paying higher and higher prices for those GPUs that are supposed to give the the minimum 1440p/4k required GPU power. Not to mention, my screen is also much cheaper.

For me sheeple are those people, with more money than sense (which does not necessarily mean rich). I don't do that. But hey, since this is all subjective, if it makes you feel better, you can consider me one too.

People really have to understand the difference between a game running at 1440p or 4k resolution and having low resolution assets, like older games were, just a few years ago only and a game running at 1080p with high resolution assets (yes 4k even). It's not the same thing.
Take a 6 year old game and run it at 4k and it looks horrible vs any 2020 game at 1080p Ultra. It does not even need RT. The assets in the newer games are so much better that screen resolution is not the most important factor.

Assets resolution, photogrammetry, skin shading, AA technique and all the lighting and shading and visual FX and post-processing (+RT) - those are much more important and make a game look amazing and just after that comes resolution, at the end.

Just look at Ratchet & Clank RA which has one of the best engines (until UE5 games come) and devs teams too (but that's another subject), how amazing that game looks in RT mode at 1080p! You would not say it looks blurry at all or that is running at 1080p. It's because the quality of everything in that game is a the top level, not because of the resolution.

So yes, I stand by what I said in my 1st post. I'll take 1080p Ultra+RT any day vs 1440p Medium or High and nothing extra. And 4k is a joke with current GPUs, not even worth debating. I'm talking AAA games, not CS:GO. No one cares about that.
 

Ohnooze

Posts: 160   +370
Ok, let's assume you are right and I'm also a "sheeple". So let's see if I really am.

1. How much did you pay for you GPU?
2. How many years can you run it at 1440p without downgrading graphics settings from Ultra-High to lower than that (I'm being reasonable here since you can't use RT properly at that resolution with current GPUs)?
3. How many fps is the minimum you accept as playable?

So yes, I stand by what I said in my 1st post. I'll take 1080p Ultra+RT any day vs 1440p Medium or High and nothing extra. And 4k is a joke with current GPUs, not even worth debating. I'm talking AAA games, not CS:GO. No one cares about that.
1. I paid around $500 I think but sold my other card so closer to $300 or so when it was all said and done.
2. I honestly don't know. I guess it just depends on the particular card I bought.
3. I'm good with 60.

Like I said, I understand your argument. I get where you're coming from and yeah graphics are more than resolution...no doubt. That said, I value a higher resolution over RT. That doesn't mean I would be happy playing pac-man just because it was in 8k.
And yeah I agree that 4k performance is way too demanding.
 

Lionvibez

Posts: 2,628   +2,389
1. I paid around $500 I think but sold my other card so closer to $300 or so when it was all said and done.
2. I honestly don't know. I guess it just depends on the particular card I bought.
3. I'm good with 60.

Like I said, I understand your argument. I get where you're coming from and yeah graphics are more than resolution...no doubt. That said, I value a higher resolution over RT. That doesn't mean I would be happy playing pac-man just because it was in 8k.
And yeah I agree that 4k performance is way too demanding.

Yup even the current top end gpu's 3090 and 6900XT are only 4k 60 and you still have to drop settings in some games to sustain that. Adding RT just means having to use upscaling and meh. I will take native res over up scaling any day. The next gen of gpu's will really be the ones people want to 4k and ultra settings performance.