The Best 1080p Gaming Monitors in 2021

Still with the AOC g2 recommendation? is it laziness or what? :p
They come out a few new models that are even better than AOC g2 in my opinion, for example Acer Nitro XF243Y and LG 24gn600.
 
Maybe re-title this article "The Best of the Worst Monitors of 2021".... Now, 1080p is perfectly fine for office usage, but the accent on frame rates led me to believe that this article was about 'gaming'. And 1080p and gaming in the same sentence.... Oxymoronic.
 
I wouldn't dismiss 1080p as the main gaming resolution just yet.

Though I do have 4K monitors and TVs, but I still appreciate 1080p. Assassin's Creed Odyssey looks absolutely stunning in 1080p. I play it with all detail levels set to max. Beats the crap out of trying to balance out compromising detail levels at higher resolutions.

 
Maybe re-title this article "The Best of the Worst Monitors of 2021".... Now, 1080p is perfectly fine for office usage, but the accent on frame rates led me to believe that this article was about 'gaming'. And 1080p and gaming in the same sentence.... Oxymoronic.
Given how expensive video cards are right now, 1080p is great for gaming.
1080p is the best thing for budget gamers right now.
For some, it is the only way to play, actually.
 
Maybe re-title this article "The Best of the Worst Monitors of 2021".... Now, 1080p is perfectly fine for office usage, but the accent on frame rates led me to believe that this article was about 'gaming'. And 1080p and gaming in the same sentence.... Oxymoronic.

Given that 240hz plus is an absolute requirement for millions of gamers statements like this one are beyond absurd. I have a 1440p 120hz and while it's nice, it's input lag and response time are dogs hit compared to my 280hz 1080p Asus TUF. If all you play is Battleroyale then 240hz is a key advantage. We don't spend 20 hours per week aim training in Kovaaks to run sub 100fps games. We don't use vsync, gsync, adaptive sync or any other nonsense because you don't need it at 200+ FPS.
 
And 1080p and gaming in the same sentence...
...Continues to form 67% of the market, with only 8% on 1440p, 2.4% on 2160p, 2.6% on Ultrawide and the rest using *less* than 1080p?... (link)

I even left 2K behind a few years ago. So, imagine 1K!
Well if "4k" is measured by counting horizontal pixels and is still "4k" despite 3840 falling short of 4000 pixels by 4%, then 1920 width (1080p) literally is 2k to the exact same 4% short of 2000 ratio (and 1440p is 2.6k, with "1k" being more like 1024x768). The "2k monitor" marketing is completely nonsensical enough (even before Ultrawides made it even less relevant) that it's actually easier, more accurate and less confusing just to quote the full resolution instead of the "shorthand". :laughing:
 
Last edited:
1080p how quaint, I think I once used a 1080p monitor back in the early 90s.

Technically 4K is either DCI 4K at 4096 x 2160 or UHD is 3840 x 2160, DCI 2K is 2048 x 1080, FHD is 1920 x 1080, QXGA is 2048 x 1152, QHD is 2560 x 1440 (Q is for quad as it's 4x the pixels of HD which is 1280 x 720), WQXGA is 2560 x 1600, UWQHD is 3440 x 1440 (widescreen monitors)

There a ton of named and unnamed resolutions. I think we should only use 4K for 4096 x 2160 and UHD for 3840 x 2160, but really UHD should have been called QFHD as it's 4x the pixels of FHD which is 1920 x 1080. Why there was a different convention for HD and FHD is stupoid.
 
...Continues to form 67% of the market, with only 8% on 1440p, 2.4% on 2160p, 2.6% on Ultrawide and the rest using *less* than 1080p?... (link)


Well if "4k" is measured by counting horizontal pixels and is still "4k" despite 3840 falling short of 4000 pixels by 4%, then 1920 width (1080p) literally is 2k to the exact same 4% short of 2000 ratio (and 1440p is 2.6k, with "1k" being more like 1024x768). The "2k monitor" marketing is completely nonsensical enough (even before Ultrawides made it even less relevant) that it's actually easier, more accurate and less confusing just to quote the full resolution instead of the "shorthand". :laughing:
Nope, 1920x1080 is 1 quarter of 3840x2160. So, it is 1K (using round numbers) and not 2K. There is an error in your reasoning. I used 4K, 2K, 1K because, though not exactly so, it is more simple to describe than UHD, QXGA, QHD, WQXGA, etc. (what a mess!)
 
Nope, 1920x1080 is 1 quarter of 3840x2160. So, it is 1K (using round numbers) and not 2K. There is an error in your reasoning. I used 4K, 2K, 1K because, though not exactly so, it is more simple to describe than UHD, QXGA, QHD, WQXGA, etc. (what a mess!)
No what you're doing there is the resolution equivalent of saying a 32" monitor is 4x bigger than a 28.4" monitor because you took the former's screen width (27.8") then "compared" it to the latter's height (13.9") and falsely cross-compared two different things by measuring them differently for absolutely no logical reason whatsoever. Your "quadrupling" maths only work when you're actually comparing pixel count vs pixel count, eg, 8 Megapixel (2160p) vs 2 Megapixel (1080p) is correct. "If 3840 is 4k then 1920 is 1k" is not accurate at all as the "4k" label is based specifically on pixel width (3840) not pixel count and you do actually have to compare like for like for any label to have any meaning (imagine the huge mess going shopping for a printer where A3 sized printers were measured across the paper diagonal but A4 printers were missold as A5 by measuring across the paper width so that those with A3 or larger printers could feel more special about the label or something ...) :joy:

Monitor "k" numbers are as nonsensically useless as marketing labels have ever gotten, far worse than just quoting the resolution like most people do which at least are comparing the same thing. That's why literally no-one has even used "1k / 2k" labels when shopping for a 1080p screen before "4k" was invented. They definitely seem to be some new epeen based labels that 4k owners invented recently to stick on any monitor lower than 4k despite the "1k" and "2k" labels being utter gibberish that don't accurately describe anything about the monitor in question in its own right nor are consistent with how 4k is measured (pixel width alone)... (n) (N)
 
Last edited:
To be honest, I enjoy being "left behind" playing at 1080p. I even bought that AoC last summer because of the ergonomics and because I needed to reclaim some space in my desk.

In fact I expect to be playing at 1080p until any other resolution takes its spot as the entry level resolution. I don't need more pixels and it lets me cheap out in graphic horsepower.
 
To be honest, I enjoy being "left behind" playing at 1080p. I even bought that AoC last summer because of the ergonomics and because I needed to reclaim some space in my desk.

In fact I expect to be playing at 1080p until any other resolution takes its spot as the entry level resolution. I don't need more pixels and it lets me cheap out in graphic horsepower.
Once you play in 4K you cannot return to any lower resolution. I have experienced this directly.
 
Given that 240hz plus is an absolute requirement for millions of gamers statements like this one are beyond absurd. I have a 1440p 120hz and while it's nice, it's input lag and response time are dogs hit compared to my 280hz 1080p Asus TUF. If all you play is Battleroyale then 240hz is a key advantage. We don't spend 20 hours per week aim training in Kovaaks to run sub 100fps games. We don't use vsync, gsync, adaptive sync or any other nonsense because you don't need it at 200+ FPS.
I play rts, 60 fps Va monitor is fair enough. I play genshin impact, 60 fps is fair enough. I play lot of others games... I play League of legends... and while I have to admit with a 120hz group fighting is cleaner, I can go platinum with 60hz va monitor aswell.
Shooters title are a part of the games, not all the games. Oh and... with many 144/165hz monitors you can use strobing, and depending by its implementation, it surely increase the percieved clarity of the frame during a fast camera movement, also in comparison with 240/280hz. ( I had a Dell TN 165hz 1080p strobing, and it was very fast). Yes, you would be lacking the fluidity cuz of having half the fps in comparison with a 240hz/280hz, but that is expensive in terms of dollars and not always possible in all games also with an expensive video card.
 
To be honest, I enjoy being "left behind" playing at 1080p. I even bought that AoC last summer because of the ergonomics and because I needed to reclaim some space in my desk.

In fact I expect to be playing at 1080p until any other resolution takes its spot as the entry level resolution. I don't need more pixels and it lets me cheap out in graphic horsepower.
yep im staying with 1080p too. I have tried 2k, and my bro is using a 4k 60hz so I know them. Honestly I won't mind about the ergonomics, we can always buy an arm to position our monitor as we wish... what it is matter is the monitor itself.
The main problem about resolution and fps, is that you have to couple the monitor with... your video card, and that stuff cost. So the choise is really dependant on what you are going to use your pc for.
Anyway I believe you can be a pro-gamer also with a 1040p and 144/165hz, having a good color gamut (ips) and an optional "strobing" that gives you a clean camera movement (... in games like shooters). Ips gives you better picture quality in comparison with tn... you would be only lacking a good contrast compared with Va. I really hope we can have affordable oled monitors in a few years.. im following Joled with hope.
 
Last edited:
Maybe re-title this article "The Best of the Worst Monitors of 2021".... Now, 1080p is perfectly fine for office usage, but the accent on frame rates led me to believe that this article was about 'gaming'. And 1080p and gaming in the same sentence.... Oxymoronic.
1080P is the most widely used resolution in regards to gaming. Cost and FPS mean something when on a budget yet wanting to play FPS games .. and if you aren't playing FPS games then you aren't really a gamer are you ... you're just a pretender.

1080P = cheaper psu, graphics card and gaming monitor while managing to achieve a high FPS average for a more fluid gaming experience.

https://www.amazon.com/Pixio-Radeon-FreeSync-Esports-Monitor/dp/B08TYX8HN6/
Pixio PX248 Prime S 24 inch 165Hz IPS 1ms FHD 1080p AMD Radeon FreeSync Esports IPS Gaming Monitor $179.99
 
1080P is the most widely used resolution in regards to gaming. Cost and FPS mean something when on a budget yet wanting to play FPS games .. and if you aren't playing FPS games then you aren't really a gamer are you ... you're just a pretender.

1080P = cheaper psu, graphics card and gaming monitor while managing to achieve a high FPS average for a more fluid gaming experience.

Nah, 1440p is the new 1080p, time to put that crap resolution to pasture. 1440p screens are plenty cheap enough now and even a 1070 can handle most games at 1440p.

https://www.amazon.com/Pixio-Radeon-FreeSync-Esports-Monitor/dp/B08TYX8HN6/
Pixio PX248 Prime S 24 inch 165Hz IPS 1ms FHD 1080p AMD Radeon FreeSync Esports IPS Gaming Monitor $179.99
 
I bought the AOC 24G2 on what the review said about it And I can say it's bloody nice compared to the ViewSonic VA monitor I had the only thing I dislike is the stupid button positions and size luckily they also make an app (I-Menu) that allows you to change setting inside widows
 
Back