Upgrade Your Monitor, Not Your GPU

Yup, it may look fancy, but it wont help with SPEED.... how many people out there think nothing of spending thousands??
and then there is a new wave of quantum dot monitors coming, very cheap at $2000.. LOLOLOL
 
I don't agree with the logic. Gpu market is awful and the way to mitigate it is to wait longer between purchases. The monitor market is in innovation phase where the products keep getting better and cheaper, therfore as long as your monitor works you should wait for the stagnation period. You will get excellent monitor for a great price.
With so much competition in the display arena the stagnation period is no where in sight at least by me. We have 2 competitive oled technologies woled and wd oled . The w-oled technology branches to mla and non mla oleds which both are making new peak highs with every generation. Then there is the Chinese brands like TCL and others attempting to improve mini led technology. This is all before we actually get micro led displays which are supposed to be oled benefit without the burn in risk. Then there is nano oled which Is currently only in miniature displays for VR. If they can scale that to higher displays the ppi will destroy anything else on the market imo. So unlike the graphics card market where we have a duopoly the display market is probably a golden age beginning from the looks of it.
 
Is this not one of the biggest faceplam moments in Techspot journalistic history? I know that there's a passing mention to it, but any monitor upgrade is pretty well certain to push you into a GPU upgrade; stands to reason.
This, I own 4k monitors with an rtx 3070 and it is consistently a problem for most games these days. They market the cards as "4k ready" meanwhile I can barely get 60 fps on some of the lowest settings. Waiting for the 5000 series now and will most likely grab the flagship model this time around.
 
I agree that gaming at 1440 makes a huge difference but my monitor (Asus ProArt 27) only supports 60hz. I'm actually fine with that due to network lag being more of an issue most times that a higher refresh is a waste of money for me. Keep that in mind when looking at monitors. Yes 120hz would be a major improvement if my pings were <20ms all the time but they vary from <20 to >1000 (.1 second) and I start rubber banding.

I'd suggest going with a 120hz Refresh in the 27 inch range unless you don't have the room for it and use a monitor arm for it. They're cheap enough if you want something basic while allowing the use of a larger unit on your desktop plus they open desk realestate so you can actually use it.
 
A compelling argument for why I should get a new monitor over a new video card, except you miscalculated, I want both.
 
Ah yes, third party utilization of the Steam hardware survey for monitor marketing purposes.
 
This, I own 4k monitors with an rtx 3070 and it is consistently a problem for most games these days. They market the cards as "4k ready" meanwhile I can barely get 60 fps on some of the lowest settings. Waiting for the 5000 series now and will most likely grab the flagship model this time around.
And it gets worse with each coming year as we get these stunning new games on unreal 5 which put 4090 on its knees.
I, too, have 4k monitor (non gaming stuff mostly) and a 3070. Good for older games, not so much for new ones.
 
Problem with the PC market is, everything is a "it depends." The stuff people have in PC is wildly different, what you upgrade is entirely dependent on what you upgraded last and what you needed.

If you are new to PC gaming, and you like many normal people, probably just got an office computer cheap or you bought yours like more than 5 years ago, then maybe you should upgrade to a gaming monitor with high refresh rate if you want action, or something with better colour and detail if you are more into slow, immersive games. But if you already upgraded a monitor perhaps in the last couple years, and you are running a 1060, yeah, a GPU is probably where is at if you do want to upgrade. It is probably not wrong to say there is a bit of a disconnect between what monitor to have in relation to your GPU/CPU, and even among those two you have a better chance finding a recommendation from a random Reddit post than any tech outlet.

Honestly, if I am to make a out there recommendation on what to get, is probably better speakers or headphones. For my Playstation, my TV surround sound unit is having a little trouble recently, and therefore I had to use TV speakers; the difference is silly and I am using headphones even when gaming alone. Whereas on my desktop, after fiddling around for ages on cheap speakers and stuff like Steelseries and Logitech (basically "gamer" orientated junk), I opted for Q-Acoustics speakers, wired Audio Technica headphones that I semi-retired when I decided to use wireless, and a Blue Yeti mic (I had to do presentations on Zoom, so that's a plus). Together with a cheap DAC, I have a great sound set up for everything I do with my desktop. Thing is, save for the speaker, which I got because I wanted high quality output for music, the stuff I got wasn't expensive but the quality is top notch.
 
The arrival of OLED monitors is the trigger here. Its vastly superior in almost every way. But it hasn't quite trickled down to the mainstream / budget segments yet, so I'd only follow this advice if you have a big war chest for the hobby. I do currently, as I haven't bought any other hardware this year, and have just ordered a new qd oled display.

However I'd advise most people to be more sensible than me, and hold onto what they have and wait 6 months to a year.
 
I bought a monitor with better/darker displays for black, and it was night and day while watching movies and playing games that have dark settings. Blacks actually look black and dont have a grey glow like my previous cheap monitor.
 
As always, it depends what you really want.
I had a 6700xt gpu paired with a five year old 1440p 144Hz monitor.
I was torn between upgrading to a 7800xt gpu or a 4k monitor.
Conventional wisdom suggests pairing a mid range 6700xt with a 4k monitor is a terrible idea.
However, I tend to mainly play older games, so thought I might get away with it, with occasional settings or resolution changes.
I went for a LG UltraGear 4K Gaming Monitor 27GR93U and couldn’t be happier. Compared to my previous monitor, games look so much more vibrant, and that’s with only ‘pretend’ HDR400.
Most games I’ve tried run absolutely fine in 4k. My son is also playing competitive FPS’ in 4k, with high frame rates.
I’ve tested dropping the resolution to 1080 in case I get something much more demanding, and I can honestly say I can barely tell any difference from 4k, so I’ll happily do that. Note, mine is 27” - a 32” may look a little more ‘pixelated’.
So, in conclusion, I think this is a valid suggestion if you want a way to improve your visual experience.
 
Not so sure about HDR being a "game changer". On my Alienware OLED, colors always look more washed out in HDR mode. As such, I keep HDR off to enjoy the vibrant colors of OLED instead of the washed-out colors of HDR. Proper HDR implementation still has a long way to go imho.
 
Once again...

Upscaling is gimmick for older cards to extend their life and easily allow older systems to update to higher resolutions without loosing performance.

That is why so many GTX1080 fans love AMD's FSR...! But who is buying a new GPU today, all geeked up ovr upscaling...?


 
High refresh rate is great and there's no drawback to that, but going to 4K without upgrading the video card is a big mistake as you will receive a serious refresh rate drop is many modern games. Furthermore, games require more VRAM when running at higher resolution, so again, you need a stronger video card.
Furthermore, once you commit yourself to 4K you need to be willing to stay at a much higher video card budget as long as you keep running games at the monitor's native 4K resolution (cause that's how the game would look the best and any other resolution would look awful on that monitor).
I could have purchased a 4K monitor but instead, I went for a quality 27" 1440P one for this reason alone.
 
1080p? Luxury! Believe it or not I'm still running a 1280x1024 monitor -- so I run games in 1280x720 windowed. If I want over 60hz, it DOES support 75hz though LOL.

I am glad gamers are so demanding though, it's driven the price of a montior that's nice (but "only" supports like 120hz rather than even higher refresh) down to like $80.
 
Will a better quality monitor be worth the cost of the upgrade? Yes, in the same way that any well thought out upgrade should improve a gamer's experience. Is the upgrade as important as the article makes it our to be? As I've said before bottle necks are software related, not hardware. If you play fast paced multi-player FPS like battlefield or COD, you want really high FPS. To the point where a 200hz 1080p monitor will serve the end user better than a 75hz 1440p. HDR is a major game changer, but to really benefit from the feature you need a top tier monitor going for $1,000 plus.

In the end everything comes down to what the end user is going to do. I went from a IPS 5ms 1440p 60hz to a IPS 1ms 1440p 170hz and TBH for the games I play I don't really notice any difference. I'm playing single player games exclusively, except for WoW, but I don't PvP. Beyond that, I mostly play single player sims, RTS, strategy, and RPGs. And the difference between 60hz and 120hz (my default with the new monitor) isn't mind blowing. By any means. The new monitor has the lowest level of HDR, which is pretty much crap. So while a high end OLED is intriguing I can't justify the cost, and my current visuals are fine IMHO.So maybe one day, but not till both reliability and cost improves I'm afraid. I've got to live in the real world with a real world budget...
 
Back