Best HDR Gaming Monitors 2022, and Learn What Fake HDR Monitors to Avoid

it's a high dynamic range image after all. This means it needs to support a high level of brightness: HDR images are brighter than SDR...
When you start with a false premise, it taints your conclusions. The "R" in HDR doesn't stand for brightness. Due to how the human eye works, unless you're using a monitor in conditions of high ambient light, the range is far more important than peak brightness. It's only when a monitor's "blacks" are actually grey that you need high brightness to make them appear dark by contrast.
 
When you start with a false premise, it taints your conclusions. The "R" in HDR doesn't stand for brightness. Due to how the human eye works, unless you're using a monitor in conditions of high ambient light, the range is far more important than peak brightness. It's only when a monitor's "blacks" are actually grey that you need high brightness to make them appear dark by contrast.
it is not possible to show at the same time a high dynamic range of a picture without have many level of brightness to support that. on a screen with 250 nits adjusted to a hdr image (image to sdr tone mapping) many scenes will be burned or black crushed and overall image will not represent the source image, no matter what external conditions are. Brightness increases as well in small steps, and if dynamic range levels are higher than step of 'brightness resolution' (real term).
If source dynamic range > display brightness resolution then we have some different level of brightness displayed as one. It is similar to image banding effect, when you try to display a 14bit gradient image on 6 bit screen.
So yes, the high maximum brightness translates to higher brightness resolution, and more detailed image better representing dynamic range of source picture.
 
Last edited:
Brightness increases as well in small steps, and if dynamic range levels are higher than step of 'brightness resolution' (real term).
If source dynamic range > display brightness resolution then we have some different level of brightness displayed as one. It is similar to image banding effect,...
What you're referring to is an entirely different concept. The "step of brightness" , or bit depth, refers to the number of luminosity steps for each color channel. HDR10, for instance, specifies 10 bits, or 1024 steps.

Furthermore, you have it almost entirely backwards. The degree of brightness variance between any two steps is the total luminosity divided by the step count. A 10-bit 1000 nit monitor has double the variation as does a 10 bit 500-nit monitor. The extra brightness is useful for increasing contrast ratio, not in reducing banding.
 
I'm holding out on a 4K HDR gaming monitor for the reasons you describe here. I am not going to pay $1000 + for a monitor, its just too expensive. Eventually decent 4K HDR high refresh rate monitors will be released that are < $600.
 
What you're referring to is an entirely different concept. The "step of brightness" , or bit depth, refers to the number of luminosity steps for each color channel. HDR10, for instance, specifies 10 bits, or 1024 steps.

Furthermore, you have it almost entirely backwards. The degree of brightness variance between any two steps is the total luminosity divided by the step count. A 10-bit 1000 nit monitor has double the variation as does a 10 bit 500-nit monitor. The extra brightness is useful for increasing contrast ratio, not in reducing banding.
HDR10 indeed specifies 10 bits per color, which gives 1milion steps per pixel.
No screen can display 1 milion brightness levels per pixels, but this is where higher maximum brightness helps a lot. And no, step of screen brightness are not related to screen total advertised colour count. Step of brightness are not bit, or colour depth. It is unrelated backlight (or a single diode) energy levels and they are not analog.

And I might not have been clear in reference to banding. Monitor with lower brightness will display similar luminescent pixels as same brightness much more often than screens with high maximum brigthness. Meaning, that range of source will not matter, as it will be mapped to available target display, resulting in loosy compression and removing luminescency details - therefore hdr image converted to SDR, or SDR image on such screen will look the same.

Displayed range will be limited by maximum brightness just because you will be able to differentiate more subtle transitions, and show more details before black/white crush happen. Tone algorithms obviously will try to squeeze image to look nicely for people, but if there is not much display range to work on, then this will result in worse result.

But yeah, there is just me, talking, and we obviously have own standing points. Before continue, please watch this video:
made by expert in the topic. I recommend other of his videos as well if you're concerned about anything related to display image/quality etc. (at 2:00 you can see this 'banding' representation of flattening pixels colours to accomodate lower brightness range, but please watch it fully).
 
I'm holding out on a 4K HDR gaming monitor for the reasons you describe here. I am not going to pay $1000 + for a monitor, its just too expensive. Eventually decent 4K HDR high refresh rate monitors will be released that are < $600.

This is exactly what I was thinking. The pricing is out of my budget. Like you I'll stick to sub $600, often well below that for 34" curved
 
Would my Gigabyte M32U might not be “true HDR” or “good” for that matter, it’s by far the best monitor I could afford, and it’s a night and day difference compared to my previous screen. For a lot of people, these upgrades are massive. I’m very pleased with it, sad it didn’t make the list.
 
I'm holding out on a 4K HDR gaming monitor for the reasons you describe here. I am not going to pay $1000 + for a monitor, its just too expensive. Eventually decent 4K HDR high refresh rate monitors will be released that are < $600.
its true but currently the c1 at 48 inches at $797 is the closest you will get to ideal experience/ quality / price. Unfortunately we might have to wait a few more years to get to the figure you mentioned.
The oled high refresh rate gaming experience went from $1500 to $797 in a matter of 3 years.
 
Before continue, please watch this video [link] made by expert in the topic.
You've misinterpreted that video. He shows a PQ curve and very clearly explains (3:46-4:02 in the clip) how values above the curve cause posterization (banding) whereas values below the curve cause the opposite problem -- loss of contrast detail. He then turns down a 650-nit screen to 400 nits, and shows that forces the screen image to be always below the curve. Causing, not banding, but loss of detail.

In other words, exactly what I said: lower peak brightness means less banding, not more.

Also, as an aside, I should point out the video, although technically accurate, is quite disingenuous. Look at the PQ curve -- he calibrated a 650-nit monitor to display any and all content above 70% brightness at the exact same 400 nit level. Of course that's going to crush detail in bright scenes. But that's not how a monitor is designed -- they should smoothly map brightness levels across their entire dynamic range. He set a monitor to ignore a third of its dynamic range, then compared it to the exact same monitor that was properly calibrated. Is anyone surprised which one won?
 
You've misinterpreted that video. He shows a PQ curve and very clearly explains (3:46-4:02 in the clip) how values above the curve cause posterization (banding) whereas values below the curve cause the opposite problem -- loss of contrast detail. He then turns down a 650-nit screen to 400 nits, and shows that forces the screen image to be always below the curve. Causing, not banding, but loss of detail.

In other words, exactly what I said: lower peak brightness means less banding, not more.

Also, as an aside, I should point out the video, although technically accurate, is quite disingenuous. Look at the PQ curve -- he calibrated a 650-nit monitor to display any and all content above 70% brightness at the exact same 400 nit level. Of course that's going to crush detail in bright scenes. But that's not how a monitor is designed -- they should smoothly map brightness levels across their entire dynamic range. He set a monitor to ignore a third of its dynamic range, then compared it to the exact same monitor that was properly calibrated. Is anyone surprised which one won?
I didn't say lower brightness causing banding. I said it causes details reduction, in similar way banding does (by flattening surrounding pixels in terms of brightness level, while real banding cause flattening surrounding pixels based on colour depth). And exactly that is visible on screen with lower maximum brightness, which directly addressing your first post.

And the video is correct - he told exactly what he is doing and that he disables tone mapping. Later he enabled tone mapping to allow screen to level brightness across the dynamic range, which resultet in overall darker image (because you need to squeeze same information on much lower brightness range) and detail loss anyway (because similarly bright pixels will be lotted on less brightness steps - and again, this do not mean color or pixel depth).

What we did see was 2 screens, one acuratelly displaying image as expected by the content producer, and 2nd not, only, because the brightness was not high enough.
 
HDR10 indeed specifies 10 bits per color, which gives 1milion steps per pixel.
Aiy yai yai, no. If you're talking about luminance, 10 bits is 2^10 = 1,024 steps. If you're talking about color depth, then there are 3 color channels, meaning 30 bits, or 2^30 = 1 billion steps. Not a million.

And with all due respect, you're still missing the crucial point about that video. He took an OLED TV and recalibrated it to give up more than 1/3 of its dynamic range Which, by definition, drops the contrast ratio by the same amount.. Of course the image looks worse. Not because it's "less bright", but because it has less range.

The HDR aspect of those displays at 650 nits, or even 400 nits is far superior to a non-OLED TV at 1000 nits. Why? Because OLED displays blacks at about 0.0001 nits, rather than the 1 nit or so of a basic LCD. The range between the darkest black and the lightest white is what matters.

If brighter always meant better, we'd all turn up our monitor brightness to max, and ignore the washed-out results. When higher brightness translates to a higher dynamic range, yes the image improves. But the idea that "if it ain't a thousand nits it ain't HDR" is flatly absurd.
 
Best Below $1,000 Sony Inzone M9 27" Price: $1,227

Anyone see a problem here?

I don't even know how they can price it like that given the monitor is still in preorder phase. It releases later in August so you can't actually pick one up yet. Sony.com has it priced at $900 with $13 shipping.
 
I don't even know how they can price it like that given the monitor is still in preorder phase. It releases later in August so you can't actually pick one up yet. Sony.com has it priced at $900 with $13 shipping.
Best Buy and B&H Photo have it on pre-order for $899. One wonders why they didn't link to those prices instead of Amazon. Amazon, FYI, only shows one vendor at $1227 with delivery of 10-20 days, so it's likely the seller has no inventory yet.
 
Best Buy and B&H Photo have it on pre-order for $899. One wonders why they didn't link to those prices instead of Amazon. Amazon, FYI, only shows one vendor at $1227 with delivery of 10-20 days, so it's likely the seller has no inventory yet.
We reviewed the Sony Inzone ahead of release and linked to Amazon as it had it listed at the MSRP, but prices can fluctuate there based on the actual reseller. We have now replaced it to B&H which is showing $898 as of writing.
 
We reviewed the Sony Inzone ahead of release and linked to Amazon as it had it listed at the MSRP, but prices can fluctuate there based on the actual reseller. We have now replaced it to B&H which is showing $898 as of writing.
Sounds good. Thanks....
 
Should be "list of really really expensive monitors" as expected. I would be looking in the $100 to at most $400 for a monitor.

Review snobs always only gravitate to extremely expensive options, since they have access to everything as reviewers. It's the same issue for car reviews as electronics or any reviews, they'll say "only the v8 version of the truck is worth getting" (for an extra 10k).

I mean I get it and I know the easy reply, this is HDR monitors, apparently by nature expensive. Somehow I suspect monitors below $1000 can have some level of HDR support though. Hell I remember when HDR Tv's were new they said we needed 10,000 nits to display true HDR, whatever happened to that? Since it seems physically impossible, suddenly over the years the reviewers have changed to 1000 nits is HDR...in conclusion, I'd probably be thrilled with a displayHDR 400 monitor, and that's probably what I'll end up with.

Same with TV's again, there isn't a magic HDR brightness number. It's a incremental not a binary. I'm looking at a TV with wide color gamut, FALD, 20k:1 contrast, and ~500 nits bright. Is that HDR? What about my +1000 nits TV from 2018 but only ~6k contrast w/FALD and less color gamut IIRC (it was pre-QLED)? Which one is more HDR-y?
 
Bought a MSI mgp321ur-qd but that hurt my eyeballs and exchanged it for Asus Pg32uqx. Eyeballs happy but wallet sad now.
 
In reality, buy what one can afford since proper HDR on PC is seriously not practical due to the price. While prices of HDR capable screens are coming down, they are still not wallet friendly. And some people like me uses my monitor 80% of work, and 20% of games and light usage, won't find it meaningful to buy high end monitors.
 
its true but currently the c1 at 48 inches at $797 is the closest you will get to ideal experience/ quality / price. Unfortunately we might have to wait a few more years to get to the figure you mentioned.
The oled high refresh rate gaming experience went from $1500 to $797 in a matter of 3 years.
In terms of pricing that seems good. Burn in however is still a thing.
 
I don't care much about HDR. I just want a 27" 1440p or 32" 2160p, 144 Hz, FreeSync, flat QD-OLED for daily use - one that won't cost me a kidney.
 
Agree with both top choices - these were my shortlist when I was replacing my over-sized CX 48 in spring this year. I originally opted for the AW3423DW and ordered on day of release in UK but it was another paper launch and wasn't shipping till August. In the end I went with the C2 42, partly because of availability, but also after reading about the sub-pixel layout / text issue (I also use my monitor for working). I'm glad now that I did change my mind, as the C2 is phenomenal. The 4k screen in standard 16:9 aspect ratio with better built-in sound and RGBW pixel layout makes this a better all round choice in my view. Edge effects can still be present on some text and charts but is less obvious than on the CX 48 probably due to the higher dpi. Buying a monitor has always required making competing choices and compromises, and that's before factoring in the price, and OLED hasnt changed that. I wouldn't ever go back to LCD now though
 
I don't care much about HDR. I just want [a] QD-OLED for daily use - one that won't cost me a kidney.
Well, god did give you two for a reason; why not give up one in a good cause?

In any case, since all QD-OLEDs displays -- and in fact all current 4K OLEDs -- are HDR already, you won't have to choose between them.
 
Should be "list of really really expensive monitors" as expected. I would be looking in the $100 to at most $400 for a monitor.

Review snobs always only gravitate to extremely expensive options, since they have access to everything as reviewers. It's the same issue for car reviews as electronics or any reviews, they'll say "only the v8 version of the truck is worth getting" (for an extra 10k).

I mean I get it and I know the easy reply, this is HDR monitors, apparently by nature expensive. Somehow I suspect monitors below $1000 can have some level of HDR support though. Hell I remember when HDR Tv's were new they said we needed 10,000 nits to display true HDR, whatever happened to that? Since it seems physically impossible, suddenly over the years the reviewers have changed to 1000 nits is HDR...in conclusion, I'd probably be thrilled with a displayHDR 400 monitor, and that's probably what I'll end up with.

Same with TV's again, there isn't a magic HDR brightness number. It's a incremental not a binary. I'm looking at a TV with wide color gamut, FALD, 20k:1 contrast, and ~500 nits bright. Is that HDR? What about my +1000 nits TV from 2018 but only ~6k contrast w/FALD and less color gamut IIRC (it was pre-QLED)? Which one is more HDR-y?

I am totally with you. My TV is a Panasonic ViERA LX47ET5E 47" now 10 years old and for me it still looks great even compared to newer ones. Defintely not HDR but for now it is still an excellent TV.

As a monitor I have a IIyama GB3466WQSU 34" its a VA panel according to the specs it is HDR400 but I don't know if it is actually capable of it but I don't care.

To me HDR is still a buzzword.
 
Back