A Fair Warning: Avoid most so-called HDR monitors

Scorpus

Posts: 2,162   +239
Staff member
Why it matters: One of the most annoying aspects of the HDR ecosystem, especially for computer monitors, is the amount of products that claim to be "HDR capable" yet don't have the hardware to support HDR properly. We think it's really important to arm you all with the knowledge on how to identify a fake HDR product and why these products are bad and not worth spending money on.

For HDR to provide a genuine image quality improvement over SDR, the hardware needs to be capable of displaying the majority of the HDR signal's huge range – it's a high dynamic range image after all. This means it needs to support a high level of brightness: HDR images are brighter than SDR, often reaching over 1000 nits. It needs to support very high contrast, so that high brightness elements can be displayed alongside deep, rich shadow detail on screen simultaneously. This is crucial, as it gives HDR imagery most of its richness and pop. It needs to support a wide color gamut, allowing for a greater range of colors to be displayed. And it needs to follow HDR encoding systems, like the use of the PQ gamma curve and minimum 10-bit processing.

True HDR monitors will target all four areas and deliver a significant improvement over SDR displays.

But often we see fake HDR monitors, which try and fool buyers into thinking they're getting an HDR experience, when only a few (or sometimes none) of the key areas are accounted for. Monitor manufacturers are so lazy and deceptive that sometimes HDR support only extends as far as supporting HDR10 signal inputs and an adjustment to gamma, without any of the extended brightness, contrast, or color gamut that is required to show that signal properly.

This has been exacerbated by standards bodies that are doing a bad job of highlighting to consumers which monitors actually support good quality HDR. One of these standards is DisplayHDR, which is so poorly designed that monitors we would class as "fake HDR" can easily be certified in the lowest, DisplayHDR 400 tier. This only serves to benefit display makers who can market their products as "HDR certified," with the backing of a third party, but not consumers who want to find the best HDR products.

How can you tell if a monitor is fake HDR or not?

Our advice is simple: based on what is currently available, you should assume that an HDR monitor is fake, trash tier HDR unless proven otherwise. The vast majority of displays advertising HDR support these days, we would say in excess of 90%, are awful HDR products that you absolutely should not buy for their HDR capabilities.

The DisplayHDR certification system is not trustworthy enough to give you real insights into HDR performance, as we've seen products rated as high as DisplayHDR 1000 that we wouldn't class as true HDR.

You should especially ignore DisplayHDR 400 products. We don't think we've ever seen a good one for HDR content consumption, but ignoring DisplayHDR entirely is a good idea, too.

You should especially ignore DisplayHDR 400 products

Beyond this, we'd strongly recommend reading reviews to learn whether a product is a real HDR monitor or not. But if there are no reviews, there are a few things to look out for.

Most OLED displays will be capable of true HDR performance, so OLED is usually a good sign. Also look out for full array local dimming LCDs, often advertised with mini-LED backlights -- but make sure you check the zone count. If the zone count isn't advertised, you should be suspicious of that display's performance. If it is advertised, a number in excess of 500 zones is usually pretty good, with around 100 zones at the bare minimum. Also, look out for brightness specifications, you'll want to see peak brightness in the 600+ nit range.

Merely advertising "dimming" or "local dimming" isn't enough though. You'll want to specifically see terms like "full array" or a high zone count in the spec sheet. This is because some monitor makers like to include edge lit dimming, which can provide "local dimming," but usually with only a handful of massive zones.

Edge lit dimming is not sufficient for a good or true HDR experience as it doesn't allow for acceptable levels of local contrast, and frankly looks bad compared to real HDR. We wouldn't want you to buy an edge lit dimmed HDR panel and be disappointed, so this is our warning.

Luckily, there are some true HDR products that we believe are worth buying. See our full guide for the best HDR gaming monitors.

Permalink to story.

 
I have a 65" Samsung with HDR and 384 dimming zones. While it was a $5000 TV in 2017 we no longer live in 2017. Also, I don't know if HDR was truly established back then. But my advice to anyone, CHECK THE DIMMING ZONES. This TV was the best of the best back in 2017 but it has MAJOR glow issues. It doesn't matter if the HDR is great or not, glow around dimming zones will annoy the hell out of you. Still trying to get a few more years out of this thing because it's only worth about $500 now
 
I've been warning everyone who asks for my advice on HDR displays for years now. Almost no real HDR monitors exist below the £1000 mark.

A few exist today but it's literally like, three models and they're all close to a grand anyway.

It's always amazed me when I see people stick up for their HDR400 display saying it looks way better in HDR. It categorically doesn't. All you're observing is how bad your last monitor was if that looks good to you.
 
I have avoided those simply because I never found a means or method to verify the claims .... for one of the few times I got it right the first time around .....
 
I don't give any fs about HDR. If you create HDR content then yes it has merits. Otherwise all HDR is PR gimmick.

I have HDR400, but I never bought it (ProArt 4K) for retarded HDR, but for 100% DCI-P3 and Rec.709. It has an incredible feature... It has buttons on front face. I know! Such novelty. How come nobody thought of this since 2010? Stupid joysticks behind the display... I can press a button and switch the color profile and press it again to pick another one - it's glorious! ;)


 
Now days 4k 5k 8k should support HDR out of the box. try it out on youtube.com 4k hdr 5k hdr 8k hdr test. remember you can connect to a hdr screen even if not suppported. the hdr wil support old gpus cpus but not so good. 2017 was start of it so many bad monitors was released in that time. beta testing and at a HIgh price. now at 2022 aug you are to get full hd arent you or fake hd ready only.

 
I have to assume demand for HDR is low and/or the tech is still too expensive to go mainstream. HDR games are primarily single player and generally have little replay value, so maybe that's the problem? Maybe the implementation in some games isn't great even with proper HDR hardware? Most new gaming monitors focus on high refresh and pixel response and connecting to consoles so maybe that's where the money is right now?

Personally I'd rather have a crisp mature OLED with proper brightness, 1ms and 0 risk of burn-in than proper HDR especially in it's current state, but that won't be for another few years since I recently upgraded to a QD IPS, and I won't pay $800+ for a gaming monitor.
 
I have a LG HDR400 gaming monitor, I turned the HDR on only once and never again. I'll wait for 2 years at least to upgrade my monitor but HDR definitely not a feature I'm gonna look for.
 
Good advice Techspot! The standards authority behaviour is borderline criminal and it again shows how our consumer laws are mostly a total joke. I'm amazed in America of all places a class action hasn't been launched.
 
Even the DisplayHDR 1000 rated LG 32GQ950 is terrible. It’s an edge-lit display (1.5D) at $1399! Has great color range, though. However, most PC monitors don’t have HDR tone mapping either (Sony Inzone is one exception), so displaying SDR content in HDR often looks terrible, like Windows desktop when HDR is turn on. PC HDR monitors will display it with absolutely 0 adjustment in brightness, contrast, and saturation so that it looks flat, dark, and just plain terrible. Nvidia and AMD offer driver-level adjustments to fix it as PC monitors often lock out adjustments on panel itself (baffling!).

For AMD GPUs, I needed +200 contrast, -10 brightness, +15 saturation on Dell 3223Q (Windows SDR/HDR slider turned all the way down to fine tune it) and +165 contrast and +10 saturation on LG 32GQ950 to get the desktop to look decent when HDR is turned on. This is annoying when playing HDR games as these adjustments are global, so each HDR game needed its own color profile with no adjustments, else everything was blown out.

Meanwhile, an LG C1 OLED TV displays Windows desktop correctly when HDR is turned on, albeit it does need some manual brightness reduction via Windows SDR/HDR brightness slider to reduce white blowout on desktop when contrast is maxed out in TV settings.

Where OLED fails vs LED is in sustained brightness. OLEDs have aggressive auto brightness limiting, while LEDs, with active cooling, can sustain very high sustained brightness window levels. The LG 32GQ950 did sear my retinas quite a bit.
 
I have to assume demand for HDR is low and/or the tech is still too expensive to go mainstream. HDR games are primarily single player and generally have little replay value, so maybe that's the problem? Maybe the implementation in some games isn't great even with proper HDR hardware? Most new gaming monitors focus on high refresh and pixel response and connecting to consoles so maybe that's where the money is right now?
I'm assuming the author is correct that true HDR requires 10 bit color depth.

Here's the catch, Nvidia "game ready drivers", only support 8 bit color. (16.7 million colors) While many panels advertise 10 bit color depth, "1 billion colors", you need to install Nvidia "Studio series" drivers (5xx.) to take advantage of it. Said drivers can only be utilized with GTX 1050 and above. (Yes, I realize that is a low bar these days).
Here are a couple of links describing how many colors humans are capable of seeing:

So, I would argue that true HDR is more heavily dependent absolute luminosity than anything else. Taken from a photographic perspective, each F stop cuts the brightness by one half. Since full sun is about 16,000 foot candles, some quick figurin' that is is impossible for a monitor with 300 FC to yield a "dynamic contrast ratio" of "3 million to 1" or other such nonsense to be truly possible.

So, FWIW, while the human eye's "F stop" (iris) opens up in low light, it may nor be possible for human's eyesight to distinguish the very lowest light levels that a true HDR monitor would be capable of producing. This would manifest itself by being perceived in "loss of shadow detail". (What the monitor thinks is "very dark gray", would appear as black to the viewer. (I realize I'm splitting hairs here)).

The good news is, your house cat has 10 times the low light vision compared to humans. Thus, you can take your cat to your home theater with you, to tell you what you missed. :rolleyes:
 
Last edited:
I have a 65" Samsung with HDR and 384 dimming zones. While it was a $5000 TV in 2017 we no longer live in 2017. Also, I don't know if HDR was truly established back then. But my advice to anyone, CHECK THE DIMMING ZONES. This TV was the best of the best back in 2017 but it has MAJOR glow issues. It doesn't matter if the HDR is great or not, glow around dimming zones will annoy the hell out of you. Still trying to get a few more years out of this thing because it's only worth about $500 now
I also have 65" Samsung and before playing any movie I'm checking if it's HDR to avoid awful picture. When it's HDR I'm playing it through PC so it can't send HDR signal to tv :p
My TV is professionally calibrated and has awesome picture but not in HDR.
 
Yeah, especially that flickering AW3423DW QD-OLED. It's just a lesser HDR 400 monitor. ABL kicks the crap out it. I have never been this disappointed on a monitor that is both dim and flickering at the same time.
 
Title is rather misleading as you are not supposed to avoid a monitor just because of this; an HDR400 monitor can still be an excellent SDR monitor...just don't use it in HDR, don't buy it for HDR and be aware and informed of what you are buying.
 
Back