Until now, it hasn't made sense to write a guide about the best HDR monitors for gaming, but it feels like we're finally starting to see some momentum with some noteworthy contenders.
Until now, it hasn't made sense to write a guide about the best HDR monitors for gaming, but it feels like we're finally starting to see some momentum with some noteworthy contenders.
When you start with a false premise, it taints your conclusions. The "R" in HDR doesn't stand for brightness. Due to how the human eye works, unless you're using a monitor in conditions of high ambient light, the range is far more important than peak brightness. It's only when a monitor's "blacks" are actually grey that you need high brightness to make them appear dark by contrast.it's a high dynamic range image after all. This means it needs to support a high level of brightness: HDR images are brighter than SDR...
it is not possible to show at the same time a high dynamic range of a picture without have many level of brightness to support that. on a screen with 250 nits adjusted to a hdr image (image to sdr tone mapping) many scenes will be burned or black crushed and overall image will not represent the source image, no matter what external conditions are. Brightness increases as well in small steps, and if dynamic range levels are higher than step of 'brightness resolution' (real term).When you start with a false premise, it taints your conclusions. The "R" in HDR doesn't stand for brightness. Due to how the human eye works, unless you're using a monitor in conditions of high ambient light, the range is far more important than peak brightness. It's only when a monitor's "blacks" are actually grey that you need high brightness to make them appear dark by contrast.
What you're referring to is an entirely different concept. The "step of brightness" , or bit depth, refers to the number of luminosity steps for each color channel. HDR10, for instance, specifies 10 bits, or 1024 steps.Brightness increases as well in small steps, and if dynamic range levels are higher than step of 'brightness resolution' (real term).
If source dynamic range > display brightness resolution then we have some different level of brightness displayed as one. It is similar to image banding effect,...
HDR10 indeed specifies 10 bits per color, which gives 1milion steps per pixel.What you're referring to is an entirely different concept. The "step of brightness" , or bit depth, refers to the number of luminosity steps for each color channel. HDR10, for instance, specifies 10 bits, or 1024 steps.
Furthermore, you have it almost entirely backwards. The degree of brightness variance between any two steps is the total luminosity divided by the step count. A 10-bit 1000 nit monitor has double the variation as does a 10 bit 500-nit monitor. The extra brightness is useful for increasing contrast ratio, not in reducing banding.
I'm holding out on a 4K HDR gaming monitor for the reasons you describe here. I am not going to pay $1000 + for a monitor, its just too expensive. Eventually decent 4K HDR high refresh rate monitors will be released that are < $600.
its true but currently the c1 at 48 inches at $797 is the closest you will get to ideal experience/ quality / price. Unfortunately we might have to wait a few more years to get to the figure you mentioned.I'm holding out on a 4K HDR gaming monitor for the reasons you describe here. I am not going to pay $1000 + for a monitor, its just too expensive. Eventually decent 4K HDR high refresh rate monitors will be released that are < $600.
You've misinterpreted that video. He shows a PQ curve and very clearly explains (3:46-4:02 in the clip) how values above the curve cause posterization (banding) whereas values below the curve cause the opposite problem -- loss of contrast detail. He then turns down a 650-nit screen to 400 nits, and shows that forces the screen image to be always below the curve. Causing, not banding, but loss of detail.Before continue, please watch this video [link] made by expert in the topic.
I didn't say lower brightness causing banding. I said it causes details reduction, in similar way banding does (by flattening surrounding pixels in terms of brightness level, while real banding cause flattening surrounding pixels based on colour depth). And exactly that is visible on screen with lower maximum brightness, which directly addressing your first post.You've misinterpreted that video. He shows a PQ curve and very clearly explains (3:46-4:02 in the clip) how values above the curve cause posterization (banding) whereas values below the curve cause the opposite problem -- loss of contrast detail. He then turns down a 650-nit screen to 400 nits, and shows that forces the screen image to be always below the curve. Causing, not banding, but loss of detail.
In other words, exactly what I said: lower peak brightness means less banding, not more.
Also, as an aside, I should point out the video, although technically accurate, is quite disingenuous. Look at the PQ curve -- he calibrated a 650-nit monitor to display any and all content above 70% brightness at the exact same 400 nit level. Of course that's going to crush detail in bright scenes. But that's not how a monitor is designed -- they should smoothly map brightness levels across their entire dynamic range. He set a monitor to ignore a third of its dynamic range, then compared it to the exact same monitor that was properly calibrated. Is anyone surprised which one won?
Aiy yai yai, no. If you're talking about luminance, 10 bits is 2^10 = 1,024 steps. If you're talking about color depth, then there are 3 color channels, meaning 30 bits, or 2^30 = 1 billion steps. Not a million.HDR10 indeed specifies 10 bits per color, which gives 1milion steps per pixel.
Best Below $1,000 Sony Inzone M9 27" Price: $1,227
Anyone see a problem here?
Best Buy and B&H Photo have it on pre-order for $899. One wonders why they didn't link to those prices instead of Amazon. Amazon, FYI, only shows one vendor at $1227 with delivery of 10-20 days, so it's likely the seller has no inventory yet.I don't even know how they can price it like that given the monitor is still in preorder phase. It releases later in August so you can't actually pick one up yet. Sony.com has it priced at $900 with $13 shipping.
We reviewed the Sony Inzone ahead of release and linked to Amazon as it had it listed at the MSRP, but prices can fluctuate there based on the actual reseller. We have now replaced it to B&H which is showing $898 as of writing.Best Buy and B&H Photo have it on pre-order for $899. One wonders why they didn't link to those prices instead of Amazon. Amazon, FYI, only shows one vendor at $1227 with delivery of 10-20 days, so it's likely the seller has no inventory yet.
Sounds good. Thanks....We reviewed the Sony Inzone ahead of release and linked to Amazon as it had it listed at the MSRP, but prices can fluctuate there based on the actual reseller. We have now replaced it to B&H which is showing $898 as of writing.
In terms of pricing that seems good. Burn in however is still a thing.its true but currently the c1 at 48 inches at $797 is the closest you will get to ideal experience/ quality / price. Unfortunately we might have to wait a few more years to get to the figure you mentioned.
The oled high refresh rate gaming experience went from $1500 to $797 in a matter of 3 years.
Well, god did give you two for a reason; why not give up one in a good cause?I don't care much about HDR. I just want [a] QD-OLED for daily use - one that won't cost me a kidney.
Should be "list of really really expensive monitors" as expected. I would be looking in the $100 to at most $400 for a monitor.
Review snobs always only gravitate to extremely expensive options, since they have access to everything as reviewers. It's the same issue for car reviews as electronics or any reviews, they'll say "only the v8 version of the truck is worth getting" (for an extra 10k).
I mean I get it and I know the easy reply, this is HDR monitors, apparently by nature expensive. Somehow I suspect monitors below $1000 can have some level of HDR support though. Hell I remember when HDR Tv's were new they said we needed 10,000 nits to display true HDR, whatever happened to that? Since it seems physically impossible, suddenly over the years the reviewers have changed to 1000 nits is HDR...in conclusion, I'd probably be thrilled with a displayHDR 400 monitor, and that's probably what I'll end up with.
Same with TV's again, there isn't a magic HDR brightness number. It's a incremental not a binary. I'm looking at a TV with wide color gamut, FALD, 20k:1 contrast, and ~500 nits bright. Is that HDR? What about my +1000 nits TV from 2018 but only ~6k contrast w/FALD and less color gamut IIRC (it was pre-QLED)? Which one is more HDR-y?