FreeSync 2 in Action: How good is HDR gaming right now?

"SDR" actually looks better in half those shots, LOL:-
https://static.techspot.com/articles-info/1633/images/F-5.jpg
https://static.techspot.com/articles-info/1633/images/F-6.jpg
https://static.techspot.com/articles-info/1633/images/F-9.jpg

Personally, I'd rather they put the effort into coming out with better, sensibly priced OLED monitors than continue to churn out gimmicks that are mostly wasted on backlit TN / MVA / IPS stuff with the same old compromises.

Well first it's impossible to view HDR content accurately without an HDR monitor. Most likely, you do not have one. Second, those images are 8 bit with a limited color space. They could not display HDR colors period. Not that it maters either way, as judging the quality of another computer screen through your own screen has it's inherent flaws and is clearly going to be limited by what you are on now.

What you should notice that the blacks look very black and certain scenes appear to be more colored. On an HDR monitor you would still be able to see the details in the black and colored areas these pictures do no show. That's the advantage to increasing the contrast ratio and displayable colors.
 
No interest in this display as it's 30Hz, otherwise you would have mentioned the Hz.

Not interested in amd compatible displays as it's impossible to get a good amd graphics card for a good price and I can't see them doing better in the future. Good to know HDR is pointless still, I'll wait for better standards and support. G-sync HDR displays seem to have better requirements so I rather pay the extra to get actual HDR display, not just something trying to emulate one.
 
Well first it's impossible to view HDR content accurately without an HDR monitor. Most likely, you do not have one. Second, those images are 8 bit with a limited color space. They could not display HDR colors period. Not that it maters either way, as judging the quality of another computer screen through your own screen has it's inherent flaws and is clearly going to be limited by what you are on now.

What you should notice that the blacks look very black and certain scenes appear to be more colored. On an HDR monitor you would still be able to see the details in the black and colored areas these pictures do no show. That's the advantage to increasing the contrast ratio and displayable colors.
I don't own one but I have indeed seen a friend's HDR side by side vs a normal monitor (in person, ie, not viewing it through something else) and wasn't that impressed to be honest. Had more of a "jump" for "deep blacks" going from TN to AMVA. As article said, for extra contrast it's stuff like local dimming (on more expensive monitors that just coincidentally happen to be HDR) that makes the most visible contrast difference.

More often than not honest "HDR" gaming comparisons are extremely difficult due to the way enabling "HDR" presets in games often adds a lot more artificial filtering things on top of pure-HDRness to exaggerate the difference. Eg, the F-6.jpg file linked to earlier, the whole "HDR" image looks like the photographic equivalent of taking two pictures with different cameras, but adding a strong neutral density filter for only one of the cameras, and then pretending the only difference is entirely down to the cameras and not the blatantly obvious additional filtering. This stuff has little to do with "subtle shadow detail", as it's obvious the lightness of the bridge lights is much darker / had bloom removed, little different to turning the gamma down on everything.

It's very evident in many games that enabling "HDR" in the in-game settings also adds the extra equivalent of feeding it through a Reshade / SweetFX filter on top, "recoloring" the scene (as seen in pic F-5's ugly green tint), which is something that can be done on any monitor (and your average Fallout 3 "green tint remover" mod showcases the same effect of perfectly). And this is why I openly laugh at most "HDR vs non-HDR" laughably "skewed" gaming 'comparison' shots. Genuine HDR is so incredibly subtle, half the time you can hardly see it under normal non-blackout lighting conditions. Hence the need to "fake compare" with adding a load of non-HDR color reshading, gamma / contrast adjustments, etc on top. Quite often, the reviewer himself is unaware of what else is going on "under the hood" for in-game presets.
 
Last edited:
Still havent seen any VA panels than are not quite a bit slower than similar spec'd IPS panels. They are often a good 10% slower overall, but dark transitions on VA can be 50% or more behind IPS. I doubt this panel is any different.

Gimme IPS 200hz, then I would'nt care about all the extra's like freesync and gsync, they become unnecessary since tearing is in practise not an issue already with monitors refreshing 144 times per second. At 200hz they might aswell stop making framesync such as the gsync proprietary module altogether. Since freesync is basically free they could still use it, but I dont see the need for it.

EDIT: and that 1080p hight...3440x1440 is better.
 
Last edited:
I don't own one but I have indeed seen a friend's HDR side by side vs a normal monitor (in person, ie, not viewing it through something else) and wasn't that impressed to be honest. Had more of a "jump" for "deep blacks" going from TN to AMVA. As article said, for extra contrast it's stuff like local dimming (on more expensive monitors that just coincidentally happen to be HDR) that makes the most visible contrast difference.

More often than not honest "HDR" gaming comparisons are extremely difficult due to the way enabling "HDR" presets in games often adds a lot more artificial filtering things on top of pure-HDRness to exaggerate the difference. Eg, the F-6.jpg file linked to earlier, the whole "HDR" image looks like the photographic equivalent of taking two pictures with different cameras, but adding a strong neutral density filter for only one of the cameras, and then pretending the only difference is entirely down to the cameras and not the blatantly obvious additional filtering. This stuff has little to do with "subtle shadow detail", as it's obvious the lightness of the bridge lights is much darker / had bloom removed, little different to turning the gamma down on everything.

It's very evident in many games that enabling "HDR" in the in-game settings also adds the extra equivalent of feeding it through a Reshade / SweetFX filter on top, "recoloring" the scene (as seen in pic F-5's ugly green tint), which is something that can be done on any monitor (and your average Fallout 3 "green tint remover" mod showcases the same effect of perfectly). And this is why I openly laugh at most "HDR vs non-HDR" laughably "skewed" gaming 'comparison' shots. Genuine HDR is so incredibly subtle, half the time you can hardly see it under normal non-blackout lighting conditions. Hence the need to "fake compare" with adding a load of non-HDR color reshading, gamma / contrast adjustments, etc on top. Quite often, the reviewer himself is unaware of what else is going on "under the hood" for in-game presets.

Sounds like the early days of HD to be honest and I'd say you are making a very good point. HDR will take time to mature but it is very much a step in the right direction.
 
"SDR" actually looks better in half those shots, LOL:-
https://static.techspot.com/articles-info/1633/images/F-5.jpg
https://static.techspot.com/articles-info/1633/images/F-6.jpg
https://static.techspot.com/articles-info/1633/images/F-9.jpg

Personally, I'd rather they put the effort into coming out with better, sensibly priced OLED monitors than continue to churn out gimmicks that are mostly wasted on backlit TN / MVA / IPS stuff with the same old compromises.
Exactly. Most of the HDR I've seen, on my 65" HDR TV, looks fake and unrealistic.
 
Back