I don't own one but I have indeed seen a friend's HDR side by side vs a normal monitor (in person, ie, not viewing it through something else) and wasn't that impressed to be honest. Had more of a "jump" for "deep blacks" going from TN to AMVA. As article said, for extra contrast it's stuff like local dimming (on more expensive monitors that just coincidentally happen to be HDR) that makes the most visible contrast difference.
More often than not honest "HDR" gaming comparisons are extremely difficult due to the way enabling "HDR" presets in games often adds a lot more artificial filtering things on top of pure-HDRness to exaggerate the difference. Eg,
the F-6.jpg file linked to earlier, the whole "HDR" image looks like the photographic equivalent of taking two pictures with different cameras, but adding a strong neutral density filter for only one of the cameras, and then pretending the only difference is entirely down to the cameras and not the blatantly obvious additional filtering. This stuff has little to do with "subtle shadow detail", as it's obvious the lightness of the bridge lights is much darker / had bloom removed, little different to turning the gamma down on everything.
It's very evident in many games that enabling "HDR" in the in-game settings also adds the extra equivalent of feeding it through a Reshade / SweetFX filter on top, "recoloring" the scene (as seen
in pic F-5's ugly green tint), which is something that can be done on any monitor (and your average
Fallout 3 "green tint remover" mod showcases the same effect of perfectly). And this is why I openly laugh at most "HDR vs non-HDR" laughably "skewed" gaming 'comparison' shots. Genuine HDR is so incredibly subtle, half the time you can hardly see it under normal non-blackout lighting conditions. Hence the need to "fake compare" with adding a load of non-HDR color reshading, gamma / contrast adjustments, etc on top. Quite often, the reviewer himself is unaware of what else is going on "under the hood" for in-game presets.