Well, there's more to it than simply hanging the "HDR" (high dynamic range) tag to it. I've mostly seen "HDR" attached to 4K TVs.
To qualify as "HDR", I'm guessing a monitor would need about 400 nits or more of brightness. That's to provide the "whiter whites and blacker blacks" to which the HDR tag ostensibly refers.
AFAIK, 4K TVs can be driven @ 1080p while the TV does the upscaling to 2160p. Monitors probably don't upscale, so you'll need direct 4K content to take advantage of the monitor's full resolution
A couple of "IMO" points. 4K makes no sense below about 30" or so.
If you plan on AAA serious gaming, you'll need a very high end video card to get acceptable framing rates.
Monitors for "professional imaging work" have other considerations. Most important;y, color gamut which approaches 100% sRGB and Adobe RGB's color space. Personally, I would prefer to have the aspect ratio at 16:10 , instead of the standard 16:9.
(1080p = 1200p & 1440p (2K) = 1600p @ 16:10_
Besides, "showrooming" poor starving Best Buy well, just isn't very nice...
The fate pushes me to upgrade sooner it seems, my main monitor just went in flames :"(
never again will I buy a noname monitor. My Samsung has worked over 7 years, and this Chinese piece of crap only worked for 2...