Windows 11 users can now adjust Auto HDR on a per-game basis

Daniel Sims

Posts: 1,319   +43
Staff
In brief: One of Windows 11's most attractive features is the ability to add HDR to over 1,000 DirectX 11 and DX12 games that don't officially support it. Now, Microsoft gives users more control over Auto HDR and makes it easier to access.

Microsoft recently announced an update to the Windows 11 Game Bar, which adds an Auto HDR Intensity slider. The feature will remember the setting for each game. The update also makes the on/off toggle easier to reach. The slider comes with Game Bar version 5.721 and above. Users can update it through the Windows Store app.

Games with their own HDR implementation usually come with adjustment settings. Unfortunately, using Auto HDR on other games in Windows 11 only involved an on/off switch until now. This one-size-fits-all solution isn't equally effective across the over 1000 games Auto HDR supports.

Getting to the slider is a trivial matter — first, press Win+G to bring up the Game Bar overlay. Once there, the slider is under Settings > Gaming Features > Adjust HDR Intensity. Clicking the adjust button brings up the slider in a separate Window, which users can pin to the Game Bar. Auto HDR's on/off toggle is right above the adjust button.

Microsoft also gave Windows Insiders the ability to switch off Auto HDR notifications, acknowledging that some users found them excessive. The switch is under Settings > System > Notifications. Users can also tell Auto HDR to send notifications without playing sounds or showing banners.

Permalink to story.

 
But to do so I am forced to use the Xbox game bar? From people I play with that stupid thing is disabled because it just loves popups at inopportune moments. This should be a Windows 11 feature and NOT an Xbox app specific setting.
 
But to do so I am forced to use the Xbox game bar? From people I play with that stupid thing is disabled because it just loves popups at inopportune moments. This should be a Windows 11 feature and NOT an Xbox app specific setting.
They're using "features" as a way to force people into their ecosystem so they can collect and sell your data.
 
I'm so confused, how is "auto-HDR" any different from SDR? From what I understand, HDR has two potential benefits:
  1. Wider color gamut. With ten bits per color, instead of eight, colors can be displayed with finer granularity, reducing color banding. This would require content encoded with 10 bits per color to make a difference, meaning "auto-HDR" couldn't affect this.
  2. Brighter highlights. White colors can appear brighter.

But, for the second one, when displaying SDR content on an HDR display, wouldn't the brightness values be "stretched" across the display's capabilities anyway? If the SDR content had a color value of r:255, g:255, b:255 (the brightest white in 8-bit color) be converted to r:1023, g:1023, b:1023 (the brightest white in 10-bit color) before being sent to the pixel anyway? Because otherwise, a value of 255, 255, 255 would appear as a relatively dark grey on a device whose pixels expect 10-bit color values.

So unless "auto-HDR" is using AI technology to fill in color data that wasn't there to begin with, I fail to see how it could do any different than displaying SDR content as-is. A comparison could be made to upscaling. You can upscale a 480p image as high as you want, but unless you use an AI to "invent" details, it'll never be more detailed than what's possible at 480p.

Also, I have yet to witness the alleged massive improvement to image quality that HDR brings. I have a 4K TCL TV that is capable of displaying HDR, but I couldn't actually tell you that HDR content looked obviously better than SDR content on it. So I guess you need an OLED display to see any benefit?

To me, 4K resolution is a much bigger improvement on image quality than HDR.
 
I'm so confused, how is "auto-HDR" any different from SDR? From what I understand, HDR has two potential benefits:
  1. Wider color gamut. With ten bits per color, instead of eight, colors can be displayed with finer granularity, reducing color banding. This would require content encoded with 10 bits per color to make a difference, meaning "auto-HDR" couldn't affect this.
  2. Brighter highlights. White colors can appear brighter.

But, for the second one, when displaying SDR content on an HDR display, wouldn't the brightness values be "stretched" across the display's capabilities anyway? If the SDR content had a color value of r:255, g:255, b:255 (the brightest white in 8-bit color) be converted to r:1023, g:1023, b:1023 (the brightest white in 10-bit color) before being sent to the pixel anyway? Because otherwise, a value of 255, 255, 255 would appear as a relatively dark grey on a device whose pixels expect 10-bit color values.

So unless "auto-HDR" is using AI technology to fill in color data that wasn't there to begin with, I fail to see how it could do any different than displaying SDR content as-is. A comparison could be made to upscaling. You can upscale a 480p image as high as you want, but unless you use an AI to "invent" details, it'll never be more detailed than what's possible at 480p.

Also, I have yet to witness the alleged massive improvement to image quality that HDR brings. I have a 4K TCL TV that is capable of displaying HDR, but I couldn't actually tell you that HDR content looked obviously better than SDR content on it. So I guess you need an OLED display to see any benefit?

To me, 4K resolution is a much bigger improvement on image quality than HDR.
I can agree with just about everything you said. I see HDR in gaming near pointless, I only use HDR while watching movies. If I did content editing I might feel differently, but my display has full array local dimming so just keeping the setting on dynamic is a perfect setting between SDR and HDR.

IMO, HDR was a stop-gap measure as we moved to better display tech. On my display, going from a dark screen to a bright screen with HDR on literally hurts my eyes. It looks beautiful but can be physically painful to look at.
 
IMO, HDR was a stop-gap measure as we moved to better display tech. On my display, going from a dark screen to a bright screen with HDR on literally hurts my eyes. It looks beautiful but can be physically painful to look at.
Our displays won't be realistic enough until looking at the sun in a movie will literally blind you, just like in real life! I demand absolute realism!
 
I can agree with just about everything you said. I see HDR in gaming near pointless, I only use HDR while watching movies. If I did content editing I might feel differently, but my game has full array local dimming so just keeping the setting on dynamic is a perfect setting between SDR and HDR.

IMO, HDR was a stop-gap measure as we moved to better display tech. On my display, going from a dark screen to a bright screen with HDR on literally hurts my eyes. It looks beautiful but can be physically painful to look at.

That's where it will work I suppose - Legends of Raymond 2 - will be bright game overall - Dark Souls 3 with be dark game - Then some other game will have high contrast from dark to bright you dislike . Back monitor lighting and room lighting can lessen eyestrain a lot - Monitors should have an ambient sensor - some TVs do
 
I'm so confused, how is "auto-HDR" any different from SDR? From what I understand, HDR has two potential benefits:
  1. Wider color gamut. With ten bits per color, instead of eight, colors can be displayed with finer granularity, reducing color banding. This would require content encoded with 10 bits per color to make a difference, meaning "auto-HDR" couldn't affect this.
  2. Brighter highlights. White colors can appear brighter.

But, for the second one, when displaying SDR content on an HDR display, wouldn't the brightness values be "stretched" across the display's capabilities anyway? If the SDR content had a color value of r:255, g:255, b:255 (the brightest white in 8-bit color) be converted to r:1023, g:1023, b:1023 (the brightest white in 10-bit color) before being sent to the pixel anyway? Because otherwise, a value of 255, 255, 255 would appear as a relatively dark grey on a device whose pixels expect 10-bit color values.

So unless "auto-HDR" is using AI technology to fill in color data that wasn't there to begin with, I fail to see how it could do any different than displaying SDR content as-is. A comparison could be made to upscaling. You can upscale a 480p image as high as you want, but unless you use an AI to "invent" details, it'll never be more detailed than what's possible at 480p.

Also, I have yet to witness the alleged massive improvement to image quality that HDR brings. I have a 4K TCL TV that is capable of displaying HDR, but I couldn't actually tell you that HDR content looked obviously better than SDR content on it. So I guess you need an OLED display to see any benefit?

To me, 4K resolution is a much bigger improvement on image quality than HDR.
Google is your friend!
https://www.pcmag.com/how-to/testin...n from the Xbox,a Microsoft company blog post.
 
Our displays won't be realistic enough until looking at the sun in a movie will literally blind you, just like in real life! I demand absolute realism!
I seriously hope this extreme HDR fad goes the way of 3DTVs. Ieam it's cool, but I don't want to have to wear sunglasses to watch TV. I won't tell people what to buy, but HDR1000 is just too much for me
That's where it will work I suppose - Legends of Raymond 2 - will be bright game overall - Dark Souls 3 with be dark game - Then some other game will have high contrast from dark to bright you dislike . Back monitor lighting and room lighting can lessen eyestrain a lot - Monitors should have an ambient sensor - some TVs do
Well I use a 65"4k TV as a monitor. It samsung says it peaks at 1500 nits and a full screen brightness of 900nits. Basically anything over 1000 is painful. With HDR on it ignores my backlight setting it does have an ambient light sensor
 
I seriously hope this extreme HDR fad goes the way of 3DTVs. Ieam it's cool, but I don't want to have to wear sunglasses to watch TV. I won't tell people what to buy, but HDR1000 is just too much for me
Yeah, you're supposed to turn down the brightness of a display to match its surroundings in order to reduce eye strain, but it sounds like you don't see much benefit with HDR unless you have the brightness all the way up. Easy to see how that could be uncomfortable.
 
Last edited by a moderator:
I have search engines blocked on my computer. Among other benefits, my productivity has improved.

Anyway, so I guess it's using machine learning to approximate HDR, then. Wouldn't that cause some kind of CPU or GPU hit that could lower frame rates?


Yeah, you're supposed to turn down the brightness of a display to match its surroundings in order to reduce eye strain, but it sounds like you don't see much benefit with HDR unless you have the brightness all the way up. Easy to see how that could be uncomfortable.
well on my display the brightness doesn't actually do anything aside from play with the gamma which messes with the color accuracy. I do have a backlight setting, which I keep at 60% and it works great in "dynamic" mode, but turning on HDR completely ignores the backlight setting. And the HDR setting turns brightness passed 100% for short bursts.

It's an absolute amazing display but I really don't understand what Samsung was thinking with the controls. HDR is cool if I want to impress people with how good it looks when watching a movie, but for the time I spend in front of it the HDR setting becomes uncomfortable.
 
well on my display the brightness doesn't actually do anything aside from play with the gamma which messes with the color accuracy. I do have a backlight setting, which I keep at 60% and it works great in "dynamic" mode, but turning on HDR completely ignores the backlight setting. And the HDR setting turns brightness passed 100% for short bursts.

It's an absolute amazing display but I really don't understand what Samsung was thinking with the controls. HDR is cool if I want to impress people with how good it looks when watching a movie, but for the time I spend in front of it the HDR setting becomes uncomfortable.
Right, sorry, by "brightness" I meant turning down the backlight level.

So if you can't turn down the backlight when using HDR, presumably because you otherwise can't see the benefits as much, then I guess you have to be happy with eye strain if you want to use it.
 
not for me I have been using my pc with my 55 inch 4k tv since November and its been there since then.
I don't think you're understanding correctly.
You already said as your defense that the slider was always there.
That's irrelevant, so I don't know where you think you're going.
 
I'm so confused, how is "auto-HDR" any different from SDR? From what I understand, HDR has two potential benefits:
  1. Wider color gamut. With ten bits per color, instead of eight, colors can be displayed with finer granularity, reducing color banding. This would require content encoded with 10 bits per color to make a difference, meaning "auto-HDR" couldn't affect this.
  2. Brighter highlights. White colors can appear brighter.

But, for the second one, when displaying SDR content on an HDR display, wouldn't the brightness values be "stretched" across the display's capabilities anyway? If the SDR content had a color value of r:255, g:255, b:255 (the brightest white in 8-bit color) be converted to r:1023, g:1023, b:1023 (the brightest white in 10-bit color) before being sent to the pixel anyway? Because otherwise, a value of 255, 255, 255 would appear as a relatively dark grey on a device whose pixels expect 10-bit color values.

So unless "auto-HDR" is using AI technology to fill in color data that wasn't there to begin with, I fail to see how it could do any different than displaying SDR content as-is. A comparison could be made to upscaling. You can upscale a 480p image as high as you want, but unless you use an AI to "invent" details, it'll never be more detailed than what's possible at 480p.

Also, I have yet to witness the alleged massive improvement to image quality that HDR brings. I have a 4K TCL TV that is capable of displaying HDR, but I couldn't actually tell you that HDR content looked obviously better than SDR content on it. So I guess you need an OLED display to see any benefit?

To me, 4K resolution is a much bigger improvement on image quality than HDR.

4k vs HDR is not a valid comparison of image quality

Trying to evaluate HDR on a TCL product is probably a poor choice. I'm sure an OLED or higher tier Samsung device would show a noticeable difference. Even on a HDR400 monitor from ASUS I can see a noticeable difference in RE village. My Roku leveraging dolby vision also shows a significant different on a vizio p series vs SDR content (Same thing as HDR, different spec/standard).

 
I see so many negative comments on HDR in here, one thing I wonder about do you even have a proper HDR display? Like HDR10(HDR1000) / HDR10+ / ?

Because I have a proper 35' computer monitor with HDR10(HDR1000) and I notice a vast difference in HDR-supported content vs monitors that don't have HDR or only HDR400/600. This panel does have F.A.L.D. as well, which a lot of these 'trash' HDR400/600 screens don't have.

I owned a 35' 3440x1440 non-HDR monitor before this and the new one wins in literally every aspect.

One of the first things I tried was opening up games that support HDR such as Red Dead Redemption 2 and the game looked absolutely beautiful, I waited until the morning for the sun to rise and I had to stop watching at directly because the sun was almost blinding. The world felt way more alive than before on my previous monitor.

Anything below HDR10(1000) is you buying an inferior product with a stamp on it so you can brag to your friends that you own an HDR-screen. In reality HDR400/600 only has a higher peak brightness(luminance) and is capable of accepting HDR-signals, but it doesn't have improved colors or contrast resulting in more often than not in washed-out images while using HDR.

So before you make negative comments about HDR or even compare anything, make sure you have a proper HDR display first, otherwise your argument is pretty much moot.
 
4k vs HDR is not a valid comparison of image quality
The reason I said that is because in the past I've seen multiple people say that HDR does more for image quality than increasing the resolution to 4K. In other words, that 1080p HDR looks better than 4K SDR. I disagree, based on my experience, but I haven't experienced HDR on the most expensive displays out there, so my experience is limited.

I'm sure an OLED or higher tier Samsung device would show a noticeable difference.
OLED is much, much more expensive than LCD displays, and thus unobtainable for me and I suspect many others.
 
I'm so confused, how is "auto-HDR" any different from SDR? From what I understand, HDR has two potential benefits:
  1. Wider color gamut. With ten bits per color, instead of eight, colors can be displayed with finer granularity, reducing color banding. This would require content encoded with 10 bits per color to make a difference, meaning "auto-HDR" couldn't affect this.
  2. Brighter highlights. White colors can appear brighter.

But, for the second one, when displaying SDR content on an HDR display, wouldn't the brightness values be "stretched" across the display's capabilities anyway? If the SDR content had a color value of r:255, g:255, b:255 (the brightest white in 8-bit color) be converted to r:1023, g:1023, b:1023 (the brightest white in 10-bit color) before being sent to the pixel anyway? Because otherwise, a value of 255, 255, 255 would appear as a relatively dark grey on a device whose pixels expect 10-bit color values.

So unless "auto-HDR" is using AI technology to fill in color data that wasn't there to begin with, I fail to see how it could do any different than displaying SDR content as-is. A comparison could be made to upscaling. You can upscale a 480p image as high as you want, but unless you use an AI to "invent" details, it'll never be more detailed than what's possible at 480p.

Also, I have yet to witness the alleged massive improvement to image quality that HDR brings. I have a 4K TCL TV that is capable of displaying HDR, but I couldn't actually tell you that HDR content looked obviously better than SDR content on it. So I guess you need an OLED display to see any benefit?

To me, 4K resolution is a much bigger improvement on image quality than HDR.

You got it. 10-bit color adds more steps in a color not a higher color value/brightness.

On the PC you can set the color depth as you like 8 or 10 or even 12-bit but you need a display that can do it and you need an application that can also. If the game engine is older and running a 8-bit that is all you will get. Not seen a display or app that can do 12-bit and I don't think you could tell anyway. 10-bit gets rid of banding in the few scenes that had it so not sure what 12-bit will do for you.

There is different data that makes up HDR that goes along with what the HDR enabled display can do when in HDR mode.

I felt the same way about HDR until...

I have an LG C9 77 and its nice for games with good blacks and color and 4k-120Hz but the HDR is not really noticeable. You get the HDR or Dolby Vision logo at the top right and just have to trust that a few scenes in a few shows will have a little more detail. I just got a ViewSonic XG321UG and now I see it. In games like God of War the HDR is really nice. I love my LG OLED but it's like it don't even have HDR. The ViewSonic can do up to 1400 nits and in a bigger part of the screen and for longer. The LG C9 and even the newest C2 (CXII, C12???) that just came out still maxes out at like 800 nits and in a like 2% window. For most displays HDR just looks different and washed out. SDR Blu-Rays seem more saturated than the UHD movies with HDR.


 
Back