Question HDR users I need your help..

B Tank

Active Member
I need some HDR help, I am baffled in some aspects.

Please note, I have the following TV and HDR capable devices:

Hisense 65M7000 (No HDR sources, the native apps don't do HDR)
Xbox One S (Netflix HDR)
Intel NUC 7i3BNH (Windows 10 HDR, Netflix HDR, YouTube HDR, Kodi HDR)

Its worth noting, the TV is set up correctly in terms of HDR switching. The TV will correctly go to its HDR setting when it detects HDR signal. Wether the HDR TV picture settings are optimal however is a different matter and I need advice on calibrating these too.

Xbox One S
When HDR is set as capable in settings, the console will auto switch between HDR and SDR depending on which app is opened. E.g. HDR compatible game, or Netflix. When Netflix is opened, it will switch to HDR and stay in HDR regardless of what media is played, while the app is open. Quitting the app reverts back to SDR. Now, HDR content looks ok here, but SDR content does not look quite right. It looks a bit faded out. You can tell this when playing SDR and HDR content in the Netflix app.

Intel NUC
This one is a bigger basket of worms it seems. Windows 10 has an overall HDR setting, you turn it on and off and everything is then HDR on or off. Overall, switching to HDR on everything looks poor generally regarding all the Windows 10 stuff, background, menus, start menu etc. The colours just look wrong and overall a bit faded. HDR content in the Netflix app looks ok, but the rest of it looks really bad. Actually the device in general struggles to play Netflix HDR content but again that's a different matter.

Kodi - I have local content available that I can play in Kodi under Windows 10 set as HDR. Honestly, the content doesn't look too good and I can't figure out why. Looks faded and the black levels are terrible and contrast looks off. Turning HDR off though on Windows and playing Kodi media again, the picture quality is fantastic.

These are the things I'm experiencing with HDR currently, and I'm after some help trying to understand why it's happening..
 

next010

Distinguished Member
The HDR behavior of Netflix on Xbox & Playstation is well known, why it still does that no-body knows, other Netflix app platforms don't exhibit it at least that's my understanding of it.

PC Windows HDR is a mess plain and simple, it takes a lot of work to get it right and tailor it to your HDR display.

Simple things you can try
1) At 4K 60Hz set 10-bit output in GPU control panel, look for 10-bit (or 12-bit) YCC 422 or 420 and select that (fine text rendering will take a hit doing this).

2) Turn on Windows global HDR and go into it's HD Colour setting, look for the HDR/SDR balance slider and drag the slider down to around a value of 10-15 so Windows better emulates SDR within the HDR color space.

3) Set your TV to HDR cinema mode and make sure its black level is on low/limited.

Be aware HDR is broken in some web browsers, use MS Edge to play HDR video I think that's okay.

See if that makes any kind of difference
 

B Tank

Active Member
The HDR behavior of Netflix on Xbox & Playstation is well known, why it still does that no-body knows, other Netflix app platforms don't exhibit it at least that's my understanding of it.

PC Windows HDR is a mess plain and simple, it takes a lot of work to get it right and tailor it to your HDR display.

Simple things you can try
1) At 4K 60Hz set 10-bit output in GPU control panel, look for 10-bit (or 12-bit) YCC 422 or 420 and select that (fine text rendering will take a hit doing this).

2) Turn on Windows global HDR and go into it's HD Colour setting, look for the HDR/SDR balance slider and drag the slider down to around a value of 10-15 so Windows better emulates SDR within the HDR color space.

3) Set your TV to HDR cinema mode and make sure its black level is on low/limited.

Be aware HDR is broken in some web browsers, use MS Edge to play HDR video I think that's okay.

See if that makes any kind of difference
I'm using Intel integrated graphics and the only option I have us YCC on or off and it's set to on.

That balance slider is interesting, if I play with that and my TV settings maybe I can get a decent picture but I have no idea what that slider is there for..
 

next010

Distinguished Member
I'm using Intel integrated graphics and the only option I have us YCC on or off and it's set to on.

That balance slider is interesting, if I play with that and my TV settings maybe I can get a decent picture but I have no idea what that slider is there for..
If its only outputting 8-bit YCC then its not much use, you really need 10-bit or 12-bit output for HDR.

However at 4K 60hz, the HDMI 2.0 interface cannot support 10-bit RGB 444 so you must use a lower bandwidth alternative like 10-bit YCC 422 or 420.

How to tell what the GPU is outputting go into the windows->settings->system->display->advanced display settings and look at bit depth.

Basically the slider is meant to compensate for differences in HDR displays and Windows SDR emulation but it doesn't always work out so well.
 

The latest video from AVForums

Podcast: Marantz SR7015 & NAD T 778 AVR + Mission LX2 MKII Speaker Reviews, AV & Film News and More
Top Bottom