LG OLED C7 - 8bit with dithering vs 10bit

airmark

Established Member
Joined
Dec 8, 2008
Messages
524
Reaction score
18
Points
115
Sorry if this has been discussed before, I did extensive searching but couldn't find a conclusive answer.
I have an LG OLED C7 connected to an HTPC with an Nvidia GTX 1050 Ti. I would expect the output to be 10 bit instead of 8 bit with dithering (see below). When I play around in the Nvidia control panel only 8 bit is available if I select 60 Hz. If I lower to 30 Hz, then 8bit and 12bit are available, but not 10bit, which I believe would match the TV.
Could it be that the HDMI cable can't provide enough bandwidth. I use an Amazon Basics Premium High Speed (18Gbps) cable, which according to specs should be enough for 10bit HDR at UHD 60Hz, but maybe it's borderline and can't cope? Should I get a 48Gbps cable?
Or it doesn't matter and I won't see any improvement in picture quality from 10bit vs 8bit with dithering?
Thanks in advance.
1674725512929.png
 
It's more likely that the 10 series GPU's could only do 8 bit or 12 bit

Newer Nvidia GPU's do 8, 10 or 12.

Edit - Though that may relate to encoding and decoding, not output...

Try the other cable and see.
 
Last edited:
Yes older Nvidia GPU's cannot do 10-bit over HDMI (only via Displayport).

Also to do 12-bit at 4K 60hz you must change from RGB to YCC 422, this can only be done in the Nvidia GPU control panel under resolution de-select use default colour settings and change the output colour format along with bit depth to 12-bit.

Using chroma subsampling (422) will degrade text rendering quality on desktop but if this is only for movies/games in HDR then it's fine.
 
Thank you. I don't see the benefit of 12 bit as the TV doesn't support it, but 10 bit being supported via Displayport sounds promising, so I will explore that.
 
If you have a C8 or earlier and want 4K/60 444 then 8 bit is the limit. On the C8 I set PC label rather than game mode to get proper 444 and control over colour gamut with lower latency.

Windows enables dither with HDR to try to offset the limitations of 8 bit depth and generally it helps but with HDR banding / contouring can still be noticeable on dark or fog type scenes.

Enabling HDR should only be for HDR videos or gaming as the OLED light setting is very much higher and there's no benefit for normal PC use.

Upgrading to a TV with HDMI 2.1 48Gb/s would allow 4K/120 12 bit - in general its worth processing at the highest supported bit depth even if the panel is only 10 bit. How much 10/12 bit actually improves PQ is not easy to determine though.
 
Last edited:
Thank you. I don't see the benefit of 12 bit as the TV doesn't support it, but 10 bit being supported via Displayport sounds promising, so I will explore that.
Of course the TV doesn't have a DP input, will a DP to HDMI cable suffice?
 
What about 4K/60 RGB, can that go to 10bit?
RGB is what I have mine set to and what Windows Settings (like the screenshot in your post above) and the Nvidia control panel shows and thats what I mean by 444 - I dont have it set to YCbCr444.

Its a limitation of HDMI 2.0 - if you search there are plenty of tables showing the various combinations but many dont acknowledge HDR is possible at 8 bit.

If your framerate is lower than 60 then 444 at full bit-depth is possible. Otherwise if you need a higher bit depth then a lower chroma is fine for games which is what consoles do.

For video playback with a lower chroma Ive found PC label on my LG is an issue as it actually limits to 8 bit so you have to turn off PC label if youre not using 444

Dolby Vision playback is OK as it tunnels in 8 bit RGB anyway to give a 12 bit HDR.
 
Of course the TV doesn't have a DP input, will a DP to HDMI cable suffice?
You cant use a DP to HDMI cable to get 10-bit RGB 444, the limit is the HDMI 2.0 ports on the TV.

What are you using the TV for, if its just HDR video content than 12-bit YCC422 which the TV does support should be fine for your needs.

As jak22 says you can do HDR in 8-bit as well, though when I had a 7 series PC HDR at 8-bit did not look so good but maybe there have been some updates and combinations that make it usable.
 
I use it primarily for video playback, it's not my main PC, but I do occasionally use it for browsing if I'm in the living room (like now for instance) and/or feel like using a big screen. So I don't like the distortion on text if I choose chroma subsampling (422).

As it stands now, even with 8bit colour depth, HDR content plays pretty well, and if I hadn't noticed that it's 8bit I probably wouldn't have tried to change anything (although I do not like the posterisation/colour banding but that's another issue altogether, or is it?). But since I noticed it says 8bit, and I know the TV can handle 10bit, my OCD kicked in!

So if the limit is the HDMI port on the TV, but the TV can handle 10bit, how is it supposed to receive the signal? How is it possible to claim "hey, I am a 10bit TV, but there's no way you can feed me 10bit signal, sorry"? Forgive my ignorance and perhaps oversimplification here, I'm trying to understand it and feed the TV the best signal it can handle.

When I change the refresh rate to 30 Hz, the bit depth changes to 12bit.
Playing around the Nvidia control panel I can enable the below combinations:
60Hz YCbCr422 10bit limited dynamic range
30Hz RGB 12bit full dynamic range
I can't see an improvement to picture quality from either of them. If anything I think I'm seeing slightly more banding (but tiny, almost imperceptible difference if any). Which of these is theoretically better for video playback?
 
I use it primarily for video playback, it's not my main PC, but I do occasionally use it for browsing if I'm in the living room (like now for instance) and/or feel like using a big screen. So I don't like the distortion on text if I choose chroma subsampling (422).
Well you can try using this to easily switch between resolutions and set up shortcuts for different profiles so you can cycle between SDR and HDR.
So if the limit is the HDMI port on the TV, but the TV can handle 10bit, how is it supposed to receive the signal? How is it possible to claim "hey, I am a 10bit TV, but there's no way you can feed me 10bit signal, sorry"? Forgive my ignorance and perhaps oversimplification here, I'm trying to understand it and feed the TV the best signal it can handle.
By switching to YCC with chroma subsampling you can achieve 10-bit and 12-bit signals within the bandwidth of HDMI 2.0 at 4K 60hz.

Banding and posterization are the effects of using HDR at lower bit depths.
When I change the refresh rate to 30 Hz, the bit depth changes to 12bit.
Playing around the Nvidia control panel I can enable the below combinations:
60Hz YCbCr422 10bit limited dynamic range
30Hz RGB 12bit full dynamic range
60hz YCC422 10-bit is the one to select, not sure if nvidia changed something but usually they just give you 8-bit or 12-bit on old cards but for whatever reason 10-bit is showing up so use that.

Using the above software is the best option if it can send the commands to the 7 series, site says 8 series or later but maybe you will get lucky and it works.

For SDR output 8-bit RGB 444 and enable the PC mode on the TV otherwise it wont render the 444 correctly.

For HDR video set it 10-bit YCC 422 and switch to cinema HDR mode, turn off PC mode as well.

If thats all too much hassle to be switching around in then try using better clear type tuner and see if it change some of the font rendering so its easier to read in browser so just leave it in HDR 10-bit YCC 422 mode.
 
Last edited:
Thanks, lots to process here.
By switching to YCC with chroma subsampling you can achieve 10-bit and 12-bit signals within the bandwidth of HDMI 2.0 at 4K 60hz.

So I changed to 422 10bit and tried some HDR content and could not see any improvement (or worsening tbh) in picture quality compared to RGB 8bit.
For HDR video set it 10-bit YCC 422 and switch to cinema HDR mode, turn off PC mode as well.
I don't see PC mode anywhere on the TV. Unless you refer to naming an input as PC, which I did for HDMI1 in my case. I have my PC output plugged to HDMI2 now, which has enabled some settings on the TV that were unavailable when plugged to HDMI1. But it's not very practical to plug the HDMI cable to a different input in the TV every time you switch content type on the PC.

If thats all too much hassle to be switching around in then try using better clear type tuner and see if it change some of the font rendering so its easier to read in browser so just leave it in HDR 10-bit YCC 422 mode.

Did that, thanks, I may leave it at that.
 
Enabling HDR should only be for HDR videos or gaming as the OLED light setting is very much higher and there's no benefit for normal PC use.
That makes sense and that's what I did for a while (HDR off on Windows settings but then MadVR would turn HDR on when HDR material was playing). An issue with that was that if I paused and hovered over a menu in the player, the app would switch to SDR which made the paused HDR video looked washed out and would be quite annoying switching back and forth between SDR and HDR, so eventually I decided to just enable HDR everywhere.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom