Sorry if this has been discussed before, I did extensive searching but couldn't find a conclusive answer.
I have an LG OLED C7 connected to an HTPC with an Nvidia GTX 1050 Ti. I would expect the output to be 10 bit instead of 8 bit with dithering (see below). When I play around in the Nvidia control panel only 8 bit is available if I select 60 Hz. If I lower to 30 Hz, then 8bit and 12bit are available, but not 10bit, which I believe would match the TV.
Could it be that the HDMI cable can't provide enough bandwidth. I use an Amazon Basics Premium High Speed (18Gbps) cable, which according to specs should be enough for 10bit HDR at UHD 60Hz, but maybe it's borderline and can't cope? Should I get a 48Gbps cable?
Or it doesn't matter and I won't see any improvement in picture quality from 10bit vs 8bit with dithering?
Thanks in advance.
I have an LG OLED C7 connected to an HTPC with an Nvidia GTX 1050 Ti. I would expect the output to be 10 bit instead of 8 bit with dithering (see below). When I play around in the Nvidia control panel only 8 bit is available if I select 60 Hz. If I lower to 30 Hz, then 8bit and 12bit are available, but not 10bit, which I believe would match the TV.
Could it be that the HDMI cable can't provide enough bandwidth. I use an Amazon Basics Premium High Speed (18Gbps) cable, which according to specs should be enough for 10bit HDR at UHD 60Hz, but maybe it's borderline and can't cope? Should I get a 48Gbps cable?
Or it doesn't matter and I won't see any improvement in picture quality from 10bit vs 8bit with dithering?
Thanks in advance.