8 bit or 10 bit confusion!

EnwezorN

Active Member
This is a bit of weird one.

I have a new Sky Q 1TB box and an Amazon firestick connected through a Denon AVR-1600H to an LG CX 65 OLED. I'm using a 3metre HDMI 2.1 certified although I haven't tested its speed.

On the Denon AVR menu, there is a 4k Signal Format sub-menu where you can set the HDMI to Standard or Enhanced. It says:
Standard - Select if your TV and playback devices support standard 4K 60p 4:2:0 8 bit video signals.
Enhanced - Select if your TV, playback devices, and cables support high quality 4K 60p 4:4:4, 4:2:2 or 4K 60p 4:2:0 10 bit video signals.

When I select Enhanced, it says that my TV doesn't support it but I can set it anyway which I do.

I've never been able to work out if I'm playing 4k to the highest quality I can.

I accessed the hidden LG CX freesync menu and I get the following data on these sources:

Amazon Firestick 4k - 23.97Hz, Fixed, 3840 x [email protected], YCBCR422 8b TM (depending on what you watch, it could change to 60Hz, Netflix is RGB 8b TM)
Sky Q - 50Hz, Fixed, 3840 x [email protected], YCBCR420 10b TM
Playing directly through LG Web OS, I don't think it is possible to see what the bitrate is.

When I play anything 4k on the firestick I do get a message on the tv saying Dolby Vision which I assumed meant that I was getting 4k correctly and it does look good. But it is 8 bit only.

But when I play Netflix or Amazon through the Sky Q box, I am getting the message that it is showing HDR content and it is in 10bit.

I need to test it some more but I think both Netflix and Amazon may look better through Sky Q.

What I don't know is whether I should be worried about the Denon message or is the fact that my telly is telling me that through the Sky Q box, it certainly seems to be passing HDMI Enhanced bandwidth to the TV because my LG telly is receiving 10 bit content.

Also, what should it be??? Should I be able to get YCBCR444 at 12bit through any source?

I'd rather not unplug my HDMI cable and test it but I will if I have to.
Does anyone have a similar set-up and is getting 10 or 12 bit from an Amazon 4k firestick or a Sky Q box?

Thanks!
 

dante01

Distinguished Member
You'd basically need to engaged the Enhanced option is wanting to convey HDR and or wide colour gamut encoded video.

Your LG CX TV is fully compliant and shouldn't have any issue with handling or displaying 4K 60p 4:4:4, 4:2:2 or 4K 60p 4:2:0 10 bit video signals.

Bitrate has nothing to do with bit depth. The colour depth of an image is measured in bits. The number of bits indicates how many colours are available for each pixel. In the black and white image, only two colours are needed. This means it has a colour depth of 1 bit.

A TV’s reproduction of a color gradient indicates how well it can display details in color. It’s an important part of HDR pictures, so if you want something that will handle HDR well, you should make sure to get a TV that does well on this test. For this test, we determine a TV’s maximum color depth, photograph a gradient test image displayed at that color depth, and then assign a score based on how well the test image was reproduced.

For best results with color depth, you should get a TV that is capable of displaying 10-bit color, and then play HDR media on that TV. If you meet those requirements and still experience banding, try disabling any processing features that you still have turned on, as those can lead to banding as well.



What is 10-bit color?​

Bit depth refers to the overall number of levels of red, green, or blue that a camera records. 8 bits means there are 256 levels (0 to 255) of each color, or roughly 16.8 million combinations in total. Now, 16.8 million sounds like a lot, but one common situation where you can notice the drawback of 8-bit color is in areas of light falloff from bright to dark, which may show up as disparate bands of shading rather than a smooth gradient. This effect is prevalent on YouTube where it is exacerbated by heavy compression, although many viewers may not notice it.

Bumping up to 10 bits multiplies the levels of color by four. That’s 1,024 available values each for green, red, and blue, or a whopping one billion total combinations. As Hammond explains, however, you won’t always see this extra color information: Most screens only support 8-bit color.

But 10-bit color is your ticket to producing High Dynamic Range (HDR) content. Thankfully, 10-bit displays are increasing as HDR TVs become more common. Some phones support HDR now, and even some 8-bit displays can fake it using a technique called frame rate control (FRC).

What is chroma subsampling?​

Chroma subsampling is a separate beast altogether. This is often called color resolution, as compared to the spatial resolution, like 4K. As an example, 4K Ultra HD video has a spatial resolution 3,840 x 2,160 pixels — but the color of each pixel is derived from a much smaller sampling than that.

With 4:2:0 subsampling, for every two rows of four pixels, color is sampled from just two pixels in the top row and zero pixels in the bottom row. Surprisingly, this seemingly dramatic approximation has little effect on the color, as our eyes are more forgiving to chrominance (color) than luminance (light). If your camera supports 4:2:2 subsampling, this doubles the color resolution by including color from an additional two pixels on the second row — but that’s still just half the total pixels in the image.

Note that color resolution is tied to spatial resolution. A 4K video with 4:2:0 subsampling will still sample color from more pixels than a Full HD video with 4:2:2 subsampling.

What does it mean for image quality?​

If moving to 10-bit 4:2:2 has little effect on what we can actually see right out of the camera, why is it important? It all comes down to postproduction.

Bit depth is especially important to colorists — those are the people responsible for a movie’s final look — as it offers more room to push the color and exposure of video. Even if your final output is still an eight-bit monitor, working in a 10-bit space will give you more control and yield a better result that will lower the likelihood of banding when viewed on the 8-bit display.

4:2:2 chroma subsampling is also helpful in the coloring process, but is particularly useful for chroma key, or green screen, compositing. Here, the extra color resolution can be the difference between a smooth mask or a jagged outline.

Many video professionals working with mirrorless or DSLR cameras will use external recorders in order to capture more color information than what the camera can process internally. This is usually in the form of either 8- or 10-bit color depth with 4:2:2 chroma subsampling.

The Lumix GH5 was one of the first cameras that offered internal 4K recording with 10-bit 4:2:2 color, which can save videographers time and money by not requiring an external recorder.

If this explanation still left you scratching your head, rest assured that these concepts are explained much more succinctly in Hammond’s video. So if you haven’t yet, go ahead and give it a watch. Or if you’re ready to step up your game, check out our list of the best video cameras.


Both you AV receiver and your TV can technically handle 12 bit colour, but there's luittle if any point to this given than most sources will not be playing anything that was filmed or encoded with sich a bit depth, Your TV is also only equipped with a 10 but panel. This is the case with almost all TVs.


Putting It Into Practice​


First, should you worry about the more limited color and brightness range of HDR10 and 10-bit color? The answer right now is no, don’t worry too much about it. Getting much beyond 1k nits in an HDR display simply isn’t doable right now. Nor can most any display go beyond the smaller P3 gamut. And because a lot of content is mastered and transmitted in 10-bit color, 12-bit color hardware isn’t going to do much for you today anyway.


The second thing is how to ensure you’re getting 10-bit color on your monitor. Fortunately, this will almost always be listed in a device’s tech specs, but beware of any HDR display that doesn’t list it. You’ll need 10-bit inputs for color, but outputs are a different story. The output can be a 10-bit panel output, or eight-bit with FRC.


The other trick display manufacturers will pull is called look up tables. Not all scenes use all colors and brightnesses that are available to a standard--in fact, most don’t. Look up tables take advantage of this by varying what information the bits you have available represent into a more limited set of colors and brightness. This limits the number of bits needed to produce a scene without banding, and it can significantly reduce banding in 95% or more of scenes. We should note, though, that currently this is found exclusively in high-end reference monitors like those from Eizo. That's also the only place this trick is needed, because after being transferred from a camera (or what have you) to a device on which you'd watch the content, today’s HDR signals already come with a not dissimilar trick of metadata, which tells the display the range of brightness it’s supposed to be displaying at any given time.


The third and final piece is when to worry about 12-bit color. When the BT2020 color gamut is usable on devices like monitors, TVs, and phones, and those devices are able to reach a much higher brightness, that’s then you can think about 12 bits. Once the industry gets to that point, 10-bit color isn’t going to be enough to display that level of HDR without banding. But we aren’t there yet.



What is creating the message you get? I'd suspect the SKY box. SKY don'y have a very good track regard relative to HDMI handshakes and their so I'd suspect it is simply incorrectly reading the EFIDs of your TV and or the AV receiver? This can effect what it outputs or can output is it doesn't regard the destinations as being able to handle something.


By the way, using an HDMI cable certified for use in a HDMI 2.1 setup is pointless unless you've actually got such a setup. Your AV receiver is onl;y equipped with HDMI version 2.0b.



One thing worth mentioning is the fact that you have to ordinarilly turn on a UHD TV's wider colour gamut handling via its HDMI inputs before you'd be able to access such content via those inputs. They're should be a DEEP COLOUR setting on your TV that allows you to turn this on and off for each of its HDMI inputs individually.


Ensure that the ULTRA HD DEEP COLOUR setting for all your HDMI inputs on the TV are all turned ON.

by default 2021-06-19 at 08.48.49.png
 
Last edited:

Jay53

Active Member
As said above you hdmi 2.1 cable doesn't matter. Neither you Avr or input devices are hdmi 2.1 compliant.

Leave the Denon set to enhanced. If you want to establish what is causing the message then try enabling/disabling enhanced with each source. I'd suspect it's the avr though.

Standard is there for backwards compatibility but your devices are fine

Note even on enhanced the available resolutions are 4K 60p 4:4:4, 4:2:2 both 8bit or 4K 60p 4:2:0 10 bit video signals.

So the only way you will get HDR through sky q box is with the enhanced setting.

Not sure why you are only getting 8bit through the firestick and I can only assume its to do with enabling Dolby vision. Try turning DV off on the TV and repeating the test with the firestick. It may then switch to showing HDR 4k 10bit 4:2:0 and then you need to decide which looks better to you 8bit DV or 10bit HDR

What I would say though is to truly see what your input devices are capable of take the avr out the equation even if it's just for the purpose of testing. Your TV is more than capable but it's possible that the avr is only passing through resolutions it's happy with to the source devices I..e you might find the firestick outputs differently if directly connected.
 
Last edited:

EnwezorN

Active Member
Thanks both for your helpful replies. You've given me quite a bit too think about. I think that I'm probably worrying about not a lot 😂 but I will definitely try a few tests.
 
Last edited:

goingoingong

Distinguished Member
This is a bit of weird one.

I have a new Sky Q 1TB box and an Amazon firestick connected through a Denon AVR-1600H to an LG CX 65 OLED. I'm using a 3metre HDMI 2.1 certified although I haven't tested its speed.

On the Denon AVR menu, there is a 4k Signal Format sub-menu where you can set the HDMI to Standard or Enhanced. It says:
Standard - Select if your TV and playback devices support standard 4K 60p 4:2:0 8 bit video signals.
Enhanced - Select if your TV, playback devices, and cables support high quality 4K 60p 4:4:4, 4:2:2 or 4K 60p 4:2:0 10 bit video signals.

When I select Enhanced, it says that my TV doesn't support it but I can set it anyway which I do.

I've never been able to work out if I'm playing 4k to the highest quality I can.

I accessed the hidden LG CX freesync menu and I get the following data on these sources:

Amazon Firestick 4k - 23.97Hz, Fixed, 3840 x [email protected], YCBCR422 8b TM (depending on what you watch, it could change to 60Hz, Netflix is RGB 8b TM)
Sky Q - 50Hz, Fixed, 3840 x [email protected], YCBCR420 10b TM
Playing directly through LG Web OS, I don't think it is possible to see what the bitrate is.

When I play anything 4k on the firestick I do get a message on the tv saying Dolby Vision which I assumed meant that I was getting 4k correctly and it does look good. But it is 8 bit only.

But when I play Netflix or Amazon through the Sky Q box, I am getting the message that it is showing HDR content and it is in 10bit.

I need to test it some more but I think both Netflix and Amazon may look better through Sky Q.

What I don't know is whether I should be worried about the Denon message or is the fact that my telly is telling me that through the Sky Q box, it certainly seems to be passing HDMI Enhanced bandwidth to the TV because my LG telly is receiving 10 bit content.

Also, what should it be??? Should I be able to get YCBCR444 at 12bit through any source?

I'd rather not unplug my HDMI cable and test it but I will if I have to.
Does anyone have a similar set-up and is getting 10 or 12 bit from an Amazon 4k firestick or a Sky Q box?

Thanks!
May just be DV reporting and you are actually getting 12 bit.
AVRs can report DV as either 8 bit RGB or YCbCr 4:2:2 12 bits. Both are stated to be correct
LG UBK90 Owners & Discussion
 

Jay53

Active Member
The above makes sense as to why an intermediate device such as am avr could potentially show it as 8bit RGB (the container ) when DV is used.

However, the op is seeing via the LG TV info

Amazon Firestick 4k - 23.97Hz, Fixed, 3840 x [email protected], YCBCR422 8b TM (depending on what you watch, it could change to 60Hz, Netflix is RGB 8b TM)

you would think that the LG TV being the end Device would report it as 12bit yuv 4:2:2 seeing as it's decoded it by them. Clearly not lol
 
Last edited:

dante01

Distinguished Member
Nothing you'd watch or access via any streaming service will have been mastered with more than 10 bit colour depth. Nothing is filmed with such colour depth. Why are you obsessing about attaining something that is utterly pointless at this point in time. The TV only has a 10 bit panel anyway so wouldn't be able to display 12 bit colour depth even if source content was encoded with it.

12 bit colour depth and the full gamut of the BT2020 recomendations are aspirational at this point in time.

 
Last edited:

goingoingong

Distinguished Member
Thanks both for your helpful replies. You've given me quite a bit too think about. I think that I'm probably worrying about not a lot 😂 but I will definitely try a few tests.
Try this to check what you output from the Fire TV and change settings as necessary.
If you hold Up and Rewind on the firestick control, it'll start cycling through different resolutions. Wait until 2160p 24hz appears then go back into the Display and Sound section in the settings.

Choose Display then set these options

Video Resolution - Automatic (up to 4K Ultra HD)

Match Original Frame Rate - On

Colour Depth - Up to 12 bits

Colour Format - Automatic

Dynamic Range Settings - Adaptive

Amazon Digital and Device Forums - US
 

EnwezorN

Active Member
Thanks for the advice. I tried this and got the same results.

23.97Hz, Fixed, 3840 x [email protected], YCBCR422 8b TM

I tried switching to 2160p at 60Hz too and got these results from the LG CX.

59.93Hz, Fixed, 3840 x [email protected], YCBCR422 8b TM

I set the Match Original Frame Rate on the 2nd test to Off so it would keep it at 60Hz and it did.

I've switched this back though as I believe it is better to match the original frame rate.

If I had a UHD Blu-ray player, with all that richer content, I wonder if this would be coming through as 10-bit?

The confusing or not-confusing thing is that my TV displays either the 'HDR' symbol, sometimes 'HLG HDR' or 'Dolby Vision' for all 4k content depending on whether I am using native smart app, firestick or Sky Q as a source.
 

Jay53

Active Member
Nothing you'd watch or access via any streaming service will have been mastered with more than 10 bit colour depth. Nothing is filmed with such colour depth. Why are you obsessing about attaining something that is utterly pointless at this point in time. The TV only has a 10 bit panel anyway so wouldn't be able to display 12 bit colour depth even if source content was encoded with it.

12 bit colour depth and the full gamut of the BT2020 recomendations are aspirational at this point in time.


I thought that but just found out in terms of mastering

Netflix originals delivery specification version OC-4-0 states the format it is delivered to Netflix.

For UHD video its 3840x2160 and for HD it's 1920x1080

All the details below are the same requirements for both UHD and HD resolutions

RGB / 4:4:4 / Full range
Dolby vision HDR using P3 D65 / SMPTE 2084 (PQ) 12bit
SDR using ITU-R BT.709 / D65 / ITU-R BT 1886 gamma 2.4 10bit

Reads to me that both HDR and SDR are mastered using higher rates than I originally thought

Note this is what it's supplied to Netflix as. What they transmit it as is entirely different as I don't believe they are to take the extremes transmitting HD SDR using 10bit RGB or UHD DV at 12bit RGB :)


Opinions?
 
Last edited:

EnwezorN

Active Member
I think I read somewhere that netflix and Amazon original content was very high quality and this confirms it.
I guess it currently isn't possible to stream it because the bandwidth would be too high?
 

3rdignis

Active Member
Vincent explains partly LG 8,10,12bit readout @6min.
 

EnwezorN

Active Member
Ahh, so you can't trust the freesync information being displayed on my lg cx. The firestick 4k could be streaming at 10bit or 12bit and my TV doesn't seem to know. Still strange that it reports 10bit for sky Q and 8 bit for the Amazon.
 

Jay53

Active Member
Ahh, so you can't trust the freesync information being displayed on my lg cx. The firestick 4k could be streaming at 10bit or 12bit and my TV doesn't seem to know. Still strange that it reports 10bit for sky Q and 8 bit for the Amazon.
Strictly speaking the TV is not lying. It's just that the info it is displaying is associated with what signal it's receiving and it hasn't at that point decoded it into DV

As one of links above describes Dolby vision is 12bit yuv 4:2:2 but it's packaged up and transmitted as an 8bit RGB signal over hdmi. The TV then receives it and it's at this point it displays the signal info i.e it's 8bit signal. After this point it unpacks the DV and puts it back into 12bit yuv 4:2:2.

The conclusion is that as long as the TV says it displaying DV it's 12bit yuv 4:2:2 :)

The sky q box doesn't support DV so will be HDR 10bit so if you want to watch DV Netflix you shouldn't use the Netflix app through sky q box
 

Romster1

Active Member
firstly appologies for jumping on to the thread but it seems to cover something which I'm interested to know more about too. I have an NVidia shield tv device connected to a Pioneer vsx 934 AVR, initially the display settings on the shield were only showing as 8 bit hdr ready however after setting the avr's HDMI ports and the Sony Bravia tv's HDMI ports to enhanced the display settings now indicate and say hdr 10 ready. when i play both streaming and test hdr content and press the info button on my avr its showing 1920 x 1080 59HZ YCBCR 422 24 bit premium content. Am I right in thinking that the AVR is only detecting it as 8 and not 10 bit?
 

The latest video from AVForums

AVForums Movies Podcast: Which is the best decade for horror movies?
Subscribe to our YouTube channel

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom