Question About chroma subsampling

ztdz

Novice Member
Hi guys, I have my tv connected to my pc, mostly for gaming and watching videos ,not browsing , the TV only displays rgb full in pc mode, but said mode makes everything look dull, so i have to use nvidia vibrance slider to make it more colorful, unfortunately this introduces color clipping according to test below.

On the other hand I could use game mode with the same input lag and change color directly from tv which does not introduce clipping, but the signal would be converted to rgb 4:2:2 automatically

Is chrome subsampling or color clipping worst?

Which of the two evils is the best for my case?

Thanks

Test for reference
 
Last edited:

EndlessWaves

Distinguished Member
Well you're trying to shift all the colours away from what they're intended to be, so I wouldn't worry about which preserved more picture quality.

Try both and see what you prefer.

Colour subsampling reduces colour resolution so expect to see the artefacts caused by lower resolution when dealing with details depicted using two colours of similar brightness (e.g. coloured text on a coloured background can sometimes show this).
 

youngsyp

Distinguished Member
It would be useful to know what TV you're using but, all modern TV's actually display in RGB, regardless of what format the content starts as. That being the case, any content will typically be converted to RGB, from a flavour of YCbCr (YCC) colour, as encoded at the source.

For example, DVDs, Blu-ray and UHD Blu-ray are all encoded as YCC4:2:0 on the disc. This is simply to reduce the storage cost of the content as YCC4:2:0 takes up less storage than any other level of chroma or RGB. To get from that, as encoded on the disc, a process takes the video from YCC4:2:0 to YCC4:2:2, to YCC4:4:4, and it's then converted to RGB to be displayed. Either the source device or display can perform this process or parts of it.

So a couple of things to pick out from your OP. Firstly, the TV may only accept RGB 'Full' from a source device, whilst in PC mode. The important bit there being accept from the source device. There are several factors that can affect this but, it is what it is so you have to live with it.
Secondly, where you mention RGB Full, the full is an expression of the video levels used. Full = data levels or 0-255 in the 8 bit colour depth realm and 0-1024 in the 10 bit colour depth realm. 0 being black and 255/ 1024 white. If it was RGB Low, that would indicate it can only use video levels with RGB. In essence 16-235 for 8 bit and 64 to 940 for 10 bit. 16/ 64 being black and 235/ 940 being white.
Thirdly,, there's no such thing as 'rgb 4:2:2'. RGB is a colour space and '4:2:2' is a sub-sampling level of YCC, a different colour space.

Those aside, your post is still a little confusing, as you only seem to mention TV settings and not what the graphic's card is set to output. What I can pick out is that if the graphics card is set to output RGB Full, you're TV needs to be in PC Mode. Is that correct?
If it is what is the graphics card set to output, if you set the TV to Game Mode?

Ultimately, clipping is bad as you're losing colour fidelity because of what looks like a settings mismatch.
If the image looks dull, that also suggests a settings mismatch. So understanding what your graphics card is set to output and which TV you have would be a good starter.

Paul
 

ztdz

Novice Member
It would be useful to know what TV you're using but, all modern TV's actually display in RGB, regardless of what format the content starts as. That being the case, any content will typically be converted to RGB, from a flavour of YCbCr (YCC) colour, as encoded at the source.

For example, DVDs, Blu-ray and UHD Blu-ray are all encoded as YCC4:2:0 on the disc. This is simply to reduce the storage cost of the content as YCC4:2:0 takes up less storage than any other level of chroma or RGB. To get from that, as encoded on the disc, a process takes the video from YCC4:2:0 to YCC4:2:2, to YCC4:4:4, and it's then converted to RGB to be displayed. Either the source device or display can perform this process or parts of it.

So a couple of things to pick out from your OP. Firstly, the TV may only accept RGB 'Full' from a source device, whilst in PC mode. The important bit there being accept from the source device. There are several factors that can affect this but, it is what it is so you have to live with it.
Secondly, where you mention RGB Full, the full is an expression of the video levels used. Full = data levels or 0-255 in the 8 bit colour depth realm and 0-1024 in the 10 bit colour depth realm. 0 being black and 255/ 1024 white. If it was RGB Low, that would indicate it can only use video levels with RGB. In essence 16-235 for 8 bit and 64 to 940 for 10 bit. 16/ 64 being black and 235/ 940 being white.
Thirdly,, there's no such thing as 'rgb 4:2:2'. RGB is a colour space and '4:2:2' is a sub-sampling level of YCC, a different colour space.

Those aside, your post is still a little confusing, as you only seem to mention TV settings and not what the graphic's card is set to output. What I can pick out is that if the graphics card is set to output RGB Full, you're TV needs to be in PC Mode. Is that correct?
If it is what is the graphics card set to output, if you set the TV to Game Mode?

Ultimately, clipping is bad as you're losing colour fidelity because of what looks like a settings mismatch.
If the image looks dull, that also suggests a settings mismatch. So understanding what your graphics card is set to output and which TV you have would be a good starter.

Paul
My tvs is a samsung (mu6500), sorry didn't mention gpu settings, but the are correct RGB color full 12 bit if set to 1080p, 8bit in 4k, the output is correct, but the problem is that the TV does not let me control color in pc mode, so that's why I said the image looks dull
 

next010

Distinguished Member
My tvs is a samsung (mu6500), sorry didn't mention gpu settings, but the are correct RGB color full 12 bit if set to 1080p, 8bit in 4k, the output is correct, but the problem is that the TV does not let me control color in pc mode, so that's why I said the image looks dull

For games and video chroma 422 is fine for the most part on a PC.

You can also have the Nvidia GPU do a conversion to YCC422 instead of letting the TV resolve RGB 444 to 422 in game mode, whether it looks any better might be down to video processor on TV.

You could also try outputting YCC444 in PC mode to see how the TV treats that colours wise.
 

youngsyp

Distinguished Member
For games and video chroma 422 is fine for the most part on a PC.

You can also have the Nvidia GPU do a conversion to YCC422 instead of letting the TV resolve RGB 444 to 422 in game mode, whether it looks any better might be down to video processor on TV.

You could also try outputting YCC444 in PC mode to see how the TV treats that colours wise.
Whilst I agree that YCC4:2:2 is a reasonable output, from the video card to the TV and that trying YCC4:4:4 as a video card output is also a reasonable suggestion, the middle paragraph isn't helpful as it's misinformed. As stated above, RGB is a colourspace and YCC, expressed as 444 or 422 as you have, is a different colourspace. There's no such thing as RGB 444 nor RGB 422. The output is either RGB or YCC4::. That being the case, I think YCC is also expressed as YVV in PC speak, further confusing matters.

The part about "resolving RGB 444 to 422" makes no sense. To reiterate my last post, the content likely starts out as YCC4:2:0, as it's less storage/ bandwidth intensive. In order for the TV to display it, that video needs to be processed from that source format of YCC4:2:0 through YCC4:2:2, then YCC4:4:4. These steps essentially return the colour definition of the content before it was sub-sampled to YCC4:2:0. As a final step, the video is converted to RGB, as that's the format the TV uses to display it.
Either the video card or TV can do any part of this but, one may do a better job of any part of it than the other. And for the OP to be able to adjust the TV's video settings, he needs the video card to output in YCC. Thus setting his nVidia to output YCC4:2:2 or YCC4:4:4 seems like the best options. I'd suspect either setting would look identical on the TV also but, experimentation won't hurt.
My tvs is a samsung (mu6500), sorry didn't mention gpu settings, but the are correct RGB color full 12 bit if set to 1080p, 8bit in 4k, the output is correct, but the problem is that the TV does not let me control color in pc mode, so that's why I said the image looks dull
OK, thanks. Using RGB full @12 bit for 1080p is a choice, you can set the video card to output in anyway you wish and it should still give you a correct output. That being the case, if outputting from the video card at YCC4:2:2 allows you to do what you want with the TV's colour controls, there's no reason not to use that output format.

Paul
 

next010

Distinguished Member
Yes I know RGB is 444 output all the time but TV's do not ordinally display chroma 444 and according to rtings TV's effectively resolve RGB 444 signals to 422 internally without a PC mode.

That was their terminology not mine, I'm not a video expert but they generally know their stuff and was likely a simplified explanation.

PC GPU's do not use YVV terminology, Nvidia use YCC, AMD use YUV in the colour format output.
 

youngsyp

Distinguished Member
Yes I know RGB is 444 output all the time but TV's do not ordinally display chroma 444 and according to rtings TV's effectively resolve RGB 444 signals to 422 internally without a PC mode.

That was their terminology not mine, I'm not a video expert but they generally know their stuff and was likely a simplified explanation.
It's still not clear what you're trying to convey, as you're using the same conflicting terms, regardless of whose terms they are. I agree though, rtings do a good job however, there is room for error in the interpretation of anything they write. Could you point me in the direction of the rtings article?
PC GPU's do not use YVV terminology, Nvidia use YCC, AMD use YUV in the colour format output.
That's it, YUV, not YVV, thanks for the correction.

YUV is YCC, essentially.

Paul
 

The latest video from AVForums

Samsung Micro LED, Mini LED, NEO QLED TVs and More: AVForums Podcast Interview
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom