8-bit 4:4:4 or 10-bit 4:2:2?


My understanding is that 10-bit 4:2:2 YCbCr is a mandatory part of the HDMI v1.1 spec. So if you have a VP with 10-bit processing, you would seem to have two options:

1) Output 10-bit 4:2:2 from a VP (having upsampled 4:2:2 inputs, processed at 10-bit, and downsampled to 4:2:2) and allow the display to re-upsample to 4:4:4.

2) Output 4:4:4 at 8-bit resolution, with 10-bit --> 8-bit rounding errors.

Which would be preferable?


I don't want to do that, because my projector will convert RGB to YCbCr for processing and then back to RGB again for driving the panels.


Previously Liam @ Prog AV
You've answered your own question in the sense that there isn't a rule of thumb since different displays will process differently, different sources output differently, and different VPs are more or less capable of the conversion. Short answer - whatever looks best.


Thanks Liam.

Out of interest, is anyone on this forum outputting 10-bit 4:2:2 YCbCr from a VP with an HDMI output. Clearly the currently Lumagens only output RGB over DVI.


Active Member
Multiple YUV<->RGB conversions will not effect image quality if its done correctly, although its a fairly big if.

Its worth pointing out that if you're messing with gamma I beleive this is normally done in RGB space not YUV, so by setting your proc output to YUV you may be incurring another conversion anyway.

Personally I normally set the video sources to YUV422 as its the closest to their native format and output to RGB, although there is no descenable diffrence between that and YUV444 (display and/or proc doesn't appear to handle 422 correctly so can't try that).


The latest video from AVForums

Movies Podcast: Star Trek in 4K. Is the new boxset worth it?
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom