The Pixel Format and Video Dynamic Range settings thread

wolvers

Distinguished Member
This has come up again recently, and we never seem to of gotten to a definitive answer on how to set this correctly, so I thought I would start a thread :rolleyes: where we can collate together all our settings and how we set them to see if there is any common ground. Who knows, we may even get to a stage where we can create a definitive method. :thumbsup:

I'll start us off as I just spent an hour or so with some test patterns that djcla pointed to the other day from here;

AVS HD 709 - Blu-ray & MP4 Calibration - AVS Forum

The mp4 zipped one is what I chose to use and I've used just the three that are dedicated to black and white levels. They are called 'Black Clipping', 'White Clipping' and 'APL Clipping'. These are self explanatory and you shouldn't have any trouble working out what they should look like. I can try and add screenshots if I'm not making much sense.

APL clipping (you can use this image to check that the main output of the GPU and the display are set correctly)



With my system I have 3 settings to focus on. First is the Pixel Format in the AMD Control Center (which I believe sets the main output of the GPU), then the Dynamic Range in the video settings (only affects video playback via the enhanced video renderer as far as I can tell i.e. in media center) and finally the HDMI Video Range in my display's setting (Normal or Expanded). I started out by setting them all to what I believe to be matching options (limitedRGB, limited 16-235 and normal). It's worth noting here that I made a point of leaving all Brightness and Contrast settings on default, that's for the desktop colour, video and in the display's settings as well as leaving all dynamic contrast options off.

Video Dynamic Range (AMD)

dynamicrange.png


Video Dynamic Range (Nvidia)

dynamicrangenvidia.png


Video Dynamic Range setting for Intel?

I was obvious straight away that there was something not right as the bars were nowhere to be seen on the blacks test and the whites were fully in view. After some trial and error, basically going from being fully crushed to well overblown, I found that only one of the original settings was incorrect. Switching the Pixel Format from limited RGB to full RGB has sorted it, then I just tweaked the brightness and contrast in the video colour settings on the AMD control center to get the test patterns looking perfect.

Pixel Format settings (AMD, AFAIK you cannot change this on Nvidia so you might need to to work out what the GPU is outputting. I'm unsure of Intel)

colourpixelformat.png


Admittedly it's a slightly odd combination of settings but it appears to be spot on according to the test patterns so I will now do some extensive movie watching to see if it looks good in real life. :D

Please chime in and let us know how you have set yours up and why.
 
Last edited:

wolvers

Distinguished Member
Understanding what this is all about (thanks to Stephen for taking the time)

Quick (abridged) primer :



Most broadcast sources start life RGB.



Broadcast cameras either have 3 CCD or CMOS sensors, with Red, Green and Blue filters feeding them, or have a single high-resolution sensor with Red, Green and Blue filters over individual pixels (like digital stills cameras) Similar film->video transfer telecines detect Red, Green and Blue signals. These are likely to generate a 4:4:4 RGB signal internally in the the device.



(A 4:4:4 RGB signal kind of goes : R1, G1, B1 : R2, G2, B2 : R3, G3, B3 : R4, G4, B4)



Most broadcast and post-production gear (apart from the very high-end stuff) is designed around the SDI/HD-SDI interconnect standards. These usually carry YCrCby (Y, R-Y and B-Y signals) with the Y (luminance or brightness) at full bandwith (aka 4) and the Cr/Cb colour difference signals carried at reduced resolution as the eye is more sensitive to brightness changes than colour changes.



(A 4:2:2 YCrCb signal kind of goes : Y1, Cr1 : Y2, Cb2 : Y3, Cr3 :Y4, Cb4)



So the output of cameras, vision mixers, routers, VT machines, servers in broadcast areas is usually 4:2:2 YCrCb.



To reduce the bandwith even further, transmission compression (MPEG2/H264 that we receive via Freeview, satellite and cable) and pre-recorded media compression (MPEG2/H264/VC-1 on DVD, Blu-ray, HD-DVD etc.) uses 4:2:0 compression, where only the Cr OR Cb samples are carried on each line.

(So you end up with : Y1, Cr1 : Y2 : Y3, Cr3 : Y4 on one line

and : Y1, Cb1 : Y2 : Y3, Cb3 : Y4 on the next line. ) (*)



Most TV displays - plasmas, LCDs, DLPs etc. have RGB output systems, so before the picture can be displayed, it needs to be converted from whatever format it is in to RGB 4:4:4 (assuming a native display resolution)



If you keep YCrCb at 4:4:4 and RGB at 4:4:4 though, assuming the levelspace is correct, you can convert between the two cleanly (apart from mathematical rounding errors)



So most of us will start with a 4:2:0 YCrCb signal from our DVD, Blu-ray, DVB broadcast etc. We will need to end up with a 4:4:4 RGB signal before we can see this signal on our TV. The conversion has to take place somewhere - but if you aren't careful it can take place in more than one place.



Not all HDMI destinations will take all formats though - you may find a display that only accepts 4:2:2 YCrCb, or a DVI display that only accepts 4:4:4 RGB.



To complicate matters - the SDI/HD-SDI RGB to YCrCb conversion formulae are different... You can't use the same conversions to convert DVD/SD DVB (aka ITU/Rec 601 content) YCrCb to RGB as you will Blu-ray/HD DVB (aka ITU 709 content) YCrCb to RGB.



So far, so not that simple.



Where it gets even more complicated is when you start thinking in terms of level space.



At first glance, the PC world makes total sense. Black is at 0 and White (or maximum R,G,B) is at 255. Simple.



Except that broadcast signals don't have a nice clean easy life. Analogue content still exists. Analogue techniques are still in use (at least one major camera manufacturer still sends HD signals down its camera cables on analogue modulated carriers, and this is the norm for SD studio cameras) Analogue signals can have overshoots, where they go over peak white level, and undershoots, where they briefly drop below black level. If you don't preserve these through the digital chain, and clip them, you can introduce all sorts of nasty ringing effects (clipping an overshoot on a sharp edge in an image can cause it to have little repeated edges like nasty sharpening)



To ensure that these over/undershoots were preserved, the early 80s digital TV interconnect standards were based on 16-235 levels for Y and RGB, with 16-240 level for Cr and Cb.



We use them to this day.



The signal we receive at home from Sky, Freesat, Freeview, the content on our DVDs and Blu-rays is all in 16-235 level space. Set top boxes and displays expect to output and receive 16-235 content by default.



Only PCs (and some games consoles) will generate native 0-255 content.



When displaying 16-235 content on a desktop PC feeding a VGA or DVI monitor, the PC should re-map the 16-235 material to 0-255 to ensure blacks are black and whites are white (though old PC CRTs may show some ringing) This mapping can introduce some additional "banding"



When displaying 16-235 content on a standard HDMI TV, the PC should ensure 16-235 sources are output as 16-235, and 0-255 PC content is also remapped to 16-235.



However MS, driver manufacturers and app writers continue to get this wrong... There is no "right" or "wrong" answer.



Very annoying - when DVD players, Sky boxes, Freeview boxes, Blu-ray players do it perfectly without any user interaction at all...



(*) To avoid aliasing, the Cr and Cb subsampled samples are not created just by discarding the others, they are created by filtering the 4:4 Cr and Cb source and generating a clean lower bandwith signal. They are not co-sited quite as my sequences suggest - but drawing quincunxial diagrams in a forum page is tricky...



Useful links

http://www.avforums.com/forums/room...ion/760012-greyscale-calibration-dummies.html
 
Last edited:

djcla

Distinguished Member
So you left your tv brightness and contrast in the middle defaults and just adjusted the driver brightness etc? Did you adjust them under desktop managment or under the video section of ccc?
 

djcla

Distinguished Member
Right set everthing to limited and madvr etc and adjusted contrast slightly in ccc under desktop as video one had no effect. Found i could see all the whites but blacks were ok thats why i adjusted contrast slightly in ccc rest i did through tv controls
 

wolvers

Distinguished Member
Yep, I had ALL brightness and contrast settings at default, then got the range settings sorted. Tweaked (only about 3 or 4 points) the desktop brightness and contrast first with the still image because that's a global thing, video settings didn't need touching after that. I Never touched the display adjustment.
 
Last edited:

PsyVision

Distinguished Member
My brightness and contrast are set on my TV from calibration specs that I found on a website. I checked these with the AVS HD dvd via my blu-ray and was happy with them.

For my HTPC:

I have the HDMI range on my TV set to Normal - I can't change this for some reason.

I then have the dynamic range set to Limited in CCC.

The pixel format has the biggest effect. I was sing YCbCr 4:4:4, I changed this to RGB 4:4:4 Limited but this looked a bit washed out on the desktop background and not the best on the APL clipping. I changed this to RGB 4:4:4 Full and have since left it at that. I think in hindsight now the YCbCr looked 'crushed'.

Will watch some Films/TV and see what I think.
 

wolvers

Distinguished Member
My brightness and contrast are set on my TV from calibration specs that I found on a website. I checked these with the AVS HD dvd via my blu-ray and was happy with them.

So you calibrated the display first with a standalone BD player as the cource? That makes sense if you display doesn't have settings for each input. My PJ will remember the settings for each input so that I can calibrate it for the SkyHD box (easier said than done :suicide:) and leave it on stock settings for the HTPC input as it seems to give better results calibrating it in the AMD control center settings.

The pixel format has the biggest effect. I was sing YCbCr 4:4:4, I changed this to RGB 4:4:4 Limited but this looked a bit washed out on the desktop background and not the best on the APL clipping. I changed this to RGB 4:4:4 Full and have since left it at that. I think in hindsight now the YCbCr looked 'crushed'.

I think that I had pretty much the same experience. It sounds like you've ended up with the same settings as me. :thumbsup:

What decoders/player software are you using? I played the test patterns in 7MC with the DXVA ffdshow decoder.
 

Stephen Neal

Distinguished Member
Does anyone use the Nominal Range registry hack to get 16-235 sources displaying correctly in 7MC when running 16-235 displays via HDMI?

There seem to be real issues with getting consistent 16-235 output (I don't really care about BTB <16 and WTW>235 as properly mastered content shouldn't really have content below black, though above white is probably a little more to be expected) for all 16-235 content (with unprotected 16-235 content often not properly handled and treated as 0-255 and thus appearing washed out and de-saturated when output at 16-235)

It is infuriating - we've only had 16-235 as a video standard since the early-to-mid 80s...

It just needs MS and the driver manufacturers to properly implement what they should be (the backbone is apparently there to properly support it all...)
 
Last edited:

wolvers

Distinguished Member
I haven't needed to do the hack so far. I've played the test patterns in 7MC and it looks good with very minor brightness and contrast tweaking.

Is it related to the fact that I'm using ffdhow and that is outputting 16-235 correctly? Plus also, when you set 16-235 in the AMD control panel it only alters the video range (works in 7MC but not all players) so maybe that is controlling the registry setting you're talking about. I tried it on my laptop and that needed 16-235 in the Nvidia settings for the test pattern to look right so maybe it's your GPU?
 
Last edited:

Stephen Neal

Distinguished Member
I've had various experiences with nVidia and ATI drivers (there used to be the old friend of the BT601CSC registry hack for ATI drivers)

The issue seems to be that MS have a way of flagging whether content is 16-235 or 0-255 range, and drivers have a way of setting their output to be 0-255 or 16-235. However somewhere in between the flagging is ignored for some content. There is also the argument as to whether 16-235 content should be re-scaled to 0-255 for internal consumption and then scaled back to 16-235 for output (which will obviously clip <16 and >235 content)

The most common example was that PDVD/TMT would play Blu-rays fine (16-235 content) with nVidia or ATI drivers configured for 16-235 output range (which is what a standard HDMI TV will expect - and is what many of us routing through AV amps need as our other HDMI kit sharing the same input on the TV is also 16-235 standard).

However for SD (SDTV, DVDs etc.) using the standard MS DTV decoder and VistaMC/7MC we ended up with incorrect black and white levels, as in this case the 16-235 content was being treated as 0-255 - and scaled again (giving you something like 32-215 output with sat-up blacks, dull whites and de-saturated colour)

To get this video to appear in the correct level space you had to incorrectly tell your drivers that the output range was 0-255 - so the incorrectly handled 16-235 was output as 16-235 (usually with <16 and >235 content present still because it wasn't being treated as 16-235 but as 0-255) OR with ATI cards you could force SD 601 content to be handled correctly via the registry hack. (For some reason HD 709 was often handled correctly)

More recently there has been an unofficial hack within Media Center to compensate for incorrect driver handling of 16-235 content it appears.

I don't use FFDShow, MPC-HC, VLC etc. - just 7MC and PowerDVD or TMT with no extra codec packs on my main HTPCs.

There appears to be a minefield of drivers and apps both thinking it is their job to re-scale 16-235 content to 0-255, or both thinking it isn't...

I try never to touch the brightness and contrast controls in player apps or drivers.

This stuff is digital. With Brightness and Contrast inhibited or at their default positions the system should just operate in "straight through" mode.

If I play a DVD or Blu-ray with a full-screen at 16, I expect a full screen of 16s to come out of my HDMI output, and I expect my TV to display this at black level (I should not be able to see anything below this level, if I can my TV black level is wrongly set).

If I play a DVD or Blu-ray with a full-screen at 235, I expect my HDMI output to sit at 235, and I expect my TV to display this as white. (There are arguments about what should happen >235, just as there are with analogue video signals above 0.7V)

I keep meaning to take my HTPC into work and run the HDMI output into an HDMI->HD-SDI converter and look at the levels on a proper scope. (I've connected my PC to HD-SDI VTRs and Servers via this route in the past to record its output)
 
Last edited:

PsyVision

Distinguished Member
So you calibrated the display first with a standalone BD player as the cource? That makes sense if you display doesn't have settings for each input. My PJ will remember the settings for each input so that I can calibrate it for the SkyHD box (easier said than done :suicide:) and leave it on stock settings for the HTPC input as it seems to give better results calibrating it in the AMD control center settings.

Yes, I initially did this before I had my HTPC and I duped the settings across all inputs. I would prefer this to doing it on the HTPC because if driver updates, reinstalls etc reset settings then it's a bit of a pain changing them back.

What decoders/player software are you using? I played the test patterns in 7MC with the DXVA ffdshow decoder.

I tested the MP4 download with MPC-HC using the standard settings. I don't have ffdshow installed at the moment. I am currently switching from 7MC to mediaportal so I haven't really done much with decoders, renderers etc because I will probably do a fresh install when MP 1.2 goes from beta to final. I am trying to judge what filters etc I want to use given what I want to do and so have been following developments with ffdshow, lav etc on these forums. I may set mediaportal to use MPC-HC externally and will then configure that.

I want to get in touch with jameson at some point as he uses MP so will probably know what's best.
 
Last edited:

djcla

Distinguished Member
When u setup 7mc tv do u set to television or digital flat panel as i notice it trys to auto adjust the display but not sure what its changing?

Everything now set to limited and i adjust display controls i can see all white bars in while clipping pattern with display contrast set to 0 and at the top setting is that right. Reading the manual it seems correct . I have left ccc brightness etc as default now.
 

wolvers

Distinguished Member
So are you both of the opinion that it is better to make the fine adjustments to brightness and contrast on the display?
 

PsyVision

Distinguished Member
In terms of PQ I'm not sure it makes much difference. In terms of it being separate to whatever I do to my HTPC - I prefer the TV.
 

jameson_uk

Well-known Member
So are you both of the opinion that it is better to make the fine adjustments to brightness and contrast on the display?

now this all depends and is why professional calibrators charge a lot of money :)

If the HTPC is your only source then the best place to make the adjustment will be the TV. If you have multiple sources then it gets more tricky...

Also the video processing chips that seem to exist in most amps now can start changing things...

The black and white levels are only the start. To get the most of your picture you need to understand the whole lot (http://www.avforums.com/forums/room...ion/760012-greyscale-calibration-dummies.html can not be recommended enough). Although you have a brightness and a contrast setting most TVs actually have these settings stored against Red, Green and Blue but they are only accessible via the service menu (my Samsung LED in the bedroom actually does show these in the menu though)

If you calibrate your set by eye to just make 16 and 235 look about right I think you could end up with a setting that is still really bad. You can end up with all sorts of odd things happening and only calibrating at two points might not be enough (although is probably getting there).

I understand the theory of all this but what I really struggle with is how the PC can affect it. I guess with a standalone bluray player you can assume the output is correct and calibrate around this. On a PC you can not make this assumption as it might be doing all sorts of messing around with the video. This makes me slightly wary of calibrating from the PC and the AVS disc as how can I be sure that it actually is outputting the levels it says it is??? If I calibrate from another source though this will then not be right with the HTPC.....

I am often tempted to fork out for someone to come round and calibrate the TV but I don't think any of the calibrators will understand the PC impact on all of this...
 

PsyVision

Distinguished Member
Yes, to clarify, the settings I have set are full calibration and include tweaks to the red, green, blue etc as described by jameson as well as the brightness and contrast.
 

djcla

Distinguished Member
wolvers said:
So are you both of the opinion that it is better to make the fine adjustments to brightness and contrast on the display?

I think so yes i end up with contrast higher than default and brightness fractionaly lower on tv and all pc setting set to limited . Default on TV is in the middle for both.
 

djcla

Distinguished Member
jameson_uk said:
now this all depends and is why professional calibrators charge a lot of money :)

If the HTPC is your only source then the best place to make the adjustment will be the TV. If you have multiple sources then it gets more tricky...

Also the video processing chips that seem to exist in most amps now can start changing things...

The black and white levels are only the start. To get the most of your picture you need to understand the whole lot (http://www.avforums.com/forums/room-acoustics-audio-video-calibration/760012-greyscale-calibration-dummies.html can not be recommended enough). Although you have a brightness and a contrast setting most TVs actually have these settings stored against Red, Green and Blue but they are only accessible via the service menu (my Samsung LED in the bedroom actually does show these in the menu though)

If you calibrate your set by eye to just make 16 and 235 look about right I think you could end up with a setting that is still really bad. You can end up with all sorts of odd things happening and only calibrating at two points might not be enough (although is probably getting there).

I understand the theory of all this but what I really struggle with is how the PC can affect it. I guess with a standalone bluray player you can assume the output is correct and calibrate around this. On a PC you can not make this assumption as it might be doing all sorts of messing around with the video. This makes me slightly wary of calibrating from the PC and the AVS disc as how can I be sure that it actually is outputting the levels it says it is??? If I calibrate from another source though this will then not be right with the HTPC.....

I am often tempted to fork out for someone to come round and calibrate the TV but I don't think any of the calibrators will understand the PC impact on all of this...

I have also wondered if the pros would understand the htpc side of things if i ever forked put the cash be interested to know as the htpc is my only source. Do any post on these forums?
 

wolvers

Distinguished Member
If the HTPC is your only source then the best place to make the adjustment will be the TV. If you have multiple sources then it gets more tricky...

As I mentioned already that's not an issue for me as my display will remember the settings for each input, and by that I mean type of input, as opposed to the socket i.e. I only use the one HDMI socket as the source is switched by the amp but the projector knows whether it is getting the signal from either SkyHD or the HTPC. Probably goes on EDID.

Also the video processing chips that seem to exist in most amps now can start changing things...

Yep, I need to check if this having any effect by bypassing it. That's on the job list! :rolleyes:

The black and white levels are only the start. To get the most of your picture you need to understand the whole lot (http://www.avforums.com/forums/room...ion/760012-greyscale-calibration-dummies.html can not be recommended enough). Although you have a brightness and a contrast setting most TVs actually have these settings stored against Red, Green and Blue but they are only accessible via the service menu (my Samsung LED in the bedroom actually does show these in the menu though)

If you calibrate your set by eye to just make 16 and 235 look about right I think you could end up with a setting that is still really bad. You can end up with all sorts of odd things happening and only calibrating at two points might not be enough (although is probably getting there).

My intention is to move onto colour calibration next but I felt that making sure that the greyscale was correct at the most basic level was something that was often overlooked, certainly by me. :suicide:

I understand the theory of all this but what I really struggle with is how the PC can affect it. I guess with a standalone bluray player you can assume the output is correct and calibrate around this. On a PC you can not make this assumption as it might be doing all sorts of messing around with the video. This makes me slightly wary of calibrating from the PC and the AVS disc as how can I be sure that it actually is outputting the levels it says it is??? If I calibrate from another source though this will then not be right with the HTPC.....

I find the idea of calibrating a display to one source first a bit odd. How do you know if that source is correct in the first place? Is it more reliable than a decent PC's GPU? They must be all slightly different otherwise all those reviews are a waste! :p

I'd like to see what Stephen comes up with if he is able to check the output of his HTPC properly. :hiya:
 

joeydrunk

Active Member
I'm glad you started this thread wolvers, hopefully I cam learn some things from whats posted and contribute too. I've been asked to help write a tutorial/guide for madvr in the setup of mpchc/madvr/lav, mostly focusing on how to setup settings inside madvr. I have a good friend who has been in the professional audio/video industry for decades. He's going to drop by next week with his professional calibrating gear. Were going to mess around with the display, and avr settings In conjuction with madvrs calibrating settings. I'm going to spend the next few weeks reading and studying everything I can get my hands on regarding the subjects at hand including, chroma/luma up/down sampling, 3dlut, scaling, gamma correction, gpu shaders, calibrating madvr/display, YCbCr>RGB conversion, yv12/nv12, 4:2:0-4:4:4, etc. etc. I'm looking forward to contributing here as well as educating myself.
 

pRot3us

Distinguished Member
I'd certainly say calibrating the display is essential, especially if using a projector.

For example, if you're leaving your display's brightness at zero when the correctly calibrated setting is say -15 and calibrating the brightness/contrast on your pc, then any material viewed will be correct within the video source but would be way off from the best possible image from your display. Your black levels would be washed out and you would never see deeper blacks than the zero setting on the display.

That's before getting onto more advanced settings like gamma and iris settings that your display may also have.
 

Stephen Neal

Distinguished Member
Here's some (slightly abridged) background that might be useful for people new to this area.

Digital video - as used by broadcasters in studios, when recording on standard SD/HD camcorders on-location etc is based around two main standards. ITU/Rec 601 for SD and ITU/Rec 709 for HD.

These are both based around 16-235 level space for Y (Luminance/Brightness) and 16-240 (centred around 127) for Cr (weighted R-Y)/Cb (weighted R-B) colour difference signals. If RGB representation is being used (unusual as an interconnect but not internally) then 16-235 levels are used for all three RGB signals.

The relationship between YCrCb and RGB differs between 601 and 709.

In 601 : Y=0.59G+0.30R+0.11B
In 709 : Y=0.72G+0.21R+0.07B

All broadcast video sources will be in either 601 YCrCb (SD stuff like DVDs and SD TV broadcasts) or 709 YCrCb (HD stuff like Blu-ray and HDTV broadcasts). They will all be in 16-235/16-240 colour space for Y and CrCb respectively. (709 derives luminance more from the green and less from the red and blue than 601)

Within studios it is usual to carry the YCrCb stuff as 4:2:2 in both SD and HD studios - which means the luminance content is carried at twice the resolution/bandwith as the chrominance bandwith horizontally, but the same resolution vertically. (This means for every 4 Y samples per line there are 2 Cr and 2 Cb samples - the chroma is sub-sampled from a 4:4:4 source) This means the chroma resolution is very asymmetric.

However for broadcast the chroma is also subsampled vertically to create 4:2:0. This means that for every 4 Y samples per line there are 2 Cr samples OR 2 Cb samples, with the Cr and Cb alternating on a line-by-line basis).

Thus 4:2:0 has a quarter of the chroma resolution of 4:4:4 (half the horizonta and half the vertical)

So DVDs, Blu-rays and SD/HDTV all reach our HTPCs as 4:2:0 16-235 in either SD 601 and HD 709 colourspace. We have to re-sample the chroma to either 4:2:2 YCrCb, 4:4:4 YCrCb or 4:4:4 RGB, have to cope with a mix of 601 and 709. (If you are playing an SD 601 DVD on your HTPC and your HTPC is running YCrCb output at 1920x1080, which your TV will expect to be HD 709, it is likely that you will need to have a 601 to 709 colourspace conversion - you can't just pass through the YCrCb untouched)

You also have to ensure that the 16-235 levelspace is observed.

There are many pitfalls in these processes. There is a widespread bug in 4:2:0 to 4:2:2/4:4:4 chroma upsampling caused by a misunderstanding of the siting of chroma sampling points, which many hardware and software devices still get wrong. There are issues with 16-235 vs 0-255 level space. There are issues between SD 601 and HD 709 YCrCb to RGB matrices differing.

Additionally, we all set-up our equipment differently, and there is also personal preference in how we want to 'see' the same picture.

I route my HTPC through an AV amp, so the same HDMI input on my TV is used to display my HTPC, my Popcorn Hour, my PS3 and my Sony Blu-ray/DVD player. I therefore need my PC output to be in the same levelspace as the other equipment I use, and calibrate my TV to work with all of them not just one of them.
 

joeydrunk

Active Member
Stephen Neal said:
Here's some (slightly abridged) background that might be useful for people new to this area.

Digital video - as used by broadcasters in studios, when recording on standard SD/HD camcorders on-location etc is based around two main standards. ITU/Rec 601 for SD and ITU/Rec 709 for HD.

These are both based around 16-235 level space for Y (Luminance/Brightness) and 16-240 (centred around 127) for Cr (weighted R-Y)/Cb (weighted R-B) colour difference signals. If RGB representation is being used (unusual as an interconnect but not internally) then 16-235 levels are used for all three RGB signals.

The relationship between YCrCb and RGB differs between 601 and 709.

In 601 : Y=0.59G+0.30R+0.11B
In 709 : Y=0.72G+0.21R+0.07B

All broadcast video sources will be in either 601 YCrCb (SD stuff like DVDs and SD TV broadcasts) or 709 YCrCb (HD stuff like Blu-ray and HDTV broadcasts). They will all be in 16-235/16-240 colour space for Y and CrCb respectively. (709 derives luminance more from the green and less from the red and blue than 601)

Within studios it is usual to carry the YCrCb stuff as 4:2:2 in both SD and HD studios - which means the luminance content is carried at twice the resolution/bandwith as the chrominance bandwith horizontally, but the same resolution vertically. (This means for every 4 Y samples per line there are 2 Cr and 2 Cb samples - the chroma is sub-sampled from a 4:4:4 source) This means the chroma resolution is very asymmetric.

However for broadcast the chroma is also subsampled vertically to create 4:2:0. This means that for every 4 Y samples per line there are 2 Cr samples OR 2 Cb samples, with the Cr and Cb alternating on a line-by-line basis).

Thus 4:2:0 has a quarter of the chroma resolution of 4:4:4 (half the horizonta and half the vertical)

So DVDs, Blu-rays and SD/HDTV all reach our HTPCs as 4:2:0 16-235 in either SD 601 and HD 709 colourspace. We have to re-sample the chroma to either 4:2:2 YCrCb, 4:4:4 YCrCb or 4:4:4 RGB, have to cope with a mix of 601 and 709. (If you are playing an SD 601 DVD on your HTPC and your HTPC is running YCrCb output at 1920x1080, which your TV will expect to be HD 709, it is likely that you will need to have a 601 to 709 colourspace conversion - you can't just pass through the YCrCb untouched)

You also have to ensure that the 16-235 levelspace is observed.

There are many pitfalls in these processes. There is a widespread bug in 4:2:0 to 4:2:2/4:4:4 chroma upsampling caused by a misunderstanding of the siting of chroma sampling points, which many hardware and software devices still get wrong. There are issues with 16-235 vs 0-255 level space. There are issues between SD 601 and HD 709 YCrCb to RGB matrices differing.

Additionally, we all set-up our equipment differently, and there is also personal preference in how we want to 'see' the same picture.

I route my HTPC through an AV amp, so the same HDMI input on my TV is used to display my HTPC, my Popcorn Hour, my PS3 and my Sony Blu-ray/DVD player. I therefore need my PC output to be in the same levelspace as the other equipment I use, and calibrate my TV to work with all of them not just one of them.

This is some excellent information even though I have little knowledge what you mean! Lol!! I recognize all the terms and have a little bit of an idea of how things work.



Do you know of any good reading (links)to start getting a thorough idea of what's involved with this stuff? Novice-imtermediate?
 

djcla

Distinguished Member
One thing to note re the ccc any changes seem to need to be set for each resolution frequency etc
 

The latest video from AVForums

Sony Bravia XR A80J OLED TV Review
Subscribe to our YouTube channel

Latest News

BBC licence fee to be scrapped in 2027
  • By Andy Bassett
  • Published
What's new on Netflix UK for February 2022
  • By Andy Bassett
  • Published
iFi Audio launches ZEN One Signature DAC
  • By Andy Bassett
  • Published
Rotel announces MKII versions of A12, CD14 and RCD-1572 devices
  • By Andy Bassett
  • Published
Bang & Olufsen software update expands speaker connectivity
  • By Andy Bassett
  • Published

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom