Colorimeter confusion

Dick Emery

Active Member
From what I know when using a colorimeter such as an X-Rite Eye One you have to use a computer in order to calibrate a display. What confuses me is how do you know if the output on the video card is correct to begin with in order to calibrate a display? I can see how the two may work together but once you switch from the video card source to another source (Like a different HDMI or internal tuner) how do you know that it is correct? Surely the display is only calbrated against the video card output and nothing else?

Also from what I know of Nvidia, ATI and Intel are not equal in terms of the way they output colorspace and levels? I know that my HDTV's EDID is used by my Nvidia card to apply a colorspace for instance.

Basically I need an explanation of how calibration deals with varying sources.
 

andy1249

Distinguished Member
I dont know the particular device your talking about , but , in general this is how it goes.

Video has standards , such as REC 709 etc.

In these standards , Red , Green and Blue output a specific frequency when shown correctly on the screen.

The Colorimeter measures these frequencies , so for example a program will be run whereby these colours are output , the colorimeter measures them against the known standard for that color , and you adjust your device to get it as close to the standards as possible.

No colorimeter relies on the output of a video card directly , most should be measuring directly from the display and verifying against known good values.
Any corrections should then be made to the output of the video card or display and this adjustment loop goes on until the display is accurate.


Also from what I know of Nvidia, ATI and Intel are not equal in terms of the way they output colorspace and levels?

Your right in the sense that hardly any piece of gear is set up to reproduce accurate colours out of the box , they are nearly always setup to overcome the bright lights on the shop floor , so almost all displays would need to be calibrated when they are where they are going to operate.

For computer screens and TV's , most people will never do this , its usually only done by the people who care , or who need accurate color for work purposes , such as photographers and so on.
 
Last edited:

arfster

Active Member
Also from what I know of Nvidia, ATI and Intel are not equal in terms of the way they output colorspace and levels?


Levels:
Nvidia cards will contract levels whenever an hdtv is connected by hdmi. Recent drivers will expand the levels back again for video alone so it still kicks out an end result of 16-235 video, but because expansion comes before contraction, btb/wtw get clipped, and you also have banding problems. The solution is to make a new resolution in the control panel, so it identifies the hdtv as a monitor, and leaves the levels alone. Make sure you get the frequencies/polarities/etc/right (see your tv manual).

ATI when I last used them expanded HD video levels by default, but with hdmi connected, reversed it with a contraction, with the same results as above (clipping+banding). Not sure if there is a solution.

w6rz.net has test patterns, grab the one labelled....
Level 1 to 254 Ramp and -5 to 105 IRE Bars (5 IRE steps) 1920x1080

The dots are 16 and 235, and the smooth ramp at the top makes banding obvious.


Colourspace:
Again w6rz.net, grab the ones named....
Rec. 601 75% Color Bars with Pluge 1920x1080
Rec. 709 75% Color Bars with Pluge 1920x1080
 

Dick Emery

Active Member
I don't suppose you know the timings for a Panasonic TX-P50G20B do you? In the manual it mentions 1125 vsync but nothing else really. When I check the current timings for 24Hz it's Horizontal front porch 638, vertical front porch 4, horizontal sync width 44, vertical 5, horizontal total pixels 2750, vertical 1125. This is progressive 32bit and gives 26.97 refresh rate with a pixel clock reading 74.2500

What I tend to do is use MPC HC to play with a SD/HD shader of 16 - 235 -> 0 - 255 which gives darker blacks and more punchy colours. However I do not know if this is correct.
 

Gordon @ Convergent AV

Distinguished Member
AVForums Sponsor
Dick: You are correct. Calibration is of the display chain...not just the display. I have found that different media players can affect levels output by PC's and that, in my experience as a calibrator, PC's are the least consistent piece of video electronics I come across. I would not be in a hurry to use a PC as a test pattern generator myself.
 

Dick Emery

Active Member
I thought so. So all those people using colorimeters are under false impressions as you are only calibrating against the video card output. Once you switch to another source that may be incorrect? Surely the only way to calibrate is using a professional calibrator that hooks up to the HDMI/component that has known tolerances. I assume professional ISF engineers use proper calibration hardware? Or should you get suspicious if they turn up using a laptop and Eye-One?
 

arfster

Active Member
I thought so. So all those people using colorimeters are under false impressions as you are only calibrating against the video card output.

Depends if it's properly setup or not. You do have to be careful with updating drivers and so on, nvidiai and ati can be pretty careless. However, it's relatively easy to compare readings you get from a HTPC and say a DVD/Bluray player, to check all is OK.

Most of this is pretty familiar territory for HTPC users, for at least 5 years now. There are fixes for pretty much everything.
 
Last edited:

andy1249

Distinguished Member
Most of this is pretty familiar territory for HTPC users, for at least 5 years now. There are fixes for pretty much everything.

Yes this is true , using ffdshow experiencd HTPC users will make sure that the graphics card is outputting using the correct color space ( 16-235 rather than 0-255 ) for the content that is actually playing.

In such cases the graphics card output can be properly calibrated.

Likewise if a workstation is being used for a critical task , such as color correcting photos or any other dedicated task , the graphics card will be set up and the screen calibrated according to the requirements of that task.

Unlike a dedicated piece of equipment though , a computer is more likely to change one of the pieces in the chain. This could be the graphics card , the drivers for the graphics card , or the display/display driver.
Of course any one of these being changed means calibration has to be done again.

It is even the case , if you have your computer set up to do so , that the driver can update without you knowing about it.
 

Rickyj at Kalibrate

Distinguished Member
AVForums Sponsor
I thought so. So all those people using colorimeters are under false impressions as you are only calibrating against the video card output. Once you switch to another source that may be incorrect? Surely the only way to calibrate is using a professional calibrator that hooks up to the HDMI/component that has known tolerances. I assume professional ISF engineers use proper calibration hardware? Or should you get suspicious if they turn up using a laptop and Eye-One?

Not neccessarily as most people will not be calibrating using the pc to create the patterns. I would imagine most people use a DVD/bluray/video processor to create the patterns, there by eliminating the problem of the video card. The colorimeter reading is not influenced by the video card;)
 

Gordon @ Convergent AV

Distinguished Member
AVForums Sponsor
Not neccessarily as most people will not be calibrating using the pc to create the patterns. I would imagine most people use a DVD/bluray/video processor to create the patterns, there by eliminating the problem of the video card. The colorimeter reading is not influenced by the video card;)

I understood from Dicks comments (he is awful, but I like him) that he is suggesting calibrating using an HTPC then presuming that calibration will be valid for other input sources is incorrect. In which case this is true....just as it is true that using a BD player and assuming that will be perfect for your htpc source is also incorrect. Everything needs checked, wherever possible
 

sniffer66

Distinguished Member
Taking this further (and given I have an HTPC) what would you do as a best guess for calibrating a Sky-HD input ?

I have a PS3 in another room, which is at least "known" if not necessarily good. I can't afford a signal generator so have been using the PS3 at 16-235 and RGB to calibrate the Sky input as a best guess.

Is this is the best I can do ?
 

sniffer66

Distinguished Member
from my experience you can use a PS3 as a test pattern generator for your skybox input then just check the sky box at a few different levels to confirm the colour and hue of luminance is correct. You can't check gamma of course.

Thanks Gordon :smashin:

Would be great if Sky would do some decent recordable test patterns. If they took away the Mylene Klaas fluff they did last time, how hard could it be ? :confused:
 

The latest video from AVForums

Star Wars Andor, Woman King, more Star Trek 4K, Rings of Power & the latest TV, movies & 4K releases
Subscribe to our YouTube channel

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom