Adobe RGB is similar to DCI-P3 but has more greens. Google to see the comparison. Bit depth has nothing to do with colour space. The colour space of a screen is a physical property and is related to the backlight (with LCD) properties and the properties of the colour filters.How does this all tie in with Adobe RGB Colour Space, i.e when monitors are listed as having 92% or 99% coverage of Adobe RGB Spectrum and they support 10 or 12 bpc
What are the advantages of 10-bit monitors?
Does geforce 900 series support 10 bit color depth?
That's a great analogy for bit-depth, thanks.The bit depth is just how many gradations you have. A useful analogy is this: Think of a staircase. The overall height of the staircase is the colorspace. The number of steps is the bit depth. From this you can see that the number of steps (bit depth) has no effect on the overall height of the staircase (colourspace). The bit depth just defines how many individual colours in a colourspace and screen can show.
In answer to your question Ken, the x and y axes on the graphs are hue (sometimes called tint) and saturation (sometimes called colour), whilst the third axis in a three dimensional colour volume graph is luminance (sometimes called brightness). You will undoubtedly have seen saturation/colour and hue/tint as controls on a TV and a good colour management system will have separate controls for saturation/colour, hue/tint and luminance/brightness.Just one specific question. On the graphs, what are the quantities x and y on the axes, and what are their physical units (voltage, cd/m2, etc.) please?
Given the reactions in the thread, that was clearly the right thing to do. Apologies for misunderstanding the AV Forums audience and trying to pull you in the other direction, lesson learntI have simplified the article in places to make it easier for a layperson to understand.
Jesus, that's scary. What do they teach people in art/design colleges at all these days! Seems the lecturers are as clueless as the students - most likely hippy painters with no tech knowledge at all. Those design studios must be sending out some awfully badly coloured stuff then - how do clients/printers never complain about colours being all over the place? I knew all about color management and calibration all the way back in the Amiga 1200 days with print management software I bought and included excellent over 100 page manual ***!!I have never, in twenty years, been in a design studio where anyone had any idea about display calibration or colour management.
Also annoying that I understand all that (including Mannis comments perfectly) and you only just followed it, and yet you are supposedly ISF certified and I am not (because it costs shed loads to get the certification)I think i did my isf training with you, and i only just followed this
Like the video bit depth, the chroma sub-sampling is related to the colour gamut but outside the scope of the article. However essentially chroma sub-sampling is a way of compressing colour data by taking advantage of how the human eye works. We are very good at noticing differences in luminance (brightness) but not so good when it comes to chroma (colour). So if a 4:4:4 signal is basically uncompressed with a full luminance channel (the first 4) and two full chroma channels (the second and third 4s) then a 4:2:0 (which is what Blu-ray and Ultra HD Blu-ray use) delivers the full luminance channel, half the first chroma channel and none of the second chroma channel, which reduces the bandwidth by half compared to a 4:4:4 signal. The player can then recover much of the encoded chroma information and output the signal as 4:2:2 or in some cases like the Panasonic UB900 as 4:4:4. It won't have the fidelity of an original signal that was actually encoded at 4:4:4 but the differences shouldn't be perceptible.You mention Chroma Sub-sampling. I've seen these ratios x:x:x before but have no understanding of it all means. Is this something that is exclusive to Hayley Davidson riders with all that chrome, or is this significant to the rest of us? Or is it best left to one side and embrace matte black!?!
Call it irrational fear or not, as TVs are getting capable of brighter and brighter images, I wonder if we are going to have our retinas burnt out at some point! Whilst indoors with light levels much lower than the great outdoors in bright sunshine, our irises open to allow more light to travel through our pupils. Does this make our eyes more sensitive and subject to potential damage from HDR levels of strobing or other extremely dynamic effects? I wonder if going for OLED would be a good thing just because they're not capable of going beyond 600 nits yet, so a limitation becomes a good safety feature. What are your thoughts here?