Sky HD vs SD Colour?

J

John0902

Guest
I'm awaiting Sky HD install next week.

Does anyone know what R-G-B encoding is used on Sky HD? I'm wondering how many colours can be displayed and how it compares to Sky SD. My new Samsung 50 inch HD plasma claims to handle each of R-G-B in 13 bit resolution giving something like 549 billion colours! So I was just curious if that's how Sky HD encodes colour. So as well as the extra picture detail it seems there is greater colour depth as well - does anybody notice it?

Are there specs anywhere on the full signal encoding for Sky HD and SD?
 
Yeah u definately see the greater colur depth, u only have to watch a footie match in hd to see that. the colours seem to jump out the screen, some of the players shirts really stand out and during the world cup the refs who wore the flurescent yellow shirts looked amazingly bright and colourfull
 
Thanks Matt, I'm looking forward to all the extra colour detail.

From what I can tell from research HDTV (and presumably Sky HD) transmits in 24 bit colour - i.e. 8 bits per pixel for each of Red Green and Blue. This means 17 million colours I believe. I think I saw somewhere that SDTV is like VGA - 4 bits per pixel for each of R-G-B. If this is true then it would be just 4,000 colours on SDTV, which seems too low.

Could someone comment or correct me on this?
 
Assuming you're talking digital transmission, 24bit colour is standard for transmission in both SD and HD, giving a potential 16 million colours.

TVs have different renderings of that.. the theoretical capabilities are all at least 24bit (excepting some early LCDs that dropped to 6 bits/pixel to increase refresh speed) but manufacturing quality, design and simply poor source material can reduce it.

CRTs vary. The average 'argos' CRT TV is likely to be showing tens of thousands of colours even if it's well adjusted. A high end CRT can beat the crap out of either an LCD or Plasma even now (SED display technology is on the horizon that can match or beat CRT on a flat panel.. but then it's been on the horizon since 1999..).

The health eye can only see about 10 million colours so 24bit would seem to be more than enough, but it's not the whole picture - in greyscale for example (all 3 channels equal) that equates to only 256 greys so you'd see significant banding even at 24bit. The effect is similar in single colours. That's why high end design packages are up to 16 bits per pixel and the really expensive ones are going higher. Consumer equipment is unlikely to be able to render this for some years though.

Colours generally aren't the problem - a common mistake is to put the TV in to 'dynamic' mode and make the colours garish and bright (stores do that a lot.. makes the picture stand out, but it looks truly awful)... so you end up with a completely unnatural picture that doesn't reflect what was originally filmed. Ideally you want to start at 'normal' and make small adjustments to cope with your display and environment. Using something like DVE helps.
 
Technicly they digital SD and HD have the same colour depth. however I think With HD they may be using a more suitable colour profile rather than just assuming everyone will be using a average CRT.

Still no mistake A decent CRT it still the best display available. just a shame there are no good ones being produced these days let along good HD ones, CRT is going the way of LP records.
 
The colour is better in HD simply because the colour resolution is so much higher.

In both consumer SDTV and HDTV colour resolution is sampled at one quarter of the luminance (or greyscale) resolution. This is because the eye is more sensitive to brightness than colour. In SD the colour resolution is 360 x 288 - pretty low. In 1080i the colour resolution is 960 x 540 - much higher, hence the much richer colours.
 
Okay, thanks guys. So, SD and HD use 24 bit resolution per pixel for colour. I'm not sure if I follow Quickbeam's point. Seems to be referring to number of pixels horizontally and vertically. I assume range of colours is only a function of the number of bits per pixel devoted to colour - 24 bits. How is luminence encoded? Is that 24 bit also in SD and HD? If luminence range is greater in HD then I can see how colours might appear richer.

I'm still a bit confused as to why colour is richer in HDTV.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom