Deep color?

Theres a table about half way down the page here that shows what features were added between the different HDMI version numbers.

HDMI - Wikipedia, the free encyclopedia

The big ones between 1.2 and 1.3 were the lossless audio codecs.

Deep color is one of those little used features that were added , much like the ethernet channel in 1.4.

Makers of Pre-made content would have to change the whole process line to introduce that feature , and it would cost millions , something they are not prepared to do. So very very little deep color content is out there.

Some devices can generate deep color content , i.e. some graphics cards , some camcorders and so on , but DVD's and Blu rays dont support it , and as far as I know there are no plans to do so.
 
Thanks . So not really an issue.
Just wish an expert could tell me which one to keep the w1200 or w5000. W5000 seems to have noise ( maybe I could sort that) w1200 looks great but doesn't look cinematic like the w5000.
 
Think you answered your own question there then. If the W5000 looks 'more cinematic' then surely that's the one to go for. If it has a noise, then isn't it covered by warranty?

FWIW There is no current domestic film source with deep colour so there is little point in worrying about having it: If you have a video processor then you can 'upscale' the signal to deep colour which may reduce banding in certain scenes, but only if the display can accept deep colour and doesn't drop it back back down during the internal processing stages.

I tried various tests a few years back using a DVDO Edge VP which allowed the output to be set a 24, 30 or 36 bit output. My old JVC HD350 accepted upto 36 bit and I paused various scenes with banding in, but perhaps it was encoded into the disc because it made little or no difference. It also made drop outs more likely as it was increasing the data rate sent down my 12 metre HDMI cable, hence I reverted to standard after the test.
 
Kelvin, what do you mean by 'There is no current domestic film source with deep colour'? Even my old Panasonic DMP-BD60 has deep colour!?

Also im suprised you can see much banding even with only millions of colours?!
 
Sorry I've just typed a long reply elsewhere, so I'll keep this short: The native colour depth on BluRay (and DVD for that matter) is 8 bits RGB, which is 24 bits. Your player can do what my Edge VP did and 'upscale' the colours to 30 or even 36 bit depth, but the source remains 8 bit RGB / 24 bit depth. Some displays internally perform a similar process anyway so it's more a question of which device does it better.

I could occasionally see banding on sunset type scenes on various projectors from AE1000/2000/3000 and HD350, plus my various TVs. It's not a massive issue and I'm sure many people wouldn't notice it either, but I could see it occasionally so spent some time trying to improve it via the VP (even my Lumagen Mini3D couldn't fully resolve it as IMHO it is inherent in the source on occasions).

Hope that helps. :)
 
yeah on my w1000+ sky is banding and have to fiddle for ages to get it right. It narks me to no end.
You think it's the pj?
Ps the reviews of w5000 say it had noise but super crisp.
 
Some of it is inherent in the source, sometimes it's due to issues such as going from RGB to YCbCr and back to RGB or similar factors. Discs such as Spears & Munsel have various test patterns that can help expose errors in the video chain, which in turn can help work out the best setting for each particular equipment combination.

Some displays work better fed with RGB and others with YCbCr. Equally changing the output of the BluRay player when using YCbCr option from 4:2:2 to 4:4:4 can cause some issues with certain displays (usually best to stick to 4:2:2 as it's closer to what's encoded on the disc which is 4:2:0). Then add in some people's insistance on using HDMI enhanced rather than Normal can cause other issues, so it all requires a bit of testing and decision making: Equally if you can't see the difference between settings, you have to consider if it's worth worrying about. :) Might be worth trying on your PJ, whichever one you settle on.

One surprising (to me anyway) cause is incorrect contrast setting: Try putting up a greyscale ramp pattern and adjust your contrast down from clipping 254 (or 235 whichever is your preference) and you'll see that there are bands that come and go according to the contrast setting. The trick is to pick the setting that minimises the bands and hopefully this will also minimise banding in 'real video'. You have to go down from the highest setting to avoid clipping, so there is an element of compromise as effectively this is wasting the display's contrast ratio. I posted about this a year or more ago in the calibration thread as I was surprised to discover this, it seems I wasn't alone either. It is much more noticeable on my LCD TV and therefore worth taking the time to find this optimum setting, but my old HD350 was less sensitive in this regard (though I could still optimise the setting).
 
Kelvin - but surely if you are outputting as X.y.v in 10 bits then you gain there because you avoid rounding errors.
 
Kelvin - but surely if you are outputting as X.y.v in 10 bits then you gain there because you avoid rounding errors.

You might gain depending on the display (and the rest of the video chain) but my comment remains that there is no deep colour source available on BluRay. I used to have a VideoEQ CMS device and found that it worked best when fed with a 10 bit signal (even though I believe that 2 of the bits were just 'padded' by the Edge) as this helped with rounding errors. However, this was because further processing of the signal was occurring to correct the colour gamut of the display, so could result in rounding errors if 8 bit in and out was used. If just feeding a display directly from a BluRay/DVD then there may be little or no benefit in the padding applied to make 8 bit into 10 bit.
 
thanks Kelvin for clearing that up for me :) Will bluray ever support native 10bits? Or is the whole deepColor thing just another marketing ploy that wil never be realised?
 
I'm not sure if it will ever include deep colour, but I suspect that if we get 2160p or higher in the home they may well include deep colour as well. They have to find a way to tempt us to buy newer gear and even if we can't justify higher resolution in some setups (due to sitting too far away for example) then the use of deep colour might still give visible improvements in the picture in terms of banding. Who knows what they are planing for the future...
 
So as you boys seem to know what you are talking about... Can you give me an order of what will give me the best 2d quality if calibrated correctly please?
Benq w5000
Benq w1200
Sony bravia hw-15

Need to sell some but wanna know which is potentially best 2d picture.

Left my w1000+ at the gf's as sky scenes really annoy me.

I have an hd33 but I would imagine the 2d pic not as good as the 3 above
 
I can't help with this myself as not models I've even seen in person. I don't know how many people will have had full calibrations (as in 'ISF' type calibrations) of these or had chance to compare fully calibrated examples of them. Perhaps in the more general sense of calibrated (as in properly set up brightness and contrast with suitable test patterns) you might have more joy.

However, as you own all three, can you not just set them up as best you can (I highly recommend using the 'basic patterns' on the free to download AVS HD709 disc)? Then at least you're comparing them on a level playing field. Of course full calibration with meters, etc should improve them further, but a waste on the two that you're going to sell (or give away ;):devil:).

What is the issue with the Sky scenes on the W1000? Is it a lower resolution PJ and you can see pixels or is it more of a banding type issue? I don't think I've asked, but how are you setting up these projectors? Are you using settings from reviews/other users, a basic patterns test disc like AVS HD709 or a full calibration with a sensor and software?
 
To be honest haven't used calibration . I physics and thought I could do it on my jacks . And just been busy with all sorts. I want to get rid of some to clear space and get some cash back before they are worthless . And thought if someone new it would make it an easy choice for me so when I do spend hrs calibrating I'm wasting my time on the right one. W5000 was a top of the range however a few yrs old so not sure how things have improved. So is the the hw15 . But the w1200 is cheaper but great reviews whathifi product of the year. They all have great reviews. 9/10 even the w1000+ but that has banding on the sky scenes. And to be honest I have 17 pjs at home and none are as bad. Don't get me wrong when fiddled it looks great with blu ray.
How long does this free software take to set up on pj. And will it depend on screen or wall used?
Will it work on Lcos and Dlp?
And does it work with 3d as have an Optoma hd65,Optoma hd33, some dell m410hd's and dell 4320's?
 
What and how much are these sensors? And how much for some pro to do it?
 
What and how much are these sensors? And how much for some pro to do it?

Isf or thx cal will be around £220-£350 ish.
 
What and how much are these sensors? And how much for some pro to do it?

I use a previous version of the i1 Display Pro and Chromapure Pro myself, but there are also cheaper options on this page (forum sponsor):

Products | Kalibrate Limited | Home Cinema products, Calibration, DVD Players, 3D Glasses | Audio Systems | Cinema Systems

You will also need some test pattern discs, of which the free AVS HD709 one is IMHO the best one (I also have the DVE (DVD & BluRay versions) and Spears & Munsel discs):

AVS HD 709 - Blu-ray & MP4 Calibration

As said before you can use the 'basic patterns' section to properly set your contrast and brightness which IMHO is a very important part of setting up your display and you don't even need a sensor to do this part. In fact it's the first thing I do with any display, before going on to use a sensor. You make sure you can just see the 17 IRE 'black' bar flashing as you set the brightness. You then adjust the contrast so that you can just see the 254 white bar flashing (or 235 if you prefer, as I do, to clip data that shouldn't be in the signal and gain a small amount of contrast instead). Even just doing this means you've set the basics on the display which is much better than randomly adjusting the controls while watching a film 'to taste' as some seem to do. The beauty being that once you've set the above (even if you go on to use a proper sensor, etc) that you shouldn't need to adjust the brightness and contrast again regardless of what film you watch.

BTW, how come you have 17 projectors? Wouldn't it make more sense to have one (or two if you have another room) really good models instead of lots of cheaper ones?
 
Last edited:
What and how much are these sensors? And how much for some pro to do it?

Probably the "best value" measuring device currently for the enthusiast looking to calibrate a panel/projector display device is the X-Rite i1 Display Pro Colorimeter. It's pretty accurate and has very good low light sensitivity i.e. 0.003 cd/m2. Low light sensitivity is a relative weakness of the i1 Pro Spectro (0.2 cd/m2) so this makes a good addition for low level calibration even if you have an i1 Pro.

Beware that there are two versions sold as "retail" or "OEM" and the version can preclude the use with certain calibration software i.e. some only support the OEM version such as Chromapure. AFAIK either version is supported by the latest version of the free HCFR softaware and CalMan offers licences for both. The retail version can be found for around £155 and the hardware is same for both retail/OEM versions and HCFR calibration software is free.

The biggest investment is usually not money but time, learning and gaining practical experience.

Avi
 
The biggest investment is usually not money but time, learning and gaining practical experience.

Avi

Definitely :D For me it was an on/off learning curve spread over a number of months. I set up various calibrated presets just so I could pause scenes to try to understand what effect they had on the image. The main one being different gamma settings (including non linear ones with lower gamma at the bottom IRE ranges). I also found that calibrating colour gamut with HFCR was much more difficult than when I bought the Chromapure software as you only see the 2D chart in HFCR that makes you think you've got it perfect, when in fact the luminance could be out.

I suppose it helps that I was rewarded for being an early adopter of Chromapure by them upgrading me to Pro, which has to be a first for anything I jumped in at the early stages with. :D Thesedays you have to pay quite a lot more for the Pro version...
 
The native colour depth on BluRay (and DVD for that matter) is 8 bits RGB, which is 24 bits.

Also Blu-ray and DVD uses a form of chroma compression that means not all pixels have colour (chroma) data. This basically means a native 1920x1080 resolution YCbCr 4:2:0 Blu-ray frame has only 960x540 resolution chroma data from the disc.

This results in 8 bit "per channel" not "8 bit per pixel" i.e. Y=8 bit with the remaining 16 bits split between the Cb and Cr chroma channels. It's a requirement of the player and/or display to upsample chroma to full resolution i.e. YCbCr 4:4:4 or RGB at a minimum of 8 bit per pixel.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom