I've made an intuitive realization that contrast ratio while a useful number to consider can be a deceiving measurement in any display device's specs. Here is why: I got a new Panasonic AE700 and it is rated at ~2000:1. In some high-dynamic scenes with dark darks and light highlights all in the same scene (Soul Calibre 2 is a good game for this), the contrast is excellent, the whites are very bright and the blacks are very dark looking. In other low-dynamic scenes that are darker, you really notice the gray "black" and a lot of non-black pixels fall into the gray (see the opening scenes in The Bourne Supremacy). You could possibly adjust for this, but then it might throw off the shadow detail in brighter or more dynamic scenes. I don't mind this too much in dark room (since it is much less pronounced) but with increasing ambient light the issue grows (as it would with any display technology other than electronic paper which gets worse in a darker room ). On some direct-view CRTs I have seen a similar problem with blacks... they have a very dark black, but then the other dark shades fall into black as well (are darker than they should be so the actual black isn't much darker, so it might as well be gray like a LCD). I think the reason for this issue is that contrast measurement is a simple ratio that assumes that the light output will be linear, when in most cases you lose the linearity near the bright and dark ends. It is kindof like rating speakers by frequency response: 20hz-20khz! It's more useful to look at the frequency response graph to see if there are any nasty dips or spikes in there. I would much rather see a light output graph than a simple contrast ratio number when comparing display products.