leonard.powers said:
Just been to Comets and Currys to have a look at their HD stuff. It looks impressive but when I asked some simple questions to the staff, they seemed pretty stumped and just refered me to Sky's info.
Basically, most LCD seem to offer a native resolution of 1366x768.
Yes - this is a 16:9 wide version of the 4:3 PC 1024x768 standard. It isn't actually an HD broadcast standard. It is most common on LCD displays - which have a slight PC heritage.
DLP rear-pros often employ 1280x720 DLP chips, and Plasmas operate at all sorts of resolutions.
I understand that the HD ready standard offers 2 resolutions.
1280x768 (progressive scan) known as 720p
Not quite - 720p is 1280x720 (16:9 square pixel/sample) not 768.
1920x1080 (interlaced) known as 1080i
Yep.
1280x720 is the 720p square pixel resolution, 1920x1080 is the 1080i square pixel resolution.
Am I right in thinking that all the HD Ready Tv's are able to do the 720p ok but won't be able to do the 1080i standard since their X res is only 1280 and not 1920. In which case, am I right in thinking that the TV will scan down the 1080i to make it fit?
A 1366x768 display will usually scale a 1280x720 720p image both horizontally AND vertically to fit - with the resulting image quality dependent on the quality of the scaling and filtering algorithms used. A few will 1:1 pixel match and leave black bars around the image - but deliver a high quality result as it is 1:1 matched. (Not many do this)
All HD Ready displays will accept and display a 1080i signal - but not neccessarily at its full resolution.
A 1920x1080 interlaced image will not actually contain 1080 lines of vertical resolution - as interlacing reduces the resolution vertically. It delivers a picture similar to that of a 1920x800ish progressive system. A 1366x768 display is thus capable of approximately resolving the vertical resolution, but will not fully resolve the horizontal resolution.
HOWEVER - some 720/768 line displays cheat anyway - and rather than porperly de-interlace the 1920x1080i signal to 1920x1080p (containing around 1920x800 levels of detail) and then scale this to 1366x768 or 1280x720, instead they treat the 1920x1080i signal as a 1920x540p signal and scale the 540p to 768 or 720. This effectively bins 260ish lines of vertical resolution, and can introduce vertical jaggies (or aliasing)
Also, what about the Plasma and LCD that say they are 1024x1024 and are HD Ready. How are they doing that? If it's a case they scale the picture down, surely any LCD could display a HD signal (scaled down)?
They scale the 1280x720 to 1024x1024, or scale horizontally - but crop not scale vertically the 1920x1080 to 1024x1024. Often 1024x1024 displays are ALiS designs - which are actually interlaced. They therefore display 1080i sources very well - albeit cropped to 1024i and scaled horizontally.
Whilst this may sound like a major resolution drop - many HD sources do not exploit the full 1280 or 1920 resolutions of 720p and 1080i. This is because some broadcasters use equipment that can run as low as 960x720 or 1280 or 1440x1080, and many transfers don't contain detail up to the 1920 level.
The 1080 to 1024 crop used by ALiS is not a major issue for most situations - as this level of crop is less than the levels of overscan that most CRT displays employ. It is a legacy of the Japanese system the 1080i standard is based on originally only running at around 1030i - which is so close to 1024i as to make it sensible to select the binary figure!
I know the connectors on the back are also needed but I'd be surprised if scaling down the picture (especially in the case of the Plasma with 1024x1024 res) but having the correct HDMI or HDI sockets make's it HD ready.
The HD Ready specification doesn't fix a minimum horizontal resolution, just a minimum vertical resolution (of 720 lines).
Therefore the 1024x720 resolution of the widely regarded, and popular 37" Panasonic PV500 mean it IS described legitimately as HD Ready, as are the 1024x1024 and 1366x768 devices.
In an ideal world all displays would be 1920x1080 or 1280x720 and no scaling or cropping would be required when they were displaying 1080i or 720p signals. However there is a lot more to display quality than resolution - and by compromising resolution with black levels, brightness, response time, saturation, gamma etc. you often end up with a better all round picture. There is little point in having a very sharp picture that is dim with bright blacks, poor quality colour and a laggy display!
Are there any native 1080i LCD or Rear projections out yet?
LCDs are incapable of displaying 1080i natively - they have to de-interlace to 1080p. There are 1080p displays around in DLP and LCD variants.
CRTs are the most widespread 1080i native displays - but there are no HD Ready CRTs on sale in the UK. (There are a few 1080i capable displays, but they don't have HDMI/DVI+HDCP or 720p compatibility so can't be called HD Ready). The ALiS plasmas also, apparently, display 1080i as interlaced, but on 1024i screens with a bit of crop.
Also, althought the demo's look great, isn't this what all PC users have been looking at for years. After all, I have a 21" CRT running at 1600x1200. My digital camera is 5mega pixels so it's resolutons is far in excess of the 1600x1200 resolution. I've been viewing my digital pictures at Hi Def for years. Any what about PC games?
Hopefully someone can clear up the confusion.
Many thanks
1280x1024 4:3 PC displays are the same angular resolution as 1280x720 16:9 HD video - so you can watch 720p video letterboxed on a 1280x1024 display at full resolution.
Similarly there are a number of 1080 HD quality displays and higher - such as the Apple Cinema displays.
The major limitation of PC CRTs as HD displays has usually been brightness - the sharper the display, the dimmer the electron beam (as the brighter the beam the bigger the spot becomes) unless very expensive techniques are used - as well as display size. 21" displays are large in PC terms, but tiny in TV terms these days...
After all - HD video has been with us in one form or another since the 1980s when the Japanese started marketing HD gear. (The BBC made a drama in HD in 1989 in association with NHK - the Japanese State Broadcaster)
Also - beware digital stills camera pixel ratings - they are not all they seem.
Digital stills cameras have a single CCD or CMOS sensor, with an arrangement of coloured filters over the top of them. Therefore each pixel on the camera CCD is only sensitive to one colour - not all colours. Usually these sensors are arranged so that there are 4 sensors, 2 for green and 1 each for red and blue. The results of these 4 sensors are interpolated to create a high resolution image. This is not deemed a problem because the eye is more sensitive to luminance than chrominance, and green is a more luminant colour, and thus carries more detail that the eye perceives. However each of the 4 pixels are counted in the cameras pixel rating.
Broadcast systems base their video camera pixel counts on the resulting video signal - and broadcast video cameras work by having three separate sensors, each fed with red, green and blue versions of the scene. Each sensor then provides a red, green or blue version of the scene that can then be mixed to create the luminance and chroma difference signals (the latter can be subsampled if required, but the camera sensors don't do this if you don't want to!). If you assume the camera only operates at 1920x1080 or 1280x720, and want to compare the camera pixels with a stills camera, you actually have to multiply by THREE. Some broadcast HD cameras actually operate at 1920x4320 (!) to allow 1080 and 720p interlaced and progressive signals to be derived by line averaging.
Therefore a 1920x1080 camera has 6 megapixels in stills camera terms. A 1920x4320 World Cam has nearly 25 megapixels.