Discussion in 'Televisions' started by NicolasB, Aug 1, 2003.
Just wondering if a 1365x768 plasma screen can display a 720p signal without downscaling....
The horizontal res of a 720p signal is 1280, so it would be scaled up on a 1366x768 plasma, but not on a Pioneer 503 which is 1280x768. Hope this helps.
Rectangular pixels. Hmph.
Just to avoid confusion rather than be irritatingly nit-picking, could I point out that while the active horizontal pixel count of a 720p signal may be 1280 pixels, the resolution of the image carried or the image delivered to your eyes may be nothing like that! I mention it, not from a desire to be especially boring, but because many people seem to mistakenly equate the pixel count of a signal (or display) with image resolution, or at least be confusingly ambiguous in what they say.
Frankly, even when people do talk about resolution, its not always terribly clear (to me) whether they are thinking about or properly discriminating static/dynamic, luma/chroma, encoded/output etc.
Well, as far as a device like a plasma TV is concerned, each image pixel is a separate physical structure, so its display resolution is unambiguous.
The resolution of the source video stream is also fairly unambiguous. Certain elements of the ideal data may be thrown away in the compression process (chroma lost in favour of luma), but nonetheless one can with some certainty "this is a 720x576 recording" if it's a PAL DVD.
(Of course that may actually be upsampled from a previous lower resolution, as is the case with, say, output from a Sky box on some channels).
The next question is whether the original signal has to be upscaled or downscaled in order to map it to the display resolution. My question was prompted by my wanting to know if a 1365x768 plasma screen could display a 720p signal without downscaling in either direction.
Im drifting far from your original question, I apologise for that, but my point was that while I agree that the pixel structure of a plasma screen or the active pixel count of a 720p signal is unambiguous, that characteristic does not equate with resolution, although it does tell you what the maximum possible resolution might possibly be for a static image without pixel straddling issues.
Its just unfortunate that computing terminology has misappropriated resolution to describe image or screen structure/pixel-count, when it means something different in broadcasting/film/video. I think the difference is worth preserving, particularly in home cinema, and that ambiguity in terms causes a lot of unnecessary confusion in a subject that is already confusing enough.
For example, 720x576 images on a DVD do not have a corresponding resolution of 720x576. If a 480p image is scaled to 720p, the image resolution does not improve, although the pixel count has been manipulated. Talking about resolution in these situations, instead of discriminating pixel structure and image resolution can simply muddy the waters IMO.
I'd be interested in knowing how you figure out a 720x576 resolution image from a nominal PAL source isn't its actual resolution.
blanking of 8 pixels either side?
non-square pixels giving the correct display resolution of 768x576?
mpeg2 compression (no way and little point quantifying that beyond a perceptual basis)
If someone tells me to make some PAL resolution images I give them 720x576 non square.
Hi, Mr D.
you'll know a lot more about this than me, so please correct me, but if you took a piece of digital master film that included a single pixel on-off 720x576 checkerboard (not very interesting, I| grant) and that made its way to a mass-produced DVD, my understanding is that with vertical & horizontal filtering, edge enhancement etc. there is no way the final 720x576 image will accurately display that pattern, even progressively reconstructed, so although the active pixel count of each field/frame will still be 720x576, the resolution of the image "content" will be substantially lower, having had its bandwidth chopped during the production process.
The point I was trying to make is that just because there are x pixels making up the image, doesn't mean the image content has an equivalent resolution, so using the same word to describe both qualities is potentially confusing. The difference for me was highlighted recently by Stacey Spears reporting his and Joe Kane's image spectrum analysis of HD D5 telecine master tapes of several typical movies - showing that basically there is little or no high frequency resolution content above the equivalent of 800 pixels horizontal resolution on those masters. Even on their own DVE HD/D5 24p master, a piece of telecine from film using (I think) a 1920x1 Spirit, the high frequency content maxed out at equivalent of 1300 pixels horizontal resolution. Presumably that reflects either the limitations of the optical path of the present telecine process to capture detail off the source, or simply that such movies don't contain much HF detail on the negative or whatever is scanned (- seems very unlikely to me given the resolution of film negative)? Maybe you can comment?
There are too many differing variables to say definitively how much resolution a given image conains when viewed through a given system. Its almost not worth commenting on . All you can really say is that an idealised image has 720x576 worth of resolution in it. Various stages in the real world image signal processing chain prior to display and the nature of the display itself mean its highly unlikely that the image finally presented depicts the maximum available information thats present in the original image data.
However thats a given like I said its pretty redundant to even comment on it. All you are left with is that a certain image may have less or more resolution than another image and a certain display chain may disclose more or less of the information present in the original material.
Separate names with a comma.