Hi Def

p = progressive
i = interlaced

720 and 1080 the number of lines

p = generally better than i

p requires more information to be transmitted by the source (DVD, Sky, BBC etc) hence why 1080p is difficult to find in any form
Thanks. I understand now. I suspect P at 1080 is only likely to come from local sources for a long time - local sources being DVD players etc.
how can you "suspect that" when you didn't even know what the "I" and "P" stood for. What do you base your opinion on. Not that I disagree, just abit of a sweeping remark, taking your previous post into account?

I'm just probing not being rude! :)
Depending on the material, 1080i is preferred to 720p. The usual guideline is: 720p for sports, 1080i for movies.
Also, although it's interlaced, 1080i has more pixels than 720p (double in fact), although they're displayed in an interlaced manner.
1080i/p is 1920x1080i/p (around 2 million pixels) whereas 720p is 1280x720 (around 1 million pixels). The big difference is the horizontal resolution you see, goes from 1920 to 1280. So in theory 1080i should be somewhat more detailed than 720p although you might experience usual interlacing issues (smearing and all that).
In the end, all HDTV you'll find can only do 720p, although they say they support 1080i. All these TVs have a native resolution of 1280x720 (or 1360x768) so whatever signal they get, they up/downscale it to 720p.
Only when you find a TV which has 1920 pixels of horizontal resolution you'll be able to see the difference between a 1080i and a 720p image, but then you'll probably be spoiled by the set's native 1080p picture anyway, since i expect all future panels with 1920x1080 resolutions to be fully 1080p capable...

I'm not sure i made much sense.
I 'suspect' that because though I didn't know what the 'p' stood for I do understand the kind of amounts of data that have to be pumped through an on air system to generate a fully progressive picture. But I can see why you're asking. Although I didn't know what i or p meant in relation to Hi Def, I'm an IT consultant, s once I have an idea of what the term refers to I have an idea of what the practise involves. Satisfy your curiosity ? :)
I did sort of answer in your other thread somewhere asking the same thing.....

Because you have a plasma TV you will probably find 720p may look better to you than 1080i.

With a 720p signal you tv will take the 1280x720 progressive signal and scale the image to 1024x768, it does not have to deinterlace the signal which is good as your TV is not very good at doing this.

With a 1080i signal the following happens.......
You TV will take the first 540 odd lines from the first frame and deinterlace then giving you 1920x540p, it then has to down scale the Horizontal lines and upscale the Vertical lines to fit you panel, it will then take the 540 even lines from the next frame and deinterlace them, again leaving a 1920x540 progressive image, and once again it then has to downscale the Horizontal and upscale the vertical to fit your 1024x768 panel.
Once all this has been done you have seen the same on screen as you got with one frame of 720p and a lot less has been done that could potentially muck the image up.

If you were using a CRT where there is no native resolution 1080i is often prefered, I certainly prefer it, but because you have a progressive display things are a little different.
You can now see why a decent offboard deinterlacer/scaler rather than using the cheap one built into your TV can really, really improve things.

Other thing to remember is there are good and bad transfers too, just like on dvd, a good 720p will beat a bad 1080i and vice versa.

The latest video from AVForums

RESTORATION vs. REVISIONISM: At what point do 4K video and audio 'upgrades' cross the line?
Subscribe to our YouTube channel
Top Bottom