What Will Give A Better Picture? Native 1080p or 1080i via Seperate Deinterlacer?

Timbo21

Well-known Member
Since at some point there will be native 1080p source material I was wondering what is likely to look better 1080i via a good deinterlacer, or native 1080p direct?
 

madshi

Active Member
Are we talking about video (e.g. TV sports and shows) or film (e.g. hollywood movies) content?

Video content: If you take a 1080p signal, forcefully interlace it (so that half of the scanlines are lost) and then pass it through the world's best deinterlacer, the original 1080p signal will most probably never look worse, but most of the time better. However, there is more than just deinterlacing. E.g. if your display is not native 1080p, someone has to scale the image to fit your display. An external VP will probably do that much better than a normal HD-DVD player. Also an external VP may be able to reproduce better colors, reduce compression artifacts etc. Furthermore ask yourself the following question: If a broadcaster sends 1080p with the same bitrate as 1080i, which will look better? That's a question which is very hard to answer. So you see: Your question can't be answered that easily.

Film content: If you have a good external VP, there will be no (null, zero, nada) difference between 1080i and 1080p, because a 1080i60 film signal is not missing any information compared to the original 1080p signal. How is that possible? Simple: The original signal was 1080p24 (only 24 frames per second). The conversion from 1080p24 to 1080i60 for broadcasting is lossless and can be fully reversed by a good VP.
 

The latest video from AVForums

Podcast: Trinnov Room Optimiser: A full explanation of Trinnov and its room optimiser technology
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom