Question regarding 1080i resolution

Oakleyspatz

Prominent Member
Joined
Oct 9, 2004
Messages
3,708
Reaction score
236
Points
726
Age
57
Location
Woking
Hi HD experts!!...I have a question.

I own a 576p resolution DLP widescreen projector ( Sharp XV-Z200e) and have recently purchased a D-Theater player and 45 movies.
The movies seem to all be in 1080i resolution and through my Sharp look absolutely stunning and ten times better than my DVD collection. Infact, it would not be too much of an exaggeration to say they make my DVDs look like VHS!

My question is, although my projector is not high def, does a 1080i signal not split the image into two (interlaced) and send half then the second half a split second later? So doesn't this mean that half of 1080i is 540 so can my pj not display the image exactly, unscaled, in other words....in High Def?
 
Here are two screen shots of the Sharp in action. The first is taken off the DVD and the second off the HD D-Theater tape. The first looks like I had camera shake but believe me, this is as sharp as it looks compared to the HD one and both taken on a 576p projector, NOT a High Def one.

After this, I begin to wonder why we 'have' to have a high def display to appreciate high def material!
 

Attachments

  • transdvd.JPG
    transdvd.JPG
    46.1 KB · Views: 99
  • transHD.JPG
    transHD.JPG
    51.1 KB · Views: 97
Exactly how a 1080i source is de-interlaced is beyond me but not to some who use these forums:)
However there has been talk about how some cheaper displays take short cuts when fed a 1080i source by only de-interlacing one field and magics up the second field which gives you your 540/576p result. A good de-interlacer will process both fields and then merge the two to give the best results.
I welcome any clarification on that:)

It's been said that even standard def displays will benefit from the superior source which is HD even after good or bad de-interlacing and scaling and you seem to have seem that in the flesh.
The question has always been that a good HD source will probably be too expenisve if you don't want to or can not afford to replace your existing SD display.

I think I am on safe ground in saying that if you had a projector that had a resolution of 1920*1080 and a damn good de-interlacer/scaler it would put your current model to shame:)

Didn't even know they had released 45 D-Theatre movies, damn good buy and an impressive example of the quality in the pics:)
 
Starburst said:
Didn't even know they had released 45 D-Theatre movies, damn good buy and an impressive example of the quality in the pics:)
Thanks for the reply. Yes, they have made about 91 movies in D-Theater plus a few not on the D-Theater label. I actually don't want to watch DVD anymore the quality is so good. It is like going back to VHS from DVD!!
 
Starburst said:
Exactly how a 1080i source is de-interlaced is beyond me but not to some who use these forums:)
However there has been talk about how some cheaper displays take short cuts when fed a 1080i source by only de-interlacing one field and magics up the second field which gives you your 540/576p result.

A good de-interlacer will process both fields and then merge the two to give the best results.
I welcome any clarification on that:)
You've got it a bit confused!

A good de-interlacer takes every 50i source field, and creates a full 50p frame to replace it. It does this either by taking information from the previous or next field (in the case of film or 25p progressive video sources, where there is no movement between the two fields that make a 50i frame) OR it detects that there is motion between the fields and creates a new frame using the information from the field to interpolate.

Sophisticated de-interlacers attempt to motion track elements around the screen, and isolate static and moving areas within the scene to use field merging or field interpolation to recreate the frames.

Cheap de-interlacers convert by field replication. Effectively they treat each 1080i field of 540 lines as a 540p frame. If they are converting to 720p they just scale the 540p line fields to 720p, rather than de-interlacing properly to 1080p and scaling.

However treating the source as 540p to feed a 576p display is probably a reasonable compromise - as the cost to de-interlace to 1080i and then scale would outweight the benefits, though it might reduce vertical aliasing slightly.

It's been said that even standard def displays will benefit from the superior source which is HD even after good or bad de-interlacing and scaling and you seem to have seem that in the flesh.

HD sources downconverted to SD still look better than SD sources in most cases, as long as the downconversion is performed properly. Things like video noise, edge enhancement/aperture correction and compression artefacts applied in the HD domain appear less visible when scaled to SD, than they would if they were applied in the SD domain. As such HD sourced pictures appear cleaner, even in SD.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom