Comment on what my mates said


Distinguished Member
Jan 11, 2002
Reaction score
Horley, Surrey
I was chatting to my mate, saying how I have a LCD projector now, and the very best way to get the best image is feeding it from a PC (radeon for example) with 1:1 pixel mapping.

His comments:

Problem with PC dvd players (unless you've got a special one) is that they output de-interlaced which sucks.

and also

Hmmm, it's a big subject but at best de-interlacing will reduce the resolution (especially with real-time deinterlacing) and kill motion, but if it looks ok to you, who cares.

Can anyone who knows more than I do, comment on his remarks and correct any confusions I/HE may have re his points.
Feeding the projector a signal that matches its native resolution will always give you a better image as there is no scaling up or down involved.
De-interlacing is a big subject as there are many types. Some do a better job on PAL, some on NTSC.
Then there's the argument as to where it's best carried out; in the DVD player or the projector or a separate unit.
Then there's digital signals like DVI and HDMI which have the advantage of cutting out two stages of digital to analogue and analogue to digital conversion which must be a good thing.

I know I shouldn't post on this thread (feeling grumpy....) but that they output de-interlaced which sucks.......

.....but at best de-interlacing will reduce the resolution .....

The first statement is feeble................... apart from the concern with projectors, the picture quality on a normal crt monitor beats the pants off any tv I've ever seen................ all computers put out a de-interlaced signal....

The second statement is simply wrong........ de-interlacing has no effect (that I know of ) on resolution (it could be argued that it doubles the resolution in a momentary sense, but let's not go there...)..................... scaling (converting the res of the dvd signal to the output res of the graphics card) does have an effect..... and it is possible to reduce the resolution below that of the original dvd by scaling........ but the real benefit is gained in eliminating the de-intercaling and scaling that would otherwise take place in the projector. With a few notable exceptions (such as X1 or NEC 1000HT) projectors perform this function poorly.... the effect of a moderately powered pc and low to mid range Radeon is dramatic, to say the least.

The simply way to solve any dispute about picture quality is to compare a non-progressive (or progressive for that matter) dvd player to a radeon based dvd-playing pc pixel matched to the re of the projector................. for my money there is no contest, full stop.

On the otyher hand, you could just ignore whoever you're quoting........ sound to me like they're talking through the wrong orifice.

of course computer displays tend to display progressive frames...... only a humpty dumpty could honestly believe it is better to send it a signal which it has to deinterlace and scale than one which it does nothing to!

Biggest problem is that people watch dvds on their pcs which are totally not up to spec, not even remotely configured and in totally ridiculous environments. All the same, people like that are impossible to argue to, ive tried. I know a guy who told me about his special shuttle pc which can output interlaced at any resolution to his tv and this made it fantastic.... and this is a technically competent guy!

We know progressive is better, in general it is better just to accept that others dont understand sufficiently.


The executive summary: he's clueless. :D
The S-Video and/or composite outputs on a graphics card may output interlaced but the vga or dvi output would be de-interlaced/progressive. Maybe thats where your mate is getting mixed up with regard to whether PC's output interlaced or progressive.

The latest video from AVForums

LG G4 OLED Best Picture Settings Out Of The Box - Filmmaker Mode!
Subscribe to our YouTube channel
Top Bottom