monitor de-interlacing

bobsplace

Established Member
Joined
Dec 7, 2005
Messages
1,014
Reaction score
54
Points
194
Location
Newcastle
Does a normal pc lcd or crt provide any de-interlacing?

I'm assuming it scales to the native?
 
No. A VGA signal is always progressive (not interlaced).

Digital displays scale the pixels to the native resolution. Analogue displays generally accomplish this in a different way - by syncing to the incoming signal, and effectively stetching the image the fill the screen (but it is an electronic process - not really scaling in a digital sense). Only multi-sync monitors do this (which is most).
 
thanks for that Mike :thumbsup:

I was wondering as i have bought a cctv capture card and it has the option of interlace and de-interlace within the software.

I cant tell any difference at all between the two.
 
Well, you'd probably only notice on high motion scenes anyway (comb effect/jaggies). I'm not really very familiar with CCTV systems, so can't really offer an explanation about what you are seeing. But it could be only capturing half the frames (or half size frames), in which case you wouldn't see any interlaced artefacts anyway.
 
No. A VGA signal is always progressive (not interlaced).
Actually believe it or not, there is such a thing as Interlaced VGA. Thankfully it's more or less impossible to find now. For that reason, most new screens will not show the signal.

monitorq3ee.gif


So no, new monitors don't have deinterlacers. That has to be done by software.
 
Thanks for the input.

in your opinion what would yield the best pq


640x480

648x480 de-interlace ?
 
Nice correction:thumbsup:

(but of course, the monitor isn't deinterlacing in that case, but showing half frames quickly!;) )
 
custom interlaced resolutions over vga are used by a lot of HTPC fans.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom