monitor de-interlacing

MikeTV

Well-known Member
No. A VGA signal is always progressive (not interlaced).

Digital displays scale the pixels to the native resolution. Analogue displays generally accomplish this in a different way - by syncing to the incoming signal, and effectively stetching the image the fill the screen (but it is an electronic process - not really scaling in a digital sense). Only multi-sync monitors do this (which is most).
 

bobsplace

Active Member
thanks for that Mike :thumbsup:

I was wondering as i have bought a cctv capture card and it has the option of interlace and de-interlace within the software.

I cant tell any difference at all between the two.
 

MikeTV

Well-known Member
Well, you'd probably only notice on high motion scenes anyway (comb effect/jaggies). I'm not really very familiar with CCTV systems, so can't really offer an explanation about what you are seeing. But it could be only capturing half the frames (or half size frames), in which case you wouldn't see any interlaced artefacts anyway.
 
No. A VGA signal is always progressive (not interlaced).
Actually believe it or not, there is such a thing as Interlaced VGA. Thankfully it's more or less impossible to find now. For that reason, most new screens will not show the signal.

monitorq3ee.gif


So no, new monitors don't have deinterlacers. That has to be done by software.
 

bobsplace

Active Member
Thanks for the input.

in your opinion what would yield the best pq


640x480

648x480 de-interlace ?
 

MikeTV

Well-known Member
Nice correction:thumbsup:

(but of course, the monitor isn't deinterlacing in that case, but showing half frames quickly!;) )
 

The latest video from AVForums

Podcast: Samsung TV Launch & QN95A Neo QLED Review, plus Film & TV news & Reviews
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom