No. A VGA signal is always progressive (not interlaced).
Digital displays scale the pixels to the native resolution. Analogue displays generally accomplish this in a different way - by syncing to the incoming signal, and effectively stetching the image the fill the screen (but it is an electronic process - not really scaling in a digital sense). Only multi-sync monitors do this (which is most).
Well, you'd probably only notice on high motion scenes anyway (comb effect/jaggies). I'm not really very familiar with CCTV systems, so can't really offer an explanation about what you are seeing. But it could be only capturing half the frames (or half size frames), in which case you wouldn't see any interlaced artefacts anyway.