If videos and movies contained 100 complete pictures every second (at a suitably high resolution), then we would have a wonderful system for displaying moving images -- no flicker, good resolution and no strobing or smearing artifacts --- but it would require an awful lot of storage space or film, so it would not be feasible in practice. There are lots of compromises possible which reduce the amount that has to be stored without losing the quality that you are interested in. Film is one compromise. It has 24 complete pictures a second. This would flicker horribly, but the projector turns the 24 pictures into 192 (or so) by interrupting the display of the image 4 times (or more) per picture. Surpisingly this does stop the perception of simple large-area flicker, but fast moving objects can move a long way in 1/24 s, so artifacts such as strobing (positions of an object are absent) or smearing (different positions of an object are all displayed at the one time) are sometimes problematic. This is your cinema big-screen experience. Television (PAL, NTSC or SECAM) display of video is another compromise. The idea is to display 50 images per second ( 60 for NTSC) instead of 24 to reduce strobing and smearing. There is no re-display (interruption) of frames so large-area flicker is much worse than the cinema but acceptable for small early TV screens. When television was invented, screen phospors which fade quickly enough in 1/50 seconds weren't available, and also the amount of data required for 50 complete images was too great for the bandwidth available, so a clever compromise was created --- interlacing. The 50 Hz images only had half the number of lines of the full image and consecutive images (fields) contained the odd and then the even lines. So for slow moving images you had full image resolution, but for fast moving objects you had half the vertical resolution but half the amount of strobing or smearing. A wonderful compromise! There are however additional artifacts where small details (especially thin horizontal lines) are present only in one field -- causing extremly bad flicker. These are called interlace artifacts. Of course people like to watch films on television, and this is where the story really starts. How are the 24 complete frames per second broadcast (and watched) as 50 (or 60) fields? For PAL this almost invariably done, by speeding the film up by about 4% (1 frame per second). The sound is thus raised in pitch by about half a semitione. The pitch can be processed back down again or left as it is. Two fields therefore correspond to the same film frame and the same point in time. So we have lost the improvement in strobing and smearing that television provides, and some television (interlacing) artifacts are still present. For NTSC the situation is slightly more complicated since 24 frames have to divided amongst 60 fields. This requires that each film frame corresponds to 60/24 = 2.5 fields. This is achieved by making alternate film frames yield 2 and then 3 fields. This leads to a slight uneveness of motion (the motion judder artifact). The sound can be left at the original speed. You can see that this is significantly different from PAL. Now of course we want to put both Films and Television Video on DVDs. We want to be able to use DVDs with 50 or 60Hz televisions. So DVDs basically store the fields that would be displayed on such a television. As televisions screens have got larger, the large-area flicker and the interlace-flicker artefacts have become much more noticeable. To get around this we need to display the images at say 100 or 120 Hz. Why don't we just repeat the display of the each field (several times) to achieve the reduction of large-area flicker in the same way that the cinema does? This requires that the image be stored temporarily (which needs to be done well or else we lose some quality -- but this should be no problem for today's technology). With Films stored on a PAL DVD this is no problem since the two fields correspond to the same point in time and we can produce a each frame image by just interleaving the lines. This can be done in the display, but it is sometimes slightly more satisfactory to do this in the DVD player (since it can be done digitally and it has extra information) and then have a special sort of television signal which is non-interlaced (progressive scan) feeding the television. The standard for this signal was not part of the PAL DVD spec -- partly beause the way Macrovision security can be added to such a signal to stop copying had not been agreed. With Films stored on NTSC DVD the situtation is more complicated, since each field is repeated several time and for different times (see above), but it can still be done either in the television or in DVD player. With NTSC it is clearly more efficient to do the interlacing in the DVD player, since the DVD player can do this digitally and has extra information about the video signal stored on the DVD. Interlacing in the player does however require the use of a the special non-interlaced (progressive scan) signal which is not playable on all sorts of televsion. However Progressive scan was part of the spec for NTSC DVD since it such a worthile addition. A television trying to interweave the fields has to store quite a lot of fields and work out which ones have to be interleaved. The difficult case is playing back interlaced video on a 100 or 120 Hz screen. The problem is interlacing because each line is essentially black for half the time (while the other field is being displayed) and repetition will not stop this causing large-area flicker. The only solution is to put something in this blank time-slot. One approach is to put in a copy of what is being displayed in the other field at that time. This is roughly the basic "double lines" or "bob" solution provided as the least complicated 100 Hz option on many televisions. This reduces vertical reolution, and/or motion smoothness a little and interlacing artefacts may still be present. The other extreme is to display the each field for twice the time intended. This is sometimes called "weave" as each frame is basically the result of weaving together two fields into a single image. With slow moving images this works extremely well, and you get full resolution images without any large-area-flicker or interlace-flicker artefacts. However, when objects are moving quickly the two fields being woven together actually corresond to very different points in time on altenating lines and the postions of the object are very different. The combined image is very confused, show a sort of tearing (tearing motion artefact) which is unpleasant to watch. This is the basic 100Hz display method, but televisions usually have some way of selectively usingan average of of both fields on both lines where objects on the screen are moving rapidly. More sophisticated 100 Hz methods try to effectively predict the exact motion between fields and try to calculate what the field would look like at this intermediate time. This can work well for simple motions with simple objects, but as you can imagine it can occasionally get it wildly wrong and spoil the whole thing. With both PAL and NTSC is probably still better for the DVD player to do the interweaving, since it can be done digitally and with more information, but the situtaion is perhaps not so clear cut. Playing on computer is basically the same as a 100 Hz television, and software can perform the de-interlacing digitally using extra information from the player and the display screen. For interlaced television (ie everything except films) 100Hz playback sometimes fails to impress even though they have greatly reduced flicker and artefacts. Is this because de-interlacing methods either involve a slight blurring or involve guesswork which sometimes fails? While this partly true, I think it is also that once all flicker and artefacts are removed, then the remaining artefacts (which are present in 50 Hz playback but are partly hidden by all the other flickerings and noise etc) suddenly stand out. These are sometimes suppressed by smoothing things out still more, giving an effect of blandness compared to interlaced video.