> The cause of Interlaced motion blur? <

Discussion in 'General TV Discussions Forum' started by Nielo TM, May 6, 2007.

  1. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    Hey guys

    I have some few complex questions I would like to ask (hope you don't mind).


    Few days ago, I though I fully understood cause of motion blur in interlaced sources but, I'm not quite sure anymore.

    This is an image of what I thought to be the cause of motion blur in interlaced sources.



    [​IMG]





    That image makes scene to me because I saw a great deal of commercials and programs in the past containing animated videos without any sign of motion blur.

    When I played games that are rendered at 50/60fps (Call of Duty, PGR1, Timesplitters etc&#8230;), I failed to notice any motion blur and I'm pretty sure you haven't either.

    Am I right in thinking that, when a full motion video (50/60fps) is converted from progressive to interlaced, the motion blur is eliminated? If the answer is yes, does my image represent the correct answer?
     
  2. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,274
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +942
    I think I would tend to argue for the opposite - interlacing may tend to reduce motion blur, but it's nothing like as simple as that. Its essential to understand the mechanisms to capture, process and display the video.

    Interlaced video is generally used for TV, and the signal is interlaced throughout the chain. If you are watching on a CRT TV, then it will generally be replayed as interlaced. If you are watching on a digital / fixed-pixel / flat panel display, it will generally be displayed as progressive, and an i to p conversion process will be required. This process may in itself generate a loss of resolution during motion, which may appear to be motion blur.

    Video from film starts off progressive, because interlaced cameras were never used in the first place - the images were captured on film, which is inherently progressive. It will be interlaced for broadcast or distribution on VHS or DVD. However, odd and even interlaced fields will still come from the same, original progressive frame that was taken from the film frame. On replay, there is no motion between these fields, so some say there will be no motin blur. A digital display may recover the original frame effectively, but many will still lose resolution when de-interlacing (though they shouldn't).

    However, film-source video was still only captured at 24 Hz (unlike real video, which is 50 or 60 Hz). Therefore, rapid motion will still not be replayed smoothly, and will have some sort of blurring or stepping, depending on how the director and cinematographer shot it. So you don't get something for nothing.

    I think the link is misleading, as it apparently shows CRT replay, but doesn't consider the benefit of retaining interlaced video thoughout the chain from camera to TV. And that is that more scenes are captured that could be done with progressive, other things being equal. It's just that they are interlaced fields, rather than progressive frames. The link seems to assume that two fields are viewed at the same time, and any motion between them will appear as blur. Well, they're not, and it won't. The two fields will be shown at different times, and motion will appear as motion.

    Nick
     
  3. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    Thx for the reply nick:thumbsup:

    I understand nearly everything about interlaced and progressive accept motion blur.

    When watching a program that's recorded in full motion (50/60i) on a CRT, there is noticeable motion blur. However, when playing games at 50/60i, there is absolutely none.

    Here are few examples: Programs such as soaps, wildlife, news etc... have very distinguishable motion blur. However, when they show animated 2D/3D clips, there's none what so ever.

    The same happens when playing full motion video games (COD, Burn Out etc...). At 50/60i, there is no motion blur.


    Please keep in mind that we are talking about interlaced signals on CRT.
     
  4. Chris Muriel

    Chris Muriel
    Well-known Member

    Joined:
    Jun 14, 2002
    Messages:
    7,280
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Manchester
    Ratings:
    +763
    Good explanation by Nick.
    Animations always tend to look better - that's why they are what is commonly displayed on panels in the shops/showrooms.
    The actual amount of blur will depend on how good the scaler is in your TV and results can sometimes be improved by adjusting various settings (but which ones to adjust depends on the model).

    Chris Muriel, Manchester
     
  5. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    Scalers, Video processors, de-interlacers and other forums of image enhancers do cause motion blur but we strictly talking about simple CRT lol (even though this thread is under HDTV hardware).

    This is very confusing lol.

    Here's another example.

    Lets just say, two cameras are placed on a rotating table. It takes 5 seconds for the table to complete one round. One of the camera is interlaced (A) and the other is progressive (B). After recording for 1 minute, the footage taken from the progressive camera is converted to interlaced.

    The videos from the cameras are transferred on to a DVD.

    When the video is played-back on a SD-CRT, the footage recorded from camera B did not have any form of motion blur. However, the footage recorded from camera A had considerable amount of motion blur.

    Why is that?
     
  6. NonPayingMember

    NonPayingMember
    Previously Liam @ Prog AV

    Joined:
    Oct 18, 2002
    Messages:
    8,525
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Kent
    Ratings:
    +830
    What kind of display are we talking about specifically and what symptoms are you seeing that you are describing as motion blur? I think what is confusing is taking into account the moving image over time, rather than just a 2D interlacing issue. However this is something so much easier to describe with a blackboard in front of me (a picture paints a thousand words after all!).

    Taking the image from the link, the first scenario I assume is material that was captured in an interlaced manner i.e. the camera itself was recording the image as a series of half frames over time (never capturing a whole frame in a single pont in time). So there is a distinct timeline of fields (50 of them a second a la 50i), which are made up of half the resolution of a frame.

    Field 1, Field 2, Field 3 etc and as in your image, things will have moved between each capture so they cannot be weaved back together. If they did it would cause combing. This is fine for interlaced displays (traditional CRT TV) since the TV also displays things on field at a time, 50 times a second. The eye resolves the flickering half resolution frames into moving full resolution image.

    On a fixed pixel or progressive display (e.g. plasma, LCD, DLP) the display must show a full frame of information, 50 times a second (a la 50p). However the mistake the image above makes is assuming each pair of fields are combined even though they are out of time with each other. In a very simplistic sense the processor in the display only takes one of the fields, and for each missing line interpolates an average of the information in the line above and the information in the line below. It basically makes a well informed guess! More advanced video processors will go on to use the frame in front and the frame behind to analyse if any movement has taken place. If not, then the two fields can be weaved together to make a full frame without combing or blur happening. Even more advanced processors take into account many fields, and/or analyse each pixel rather than just each line in order to decide whether to weave, or whether to interpolate the data.

    Motion blur, or whatever artefact you are seeing, is not a product of weaving two out of time fields together. It might be that the video processing is so bad, that the interpolated information used to build up the frame is lacking in detail so just looks soft and smudgy, and so has the effect of being blurred when displayed as a moving image. The blur could also be something completely different, LCD lag for example.

    Something shot progressively in the first instance (e.g. 25p film) shouldn't show deinterlacing artefacts. In the case of interlaced CRT TV, the 25 full resolution frames are broken into 50 half resolution fields (a la 50i again). Only this time, observed in pairs there is no movement in time between two fields since they were originally a single capture. The CRT TV will show the half frames as before, and the human eye will assemble the image into a moving picture. The progressive display will attempt to detect that the image was originally progressive (cadence detection) and will try to buffer the information and weave the fields back together again (inverse telecine). It will then have in it's memory bank a 25p signal, and will simply show every full frame twice to make up the 50p refresh rate. This only in an dieal world unfortunately. In the case of PAL it is very difficult to detect film and so often the processor will still use the interpolative technique and give you some deinterlacing artefacts (resolution pumping, line twitter etc).

    Finally while I'm on a full ramble, scaling. In most cases the progressive display is gonna need to rescale the incoming signal to match the native resolution of the panel (e.g. 1366 x 768 panel from a 720x576 PAL source). This could cause an appreciable blur if the scaling is not done well. In an upscaling step especially, the scaler can easily soften or blur detail out of an image as it tries to represent fine detail (say something only a couple of pixels in size) over now an array of pixels. Jaggies can kick in trying to represent a curved edge on what is essentially just an image made up of squares!

    I'd love to continue on, but have a beer garden to get to!
     
  7. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    thx for the reply Liam :thumbsup:

    There are many factors affecting a FPD (Fixed Pixel Display) so its best to leave that out at the moment.


    The part I'm very confused about is full motion video.
     
  8. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,199
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    What are fairly simple image properies are getting over complicated.

    Real "motionblur" is caused by the capture duration of an individual image being overly long relative to the motion of the subject. The subject or scene moves during the capture period. End result is that the motion during that capture interval is blurred throughout the captured image.

    Whether a sequence of images is captured as frames or fields has no bearing on this. What does matter is the capture interval. To simplify things for this particular issue you should consider fields as merely half height frames.
    If you are shooting 24fps you are obviously going to have a longer capture period than shooting 50 or 60 fps so the liklihood and severity of motionblur is increased. Motionblur is essentially temporal aliasing.

    Thats me keeping it simple and I won't go into how human beings percieve motion and whether film motionblur is aesthetically pleasing ( well it is compared with strobing) neither will I branch into how the capture period is modulated with fixed capture rate technologies.


    Deinterlacing is also part of the problem.

    The simplest way of deinterlacing an interlaced video sequence is to "weave" the two fields together. If the material was originally shot as frames and you get the field order correct ...bingo you've got coherent frames

    If the material was shot interlaced with each field being temporally seperate from its neighbours then weaving the field together will give you mismatched field artifacts on any movement within the scene.

    So you need to employ a different deinterlacing technique.
    The "simplest" method is called a "bob" deinterlace. Each half height field is scaled up to full height and shown once as a full frame albeit one missing half its vertical resolution: fine horizontal detail seems to bob up and down as the nearest real line of information varies from filed to field. The advantages of this method are the complete lack of mismatched field artifacts even with broken cadence material (its the common default and fallback deinterlace method for lots of displays)
    The downside of this method is bandwidth as you've effectively doubled the information you are actually displaying although half that information is essentially just interpolated.
    This method won't exacerbate motionblur.

    Another method is a field average deinterlace. Each field again is scaled to full height as in the bob deinterlace but rather than double the frame rate each bobbed frame is averaged with its neighbour ( 50/50 mix) to keep the data rate down. This method produces ghosting on movement which some people may incorrectly identify as "motionblur". Its not its a deinterlacing artifact. This deinterlace is frankly rubbish as it also produces very soft results.

    Most other types of deinterlace use variations on the above methods or switch between them based on detected movement thresholds , with the exception of motion compensated types which are normally not found on anything but the most advanced deinterlacers

    I've avoided explanations of different cadence types ( 3:2 and 2:2 pulldown) as it doesn't have any bearing on whether a given deinterlacer produces ghosting or not.

    As far as game imagery is concerned it makes no difference. Unless its being created by a bad deinterlace "motionblur" on games is a function of the shutter angle on the camera in a rendered 3d scene unless its an additional and very fake looking filter effect. As far as a deinterlacer is concerned its just a bunch of fields cadence issues aside.
     
  9. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    Like I said before, I nearly know everything about 3:2, 2:2, de-interlacing, etc...


    But what I dont know is, the the full motion video. I'm not talking about 24, 25 or 29.97. I'm purely talking about 50/60i on a CRT.


    FPDs on the other hand are different matter. This is the only area I'm really confused about.



     
  10. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    I now know that at 50/60i, the lines do become misaligned but I cant seem to understand how there's no motion blur on video games and videos that was converted from progressive.:confused:
     
  11. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,274
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +942
    I don't know much about video games, but I guess they are created as a series of "perfect" stills. Real life is different, and just as a still camera will give you a blurry picture of fast moving object if the shutter speed is too long, the same may happen with video. It does depend how it is shot, though. Think about some of the slow motion replays of some sports events, or the opening battle scene in Gladiator, where is strangely very little blurr. A lot of films, even computer animated ones, often include some intentional blurring to suggest motion.

    Neilo, can you clarify what sort of display you are using?

    Nick
     
  12. Nielo TM

    Nielo TM
    Well-known Member

    Joined:
    Oct 30, 2005
    Messages:
    5,062
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    LDN
    Ratings:
    +549
    I am using a CRT.

    The thing is, when ever I watch a program recorded in full motion video (50i), there's always noticeable blur. However, I have seen very few videos that have no motion blur at all.

    Those are likely to be captured in progressive and converted to interlaced.


    Maybe, this image will show you what I mean


    http://img503.imageshack.us/img503/5654/interlacedwt0.png
     
  13. StevenBagley

    StevenBagley
    Active Member

    Joined:
    Apr 13, 2002
    Messages:
    84
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Ratings:
    +4
    Are you seeing this with material shot on film (or for a film-like look) e.g. most drama on TV or on material shot on video (e.g. sport)?

    If it is the former then you are likely seeing a totally separate issue which has nothing to do with motion blur but is down to how the brain tracks movement. The fact that each frame of film is repeated twice when played back causes the brain to be able to detect two separate motion paths objects could have taken and not being able to fuse them into one decides that there must have been two overlapping images and so you get a double imaging effect on movement.

    The noticeability of this generally depends on how sharp the image is at certain frequencies (and so electronic images need to be softened to match film's MTF), how bright it is (it's more noticeable at home than in a darkened cinema) and how fast objects or the camera are moving (you'll notice that films tend to track movement which reduces the double imaging since the object tends to be stationary now and the background is out of focus.

    Steven
     

Share This Page

Loading...