> The cause of Interlaced motion blur? <

Nielo TM

Prominent Member
Joined
Oct 30, 2005
Messages
5,050
Reaction score
536
Points
831
Location
LDN
Hey guys

I have some few complex questions I would like to ask (hope you don't mind).


Few days ago, I though I fully understood cause of motion blur in interlaced sources but, I'm not quite sure anymore.

This is an image of what I thought to be the cause of motion blur in interlaced sources.


interlacedwt0.png






That image makes scene to me because I saw a great deal of commercials and programs in the past containing animated videos without any sign of motion blur.

When I played games that are rendered at 50/60fps (Call of Duty, PGR1, Timesplitters etc&#8230;), I failed to notice any motion blur and I'm pretty sure you haven't either.

Am I right in thinking that, when a full motion video (50/60fps) is converted from progressive to interlaced, the motion blur is eliminated? If the answer is yes, does my image represent the correct answer?
 
I think in many ways this issue is quite pertinent, as the increased resolution with HD will only be a benefit where there is little or no motion. As soon as something starts to move quickly, the resolution is lost for one reason or another, and you may as well have SD.

I don't think its fair to pin the blame on interlacing, though. The so-called misalignment of the fields is actually wanted information - it shows how the subject has moved between fields. With progressive frames, there is still movement between frames, and couldn't this be considered to be misalignment as well?

It's just the same thing. As long as you are replaying the interlaced video on a CRT display, it will be reproduced in just the same way it was captured. The important thing is that the view in each successive field is captured at a different instant in time, and is replayed at a different instant in time.

The interlaced fields are not replayed together, with one sitting on top of the other, they are showing a different scene at a different time. Of course, with any sort of digital display, things would be quite different, but that's another discussion.

Nick
 
Lets just say, two cameras are placed on a rotating table. It takes 5 seconds for the table to complete one round. One of the camera is interlaced (A) and the other is progressive (B). Both cameras are recording at full motion (60i and 60p). After recording for 1 minute, the footage taken from the progressive camera is converted to interlaced.

The videos from the cameras are transferred on to a DVD.

When the video is played-back on a SD-CRT, the footage recorded from camera B did not have any form of motion blur. However, the footage recorded from camera A had considerable amount of motion blur.

Why is that?
 
How high a res monitor do you need to see those images without scrolling???!

Anyway, it depends what you mean by motion blur and what the display device is. Real life video or movies will have motion blur in it because of the way the CCD/film is recording for a short period of time (interlaced or not) and the subject moves. Games don't because they draw a static frame. blur actually helps to reduce judder & make things look like they're animated more smoothly than they really are (24 fps or whatever). Back in the day, 3Dfx tried to reproduce that effect in hardware, during their "we're getting panned by nVidia and have to come up with something new" period.
 
Lets just say, two cameras are placed on a rotating table. It takes 5 seconds for the table to complete one round. One of the camera is interlaced (A) and the other is progressive (B). Both cameras are recording at full motion (60i and 60p). After recording for 1 minute, the footage taken from the progressive camera is converted to interlaced.

The videos from the cameras are transferred on to a DVD.

When the video is played-back on a SD-CRT, the footage recorded from camera B did not have any form of motion blur. However, the footage recorded from camera A had considerable amount of motion blur.

Why is that?

I don't believe that would happen. I think they would both look the same. On the assumption the interlacing is done by grabbing odd fields from Frame1 and even fields from Frame2 with no sort of processing.
 
How high a res monitor do you need to see those images without scrolling???!

Games don't because they draw a static frame.


Finally, the answer I've been looking for:D. Could you please explain what you meant by "Games don't because they draw a static frame"?

Thx you very much:thumbsup:


PS: I'm using a standard 17" TN (LG.Philips) based panel
 
A game draws every pixel in the frame according to the objects, lighting and other shaders that make up the scene. Nothing is "moving", it's all calculated at a point in time.

Actually I just remember that quite a few current gen games use fake motion blur ala 3Dfx's idea to illustrate speed, e.g. BurnOut, RidgeRacer.
 
So basically there wont be any motion blur? Never knew that part lol. I assume its the same with other animated 2D/3D videos?

I though the consoles just render the image at 60fps and convert it to interlaced (60i) on the fly. That what made me go all crazy cos there was no motion blur.


So basically, it doent matter whether the game it outputted in progressive or interlaced cos the motion will remain the same?



PS: Some games does have annoying motion trail effect. It might work wonders with certain types of games but they should really think twice before applying it. Same goesfor blooming effect too.
 
The PS2 (for example) generates an interlaced image by default (although there are some progscan games on it) as it takes less effort. I think the xbox & new consoles always generate a full frame then output what the console is set to (interlaced over scart of course)
 
Finally, the answer I've been looking for:D. Could you please explain what you meant by "Games don't because they draw a static frame"?
Thx you very much:thumbsup:
PS: I'm using a standard 17" TN (LG.Philips) based panel
1.
I don't know much about video games, but I guess they are created as a series of "perfect" stills. Real life is different, and just as a still camera will give you a blurry picture of fast moving object if the shutter speed is too long, the same may happen with video. It does depend how it is shot, though. Think about some of the slow motion replays of some sports events, or the opening battle scene in Gladiator, where is strangely very little blurr. A lot of films, even computer animated ones, often include some intentional blurring to suggest motion.

Neilo, can you clarify what sort of display you are using?

Nick
2. You said you were using a CRT - a flat panel changes everything.

3. Why would interlaced and progressive scan make any difference?
 
The PS2 (for example) generates an interlaced image by default (although there are some progscan games on it) as it takes less effort. I think the xbox & new consoles always generate a full frame then output what the console is set to (interlaced over scart of course)


ya, most game consoles use video encoder to convert progressive to interlaced but some consoles do render in interlaced but, I didn't know PS2 was one of them:eek:.

If consoles are using static frame, how are they outputting in frames or field per second?

Also, some games (such as HALO-CE PAL) are rendard in 30fps and outputted in 50 field per second. This causes constant motion judder.

Sorry to be a pain in the ass but I'm one of those people who want to know everything about certain subject lol.

So far, I know everything about contrast ratio, response time, color depth/range, LCD, PDP and CRT. I just need this final info to complete interlaced vs progressive.
 
ya, most game consoles use video encoder to convert progressive to interlaced but some consoles do render in interlaced but, I didn't know PS2 was one of them:eek:.
They have to use that crappy amount of video RAM sparingly.
If consoles are using static frame, how are they outputting in frames or field per second?
Static as in each single frame/field is a fixed snapshot, rather than light from a moving scene hitting a sensor.
Also, some games (such as HALO-CE PAL) are rendard in 30fps and outputted in 50 field per second. This causes constant motion judder.
I thought (from memory) that for some bizarre reason it judders like hell at 60Hz & is fine at 50? Anyway, that's nothing to do with progscan vs interlaced.
 
Lets just say, two cameras are placed on a rotating table. It takes 5 seconds for the table to complete one round. One of the camera is interlaced (A) and the other is progressive (B). Both cameras are recording at full motion (60i and 60p). After recording for 1 minute, the footage taken from the progressive camera is converted to interlaced.

The videos from the cameras are transferred on to a DVD.

When the video is played-back on a SD-CRT, the footage recorded from camera B did not have any form of motion blur. However, the footage recorded from camera A had considerable amount of motion blur.

Why is that?

Isn't the basic problem with this comparison the fact that there is no such thing as 60p (at least in a filming sense)? I though that any video cameras that capture progressively will be doing 30p (NTSC) or 25p (PAL)?

Or am I wrong?
 
Isn't the basic problem with this comparison the fact that there is no such thing as 60p (at least in a filming sense)? I though that any video cameras that capture progressively will be doing 30p (NTSC) or 25p (PAL)?

Or am I wrong?

Well I think it was hypothetical but as far as I know, you're right.
 
Thx for your help guys


I got the answer I was looking for lol
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom