576i vs 576p

choddo2006

Distinguished Member
Joined
Dec 30, 2005
Messages
8,529
Reaction score
541
Points
1,162
I think this is an interesting area, so let's have another thread for it to avoid derailing tryingtimes'

Nick, I'm really unsure what point you was making there. :)
If you have a source of PAL 576/50, it will convert that into 576/25, especially on a LCD/DLP as interlasing really does'nt make any sense, on a CRT things are different.

PAL is actually a lot simpler than our USA friends who have to worry about the 3:2 pulldown etc, so PAL is not like NTSC. But even then a it's not a total loss.

Have a look here->

http://www.hometheaterhifi.com/forum/showthread.php?t=3851

eg. SKY uses 1080i/50, now SKY do a lot of sports programmes, if 1080i/50 was causing interlasing artifacts they would be a lot of unhappy subscribers out there. A 1080i/50 will look just as clean and smooth as a 1080p/25.

I'm going to ignore the distinction between 576i and 1080i and instead talk about handling of 50Hz signals as I understand it.

So the source for UK TV broadcasts is either film (25p once sped up) or video (50Hz interlaced) - the latter might be captured at 50p but it never reaches us like that and therefore my guess is that in the broadcast chain they might choose to save the bandwidth & use 50i all the way along. Maybe. It's irrelevant anyway. What we get is effectively from a 50i video source.

Now the actual signal delivered is 50i of course, irrespective of what the source was.

A film should be converted back to a full progressive frame at 25fps, but almost certainly the signal carried to the screen (or processed inside the TV) will be at 50Hz, just with 2:2 repeat where each of the 25 frames has been blended together from consecutive odd & even fields.

Sport at 25fps would be horrible. Movies limit the speed of pans to avoid to large a jump between frames because of the relatively low framerate. You can't do that with sport, it has to follow the speed of the action, so in that case what you want and get is 50 progressive frames per second where the deinterlacing has filled in the blank lines in the field using whatever algorithm it has available.

So there's a big difference between 1080i/50 and 1080p/25 and each has their place. Of course, if you could have sport at 1080p/50 or 1080p/60... then I think it's fair to say that would be a good deal preferable but it's not an option yet.
 
Sport at 25fps would be horrible. Movies limit the speed of pans to avoid to large a jump between frames because of the relatively low framerate.

Your eyes can only see so many frames per second, for smooth motion it's something like 18fps. So why would sport look bad at 25fps, if this is the case you might need to contact SKY, as that's currently what your getting with 1080i/50.
 
No it isn't, for the reasons I stated above. I'm getting 1080p/50*... with half of each frame generated mathematically.

And do you have a source to back up why the human eye can only see 18 frames a second? Why can I clearly see the difference between a game running at 30fps and 60fps for example? I know that motion blur as captured by film cameras helps to improve the impression of smooth transition from one frame to the next, and that doesn't apply with games (usually) but why as low as 18?


*Actually I'm personally getting 768p/50 on my screen but that's beside the point.
 
I think everything you've written is spot on there choddo.

The way in which 50i video content can be converted into progressive signals differs too.
We've got a simple bob where each line is doubled. This produces jaggies.
We've got motion adaptive logic which decides what do do depending on whether there is motion or not. In its simplest form this will weave where there is no motion and bob where there is.
Then there's per-pixel implementation of the above where motion can be detected at the pixel level.
Then there's filtering to help with the inevitable jaggies seen on diagonals. DCDi is the most famous example of this. AIUI they will look for a diagonal line and smooth along that line.

After that it's frame interpolation which tries to add even smoother look to the video by adding in-between frames to make 100fps (for example).

There might be other techniques, which I'd like to know about too.
 
Yes, I see where the confusion lies.
When the source is video camera (studio programming, sports, cheaper dramas, etc) then the interlacing step doesn't take you to 25p as you would be throwing away valid picure information. It takes you to 50p.
This is because the time that each field is recorded is different. Unlike on 50i film material, where 2 successive fields are identical.

We can clearly see over 60fps in terms of animation - I could do motion tests but they would only work if you can set your computer monitor to 100 and 120 hz.
 
Ok, I see why this thread was started. :clap:

It's the long standing posters gang up on the newbie thread..

Heh, I've no problems with that. :) I can banter until the cows come home.
To be honest, I'm not sure if this is a wind up yet, because both your post counts appear high, so one would assume you understand where I'm coming from.

No it isn't, for the reasons I stated above. I'm getting 1080p/50*... with half of each frame generated mathematically.

Yes, & why would'nt a half frame be calculated on 1080i/50 again?.

And do you have a source to back up why the human eye can only see 18 frames a second?

Well it was one of the reasons the pre-sound format was 18 fps. In the old days they required the least number of fps to save on reals & reals of tape. 18fps was decided to be least number of FPS to convey fluid motion. You can give yourself a very simple example of how slow your eyes do actually perceive motion, get your hand and shake it as fast as you can in front of your eyes. Do you see blurs?, yes of course and this is how the brain interpretes this fast motion, another example put a fast moving movie on, now pause it, can you see how it looks blured?, this is deliberate so that it can duplicate what your eyes perceive. Now seen as motion pictures implement this motion blur into movies, I'm not actually sure what extra information you could get from interpolating the frames.

Why can I clearly see the difference between a game running at 30fps and 60fps for example? I know that motion blur as captured by film cameras helps to improve the impression of smooth transition from one frame to the next, and that doesn't apply with games (usually) but why as low as 18?

I'm actually a gamer too, PC gaming is a bit different to motion pictures. For a start games don't implement motion blur, unless its for special effects. Eg. in car games an often technique to make you look like your going faster, is to add a little motion blur. So if you run at 60 instead of 30, 60 will look more fluid because your brain is now giving you real motion blur.

Here is a link that explains it better->
http://blogs.msdn.com/shawnhar/archive/2007/08/21/motion-blur.aspx

*Actually I'm personally getting 768p/50 on my screen but that's beside the point.

Good for you, but not really sure what your gaining if your just watching movies.

One final note:
If you really beleive 50 FPS would make films and the like look better, I would suggest you contact the Film industry and tell them they have made a terrible mistake using 24 FPS.
 
Ok, I see why this thread was started. :clap:

It's the long standing posters gang up on the newbie thread..

Heh, I've no problems with that. :) I can banter until the cows come home.
To be honest, I'm not sure if this is a wind up yet, because both your post counts appear high, so one would assume you understand where I'm coming from.
Not at all. Just that this is not a simple area so getting all the facts out on the table helps to establish a good, shared understanding. I wouldn't waste my own time on a ganging up thread.
Yes, & why would'nt a half frame be calculated on 1080i/50 again?.
Sorry, I don't understand that question. Could you elaborate?
Well it was one of the reasons the pre-sound format was 18 fps. In the old days they required the least number of fps to save on reals & reals of tape. 18fps was decided to be least number of FPS to convey fluid motion. You can give yourself a very simple example of how slow your eyes do actually perceive motion, get your hand and shake it as fast as you can in front of your eyes. Do you see blurs?, yes of course and this is how the brain interpretes this fast motion, another example put a fast moving movie on, now pause it, can you see how it looks blured?, this is deliberate so that it can duplicate what your eyes perceive. Now seen as motion pictures implement this motion blur into movies, I'm not actually sure what extra information you could get from interpolating the frames.
Interesting, didn't know they used to use 18. I did mention the blurring on film exposure too. But with sport capped on a digital camera, you're not going to get the benefit of that (and actually the blurring would be quite undesirable, no?)

I'm actually a gamer too, PC gaming is a bit different to motion pictures. For a start games don't implement motion blur, unless its for special effects. Eg. in car games an often technique to make you look like your going faster, is to add a little motion blur. So if you run at 60 instead of 30, 60 will look more fluid because your brain is now giving you real motion blur.

Here is a link that explains it better->
http://blogs.msdn.com/shawnhar/archive/2007/08/21/motion-blur.aspx
For sure. With still images, frame rate needs to be higher to compensate. But this clearly shows the human eye can detect more than 18fps.
Good for you, but not really sure what your gaining if your just watching movies.
?? I was just pointing out I was ultimately getting a lower res than 1080p because I don't have a FullHD panel but it's not relevant to the discussion.
One final note:
If you really beleive 50 FPS would make films and the like look better, I would suggest you contact the Film industry and tell them they have made a terrible mistake using 24 FPS.
I didn't say films would look better. There's a certain art to the movie framerate that I wouldn't ever want to see changed. Actually the only reason it's 24 is because of the kind of technical & financial limitations you mentioned yourself. If they could have done 50fps in 1920, I bet they would have. I said video (esp sport) is 50Hz interlaced (in Europe) and that once you deinterlace it... it's still 50p. Just that half the lines in each frame were not recorded or transmitted and have been "guessed".


So my one question to you is, ignoring all the complexity of the movie discussion, games, human eye etc; do you believe that a TV takes a 50i video-sourced (not film) input and changes it to 25p?
 
Ok, I see why this thread was started. :clap:

It's the long standing posters gang up on the newbie thread..
Not at all, I think everyone is being quite patient and benevolent. You can learn a lot in these forums, (I certainly have) but only by being open-minded. You quote a HomeTheaterHiFi thread, but it's the technical articles that have the real authority. Here is the classic article:
http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-part-5-progressive-10-2000.html

And this 10-piece article by the architect of HQV is another great reference:
http://www.hqv.com/technology/index1/video_processor.cfm?CFID=&CFTOKEN=44030951

One final note:
If you really beleive 50 FPS would make films and the like look better, I would suggest you contact the Film industry and tell them they have made a terrible mistake using 24 FPS.
[ON SOAPBOX] Now I absolutely agree with you there! I don't think 24 fps is enough; I'm not sure that 50 or 60fps are enough, but we can make do with that. I would really love to see film shot on high resolution 50 or 60 fps video cameras, but I dopn't think we'll see that for a very long time. There is a great tradition of using actual film for movies, and producers and punters seem to expect to see poor blacks, motion blur, grain, limited dynamic range and gamut. Through conditioning, they expect that to be the real thing, while video can provide a more immediate and convincing suspension of disbelief. [/OFF SOAPBOX]

But enough of that, have you noticed that the more recent blu-ray players are actually able to output 24 fps video for films that are recored on disc. It's better that way, because it's at the right frame rate, there's no telecine process or judder, it's how it's mastered on the disc, and it's just plain the right way to do it. The latest and greatest HD displays accept this and display it appropriately. Sometimes they say they frame rate convert to 48, 72 or 96fps, but that is only repeating the existing frames, and it's not really relevant to think in those terms with digital displays, where there is no CRT-style scanning used any more.

Video from TV sources, where the original capture was done on an interlaced scanning camera (typical for TV) is quite unlike film, and always has a different scene in each field. In this case, PAL most certainly does not become 576p/25 after de-interlacing. Ever. There are always 50 uniques captures in time, and the refresh rate is always 50fps.

BR, Nick
 
Not at all. Just that this is not a simple area so getting all the facts out on the table helps to establish a good, shared understanding. I wouldn't waste my own time on a ganging up thread.
Ok, not a problem!! Debate is good and that's what makes forums interesting.

Sorry, I don't understand that question. Could you elaborate?
Well, when a footy match etc are recorded at source in the UK we use 25 fps, Movies as you know are shot at 24 fps. So to get back at this 25 fps we could use PAL 576/25p or PAL 576/50i, at the end of the day they both are only going to give you the original 25 fps, IOW: try to think of 25p & 50i as just a protocol, they both give the same image at the same fps, its just how that imformation is transmitted. Now in both case you could convert this same raw 25 fps in 576/50p, and if you wanted you could interpolate the images to give you extra frames in-between it did'nt really matter if that source came from 576/25p or 576/50i. As I pointed out in another thread interlasing really only applys to CRT, so in modern day LCD etc it not really relevent. I assume the reason why 1080i/50 even came out was to make it easer to convert into 576i/50 as a lot people then were still using CRT.

So my one question to you is, ignoring all the complexity of the movie discussion, games, human eye etc; do you believe that a TV takes a 50i video-sourced (not film) input and changes it to 25p?
Yes,.. Unless as I pointed out above your pumping this into a CRT, and then the interlasing actually has advantages in avoiding flicker.
 
Well, when a footy match etc are recorded at source in the UK we use 25 fps,
Ah. Let me stop you there. We don't use 25fps. We use 50i when recording. So the cameras capture an interlaced field at 50 times a second. So there is movement between field 1 and field 2 (unlike with movies which have had a single frame split into field 1 and field 2) which you want to retain.
 
Ah. Let me stop you there. We don't use 25fps. We use 50i when recording. So the cameras capture an interlaced field at 50 times a second. So there is movement between field 1 and field 2 (unlike with movies which have had a single frame split into field 1 and field 2) which you want to retain.

choddo, I think you might need to send a link for that information.
Because what your saying is that broadcasters record at 50i, and also have a shutter speed on there video camera's of 50 fps. This straight away is going to give you interlasing artifacts and everybody in the whole UK would have notised how moving objects look jagged, they would be no easy way to de-interlase such and image either, because there is no extra information availabe to built up the image correctly. From what I understand broadcasters have a shutter speed of 25 fps, they could be recording a 50i, but if the shutter speed is 25 fps, then its the same thing.
 
Indeed you often see what's called "combing" artefacts when a TV mistakenly treats a 50i source as film and tries to combine two fields into a single frame.

When it IS correctly identified as interlaced video, you're right, there is the issue that you don't know what was in the gaps so algorithms of various effectiveness, speed and cost (mentioned by tryingtimes above) are used to make a best guess.

And deliberately setting a video processor to film mode on a video source confirms how truly nasty the results are. In fact, the mere existence of a minimum of "Film" and "Video" modes on video processors is because of this difference in how they're recorded.

It's referred to on page 1 here
http://www.bbctraining.com/pdfs/articleHighDefinition.pdf
"temporal offset of 1/50th of a second" and refers to interlacing artefacts on horizontal objects on page 2

This talks about 50i as the standard PAL/SECAM format
http://www.hdtv.biz/hdtv_information.shtml

This page talks about it ... clumsily
http://www.burnyourbonus.info/hdtv-faq/faq2.html
"When you watch a film on TV, each frame is scanned into a pair of interlaced frames which are shown successively. TV cameras capture the half frames at staggered intervals"

wikipedia has it too of course
http://en.wikipedia.org/wiki/Video
"For example, PAL video format is often specified as 576i50, where 576 indicates the vertical line resolution, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second."

I am kind of conscious that none of these are exactly "official" so if I find something better, I'll add it.
 
Indeed you often see what's called "combing" artefacts when a TV mistakenly treats a 50i source as film and tries to combine two fields into a single frame.
Well a 50i film should be combined, thats what the interlased 'I' stands for. You would'nt want to it displaying a 50p, all it could do then would be line doubling, IOW: reduced resolution.

http://www.bbctraining.com/pdfs/articleHighDefinition.pdf
"temporal offset of 1/50th of a second" and refers to interlacing artefacts on horizontal objects on page 2

Did you also notise what page 2 said ->

When you transfer a feature film to tape in Telecine for TV showing, you have to double print each frame because a
film camera records at 24 or 25 frames per second for cinema or TV

Do you have any links that say TV record using 576/50i and also use a shutter speed of 50 fps?.
 
What do you think a "temporal offset" of 1/50th of a second is? :)


I completely agree a 50i film should be combined. I never said anything different.

And the cheapest, most rubbish way of deinterlacing 50i video to 50p is indeed line doubling.
 
and this page
http://www.100fps.com/index.htm#whatis

describes how video cameras capture fields (and goes into a lot of detail about how those are stored as single frames on digital consumer cameras and also more on various deinterlacing methods)
 
What do you think a "temporal offset" of 1/50th of a second is? :)
It even says what it is, its again to do with the way CRT works, and the whole reason CRT TV's use Interlasing. The "‘persistence of vision’" was the clue.

I completely agree a 50i film should be combined. I never said anything different.

Then I totally misread -> TV mistakenly treats a 50i source as film and tries to combine two fields into a single frame.
 
Actually found this from the BBC ->

In general UK produced video is shot with a field rate of 50 Hz (50i) where the two fields in a frame
are 20ms apart, whilst programmes shot on film or shot to look like film will have a field rate of 25
Hz (or 25p) where both fields in a frame are time coincident. In the case of 25p material the whole
frame can be coded as a complete picture, whereas the 50i frame will contain combing artefacts on
motion.

So I soppose were both right!! :) If it's film, it's most likely going to be 25p, if it's sport it could be 50i. I'm not a sports fan, so I've never really notised any interlasing artifacts, so to me it's more inportant that the films are 25p.

Also did you see that link from that link you sent me, talking about what FPS the eye can see.

http://www.100fps.com/how_many_frames_can_humans_see.htm
 
I did yeah thanks, goes into the fact you can get away with 18 because of the blurring as we discussed. I guess below that point you see the flashing between frames. Interesting. I've got a miniDV camcorder which allows me to record either 50i or 25p and I have to say the juddering at 25 is very obvious when playing it back. Perhaps it's to do with the way digital cameras capture images.

Then I totally misread -> TV mistakenly treats a 50i source as film and tries to combine two fields into a single frame.
I probably wasn't clear enough. I meant something that had been recorded at 50i.
 
if it's sport it could be 50i. I'm not a sports fan, so I've never really notised any interlasing artifacts, so to me it's more inportant that the films are 25p.
If it's sport, or the news, or all TV programmes up until about 5 years ago it's always 50i ;) (most big budget TV shows now use 24p or 25p depending on country)

You won't often notice artefacts because as I say, it only happens when a TV tries to detect the source type and gets it wrong. SkyHD demo in PCWorld on the Samsung LCD last year, showing a news ticker, was always a good one. The ticker regularly got combing on the letters.
 
If it's sport, or the news, or all TV programmes up until about 5 years ago it's always 50i ;) (most big budget TV shows now use 24p or 25p depending on country)
I beleive they most likely got rid of 50i, due to LCD/Plasma's would show the interlasing artifacts a lot more than CRT, I soppose because CRT has persistence the interlasing kind of disolved.


You won't often notice artefacts because as I say, it only happens when a TV tries to detect the source type and gets it wrong. SkyHD demo in PCWorld on the Samsung LCD last year, showing a news ticker, was always a good one. The ticker regularly got combing on the letters.
I soppose thinking about it, yes. Years ago Ticker's did sometimes look jagged when I watched on the PC. But it's something I've not seen for a long while. So I soppose because of LCD/Plasma's there primary format is now 25p.
 
Only for big series like Lost, 24, Prison Break et al.

For your bog standard British TV soaps, news, Casualty etc it's still 50i and as I say, until we get 50p, 50i is preferable to 25p for sport.

The BBC news weather is an interesting one because it seems the big computer generated map is progressive with an interlaced overlay of the forecaster. So my scaler gets a bit confused and deinterlaces his hands (and maybe the rest of him but can't tell as that's not moving) as if they were film. It's usually very hard to trip it up but that does throw it.
 
So the final answer to the initial question is ->

576i/50 is the same as 576p/25

Well if the production people wish so, then yes.


But the 576i/50 with 50 fps source, can also be used for doing either ->

1. 25 fps with a horid weive.
2. 50 fps with a horid Bob.

The above 2, of course can be made to look better with some clever de-interlasing, but it will never be perfect due to missing information.

The same of course applies to 1080i/50 and 1080p/25.

Out of interest I'll be pausing a few programmes on my Topfield to see what's using 50/25 fps.
 
So the final answer to the initial question is ->

576i/50 is the same as 576p/25

Well if the production people wish so, then yes.

This is interesting. Now that HD cams are growing in popularity for regular tv programming, I don't have a clue whether they are normally shot at 50i or 25p.
If you'd have asked me 2 years ago, I would have said that over 80% of what is on tv was 50i native.

Even HD cameras for tv work were still filming in 50i, not 25p back then.

Has anything changed in the last couple of years? I guess we'd need the help of someone in the industry to answer that. My guess is that only dramas and documentaries would have moved towards progressive, with most tv shows, reality tv, sports, gameshows, chatshows, news, lifestyle, etc, etc shows still being 50i.
 
Certainly all the football on SkyHD is still 50i/Video mode as are most of the live recordings shown on BBC HD (e.g. Joules Holland etc), anyone with an external video processor can confirm this.

I can also say that when we're developing new deinterlacers we still have to worry about video mode, in fact its the biggest thing we have to worry about as film mode is relatively easy to deal with. We wouldn't be doing this if it didn't repreasent a significant proportion of broadcast and pre-recorded material still.

With respect to frame rate the key issue is that when capturing film/video material you're actually descretely sampling something that naturally has a continuous function this leads aliasing that is perceived in the form of jerkynes. Film compensates for his by caturing the image continuously over time so introducing moting blur. However digital display devices re-introduce motion artifacts due to the way the image is presented to the eye (CRT's don't suffer from this problem), this is typically seen as double images along the edges of moving objects. To get around this manufacturers are now starting to introduce frame rate conversion that interpolates between frames (not just duplication). This is actally a non trivial problem to solve but the new Sony's (Motion Flow) seem to be about the best at this at the moment, it not perfect but it defenitley reduces motion related artifacts when its working.

John.
 
Heavens, what a thread! For reference, here is a very authoritative resource:

http://dvddemystified.com/dvdfaq.html#3.4

Kpj, I think you've read some explanation of film-mode de-interlacing of 50i video (where the source material is 24/25fps progressive video before it is telecine'd to 50i for braodcast) and assumed that process is applied to "video" video as well. It's not, even though the end result in both cases (for progressive digital displays) is generally 50p video. 25p video pretty much doesn't exist in any shape or form, except as an interim mastering stage. Forgetting the US & Japan for a moment, here's the deal:

There are two sorts of video that video processors have to deal with, and the processes are quite different. (Cheap ones may treat them the same, but the picture suffers).

  1. Firstly, video sourced material that is captured with digital interlaced video cameras, be they SD (576i50) or HD(1080i50). This is captured, distributed, broadcast, received (and in the case of most CRT TVs, displayed) as interlaced video. There are 50 fields/sec and 25 frames/sec, but each field is from a different moment in time, so the fields cannot be woven together. When displayed on a progressive display like a PJ, LCD or PDP, these interlaced fields are individually de-interlaced to produce 50 different frames/sec. The DI process may be line-doubling (bobbing) or scaling or other more sophisticated processes. This is what is almost invariably used for live TV.

  2. Secondly, film sourced material is taken from 24fps film and captured to progressive video at 24fps. This is speeded up to 25fps as we all know. Those 25 frames/sec are then doubled to 50 frames/sec, so adjacent pairs of frames are identical. Each of these frames are then interlaced to 50 fields/sec for broadcast. Again, adjacent fields come from the same original progressive frame, captured at the same point in time. This is quite different to the video case above, where fields comes from different points in time. What I have described is the telecine process, and the corresponding de-interlacing process, inverse telecine, is quite different to video de-interlacing. Or at least it ought to be, to get the best picture.
Inverse telecine is a bit more difficult than many people imagine, because the processor first has to spot that the original material was captured with a film camera, rather than a video camera. The physical process itself is easy - adjacent fields are simply woven back together to produce the original progressive sequence of frames. The film is then reconstructed bit-for-bit identical to the original. Within the processor, 576i50 becomes 576p25, and this is frame-doubled to 576p50 for output to the display.

The difficult trick is with firstly spotting that the video comes from a film, but that's for another post.

BR, Nick
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom