720p vs. 1080i

H

HarryW

Guest
Hello All,

So which format is better 720p or 1080i? Anyone an expert on the pros and cons.
 
720p is supposedly better with fine detailed images whereas 1080i is better with fast moving broadcasts like sports and action movies. At the end of the day it'll all come down to personal preference but you'll probablt find that HD channels will choose which format to broadcast from so you wont get the choice.
 
It's actually the other way around!!!

1080i is 1080 lines, with 1920 pixels per line which is a massive resolution. It is interlaced because to do it progressively would be twice as massive!!! As it is the information is sent as 540 odd numbered lines, then 540 even numbered ones, which the display must deinterlace back into the full progressive frame. This is where things can go wrong.

If sport is recorded initially as 1080i, then when the second load of 540 lines is analysed, anything that has moved in that split second before capturing both set of fields will not line up properley when the lines are stuck back together (as with regular interlaced TV). So some interpolation has to take place, which is often done very poorly with 1080i signals simply because they are so large!!

720p is recorded progressively to start with so doesn't introduce deinterlacing errors. Therefore with sport and fast moving scenes it is relatively error-free in comparison. But, the resolution here is 720 lines with 1280 pixels per line. A whole lot less detailed than 1080i.

So, slower moving material better with 1080i where detail advantage comes into play. Faster stuff better with 720p where signal is recorded and displayed progressively. n.b. to most HD looks like HD whether 720p or 1080i!!!!! Either way its a bucketload better than SD
 
Don't feel bad about it, Neil. Luckily, Liam is correcting all senior members today. Wonder what he had for breakfast?
 
I've had my weetabix this morning and am in the office most of today!!! Watch out forum, i'm a lurkin.....
 
Well, even if 1080i is deinterlaced by stupid BOB deinterlacing, that still gives us 1920x540 = 1036800 "different" pixels. 720p "only" has 921600 pixels. So I'd say 1080i should be as good for sports as 720p is, if the deinterlacer is using simple BOB deinterlacing. But the better the deinterlacer is the more 1080i gains quality. So with a good motion adaptive deinterlacer I'd guess that sports looks better with 1080i than with 720p.

This is all theoretical, though, since I've not had the pleasure to compare sports between 720p and 1080i yet. Hey, video processors which are able to do good motion adaptive deinterlacing of 1080i content are just about to be released, so...
 
On a pure resolution level the theory is sound. But the errors introduced by interpolating from only 540 lines of an image that should be 1080 lines will be significant. Whereas the 720p is faithfully reproduced exactly as it started, without error.

Back to a 1080i moving image, when it moves the processor just throws away the second batch of 540 lines and makes up where it thinks detail should be. It's not just the dip in resolution, but the way it is used that'll kill it.
 
I'm half way through writing new text for the tired explanation of video processing that is on our site at the moment. But meanwhile I have found the information Silicon Optix have on their site is brilliant!

http://www.hqv.com/technology.cfm

Well worth a read. Not least of all it explains the different types of motion adaptive deinterlacing (whether a whole frame is ignored, a region, or a specific pixel).
 
What nobody seems to have pointed out, is that none of us actually own, (apart from a handful of peolpe with Sharp's 1080i 45" lcd), a set capable of showing 1080i native. My set has a res of 1366 by 768, (typical of most sets these days), and therefore it's obvious that 720p will be optimum for most people. Better to have 1280/720p50 scaled to 1366/768p50 rather than 1920/1080i50 scaled down to 1366/768i50, as 720p is scaled to 768p50, whereas 1080i ends up as 768i50. 768p having twice the number of apparant pixels as 768i50.
 
The point is no-one really want to have to do complex deinterlacing that will only ever mean a compromise with field originated material.

If you take all those events that would normally be shot interlaced and you shoot them progressivley you won't have to deinterlace them and the original quality is maintained without artifacts. The reason they are shot 720p and not 1080p is about cost and mechanical capabilities.

If you have frame based material then you can interlace it at a higher resolution (1080i) knowing that it can be easily deinterlaced to the full resolution ( 1080p).

So you use 1080i for films , as its actually 1080p with less bandwidth ( to all intents and purposes).

And you shoot everything else as frames (720p).

And now no-one needs to worry about convoluted deinterlacing any more.
 
Rob20 said:
What nobody seems to have pointed out, is that none of us actually own, .

I've got a dell lcd monitor that goes to 1920x1200. There is more to a good picture than resolution though as Dell amply shows in comparisson to my 856x480 panasonic plasma which trounces it in the picture quality stakes.
 
Well, if 1080i is properly flagged you essentially have 1080p.

And even with badly flagged material, good playback solutions like Theatertek can still do excellent Deinterlacing.

720P, on the other hand, is a no-brainer because you don't have to argue about deinterlacing at all.

Still, I haven't seen a single 1080i movie that was not properly deinterlaced by Theatertek so for me 1080i please. :smashin:


Another point: most people tend to say that 720P is better for sports because it's progressive. I've seen the US Open 2005 in 1080i and there were absolutely NO deinterlacing artefacts...
 
k0rn said:
Another point: most people tend to say that 720P is better for sports because it's progressive. I've seen the US Open 2005 in 1080i and there were absolutely NO deinterlacing artefacts...

Because you were watching 1920x540 after a bob deinterlace , so you only ever see 1/2 the resolution but no mismatched field artifacts.
 
Rob20 said:
What nobody seems to have pointed out, is that none of us actually own a set capable of showing 1080i native

ALIS panels can vertically, they simply chop off a few lines so you get 1024i. only 1024 horizontal though.
 
Mr.D said:
Because you were watching 1920x540 after a bob deinterlace , so you only ever see 1/2 the resolution but no mismatched field artifacts.

Uhm, I don't ever use BOB but adaptive Deinterlacing of course... ;)
 
k0rn said:
Uhm, I don't ever use BOB but adaptive Deinterlacing of course... ;)

So it jumps between a weave when you have little movement and possibly gives rise to interlaced artifacts but at least you have as much resolution as is available and then when things move it falls back to a bob deinterlace.

Its still a compromise over no deinterlacing in the first place and its a comparatively expensive thing to do realtime in hardware especially for hidef. (you might well find it can only manage a bob anyway with 1080i and only applies adaptive to sd depending on chipset).
 
I saw a demo with two Sharp LC-45GD1E LCDs side by side back in September, both had the same content playing back, one at 720p the other at 1080i

They were showing material designed to expose "problems" with both modes - hinting (without actually saying it ;) ) that it would be better just to go straight to 1080p rather than use 720p or 1080i which are really a kind of "half-way" house to achieving full HD potential.

The content included:

- large crowds of people jogging (look at the details of the individual faces)
- slow panning of large/close up objects (trees, faces, grass)
- close up water ripples in dark water
- colourful city skylines from medium/long distance

What was interesting was the quality of MPEG2 vs VC1 (this is the format proposed for Sky HD, which will use MPEG-4, is that right?) - VC1 showed some obvious pixelisation and artifacts (especially in the water ripples scene) compared to it's MPEG2 counterpart. The bitrates they were showing varied from 4-18Mbps in both 720p and 1080i modes.

We have a 1080p compatible Dell LCD (2405FPW) and it can look great, but not the greatest as far as ghosting is concerned i'm told.

Can anyone tell me how existing consumer "HD Ready" LCD panels would handle a 1080p signal? Would they simply be rejected, or would the display try to de-interlace and scale the picture so that it can be shown to the units best ability?

If not, then surely the whole idea of "HD Ready" displays is a bit of a sham?

But i would guess i'm not the first to be pointing this out!! :D
 
There are no consumer products that output 1080p nor do HD Ready LCDs accept 1080p as an input signal.
 
I have one of these: http://www.ati.com/products/radeonx300/specs.html

That outputs DVI at the following:

ati_x300.jpg


So with these clips: http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

can i not generate 1080p using consumer products?
 
I meant Hifi equipment.. but the real problem besides the output is just that HD Ready Screens won't accept 1080p because they don't have the chipsets to process 1080p.
 
Thanks k0rn, i see where you're coming from - but there are a few consumer displays already out there that support 1080p (and many more to follow):

http://www.consumer.philips.com/con...&proxybuster=Z35SGGNLJCVA3J0RMRESHQVHKFSEKI5P

http://www.sharp.co.uk/Product.aspx?ID=928

http://accessories.euro.dell.com/sn...age=productlisting.aspx&instock=&refurbished=

Not to mention many LCD monitors on PC's including laptops!

But what i'm asking is if anyone has actually tried feeding 1080p into an "HD ready" setup that is only spec'd to support 720p?
 
tameruk said:
Thanks k0rn, i see where you're coming from - but there are a few consumer displays already out there that support 1080p (and many more to follow):

http://www.consumer.philips.com/con...&proxybuster=Z35SGGNLJCVA3J0RMRESHQVHKFSEKI5P

http://www.sharp.co.uk/Product.aspx?ID=928

http://accessories.euro.dell.com/sn...age=productlisting.aspx&instock=&refurbished=

Not to mention many LCD monitors on PC's including laptops!

But what i'm asking is if anyone has actually tried feeding 1080p into an "HD ready" setup that is only spec'd to support 720p?

I've tried that with the Sharp you mentioned, always jumped back to 1280x720 using DVI and a PC... but I hear many users have problems with that one and getting the resolution..

I still think that these panels can't process 1080p although they are progressive scan panels, thus show 1920x1080 as progressive... but they won't accept 1080p input. So yes, HD Ready is flawed at that point.
 
The Sharp LC45GD1E can accept a 1080p signal at 60Hz if you bypass the media box and feed it straight to the screen.

Other devices (for example Samsung's latest range of DLP rear-projection) can accept 1080p via VGA, but not via HDMI.

Sources of 1080p:

- Nearly every personal computer in the world.
- Any decent video scaler.
- The PlayStation 3. (Quite possibly also other next-gen games consoles).
- (Possibly) future BluRay or HD-DVD players.
 
tameruk said:
They were showing material designed to expose "problems" with both modes - hinting (without actually saying it ;) ) that it would be better just to go straight to 1080p rather than use 720p or 1080i which are really a kind of "half-way" house to achieving full HD potential.

There is nothing half way house about having 720p and 1080i . Whover suggested this to you has a fundamental misunderstanding of the dual usage of the two hd formats.

1080i is primarily for broadcasting frame based material ( ie films) as its essentially the same as 1080p with half the bandwidth requirements as long as its correctly and easily deinterlaced.

720p is for those events that would normally have been shot in a field based format ( sports , news , studio, live).

If you use both formats its either a case of a simple no compromise deinterlace or no deinterlacing .
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom