1080i Is it worth it or is 720p good enough?

Solidstate

Established Member
Joined
Oct 8, 2000
Messages
566
Reaction score
103
Points
163
Location
Wigan
I along with many people on here will be looking at potential new TVs to buy ready for Sky HD.

I know its too early to say if there will be enough programming to make it worth getting a 1080i capable display.... however I have a couple of general questions.

I am looking at the DLP TVs and notice that the sets at the moment max out at 720p. I know that 1080i xHD3 chips are on their way - but seem aimed at screens of 50"+ (too big for me).

1) Does 1080i look a lot better than 720p (on suitable material).
2) If I downscale 1080i to fit a 720p capable display will it look considerably better than old low def TV - and what standard is it downscaled to (540i?)
3) If I got a 720p screen - would I feel like a schmuck every time a 1080i programme was on.
 
There are no "1080i chips" (and never will be), for now there is one 1080p xHD3 chip, which effectively delivers 1920x1080 progressive picture. It will easily deinterlace 1080i into its native 1080p.

My (and pretty much of the majority of DLP owners') take on your questions:

1. No, 1080i doesn't look better or worse than 720p just because of the standard itself. Most important factor is the optics of the HD camera and the way material is processed before delivered to the TV input. There is absolutely no need to be biased towards the 1080i because the number looks higher. 1080p is a completely different beast but there won't be such broadcasts anytime soon.

2. Yes, any real HD signal on any HD capable and big enough TV looks considerably better than any standard definition. 1080i is deinterlaced (converted from i to p), then scaled (I wouldn't even use the term downscale) to 720p. How the internal processor does it is customary, but I haven't heard of downscaling to 540i first on any 720p TV set.

3. Absolutely not. Go for it.
 
beeblebrox12 said:
There are no "1080i chips" (and never will be), for now there is one 1080p xHD3 chip, which effectively delivers 1920x1080 progressive picture. It will easily deinterlace 1080i into its native 1080p.

My (and pretty much of the majority of DLP owners') take on your questions:

1. No, 1080i doesn't look better or worse than 720p just because of the standard itself. Most important factor is the optics of the HD camera and the way material is processed before delivered to the TV input. There is absolutely no need to be biased towards the 1080i because the number looks higher. 1080p is a completely different beast but there won't be such broadcasts anytime soon.

2. Yes, any real HD signal on any HD capable and big enough TV looks considerably better than any standard definition. 1080i is deinterlaced (converted from i to p), then scaled (I wouldn't even use the term downscale) to 720p. How the internal processor does it is customary, but I haven't heard of downscaling to 540i first on any 720p TV set.

3. Absolutely not. Go for it.

1. 1080i does look quite a bit better than 720p. P, in this case, means nothing in regards to what the eyes can normally perceive.

2. Agreed.

3. See 1.
 
As in most discussions on political or religious issues, the sooner everyone says "We agree to disagree", the better.
An atheist would probably say to a believer "If I see one sign that God exists, I will agree with you." :)
Similar to that, I would say - "Show me one 1080i set that can show more or at least equally detailed picture than a good same size 720p set and I will accept your point 1". :)
All advantages that 1080i sets still have over 720p are because of the CRT technology - contrast, black levels, smoothness - not because of the theoretically higher resolution. But since this is coming from a DLP owner, I have to add that this is my personal opinion, formed at the time when I was choosing technology by comparing sets, and lots of personal viewing afterwards. I fully respect your beliefs, because chances are they are strong too :)
 
Thanks for the replies chaps. Sorry about the 1080i chip mix-up - I did know it's a 1080p but its so easy to get your p's and i's mixed up when talking about this stuff (and one lower-case letter does make a lot of difference).

I'm glad you've said what you have, because this means that the first 720p TV that I see (that fulfills all my needs) should be perfectly sufficient for SkyHD in 2006 and beyond.

I suppose the only possible problem is if Blu Ray (or HD DVD) start supporting 1080p in a big way. I can't see this happening for a while though.
 
My views based on using a crt projector which is capable of doing 1080p.
1080i does look nicer than 720p if both are done well.
De-interlacing 1080i (even with bob and weave) looks even better.
720p looks so, so, so much better than even the best DVD out there scaled up to the projectors optimum resolution.

When you start to look at digital displays you would think that 720p on a 1280x720 dlp would give better results than 1080i, well I don't think it does, on the things I watched when I had an HD2+ projector alot of the 1080i stuff looked as good if not better.

1080i and 720p will still look so much better than dvd even on a 480 line plasma, in fact from regular sitting distance it will not hit you immediately which is the HD panel, when being fed a decent HD feed. I was amazed whilst looking at 2 panny screens in a store in New York right next to each other, one SD and one HD, just how little there is in it.

There is just so much more info in a decent HD source, colours to die for and inky blacks you could fall into, along with a lack of noise make, and this makes as much of a differnce as the resolution.

Yeah a 1080p screen should and probably will look better if it is a good screen, but a good 480 line screen will look better than a poor 1080 line screen.
 
beeblebrox12 said:
As in most discussions on political or religious issues, the sooner everyone says "We agree to disagree", the better.
An atheist would probably say to a believer "If I see one sign that God exists, I will agree with you." :)
Similar to that, I would say - "Show me one 1080i set that can show more or at least equally detailed picture than a good same size 720p set and I will accept your point 1". :)

All advantages that 1080i sets still have over 720p are because of the CRT technology - contrast, black levels, smoothness - not because of the theoretically higher resolution. But since this is coming from a DLP owner, I have to add that this is my personal opinion, formed at the time when I was choosing technology by comparing sets, and lots of personal viewing afterwards. I fully respect your beliefs, because chances are they are strong too :)

My beliefs are not "strong," just simply what I have observed over time. I don't get emotional over a piece of electronics.

The one good example of a noticeably better picture on the CRT side is the Sony XBR Super Fine Pitch series.

Also, I have watched plenty of 1080i programming on higher end progressive displays that typically looks better than 720P programming.

As I've said, its what the eyes normally perceives that matters.
 
Abit said:
My beliefs are not "strong," just simply what I have observed over time. I don't get emotional over a piece of electronics.

The one good example of a noticeably better picture on the CRT side is the Sony XBR Super Fine Pitch series.

Also, I have watched plenty of 1080i programming on higher end progressive displays that typically looks better than 720P programming.

As I've said, its what the eyes normally perceives that matters.


I agree that a 34'' DirectView CRT is probably great in quality, but near useless for HDTV. A 60'' DirectView CRT Super Fine Pitch will probably be the perfect HDTV, but it's never gonna happen. Rear projection CRT Sonys can't compete with quality 720p displays.

What do you mean by "Also, I have watched plenty of 1080i programming on higher end progressive displays that typically looks better than 720P programming."? What are these "higher end progressive displays"? 720p or 1080p?
 
beeblebrox12 said:
I agree that a 34'' DirectView CRT is probably great in quality, but near useless for HDTV. A 60'' DirectView CRT Super Fine Pitch will probably be the perfect HDTV, but it's never gonna happen. Rear projection CRT Sonys can't compete with quality 720p displays.

Why do you feel a 34" CRT HDTV is "near useless?" Because of the size? Not everyone wants a wall full of TV screen. In fact the average viewer does not have anything larger.

And no, you certainly do not need a huge TV to see the benefits of HDTV. The benefits, depending on the quality of TV or display, can be clearly seen in the tiniest of screens.

beeblebrox12 said:
What do you mean by "Also, I have watched plenty of 1080i programming on higher end progressive displays that typically looks better than 720P programming."? What are these "higher end progressive displays"? 720p or 1080p?

Current plasma and lcd displays with the highest available resolutions. 1080i content invariably looks noticeably better.
 
Q2) If I downscale 1080i to fit a 720p capable display will it look considerably better than old low def TV - and what standard is it downscaled to (576i?)

The difference is colour is always stored at half-resolution using these MPEG or WMV compression systems.

Even if scaled down to a cheaper non-highdef Plasma panel's 864x480 resolution, a 720p HD source will retain it's 640x360 colour resolution, whereas the PAL DVD source is limited to 360x288.

I think it's mostly the difference in the edges of colour that makes the big difference for most material.

Here's a useful comparison, using the WM9-HD release, but it's still pretty valid to show the subtle difference of 1080 v.s 720 on a recent hollywood film transfer.

Conclusion 720p can have pretty much twice the colour resolution of a PAL DVD.

http://www.dvd-compare.com/comparisons/t/tombraider2_highdefinition_demo/index.htm
(to my eyes the 720p is actually retaining more grain and colour than it's losing out on detail to the 1080)

Q1) Does 1080i look a lot better than 720p (on suitable material).
Q3) If I got a 720p screen - would I feel like a schmuck every time a 1080i programme was on.


I'd say the actual difference of 1080 and 720 sources is very different.
I've read that optics and HD transfers currently top-out at about 1400x anyway, so the 1920x1080 isn't currently the big advantage it seems.

On my relatively high-end 8-inch Barco 808s CRT projector, 1080p is very, very subtly different than when viewing downscaled to 720p (or perhaps 864p in my projector's case). I'd say the detail in the downscaled 720p mode is actually a tiny bit sharper because the downscaling may keep more detail than that which is randomly lost in the crt phosphor/optics when displaying 1080p.

This is viewing on a 7ft wide screen at about 1.75 widths (12ft) away; i.e. bloody close!

I'd say from what I've seen and read, this would be the case for most high-end crt projectors, except perhaps high-end 8-inch ones with very new CRT tubes or a 9-inch projector with good tubes, i.e about £7000 worth of s/h CRT projector.

The difference may increase when we get scalers capable of true adaptive HD 1080i deinterlacing, but at the moment I think everything that can even handle 1080i is limited to bob-mode which loses vertical detail.

Oh, or the 2megapixel displays from JVC or Sony, but I think they are at least 5x that price as reviewed in Widescreenreview recently.

Rob.
 
My rambling thoughts on the subject :

1. Downconversion of HD for display on SD displays looks better than SD sourced material.

I have noticed this on a number of occasions. Whilst the chroma resolution arguments are perfectly valid for local downconversion (assuming that your set top box / DVD player doesn't cross convert to 4:2:0 SD internally - thus subsampling to 360x288 / 360x240 chroma res anyway)

However in my experience - HD sourced SD pictures "look" different in more ways than just the chroma. This is for a number of reasons.

One, the video noise present in an HD signal is there at HD resolution. When you downconvert this to SD much of the really HF noise is averaged out, giving the HD downconversion a much "cleaner" more "noise free" look than an SD sourced image. (This is true for both camera sources and telecined material)

Two, the aperture correction (sometimes called "edge enhancement", "detail" or "contours") introduced by the video camera or telecine to make the image look a little "sharper", and compensate for slightly soft optics in the camera or TK, will be applied at the HD level on HD sources. This means it is much "finer" than that applied at SD level on SD sources. When the HD "edge" is downconverted to SD it is far less harsh, giving the pictures a less artificial, more natural appearance.

(I have noticed similar effects with other downconversions. VHS recordings sourced directly from digital VTR master tapes - even at SD - look much better than recordings of the same shows made off-air - even from a good signal. Sure the source quality in both cases is much better than the quality of the VHS format - but the source quality still influences the quality of the VHS recording.)

2. 1080i vs 720p

Whilst at first glance 1080 lines sounds like it should be sharper vertically than 720 lines - the interlace vs progressive argument plays a huge part in this. One of the reasons that 720p and 1080i were chosen was that in the real world, they deliver approximately the same vertical resolution. This is because interlaced displays can never deliver the full vertical resolution that a progressive system with the same number of lines in a frame can.

One - on fast moving picture information interlaced signals effectively become 540 line progressive displays.

Two - on very fine vertical detail, if the full 1080 line resolution was exploited with no filtering, you'd get flicker at the frame (not field rate) - so 1080/50i fine detail would flicker annoyingly at 25Hz. To a degree you do see this, but if the full 1080 line detail were present in the source material it would be far more objectionable.

Three - interlaced displays themselves have a reduced "perceived" resolution than progressive, so they "look" softer.

1. and 2. are really functions of the broadcast signal, and although de-interlacing 1080i to 1080p can re-introduce some "guesstimated" picture information - it is just making up what has been thrown away.

3. can be partially recovered by de-interlacing and progressive display, but is really tied in with 2.

HOWEVER - when it comes to horizontal resolution 1920 vs 1280 is a different kettle of fish. The 1080i horizontal resolution of 1920 means that horizontally the format is capable of significantly sharper pictures. Unlike 720p the horizontal and vertical delivered / perceived "angular" resolutions are not equal in 1080i - and 1080i can be much sharper.

However 1080i material doesn't always exploit the full 1920 horizontal samples capable in the origination format.

1. Some transmission systems don't broadcast the full 1920 samples. In the US, some satellite providers are resampling to 1280x1080i, so are effectively delivering a roughly equal perceived angular resolution, matching but not exceeding 1280x702p horizontally AND vertically.

2. In Aus they are using 1440x1080i - to save bandwith again.

3. HDCam - one of the more widespread HD production VTRs uses 1440 subsampling internally when it records and replays. (This is because it is partially based on SD 720x480/576 style technology, running at roughly double H and V res rather than nearly 3 times horizontally)

4. Film sources for some reason are often horizontally filtered, or actually the prints scanned for transfer to video don't contain huge amounts of HF information, so there isn't much going on that requires the full 1920 horizontal samples. Sure 4k scanning is used in digital intermediate production - and if the DI sources are used to create the digital HD masters then things might be different - but when you are transfering a 35mm print - even a good one, things aren't always as sharp as people think.

Bottom line :

1920x1080p fixed pixel displays will be capable of displaying higher resolution material than 1280x720p fixed pixel displays.

Only CRT based displays (and the oddball non-mainstream high-end projection stuff like Eidophor) will be capable of displaying 1080i without de-interlacing.

Any arguments about 720p vs 1080i will depend a lot on how you are watching, the quality of scaling etc.

It is perfectly possible that a sharp 1080i signal de-interlaced to 1080p and scaled to 720p, will look better than a 720p source displayed natively on the same screen. This is because there is more horizontal resolution potentially present in 1080i than with the 720p source - and even if you are displaying this at 720p, just as HD downconverted to SD looks better, 1080 downconverted to 720 may well also look better.

Of course there is no reason to believe that Sky won't be doing this at the transmission end - using 1080i trucks (or cameras) to shoot sport, but then downconverting to 720p for transmission - at least initially, because if the trucks use Sony cameras then they are only capable of shooting 1080i (they downconvert to 720p in the camera chain) Only Philips/Thomson make a dual-standard native camera - the WorldCam II LDK 6000 - which uses a 4000 line sensor to create either 1080p, 1080i or 720p material by averaging different groups of lines.

If you are buying a display then you also have to include other variables - such as display quality, robustness, build quality, support, input handling etc. into the equation. Just as there are good and bad PC monitors at different resolutions, there are going to be poor 1080p displays and excellent 720p ones... Dot count won't be everything.

(BTW the best quality "consumer" HD TV I have seen so far is the Sony Qualia 006. This a 70" RPTV using reflective LCDs - not transmissive. Displaying progressive source material, delivered over an interlaced transmission path, it looked really good. Cost US$10k... I won't be buying one just yet...)
 
:offtopic:
Hi,

Rob.Screene,

You have helped me on another thread and i need a bit more advice please but i cannot send you a PM as you inbox is full!

Message for you below, can you reply to me via PM Plz, hope Solidstate won't mind.

Hi Rob,

Thanks for your help in the thread i posted!

I have done some more research into the capure cards you mentioned and i would like a bit more advice, please.

I am now down to deciding between two cards.

The HOLO3D II and the Sweetspot!

I have found a HOLO3D II card which is going for £250 and the sweet spot is £175 + £48 for a rgbs cable for sky+ RGB out use!

I maybe looking to ugrade to Sdi later and both cards can support this (SSPOT will cost another £175 for addon card, but not bothered about this at the Mo!).

Both seem to offer excellent quality but the SSpot is only 9bit and the Holo II card has a lot less support and it is not supported by Dscaler at all, bummer.

For use with my current situation, Sky+ rgb to capture card which card will best for PQ or is there another card that will provide better PQ than these two?

Do you know anything on the new Ati theater 550 pro as this is suppose to blow away anything around today according to Ati and it is a 12bit processor.

Any help will very appreciated!

Cheers! :smashin:

Cheers!
 
Really good post Stephen. Confirms a lot of my own opinion's ie 1080i has potential which is generally not exploited, yet.
Re the down conversion theory, I remember making video CD's a few years back from captured MPEG2 files. The files would have to be converted to MPEG1 for VCD and the rule was, the higher the bit rate of the source file, the better the resulting MPEG1 file even though it seemed you were wasting disc space and could see no visual improvement in the bigger MPEG2 files.
I have a question about the Philips/Thompson HD camera that you may have an answer to. About this time last year Euro1080 did a documentary on HD and there was a really good graphical explanation of how all the HD and SD (NTSC & PAL) standards were derived from the 4320 line 9 Mega pixel CCD camera. They never said anything about how the two different frame rates were derived. Is the CCD sampled at different intervals or is there some kind of field droping conversion?

Jim.
 
Stephen Neal said:
Only CRT based displays (and the oddball non-mainstream high-end projection stuff like Eidophor) will be capable of displaying 1080i without de-interlacing.

ALiS plasma panels such as Hitachi's 42PD5300 can display 1080i without de-interlacing. (Displaying 1080i on a 1024 x 1024 resolution panel actually involves cropping 28 pixels of the top and bottom of the picture in order to avoid vertical scaling.)

As far as I know ALiS panels are the only modern display type that can display interlaced content natively - all the others are progressive.
 
Rimmer said:
ALiS plasma panels such as Hitachi's 42PD5300 can display 1080i without de-interlacing. (Displaying 1080i on a 1024 x 1024 resolution panel actually involves cropping 28 pixels of the top and bottom of the picture in order to avoid vertical scaling.)

As far as I know ALiS panels are the only modern display type that can display interlaced content natively - all the others are progressive.

That's odd - I thought ALiS was a plasma technology - how do they get the lines to overlap a bit to reduce the visibility of the line-structure?
 
Muf said:
About this time last year Euro1080 did a documentary on HD and there was a really good graphical explanation of how all the HD and SD (NTSC & PAL) standards were derived from the 4320 line 9 Mega pixel CCD camera. They never said anything about how the two different frame rates were derived. Is the CCD sampled at different intervals or is there some kind of field droping conversion?

Jim.

AIUI they just run the CCDs at different refresh rates. This allows them to run at a 50 or 60Hz refresh for 1080/50i or 60i and 720/50p or 60p (and I guess 576/50i or p and 280/60i or p. They can also run at a 24, 25 or 30Hz refresh rate for 1080/24p, 25p or 30p.

(When running interlaced there is yet more averaging and discarding going on AIUI - Once you've created a progressive frame to create the interlaced version - in one field lines 1&2, 3&4, 5&6 are averaged to create field lines, in the following frame lines 2&3, 4&5, 6&7 are averaged to create the field lines. The half line offset between fields adds to the vertical resolution of static material)

It is interesting how Philips (now part of Thomson in broadcast terms) thought different to the rest of the Camera world. The DPMS system they use for their World Cams is descended from the DPMS system they used when they created their 4:3/16:9 switchable SDTV cameras (as used by BBC New and ITV News studios, as well as many 3rd party OB facilities houses). Uniquely they use a 4:3 sensor - but with far more than 576 lines. The full sensor area is used to create a 4:3 image, but a reduced height area is used to create a 16:9 one - of equal width, and equal horizontal resolution. (You just use fewer lines of the sensor for the 16:9 image)

Most other manufacturers use 16:9 sensors, reducing the width of the image when running 4:3 - which has more of an impact in lens terms - as lens widths are defined horizontally...
 
Stephen Neal said:
That's odd - I thought ALiS was a plasma technology - how do they get the lines to overlap a bit to reduce the visibility of the line-structure?

Not sure, but I imagine the increased line count (512 lines/field) compared with 240/288 lines/field on an interlaced SD CRT is enough to reduce line visibility. I've seen Fujitsu and Hitachi ALiS panels in action on several occasions, and I didn't notice any visible line structure. ALiS panels display SD and 720p HD as 512p, which produces comparable PQ to the 480 line progressive plasmas.

From the AVS Forum FAQ:

ALiS stands for Alternate Lighting of Surfaces. It is a way to increase vertical screen resolution on a plasma display while maintaining brightness (by allowing bigger pixels) and keeping number of components and therefore costs down.

Existing Alis screens have 1024X1024 discrete pixels but they are addressed in an interlaced manner so every 60th of a second all pixels on the odd rows get addressed followed in the next 60th of a second by the pixels in all the even rows. As was mentioned, this is an interlaced method of display but is still better than CRT direct view as in CRT the electron beam has to scan lines one by one. With ALiS all odd lines are lit simultaneously then all even lines are lit simultaneously.

How this is done is that the odd and even rows share electrodes. In non ALiS displays each row of pixels has a dedicated pair of electrodes so that every row can be addressed simultaneously to produce a progressive display.

With ALiS, the bottom electrode of a row of pixels would also be the top electrode of the next row and so on. What this means in practice is that a row of pixels cannot be lit at the same time as the row next to it because the shared electrode can only be used for one of the two adjacent rows. Hence the interlaced nature of this display.

The following list describes how different sources are displayed with ALiS (thank you to TrainerDave):

720p: each field downconverted to 512 rows

480p: each field is slightly upconverted to 512 rows

480i: each field is upconverted to 512 rows,

1080i: each field is cut (not downconverted) from 540 to 512 rows [/list] 720p is displayed in 1024 by 512 *resolution* spread over 1024 rows of pixels.

So the 1024 by 1024 is "used" to display a 1024x512 - resolution image. For progressive sources the resolution that you actually see is 1024X512. 1080i is displayed in its native vertical resolution over 1024x1024 pixels. Each field that comes in is shown on its own 1024 x 512 interlaced pixels.

ALiS is supposed to be perfect for a 1080i source as the vertical resolution nearly matches up. You simply lose a few rows top and bottom instead of having the image vertically scaled so there are also no scaling issues. Plus the source is interlaced anyway. In practice though this does not seem to bear out.

There are a number of posts about the Fujitsu 4233(852X480 progressive) vs. 4242(ALiS) and most people observe that HD or other images were noticeably better on the non ALiS panel which lists at $2000 less. This could be the result of better black levels on the 4233 or different scaling requirements but it is highly recommended that you compare yourself before buying an AliS 1024X1024 panel as most people here prefer the 852X480 panels. The only people that really prefer the AliS panels are those that can see the pixel structure on the 852X480 panels. But that is another topic!
 
Thanks Rimmer - interesting post. It sounds as if they have managed to create a "half-line" scanning system. Be interested to know how that ties up with the high sub-field frequency scanning that most plasmas have to employ to get a decent grey-scale (along with dithering) - AIUI they run at much higher vertical refresh rates than their source video, to allow grey-scales to be partially created by only illuminating pixels during certain sub-fields.

(I've certainly noticed something that could be attributed to sub-field processing when I've seen that certain picture information on plasmas "beats" with 50Hz TV cameras on certain models, but other picture content is fine on the same panels. Only some manufacturers plasmas seem to do this at 50Hz - NECs are fine, Fujitsus not so at 61" sizes)
 
Should add that the article posted to uses the 25i and 30i terms - these actually mean 25 interlaced frames made up of 50 interlaced fields (or 30 vs 60)

On this board these are more commonly referred to as 50i and 60i (where the field rate is quoted - as it is more helpful in defining the temporal resolution of a display. 25i implies only 25Hz motion rendition - whereas the 50 field nature can carry 50Hz motion rendition)

Also the screen shots of interlaced frames are slightly confusing - the image isn't presented in that form on an interlaced screen.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom