1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Don't ignore resolution!

Discussion in 'General TV Discussions Forum' started by bwallx, Feb 4, 2005.

  1. bwallx

    bwallx
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    On a quick look at the HDTV topics, I notice a lot of confusion over what qualifies as an HDTV screen. Sadly, very do at the moment.

    on another forum http://217.43.163.185:8889/index.php/xarbb/topic/289 I posted this whichj may be of interest here:

    Anyone who has tried viewing TV on a computer TFT will know that the image quality is often dissapointing. This is because the transmitted pixels aren't always matched to the actual pixels available in the screen.

    HDTV transmits one million pixels for every frame, with no interlacing. Thus any display device needs to have one million pixels so the transmitted pixels can be mapped one for one with the display.

    The Sony 32" mentioned in my first post has 994,448 pixels so is ideal. A 42" screen would also need one million pixels, but as I show above, they only have half that at best. So not only is the screen having to stretch its own 408,960 pixels over a vastly bigger area, it actually can only display half of the transmitted 1 million pixels anyway! So it throws away every other pixel of information or detail and what is left os a coarse, blocky image, worse than normal TV on a normal set.

    As I say, many people wouldn't know a good quality picture if it socked them in the face - I've seen some atrocious setups in friend's homes and they can't understand what is so wrong with it!! On my 32" Sony tube TV, I counted only a max of 60 charactes of text displayed on one line in text (in a setup menu). It was obvious that if they tried to squeeze any more text onto that line by making the font smaller, it would be unreadable. That is how poor TV set resolutions are!!!!!


    On HDTV New Scientist last week said: "With average viewing distances of just 2.7 metres in homes, the coarse line structure, especially with NTSC becomes obvious". Also "..once buyers get them home they quickly realise that the picture quality leaves a lot to be desired. This is because many flat screen sets simply stretch pictures designed for smaller screens. Doing this makes the pictures' coarse line structure and low pixel count all too obvious. HDTV services, which transmit more finely detailed pictures... ...but many of todays TVs will not be able to take advantage of them. A survey by Screen Digest found that only half of all plasma TVs on sale in Europe - and none of the cheaper ones - can display HDTV pictures. And only two of the 500 flat screens available in the US can display the best available HDTV."
     
  2. Jonny1973

    Jonny1973
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Just because a display can display HD pictures does not mean it is hi-def.

    My AE100 projector can handle 1080i (1920x1080) but first it must resize it to 856x480. As a result, the picture would not look any better than standard definition.

    To be truly HD a device must have a native resolution of 1280x720 (720p) or 1920x1080 (1080i/1080p). I don't think that even 720p should be called Hi-Def.

    I won't buy another projector until they are natively 1080p so that I get the full benefit of HD-DVD or BluRay when they come out.
     
  3. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    The most common "IT" mistake when comparing TV and PC screen resolutions are to ignore interlace (the reason most TV on a PC screen looks so bad is poor de-interlacing) and to think that a 1920x1080 or 720x576 image consists of 1920x1080 or 720x576 pixels. They don't - they consist of samples (luminance only often at that resolution) - which aren't the same thing exactly.

    Resolution is important - but other issues are as well.
     
  4. Daneel

    Daneel
    Active Member

    Joined:
    Dec 5, 2002
    Messages:
    2,835
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    68
    Location:
    London
    Ratings:
    +8
    That's why 720p is good :)

    I disagree with the HDTV pic not looking better than standard def because it is being scaled down. In my experience it still looks better.
     
  5. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Yep - HD sources downconverted to SD still look better than SD native sources.

    This is for a number of reasons I've detailed elsewhere in other threads.

    In brief - HD sources have noise and capture artefacts (like aperture correction etc.) at HD level. When converted to SD these artefacts are reduced - HF noise is average out, HD aperture correction appears minimal etc. Compared to SD sources - where the noise remains visible as it is "SD noise" and the aperture correction can appear harsh because it is "SD aperture correction" (though not aperture correcting normally looks a lot worse...) - the HD downconverts look cleaner and sharper and less processed.

    It isn't that difficult to spot the BBC music shows that have been shot on HD video - or shot using HD cameras mixed in SD. (For which many of the same rules apply - as the cameras themselves downconvert in the case of the BBC cameras)
     
  6. Rob20

    Rob20
    Well-known Member

    Joined:
    Jul 3, 2004
    Messages:
    3,087
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +659
    In my mind, only a tv with a screen res of 1920 by 1080p (50/60hz) is a hi-defintion tv. 720p should be called medium definition tv or something.
     
  7. Gordon @ Convergent AV

    Gordon @ Convergent AV
    Distinguished Member AVForums Sponsor

    Joined:
    Jul 13, 2000
    Messages:
    14,150
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    166
    Location:
    Living in Surrey, covering UK!
    Ratings:
    +2,970
    Rob: Where are the 1080P sources?(playing devils advocate here....)

    The original poster needs to spend a little longer on this forum to understand a bit more about the image quality issues.....he has understood compatability to some degree already.

    Gordon
     
  8. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Well there are plenty of 1080p broadcast sources...

    Most US HD TV drama is shot at 1080/24p, UK HD drama is usually shot at 1080/25p - and film is much higher resolution than that and can thus be scanned to 1080/24p or 1080/25p.

    If you mean where are the 1080/60p or 1080/50p sources - well that is a much better question to ask ;-)
     
  9. St_ve

    St_ve
    Standard Member

    Joined:
    Jan 6, 2005
    Messages:
    1,059
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    39
    Location:
    Birmingham
    Ratings:
    +0
    Watching Hidef on my monitor at 1280 x 1024 , 720 & 1081 material look the same also on my 50" DLP set. I think we'll need displays much larger than
    50" to show the improvement 1081 has to offer over 720
     
  10. Rob20

    Rob20
    Well-known Member

    Joined:
    Jul 3, 2004
    Messages:
    3,087
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +659
    I was assuming that the HD-DVD and Blu-Ray formats would support 1080p24 for movies, (50/60 for tv!?). Also, as most hi-def tv material is broadcast in 1080i, surely you would need a 1920/1080 screen to display it fully. A screen with 720 lines would have to scale a 1080i image losing 360 lines of info. Also, if you have a screen capable of 1080p, then surely you would be able to convert a 1080i (60/50) broadcast into a progressive signal for display just repeating a frame twice? Lastly, I remember reading on the net somewhere that ESPN were to start broadcasting a 1080p signal sometime this year. Did anyone else hear/read that?

    Imagine watching a premiership football game in 1920/1080p 50hz on a 10 foot screen!
     
  11. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Rob20 don't make the mistake of thinking that there are 1080 lines of resolution vertically carried in a 1080i signal - there aren't.

    Because the system is interlaced there are resolution reductions vertically that reduce the resolution delivered by an interlaced signal compared to a progressive one.

    Vertical filtering to reduce interline twitter. Broadcast cameras and film transfer equipment, as well as decent broadcast electronic graphics kit, includes vertical filtering (optically an via scanning patterns "in camera", electronically in other areas) to reduce the vertical high frequency picture detail present in a 1080p source frame. This is required to reduce the "twitter" of high frequency detail which would differ markedly between fields and cause flicker at the 25Hz frame rate on fine detail.

    There is also the Kell factor - though this may be less of an issue on progressive displays fed an interlaced signal - it is very real for interlaced displays displaying interlaced material. Basically the interlaced presentation doesn't work as well as progressive - the persistence of vision / brain activity of most people doesn't perceive an interlaced display to be as sharp as a progressive one - even if the source frames have the same number of lines.

    Sure the interlaced system is delivering a sharper picture than the equivalent data rate progressive one for a given motion capture rate - a 1080/50i system has the same number of lines or pixels per second as a 540/50p system - but it is reasonably widely accepted that 1080i stuff isn't actually that much sharper vertically than 720p stuff.

    This may sound counter-intuitive - but it is one of the main reasons that 720p exists alongside 1080i.

    Horizontally it is a different matter of course. 1280x720p offers equal resolution horizontally and vertically in angular terms, 1920x1080i doesn't - it is much sharper horizontally than it is vertically in effective angular terms (with 1080i delivering a vertical resolution of around 700-800p equivalence I believe?)
     
  12. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    If your monitor is 4:3 then if you are watching in 16:9 letterbox you only have a 720p capable display. So you wouldn't be able to evaluate the quality differences directly. (1280x1024 square pixels is the 4:3 equivalent to 1280x720 16:9 if you want to retain 1:1 pixel->sample mapping, and thus letterbox with no resolution loss.)

    For a 1920x1080 16:9 image to be displayed in letterbox on a 4:3 with a 1:1 pixel->sample mapping you'd need a 1920x1440 resolution 4:3 monitor. If your monitor isn't that sharp then you can't evaluate the full resolution differences.

    I'm not saying that a 1080i signal will contain enough resolution to require that display resolution for full quality display - just that to discern the limitations you need to see the full signal potential.

    Similarly your DLP screen is 1280x720 isn't it? So again not a 1080p full resolution capable screen - so you can't see if 1080i is better than 720p because your set is converting to 720p for display.

    Both will be converting the 1080i signal to progressive for display as well - so de-interlacing performance is also an issue.

    The only real way of comparing the formats "in real world terms" is to look at 720p and 1080i sources displayed on 720p and 1080i native displays, and also both formats up/down/cross converted to 720p and 1080p for display on screens using this format.
     
  13. Rob20

    Rob20
    Well-known Member

    Joined:
    Jul 3, 2004
    Messages:
    3,087
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +659
    AFAIK the 720/1080 describes the number of horizontal lines a picture is made up of. The 1280 or 1920 part refers to the number of individual pixels per line. I assumed that a 1080i picture would transmit the odd lines only of a 1920 by 1080p frame first, then the even, (without losing any info in the frame). So the 2 frames combined would give you the whole 1920 by 1080p frame, (every 2 cycles, 1/30 or 1/25 of a second). That's how I assumed it worked. :confused:
     
  14. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Yep - so a 1920x1080 progressive image will contain a 16:9 square pixel/sample (i.e. equal width and height - standard def broadcast TV doesn't use square pixels in either 4:3 or 16:9) image.This is 1080 vertical lines consisting of 1920 samples per line.

    A 1280x720 progressive image, again square pixel 16:9, will be 720 vertical lines consisting of 1280 samples per line.

    However interlacing confuses issues a lot. It isn't anywhere near as simple as taking 25 1920x1080 frames each second, and sending 540 out of the 1080 lines in one 1/50th of a second, and the other 540 in the next.

    No.

    Video sourced interlaced material contains movement between the 1/50th second fields (to allow for more fluid motion), so the 2 x 540 line fields will not contain information captured at the same time, so if you add the two fields back together you don't get a single 1920x1080 image - you have movement between the two fields. (Effectively the resolution on movement can drop to 1920x540)

    It is better to think of it as a system that starts with 50 1920x1080 frames each second, but only sends half the lines from each. (If the source is 25fps film then two consecutive 50fps frames will be identical, so the two fields won't have motion between them)

    However interlaced displays only show each field alternately - so any really fine static vertical detail that is present in the source 1920x1080 frame but only sent in one field but not in the other - will therefore flicker at 25Hz (not 50Hz as if it were present in both fields).

    To reduce this annoying "twitter" - there is filtering of varying sorts in the broadcast chain. In cameras, this is done by line averaging the 1920x1080 image captured 50 times a second as follows. The 540 lines in field 1 are made up as follows, line 1+2 averaged, line 3+4 averaged, lines 5+6 averaged. In field 2, line 2+3 averaged, line 4+5 averaged. This provides a degree of vertical detail filtering (initially it appears this reduces the resolution to 540 lines, but the line offset between fields partially makes up for this) to avoid flicker. In graphics and film transfer similar vertical filtering is carried out.

    Therefore the 1080i signal may have been sourced from a 1920x1080 sensor, but it doesn't necessarily have the full vertical resolution that a 1920x1080 sensor could provide progressively.

    It is VERY important, when thinking about TV not to think in terms of PC graphics based around pixels. A TV camera or film telecine is scanning a real-world image, where there is frequency detail at far higher frequencies than can be resolved by the TV system in use - video is all about sampling real-world images, not pixel-by-pixel artwork.
     
  15. loz

    loz
    Well-known Member

    Joined:
    Jun 4, 2001
    Messages:
    13,058
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +1,786
    Please put it in simple terms.
    Given everything else remains equal, which of all these (720p, 1080i, 1080p, etc, etc) actually resolves/contains the most detail?

    If the source player and display correspond (so no scaling and such) , it must be straightforward to simply put them in ascending order.
     
  16. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Aaarghhh....

    Life isn't simple. Everything else is never equal. Not all programme material is the same. If there was a simple "1080 good, 720 bad" argument it would have been clearly made here ages ago. It hasn't been because it isn't simple. How people view material is also important when you decide how to shoot and transmit it these days.

    Film runs at 24 or 25fps (as does a lot of drama shot in HD at 24/25p) - it doesn't exploit frame rates higher than 24-25 fps - wherease sport and other fast action programme material benefits from the higher frame rates, making it more fluid...

    If you want to be simplistic then :

    1920x1080/60p would be the best format - except it doesn't really exist
    1920x1080/50p would be almost as good - except againt it doesn't really exist practically.

    1920x1080/24p or 25p carrying film or 24/25p video will be sharper than the same material carried as 1280x720/50p or 60p (the extra frames are wasted carrying repeated frames with the latter, but the resolution is lower)

    However material which requires the motion portrayal of a 50/60Hz based system would look horrid (juddery and flickery) if carried by a 1920x1080/24/25/30p system, whereas it would look pretty good in 1280x720/50p or 60p as there will be more motion detail. (Sport on film in real-time never looks as "real" as sport shot on video)

    1920x1080/60i and 1920x1080/50i vs 1280x720/60p and 1280x720/50p is a much tougher argument. There is no definitive answer. It is swings and roundabouts based on lots of variables, like programme content, encoding system, available bit rate etc.

    1080i isn't as sharp vertically as 1080p. 1080p will require far more space to broadcast if it retains the same frame rate as the field rate of an interlaced system.

    Interlaced signals and progressive displays are not a good mix.

    In theory - if you could run multiple line and frame rates then :

    1080/24 or 25p would be the best transmission system for film and drama
    720/50p or 60p would probably be better for sport and fast motion stuff - especially if displayed on plasmas, LCDs or DLPs. Whether 1080i looks better on an interlaced display is a more difficult question - but how many people are going to be buying interlaced HD CRT displays now? (I would if I could!)

    1080/50p or 60p would be even better for sport and fast motion stuff - like entertainment shows - but it isn't practical to produce in this format yet (unlike transmission, production doesn't use much compression, so doubling the amount of data required to record, and route broadcast HD is quite a challenge)
     
  17. Gordon @ Convergent AV

    Gordon @ Convergent AV
    Distinguished Member AVForums Sponsor

    Joined:
    Jul 13, 2000
    Messages:
    14,150
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    166
    Location:
    Living in Surrey, covering UK!
    Ratings:
    +2,970
  18. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    Wasn't sure what you meant to be honest.

    If you mean consumer source devices - then there aren't many 1080p bits of kit knocking around.

    If you mean broadcast source devices - well there are plenty of 1080/24-25-30p sources around.

    There aren't many 1080/50 or 60p devices though - well the sources may exist, but the mixers, VTRs, Servers etc. to handle them aren't as widespread, and they require 2xHDSDI interconnects per source if you are handling them uncompressed AIUI.
     
  19. Rob20

    Rob20
    Well-known Member

    Joined:
    Jul 3, 2004
    Messages:
    3,087
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +659
    What about 1080p output from scalers? or Denon's upcoming DVD A1XV that has the HVQ chip that will scale dvds to 1080p? Not true 1080p sources, but 1080p signal.

    Anyway. It makes sense to me to be future proof and buy a set with 1920 by 1080 res. I have an lcd, (less than a year old), that can show 720p, and scale 1080i so I'm in no rush to buy another and can wait. Especially if HD-DVD or Blu-Ray does carry movies in 1080p24. Why accept anything less? unless you can afford to buy a new tv every 2 years or so.
     
  20. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    You can't really count 1080p sources from scalers as native 1080p sources - you might as well include de-interlaced 1080i as 1080p if that were the case. Sure the video signal output will be 1080p - but the content of the signal may not fully exploit the resolution available.

    Yep - BlueRay/HD-DVD may well support 1080/24 or 25p (and add vertical filtering in the replay device on 1080/50i or 1080/60i replay rather than during mastering I guess.) - we'll have to wait and see.

    Obviously it makes sense to get the best spec TV you can afford - however a poor 1080p display may look worse than a good 720p one - resolution isn't everything.
     
  21. Rob20

    Rob20
    Well-known Member

    Joined:
    Jul 3, 2004
    Messages:
    3,087
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +659
    If dvd players are able to output either a 480i or 480p signal, surely an HD-DVD or Blu-Ray player will be able to output either 1080p or 1080i? I would assume movies would be flagged 24p as they are on dvd. Don't all dvd players have to de-interlace the programme before sending out a signal, (unless it has a prog option/output).

    Also, I assume that hi-def broadcasts are to be 1080i50 because 1080p50 would take up too much bandwidth. This shouldn't be a problem for 50GB Blu-Ray discs. It might give you sufficient reason to buy the dvd of tv shows instead of just recording them for yourself. Assuming theirs a significant/noticeable difference between 1080p and 1080i.
     
  22. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,259
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +763
    480p progressive (and 576p progressive) outputs from DVD players still have the vertical filtering introduced to improve interlaced video look.

    A 480p output from a DVD (with 24p sourced images but encoded for 480i replay) will not look as good as a 480p natively sourced image - it will be softer vertically.
     

Share This Page

Loading...