No need for 1080P sets as they look worse than 720P sets

Discussion in 'General TV Discussions Forum' started by meansizzler, Dec 27, 2006.

  1. meansizzler

    meansizzler
    Banned

    Joined:
    May 11, 2005
    Messages:
    7,892
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    104
    Location:
    London
    Ratings:
    +343
    Is it just me or does anyone else think that there is no point releasing 1080P sets, as the image on there will look worse than on a 720P set, take for example a 1080P native movie, watching it on a 1080P will show all the detail as well as all the grain, I've seen Superman 1080P Blu-ray on a PS3 on a 1080P on a Sony screen and there is just too much grain, but watch it on a 720P set as the Image is downscaled from 1080P to 720P so you will get less grain.

    Example of this is if you have a 4MP Digital Camera, then pictures taken at there native res are grainy when you view them at the full res but view them at half there res then they look much more sharper and less grain/pixelation.

    Also regarding the xbox 360 Console, games are 720P and outputting them at 1080P means the image is upscaled so the image won't look as good as viewing it at native 720P, imagine zooming in on a picture, you get pixelation right?..

    So why this 1080P/Full HD hype, there is really no need for it, unless you have content encoded at a res higher than 1920*1080 eg.. 2560*1600, just look at them 1080P quicktime HD trailers on the apple website, they look terrible at 1080P, so much grain/pixelation, but watch them at 720P and they look amazing..

    1080P is only usefull for Games that are 1080P native, eg Virtua Tennis 3 on PS3, for films there is no need for it, stick with a 720P set, or even them 1366*768 Displays....
     
  2. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    Those quicktime 1080p trailers look poor for the simple reason the bitrate isn't high enough, whereas that for the 720p ones is.

    A proper 1080p set, of a decent size (eg 50" minimum) is well worth having. Of course it's new to the market, so you might find decent priced 1080p sets aren't that good, whereas you can get outstanding 720p sets for very little (eg the Panasonic TH42PX60 is £800 now). Also, for a lot of people 1080p isn't needed simply because they sit too far away to see the benefit or don't buy a big enough set (or frequently both). Sit at 8 feet and buy a 50"+, else go for a 720p set and save your cash :)

    Of course, the SXRD blows all this away, 55" 1080p for only £1500. Big enough to show what 1080p can do, yet half the price of a 50" LCD of similar quality.
     
  3. meansizzler

    meansizzler
    Banned

    Joined:
    May 11, 2005
    Messages:
    7,892
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    104
    Location:
    London
    Ratings:
    +343
    I think your missing the point, watching 1080P content on a 720p set looks better than on a 1080p set, no matter how close you are, reducing the resolution of the video reduces the grain and increases the sharpness, main point was that there is no point in 1080p sets unlesss you run higher than 1080p content through it, i.e 2560*1600, then let a scaler scale it down to 1080p and feed it to the set, currently the hd-dvd/blu-ray 1080p resolution is just not high enough to give a decent picture on a 1080p display..

    If those guys who encoded the blu-ray movie in the first place done a decent job, then there would not be a problem, i mean isn't 50gb enough space to work with?..., drop the mpeg 2 standard and move onto vc-1, at 12mb/s you have a great picture at only 10 GB a movie double the bitrate and you have an insane quality 1080p encode
     
  4. flight78

    flight78
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    This is very interesting thank you for bringing it up.
    Been thinking of buying a new tv, first I thought 1080P but since im going to use it mostly for gaming Xbox 360 and such maybe its better to go 720P?? since most games use that. Otherwise it will have to downscale the image. Which will make the picture worse right??:confused:

    Maybe this is the wrong section to post this. If it is please feel free to delete it.
     
  5. meansizzler

    meansizzler
    Banned

    Joined:
    May 11, 2005
    Messages:
    7,892
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    104
    Location:
    London
    Ratings:
    +343
    The 360 will upscale the image, so it won't look as good as on a 720P Display..., only thing that will look good is a Native 1080P Game which there are a few and only on the PS3..eg Virtual Tennis 3..., also if you hook a PC up to a 1080P set then it should look great for websurfing, but other than that my opinion is that stay clear of these sets, unless they start to do better 1080P HD-DVD/Blu-ray encodes...

    Most LCD's are 1366*768 and the 360 can do 1360*768 via the VGA cable so you loose 3 pixels each side but you get 1 to 1 pixel mapping, and it is only a small upscale from 1280*720 to 1360*768 so games will still look great, also HD-DVD's will be downscaled to 1360*768 so they will look great to... so if you go for an lcd look for one with that resolution, alot of the newer plasmas out use that resolution, but some use 1024*768 widescreen which the 360 vga cable can do, but if you hook a pc up to it then not sure if it can output that resolution..., so if you go for a new TV get one with either 1366*768 or 1280*720 Resolution for a decent picture...
     
  6. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    All films use a variety of filmstocks . Grain will vary on a scene by scene and even shot by shot basis. Grain can also be exacerbated deliberately for creative effect or as a consequence of other creative processing. (Se7en , AI, WOTW, Syriana...too many to mention).

    Grain is the image , you cannot remove grain that is already present without compromising the image detail. Its niave to consider grain a flaw , you might as well say the texture of a canvas is a flaw in a painting: its an inherent part of the piece.

    Superman Returns was shot on a genesis "digital film" camera . I'm not entirely sure of the resolution they used (I'll have some genesis footage to workk with in th near future) although I know a lot of the effects shots were completed at 4k. Whilst it doesn't have grain as such it does still exhibit noise to a greater or lesser extent.

    HD masters of films either digitally aquired or shot on film should be faithful to the director and cinematographer's intended vision. If that means visible noise or grain so be it. In fact most of the compression schemes will lose fine transient detail as a matter of consequence of the compression techniques they employ so maintaining faithful grain structure is something a good master should exhibit.

    There are also sound perceptual reasons for maintaining a bit of noise/grain in an image as it actually helps human beings pick out fine detail. It makes imagery look sharper and smoother (its very good at hiding quantisation artifacts on 8bit video as well).

    Grain can actually take on an unpleasant lumpy look at lower resolutions so deliberately dropping down to 720p from 1080p won't necessarily diminish noise , it may even make it look more unatural.
     
  7. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    Certainly doesn't, at least on my Sagem 720p 45" and Sony 1080p 55". Of course, if the 1080p set isn't good enough or not properly set up (aka every showroom), the benefit won't be seen.

    Sorry, that doesn't make sense. Reducing resolution can never increase sharpness.
     
  8. dolph

    dolph
    Active Member

    Joined:
    Dec 16, 2002
    Messages:
    1,192
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Ratings:
    +26
    And of course reducing resolution loses detail....
     
  9. neilmcl

    neilmcl
    Well-known Member

    Joined:
    Oct 29, 2004
    Messages:
    6,238
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    103
    Location:
    Nottingham
    Ratings:
    +292
    Well why don't you take this further and say why bother with HD sets at all, just stick to your faithful CRT. :rolleyes:
     
  10. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,274
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +942
    Though I still think that a 1080p display looks much better than 720p with the right source, I think meansizzler has got a point. Now that I've seen a number of HD DVDs (rather than just downloads) it seems that grain does enter the equation at 1080p, and I have to say I wish it hadn't. I guess we're all familiar with grain on large photo prints and at the cinema, and I suppose in some ways the director is trying to recreate the cinema experience in the way the disc is produced.

    Firstly, in one sense, perhaps watching an HD DVD on a big screen at home is more like watching the film at the cinema if you can see the grain, as it's part and parcel of the filming and projection process.

    Secondly, showing grain may also be a perverse way to promote the resolution capability of HD playback - 576 and 720 lines aren't enough to show it, so being able to see it at 1080 must be an assett, then?

    I disagree with all of this, because of the questionable objective of "recreating the cinema experience at home". To my mind, video replay should be a reproduction of an original event, and if it is successful, it will transport you there. On a beach; in a whale's stomach; wherever the director decides it is. But that is all a cinema is trying to do - to reproduce something else, and hopefully make you forget that you are even in a cinema at all, if the film is good enough.

    Going to the cinema is not the event itself, and it has a few limitations of its own that contrive to spoil the illusion; and I think that grain is one of them. There's not much you can do about at the cinema, because of the size of the film and the size of the screen, but just because you can see grain at the cinema doesn't mean that you should see it at home.

    Except where the director has chosen to shoot in a grainy film for theatrical use (their choice, not mine) I don't think there's any reason for HD discs to suffer grain. I believe 35mm film is supposed to be able to resolve 2-4000 lines depending on it's speed (MrD can correct me there) so the film itself shouldn't be a limitatation. But it does seem that with some HD DVDs, grain is deliberately added, and I think we're becoming familiar with them. I've seen MI:3 on several different systems now, and been frustrated by all of them. Equally, there are largely grain-free films like Batman Returns or King Kong that are much more successful in video reproduction to my mind. With the grain out of the way, you are free to watch the film in all it's clarity, and there is that much less to remond you that you are merely watching some intermediate reproduction process that cannot help but get between you and the original event.

    I think grain on discs is like distorted treble or boomy bass on the soundtracks. That's what a cinema sounds like, so shouldn't you try to reproduce the whole cinema ecperience at home, wart and all?

    Sorry, rant over.

    Nick
     
  11. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,595
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +920
    Bizarrely, resolution and sharpness are not entirely interlinked. A lower resolution source can appear sharper than a higher definition source - it is not just about resolution, but also contrast, and the distribution of high frequency content.

    Of course compression at different resolutions and how it is implemented can also impact on sharpness, as can oversampling effects. (Which is why a 576i originated image can appear less sharp than a 1080i downconversion to 576i - even though they have identical resolutions)

    The CineAlta research that Sony undertook demonstrated that some 1080p video implementations were "sharper" than film equivalents, even though the film had a higher resolution.
     
  12. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,595
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +920
    I guess the fact that research has gone into artificial grain introduced in domestic players on replay won't be good news for you then...

    Because compression hates grain, to delivery the minimum of compression artefacts it is a good idea to ensure the master you compress from is as grain-free as possible. However if you do this, some people think the result ceases to look like a "film" - so research has gone in to re-introducing a grainy look AFTER decompression in the player...
     
  13. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    This is an oversimplification of the whole relationship between audience and exhibition. Films are created artifacts and like all artworks work on a variety of different levels with regard to the response the creator is trying to provoke from the viewer. Films are not about trying to convince the audience they are watching some analog of reality.

    The actual mechanisms of film exhibition are extremely abstract. Time unfolds differently to reality , the visual experience is far different from reality , the editing together of more than a single viewpoint of a given scene is not reality. Its a totally unatural experience and those differences are the very tools that a director uses to provoke the desired response from the audience. If you showed a traditional mainstream film to someone with no cultural experience of the medium it would make little sense to them,it would be little more than a series of vaguely connected images with some disjointed music over the top of it.

    Director's are delivering you specifc codifiers and mechanism to engender a desired response. Viewing films as mere escapist alternative realities is like describing an orchestra playing a piece of music as some people making noises with various equipment.

    If you are watching a film with deliberate grain , the grain is there to communicate something . On a simple level it may be trying to echo a cinema verite , documentary type look with the audience that is either sympathetic or at odds with the action depicted . It can be there as a distancing effect precisely to remind the audience they are watching a created artwork. It may just be trying to lend a grittier unpleasant visual reinforcement to the proceedings.

    If the grain wasn't necessarily deliberate but is faithful to the look of the director's approved version of the film , then its also valid to depict it in the resulting video master as its representative of various aspects of the film's production that can be of interest to the wider meaning of the work froom both a social and historical perspective as well as a purely aesthetic one.

    If the grain is an artifact caused by deteriation of the original elements through time or physical damage then it should be cleaned up as much as possible because it is not indicative of the original intent or approved depiction fo the film.

    I see very little difference between removing faithful grain and colourising black and white films.
     
  14. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241

    Well Kodak would disagree with you there over how the MTFs were calculated and I for one have not seen a digital capture format that looks as sharp and smooth as film for a similar scene and that includes Genesis footage. In fact I'd go as far to say I've seen super16 look as sharp a lot of the time.
     
  15. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    Which is one of the many limitations of compressed video and is also a major factor in what makes VC1 a superior codec to mpeg2 : at least to my eyes . I would be interested in what Amir or anyone with more involved experience of the two codecs had to say.
     
  16. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241

    Just to clarify that comment.

    Within the limits and capabilities of a given format to display it in a subjectively faithful manner.
     
  17. Stephen Neal

    Stephen Neal
    Well-known Member

    Joined:
    Mar 29, 2003
    Messages:
    6,595
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +920
    They'd disagree with Sony - not me!

    I think that sharpness and resolution and their relationship is an interesting discussion point.

    I've had interesting discussions with people I work with about how HD allows higher quality blurring !
     
  18. Welwynnick

    Welwynnick
    Well-known Member

    Joined:
    Mar 16, 2005
    Messages:
    7,274
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Location:
    Welwyn, Herts
    Ratings:
    +942
    Ho ho ho, knew this would be a can of worms. Well it is for me.

    I suppose I'm an anal, politically correct videophile, but I do know what I want to watch, and why. I can see why directors chose the grainy look for films like Se7en, Traffic or Three Kings, and if that can be carried through to HDDVD then fair enough, but just because it's obvious at the cinema, doesn't mean it has to be on disc.

    I quizzed Amir about this at the MS / PJ seminar last week, and his comments were quite interesting. It sounded like the master was "grained-up" for some discs. I guess what looks grainy on a 30 foot screen will not at 30 inches, so it has to be exaggerated. I presumed that there may be a couple of ways to reproduce it.

    The first would simply be to capture it in the master as part of the intended signal, and carry it through the system as some sort of overhead. Since grain is fine-textured and random in nature, this would have to be quite a heavy burden on the coding and compression, and Amir confirmed that it was.

    Secondly, I asked if there was some sort of "grain engine", where grain could be detected and characterised before coding, then simply flagged on the disc, and reproduced by a grain generator in the decoder. I recall reading about something like this - developed by Philips or Siemens, I think. I understood that this wasn't used, though.

    Different people have always had different views about what is the ideal limit to detail - whether it should be grain or blurr. My feeling is that blurr is more innocuous, and doesn't take you attention away from the subject to such an extent. Where grain doesn't represent the limit of resolution, I don't believe it should be exaggerated. If it is - where do you draw the line? Should it be visible on a 720 line display? Should it even be visible on DVDs?

    YEUCH - lets add noise, distortion, clicks and pops to CDs!

    Nick :(
     
  19. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    Thomson did "Film Grain Technology" as an add-on for h264 - not sure if any current discs use it though, or if they ever tweaked it for VC1 usage. It works exactly as you say: degrain/compress/grainflag, then recreate by the player.

    Perhaps the ideal might be for players to have a user option not to recreate such grain :)
     
  20. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    Noise on HD video is more obvious and "objectionable" looking than the grain on most modern filmstocks anyway. At least it is on the majority of lighting scenarios you'll find on a mainstream film.

    The latest 400asa filmstocks from Kodak are not much grainier than the 100asa stocks from less than 5 years ago.

    And I suspect that the current crop of digital displays most people are using are further clouding the issue. I've not seen an LCD or plasma that didn't introduce its own noise into the image (LCD direct displays are particularly terrible for this in my experience: my dell 2405 is frankly useless for appraisng compression artifacts as a result).

    HD-DVD and BD are both at best 1080p 4:2:0. This essentially means that the majority of the information to create the blue and red channels has already been downsampled from 1080p significantly. The bulk of the visible grain on a film originated image comes from the blue and red records.So I don't see much merit in further downsizing as a means to "control" grain.

    This leads me to suspect that any grain addition happening at the mastering stage must only get applied as a simple noise to the luminance rather than being varied across the 3 colour records as it would on film. There isn't any point trying to apply it across the 3 records in a more "film" manner as it likely will get lost in the encode to a greater degree. Its possible noise applied to the luminance channel of a video master is maybe a bit too crude to nicely replicate film grain. Film grain gets less visible in brighter areas of the film ( as that's pretty much how film fundamentally works) so if the "grain" is highly visible in brighter areas of the image its likely been cooked up deliberately at the master.

    Also as I've said adding noise is a good way to hide quantisation artifacts (posterisation) and I wonder if this practice although innocuous at SD is maybe being applied a little too heavy handedly for 1080p mastering. Although a bit of noise might be less objectionable than posterisation.

    I can also think of extreme scenarios where removing grain prior to encoding to video and compressing is maybe going to give better results . Very blue dark imagery could end up with lumpy noisy artifacts otherwise and the predominately blue imagery is not really going to look less sharp if you soften it further as its already soft.

    However I would like to get across to people that film does not necessarily mean noisy and HD not necessarily clean (I'd actually go for the reverse but I digress). Take a look at Miami Vice for example , very obvious noise and shot digitally but I'd also say its a fantastic transfer to HD-DVD. ( I actually thought it was breath-taking and I'm very difficult to impress).
     
  21. dlpfan

    dlpfan
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Big thank you to all those posting in this thread; it has been one of the most enjoyable to read in this forum.

    I thought it would be fun to return to one point though (as Stephen Neal has tried to re-emphasise) which is the discussion about resolution, sharpness, and detail. Unfortunately, I am not so familiar with the use of these terms in this thread because it seems like these terms are being mixed together by different posters which makes it a little confusing.
     
  22. Quickbeam

    Quickbeam
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I would choose a film-shot feature over a video shot one any day; not because I dislike video, but because of the damage that gets done to video when people try to make it look like film. I have been very disappointed with the quality of all of the Genesis-shot stuff I have seen so far.

    The OP doesn't specify the display type, but I would question the ability of any current gen 1080p LCD to do justice to a 1080p source. It's not simply a question of the number of pixels, but the quality of those pixels. LCDs are particularly prone to emphasising grain especially in dark areas.
     
  23. Mr.D

    Mr.D
    Well-known Member

    Joined:
    Jul 14, 2000
    Messages:
    11,198
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    133
    Ratings:
    +1,241
    To be fair the Genesis isn't really HD video (it should be better... and is to be honest) but I agree with your point.
     
  24. Phil_Yeoman

    Phil_Yeoman
    Standard Member

    Joined:
    Nov 17, 2002
    Messages:
    185
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Durham
    Ratings:
    +1
    I can definitly say the 1080p is better for watching HD or upscaled DVD from a quality source. Both plasma and LCD introduce their own artifcats into the picture and the higher the resolution the less noticible these become.

    I own TH-65PF9 as well as a 50 inch pioneer and a 42 inch fujitsu and with identical HD material the panasonic without exception looks the best. For tv or poor quality SD the Pioneer does the best job as its poorer contrast actally becomes an adavantage.

    Low bandwidth SKY on the Panasonic,, OH DEAR.
     
  25. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    Interesting, hadn't seen that on my sets - could it be different deinterlacing/scaling qualities on yours? I use a HTPC as my source for everything (HDDVD/satellite/xvid/DVD), so it does all that before it hits the set.
     
  26. Phil_Yeoman

    Phil_Yeoman
    Standard Member

    Joined:
    Nov 17, 2002
    Messages:
    185
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Durham
    Ratings:
    +1
    I Feed them all at native resolution via the vp50 so all the pannel has to do is output.
     
  27. Badass01

    Badass01
    Standard Member

    Joined:
    Dec 7, 2006
    Messages:
    46
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Ratings:
    +2
    Now this is very confusing.

    Im looking to get the best out of my Xbox 360 and may also be getting a playstation 3.

    So what is the best thing for me to invest in? I was looking at the X Range of Sony LCD's - the 40inch one.

    Now you say that xbox games are going to look worse on this display - but surely in future, the newer games are going to be built for 1080p (microsoft included)...

    Also, 1080p is definitely going to mean better quality DVDs in comparison to my CRT isn't it? Im not too fussed about a HD-DVD or Blu-Ray source just yet. Let's say I will get a Blu-Ray or HD-DVD source in future - you are telling me that the newest thing on the market - 1080p, which is being pushed by the big manufacturers is actually going to make things worse? Hard to fathom. Surely a lot of research has been spent into this and with time the number of 1080p material will only get bigger, or so one would hope.
     
  28. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    Depends on the set's scaler/deinterlacer. The CRT doesn't need to do either, and has much better contrast than most 1080p sets (especially LCDs).

    Generally, you buy a HDTV to display HD. The only other reason is to get a bigger size than CRT can provide.
     
  29. bliss007

    bliss007
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    At least a real 1080p panel is 1920x1080, your so called 720p/1080i (inputs) panels are 1366x768 res so they are 16:10 so dont show a 1280x720 at 1:1 and 16:9.

    I watched a Apple Quicktime trailer (got to mess about to get one full res all are avail as 1080 but not all are 1920x1080 or 1080p), it was tagged as 1920x1080p in the players properties, now I class Quick Time 7 as a buggy beta so I played it back in Nero Showtime on my Sharp 1080p panel at 1920x1080p/60hz (from PC usint DVI to HDMI) the trailer looked sweet, also looks sweet on my 22" CRT 1920x1080 @ 85hz but obv interlaced
     
  30. arfster

    arfster
    Active Member

    Joined:
    Jan 17, 2006
    Messages:
    1,459
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    63
    Ratings:
    +132
    That's true, but what 720p source material is there outside the US? Everything is 1080i, so unless you have that resolution scaling is inevitable.
     

Share This Page

Loading...