Okay, this is a somewhat thorny topic, but it's one that gets asked quite a lot - for any given viewing distance, what is the optimum screen size? Or, conversely, for any given screen size, what is the optimum viewing distance? My initial thoughts on this are that you should probably aim to have a pixel size (or rather pixel separation) which is of the order of the resolving power of the human eye, but not greater than that. The resolving power of the eye is defined as follows: If you have two thin, bright, parallel lines on a dark background, how close together can the lines be before the eye sees them as a single bright line rather than two separate lines? This is defined in terms of an angle rather an a distance, because the distance between the lines in inches (or millimetres!) will depend on how far away they are: if they are (say) 1mm apart, you can easily separate them at a distance of 6 inches, but you probably can't at a distance of 100 yards. My reasoning is that, if the separation between pixels exceeds this angle, a typical person will be able to make out the individual pixels as pixels, rather than simply seeing a smooth image (which is horrible) but that, if the pixel separation is significantly smaller than this angle, a typical eye cannot actually resolve any more detail, so the extra resolution is wasted. The resolving power of a typical eye is normally taken to be roughly 1 minute of arc, that's one sixtieth of a degree. If you plug this number in, here are some of the suggested viewing distances you get for different types of screen, rounded to the nearest half foot: 4:3 CRT TV, 4:3 PAL signal (these figures are a bit suspect - I think real CRT TV dot pitches may be too large for this analysis to be valid) 14" - 4.0ft 17" - 5ft 20" - 6ft 22" - 6.5ft 26" - 8ft Widescreen CRT TV, 4:3 PAL signal 28" - 7ft 32" - 8ft 36" - 9ft Widescreen CRT, anamorphic PAL signal 28" - 9 ft 32" - 10.5ft 36" - 12ft NTSC SD Plasma (852x480) 37" - 11ft 42" - 12.5ft 1024x768, 42" plasma 10ft ALIS Plasma (42", 1024x1024) 10ft 1280x720 (720p) 37" - 7ft 42" - 8ft 45/46" - 9ft 50" - 10ft 55" - 10.5ft 60" - 11.5ft 62" - 12ft 65" - 12.5ft 67" - 13ft 70" - 13.5ft 72" - 14ft 1280x768 LCD/plasma 32" - 6ft 37" - 7ft 42" - 8ft 50" - 10ft 1366x768 LCD/Plasma 32" - 6ft 37" - 7ft 42" - 8ft 50" - 9ft 55" - 10ft 60" - 11ft 65" - 12ft 71" - 13ft 1920x1080 display 45/46" - 6ft 50" - 6.5ft 55" - 7ft 65" - 8.5ft 70" - 9ft 85" - 11ft 100" - 13ft One rather obvious question here is, what is the effect of scaling? Consider, for example, an anamorphic PAL signal scaled up to a 1920x1080 screen. If you regard the picture as actually being the resolution of the signal, you would get an ideal viewing distance on a 50" screen of 17.3 feet, which is clearly way off! But if you base your calculation on the screen pixels (which gives you 6.5 feet) then you're saying that upscaled PAL looks just as good as a native 1920x1080 broadcast, which is also clearly wrong. So I'm not quite sure how to interpret that. This problem becomes more significant when you are dealing with CRT technology, or any sort of projector, particularly one where the pixel edges are "soft", i.e. the pixels tend to blend together at the edges rather than having hard, sharp boundaries. The ability of (say) a CRT front-projector to provide close to infinite scaling in the horizontal direction (albeit not vertically) is certainly one reason why CRT makes upscaled DVDs look so good - but clearly it's still not going to look as good as native 1080i/p. Anyway, I hereby declare the discussion open, and hope some of the above numbers prove useful. The figures in the list above, incidentally, are based on the size of the screen pixels rather than the original image pixels, except in the case of CRT TVs, where I've used image pixels. I've also attached an Excel spreadsheet which will let you calculate some of your own values. Feel free to tell me where I've gone wrong with it!
It occurs to me that I should probably spell out the significance of the above numbers a bit more. Take one example: a 720p screen, 42", has a suggested viewing distance of 8.5 feet. This means: 1) If you have a 42", 720p screen, you probably won't be able to sit closer to it than 8.5 feet, otherwise you will be able to make out individual pixels, which looks bad. 2) If you have a 42", 720p screen fed with a 720p signal. and you sit significantly more than 8.5 feet away, you will be seeing less picture detail than you would be seeing at the suggested viewing distance. 3) If you're going to get a 720p screen and you propose to sit 8.5 feet from it, you probably won't want to buy a screen larger than 42", otherwise you'll be able to make out the pixels. 4) If you're going to get a 720p screen fed with a 720p signal and you propose to sit 8.5 feet from it, you probably won't want to buy a screen much smaller than 42", otherwise you will be able to see less of the picture detail than you could on a 42" screen. 5) If you're going to get a 42" screen and sit 8.5 feet from it, you will see more picture detail in a 720p screen, fed with a 720p signal, than you will if either the screen or the signal is lower res. 6) If you have a 42" screen and you sit 8.5 feet from it, and the screen and/or signal have a higher resolution than 1280x720, you will not be able to make out any more detail in the picture than you could with a 720p screen and picture. (So a 42", 1080p screen, fed with a 1080p signal, at a distance of 8.5 feet, wouldn't actually look any better than a 720p screen and signal of the same size at the same distance).
Nick ... great post but you really should get out more ;-) ... but before you do, how about a version for projected images ... !! Jon
You might want to compare these with the viewing distance calculator at http://www.myhometheater.homestead.com/viewingdistancecalculator.html This uses a couple of standard metrics for calculating the optimum viewing distance based on a typical human eye with 20/20 vision being able to resolve detail as small as 1/60th of a degree of arc. SMPTE and THX also have some standard viewing distances in this calculator.
Being able to resolve detail as small as 1/60th of a degree is precisely the method that I'm using. Those figures are valid for projected images - they're equally valid for any display where the pixels have sharp edges. In cases where the pixels don't have sharp edges, that's the same sort problem as considering a scaled-up image: blurring the pixel edges has the same effect as scaling/interpolating to a higher resolution. Scaling up an image (or blurring the pixel edges) doesn't actually add any detail, of course. It reduces the objectrionable effect of the picture being visibly pixellated, but it does this simply by making it more blurry, not by adding more information. I think it's fair to say that (for example) someone with 20/20 vision viewing from 8.5 feet is never going to be able to get more detail out of a 720p picture than they can on a 42" screen, regardless of whether the pixels are blurred or sharp. Blurring allows you to make the picture bigger, but it doesn't add detail.
Please excuse my thickness as i am struggling to get my head around everything. I am looking to get a new LCD, the Samsung LE37R41B, i will be sat approx 10 feet from the screen. i am unlikely to be getting HD for a while as i think it'll be too expensive to begin with so will just be watching normal sky and dvds. i really want a bigger tv than the 32" i have at the moment, will the 37 be too big and therefore not look very good quality? Should i be looking at the 32" version?
That screen is 1366x768. For a viewing distance of 10 feet, a screen with that resolution has an optimum screen size of 55 inches. So 37 inches is far too small. That, of course, presupposes that you're viewing a hi-definition signal. With upscaled SD it becomes a bit more complicated. I've attached a PNG which gives you an idea of the effect of upscaling on a picture: you need to view it with your face quite close to the monitor screen - almost close enough to make out the pixels round the edges in the fourth line of text. And you need to view it FULL SIZE, not just look at the "attached thumbnail"!!! Imagine that the top line of text is an SD picture. If you make the screen twice as big with no scaling, it looks the second bit of text. If you make the screen twice as big with very very good scaling (better than you will ever actually get in the real world!) it will look like the third line of text. Genuine hi-definition might look a bit like the fourth line. (Probably a little better). Clearly, line 4 looks better than any of the others, and line 3 looks better than line 2. But does line 3 look better than line 1? Even if you assume scaling is that good, by how much could you scale up line 1 before the effect becomes annoyingly blurry? Hard to say.
All this talk of HD and scaling inspired me to do some slightly more realistic examples. I've done another PNG file which illustrates the differences between various amounts (and qualities) of scaling. But the stupid AVForums script refuses to let me attach it, because it's 700 pixels wide so you'll have to launch it in a separate window. CLICK HERE. We begin with part of a notional 1920x1080 signal, displayed as it would be on a 1080p screen of approximately 59" diagonal. You can then compare how the same image might look downscaled to an anamorphic PAL TV picture and displayed on a widescreen TV of approximately 32" diagonal; how the same DVD image would look on a 42", 1366x768 LCD TV upscaling badly, typically, and unrealistically well; and how the same LCD screen might look upscaling a 720p high definition signal, or downscaling a 1080p hi-def signal. Again, you need to view the image full size, and have your face really quite close to the screen - almost close enough to see the pixels in the largest sample text.
I got a sammy 50" hd ready DLP and at 6 foot away is fine. i too was dubious when i brought it but once you have experienced tv of that magnitude you wont go back... Kurt
Mod comment: This provides good answers to a lot of standard questions, so it's going sticky. Kepp watching, though; the sticky list is getting long, so I will rationalise it this week. Nick
I'm not sure those numbers are right. The human eye may be able to resolve 60 cycles per degree but that would be in a bright light conditions (i.e. outside light conditions) which is not the way we view our tv's. If you view your tv in your living room then it would be deemed a relatively low ambient condition which would reduce the eyes resolving power to something closer to 30 cycles per degree. "Highest detectable spatial frequency at high ambient light levels, 50-60 cpd; low ambient light levels, 20-30 cpd (cpd = cycles per degree of visual angle) " Source http://www.psy.gla.ac.uk/~steve/courses/vision/numbers.html This should half your viewing distance. Basically. I worked out if you sit more then 2 x screen widths from you tv/plasma/screen then your eyes cannot tell the difference between 720p and 1080p.
That's really low light levels he's talking about - looking at scenery in moonlight, that sort of brightness. The TV picture itself is bright enough for the eye to be in bright-light mode even if the rest of the room isn't. The change-over is to do with the eye switching from cones to rods: if it did that when you were watching TV, you'd be watching in black and white. Yay! My first ever sticky thread.
No. Your wrong. 60 cpd is reserved for very bright daylight conditions and is the maximum level that the human eye (in peak health etc...) can resolve. Really low light levels, your eye drops down to 3 cpd which is just about pitch black. At normal light conditions in the living room, that will drop to 30 or even 20 cpd.
i was thinking of getting the Samsung LE26R41B for a bedroom tv, from where it would be to where my eye was a bit less than 8F, is too far away to make out an HD picture? Also am i right in thinking the futher back one sits the less obvious a poorer non HD pic will look?
Well... yes, but wouldn't it be better to have less poor non-HD picture and sit closer? If I put that screen into my spreadsheet it suggests a viewing distance of about 5 feet (based on the size of the screen pixels) or 9 feet (based on the size of SD image pixels) - so probably okay for SD viewing, but you would get limited benefit from HD. Do audition it before buying, though - you should never buy any TV blind.
Its a tough one, no doubt. I ended up with a 32" LCD viewed from 8/9 feet based on the fact that the majority of programs will be in SD for a few years and this size gives me the best of both worlds. After watching my sisters 37" from the same distance and seeing how poor SD looked on it I went for the smaller screen.HD still looks great but obviously not as good as it would from 6 feet.My choice was all about comprimise. I will definately get a larger set in time (probably the time it takes me to persuade the missus )
I have a 32" plasma 8 foot distance for SD material Bean bags in front of safa for DVD's at 5.5 foot! Dont know what distance for HD as I have never viewed it at home.
I've been reading this thread with interest, and the one at Sound and Vision Mag. I have a Philips 37" 1080 panel (37pf9830) and watch from around 10 feet. From the information gleaned from this thread and the other sources, it seems that it would be a waste of money getting Sky HD at this viewing distance? I do sit on the floor at about 5 feet to play Xbox, but I can't see my whole family sitting huddled at that distance to watch TV! Am I right in thinking I need a 76" screen at 10 feet to get maximum benefit or am I missing something?? Surely the majority of people who are going to get Sky HD have a 32"-42" screen and sit around 8-10ft away, making HD useless? Cheers Astro
Yes. Well, it's not quite that bad. 8 or 9 feet distance and a 42" screen is ideal for 720p hi-def, which is likely to form the bulk of Sky HD programming. For a 32" screen and a viewing distance of 10 or 11 feet, yes, you're right: there's no visible difference between SD and HD, resolution-wise. But there might still be gains with 720p in terms of fewer compression artefacts, fewer scaling and deinterlacing artefacts, and a faster frame-rate leading to smoother motion.
I've got a question, which is probably stupid and shows I've missed the point! Why is the optimal distance for a 1024*1024 42" panel further away than a 720p 42" panel? Surely the pixels are "smaller" on the 1024*1024, or is the fact that the pixels are not as "fat" (because there's 1280 of them) on a 720p panel the key factor? And now for an off topic rant! Is the choice of 1024*1024 purely down to ease of manufacture? Why is there now a 1024*1080 panel from Hitachi - will a 1080 signal really look better on 1024 x 1080 compared with a 1024 x 1024 ... surely when their algorithms scale the picture in one axis the other will be impacted? Is producing a 720p 42" plasma with high contrast and brightness really that difficult!! Why are panels allowed to wear a HD badge when they've only got 1024*720 pixels, is it because: a. 1280 v 1024 pixels doesn't make that much difference to PQ or b. the people deciding on the criteria to wear this this badge were manufacturers who knew it would be a while before they would build a smallish plasma with 1280 x 720 pixels?
I get the feeling that some of these numbers were plucked out of the air by the originators, and didn't come from actual experience with the panels themselves. The suggested viewing figures for some of these panels is ridiculous. I agree - higher res panels should be capable of being viewed from a closer distance, not further away !
IMHO, the ideal viewing distance is simply the one that is closest to the screen without you seeing any of the artefacts or errors in the display/source. Unfortunately this means different viewing distances depending on what you're playing, e.g. with HDTV content you can sit a lot closer than with SDTV playback.
i am getting a 42" lcd screen . resolution of 1366x768. Will be sitting just under 15ft away. Is the screen too small ?
from 15 feet even at only 720p your need a very big screen to see its full potential 50inch plus maybe bigger
What do you think the maximum sensible screen size is for viewing properly upscaled SD broadcasts at 8.5-9ft? I am tempted to go 46"/50" for full on HD movie viewing pleasure, and probably even prepared to shell out for 1080p to be sure of no visible pixelation (certainly at 50"), but as many have pointed out SD will be by far in the majority for TV for the whole life of this panel. No point in having **** SD for the next several years.
Hi, If I get a 26" widescreen, high def TV, but my viewing distance is about 10ft, is there any advantage at all from having the high definition? I very nearly went to buy the Samsung LE26R74BDX, but decided I should find out a bit more first. Would I be a much better off with the 32" version? If not, I'll stick with the 26", as there is no way I'd get an even bigger TV (I know, that must be close to sacrilege around here). So, will I not appreciate any difference at 10ft with 26" HD, and similarly 32", over SD? Thanks.
10ft viewing distance on a 26inch set is too far for HD IMO and not wortyh the investment anything over 5 ft at 720p for 26inch set and your struggle to see a difference.