• New Patreon Tier and Early Access Content available. If you would like to support AVForums, we now have a new Patreon Tier which gives you access to selected news, reviews and articles before they are available to the public. Read more.

What is the effective resolution of High Definition?



What is the actual effective resolution of High Definition formats 1080p 1080i 720p?

Not the format resolution but the actual visual detail resolution.

I know DVD format and actually resolution are different and partly display dependent. I have been reading some stuff that says the same is true for High Definition.

Is the following correct?

A good 35mm camera lens will have 80-90%MTF at 20-21cycles (line pairs) per mm, on a 24mm wide film (super 35mm excluding the soundtrack) this gives 1000 resolution with high contrast. Due to image MTF being the result of Lens > Film Negative > IP from negative > IN > Print > Projector Lens, if every stage is at 90%, the end MTF is 53%. The absolute resolution is higher but the eyes perception of sharpness is dependent on contrast.

Maximum resolvable horizontal detail for normal telecined movies is 800 to 1100 typical with 1300 at best

4k telecines of movies, or 4k digital-intermediate recordings, downconverted to 1920X1080, deliver full 1920X1080 resolution

High definition cameras following the SMPTE spec have a digital filter at 30MHz (872 line pairs = 1744) This is to reduce analising banding interference due to Nyquist limit on sampling. It has the effect of reducing the camera lens MTF of 75% at the filter (872 line pairs=1744), 60% at Nyquist limit 37.125MHz (960 line pairs=1920) to the typical camera spec of 45% MTF in the center of the image at 800 line pairs = 1600

MTF 50% is the point the image is out of focus. But the zone of confusion in depth of field vision means that the area at or above 35% MTF looks infocus. MTF 5% is very blurry and usually taken as the resolution limit, but a trained observer can distinguish objects down to 0.5% MTF.

Hollywood movies available in the D-Theater format (what is that?) are often mastered in 1080p then converted to 1080i before they are mastered and duplicated (what resolution?)

Does this mean old films rereleased on Blu-ray have less resolution than new films, or do they have the same resolution if they are remastered?

HD TV Live cameras operating in 1080i often use alternating two row pixel grouping with a 25% or more overlap to improve camera sensitivity and reduce diagonal jaggies and interframe flicker, but it also reduces vertical resolution.

Then the MPEG encoder may limit the resolution in 1080i to 1440 pixels to minimize artifacts. BBC HD TV channel is 1440 resolution.

Progressive video is usually a sampled and filtered image, interlaced images are more heavily filtered vertically
720p and 1080i formats are digitally sampled at ~74 MHz. To prevent aliasing artifacts filtering at less than half the sample frequency is applied. As a result limiting resolution 720p ~ 1035, 1080p ~ 1549

Whatever horizontal resolution reaches your MPEG decoder, the decoding process further reduces resolution by up to 20%, typically 15%.

On moving objects, interlaced images will drop to half vertical resolution unless reverse telecine technique is used.

Are all set top boxes about the same, or are there major differences in decoders reducing resolution and not reverse telecine film sources?

Effect of display

Kell factor reduce vertical resolution, the image information being recorded and displayed must line up with the pixel for it to be displayed. A series of pixels making up black and white lines one pixel wide can all be displayed only when they line up with the displays pixels, rotate the lines and detail is lost.

For progressive digital displays Kell factor is 90%. So a 1920 display resolution becomes 1728. With interlaced signals Kell factor is 70% of the progressive figure because they need to reduce the visibilty of filckering, so 1920 display resolution becomes 1209.

With a display refresh rate a multiple of 24fps, the perceivable image detail is improved due to removing motion judder. It can be further improved using image smoothing to produce less blurring than was present on the original film. The original blurring being due to the deliberate use of camera lens filters and selective focusing and object movement.

What displays use this smoothing technique, does it have any negative side effects, make the image look unnatural?
Last edited by a moderator:

Joe Fernand

Distinguished Member
AVForums Sponsor
Might be worth moving this to the Video Processor forum - the more esoteric boys can have a field day :)

Any maybe nudge Forums member Stephen Neal re how they shoot what these days.



Distinguished Member
Not sure what all this means but...D-Theatre is also known as DVHS and was a 1080i consumer distribution HD format based on the VHS cassette with Dolby Digital soundtracks. lscolman has some for sale in the classified I believe...

HD masters prepared for BD encoding are most commonly supplied on Sony HDCAM-SR tapes which have a 1920x1080 resolution. These are most likely to be created straight from the Telecine or DI files.

TV production may rely on the cheaper HDCAM format which does have an on-tape resolution of 1440x1080.


Distinguished Member
Most of those techniques they're talking about with sampling/filtering etc sound about right. I don't think I care too much to be honest, there are good reasons why most of that is done (the example of filtering to reduce the risk of line twitter on interlaced 1080i being a good one)

As for image smoothing; I can't imagine why anyone would want to use something which removed blur that was deliberately part of the original film recording, and I can't think how it would even work.
Last edited:

The latest video from AVForums

Fidelity in Motion's David Mackenzie talks about his work on disc encoding & the future of Blu-ray
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom