• New Patreon Tier and Early Access Content available. If you would like to support AVForums, we now have a new Patreon Tier which gives you access to selected news, reviews and articles before they are available to the public. Read more.

Why is 768 lines supposed to be better at SD than 1080?

Majicthighs

Active Member
I keep seeing people say "of course a 768 line set is going to give a better SD picture than an HD set" and I don't understand why...

A PAL picture has 576 used horizontal lines, so on a 768 line set an extra line has to be added in for every 1.3 lines, which is fairly coarse. Where would you put it? :confused:

On an HD set, with 1080 lines, there are nearly 2 lines for every original line (1.875 actually) of the broadcast, which I would have thought meant a better chance of fine tuning the smoothing and getting a better picture.

Why is it always that the 768 line set is said to "of course" give a better picture?

Excuse the stupid question - but I just don't get it :(
 

Jonstone

Well-known Member
It is an issue of scaling, you also have to remember the other axis as well, a proper 1080 screen is 1920x1080 and it takes a lot of processing to translate a 720x576 interlaced signal to that resolution on the fly and most televisions you can buy were built to a price which didn't leave a lot of room for an expensive video scaler.
 

Majicthighs

Active Member
Thanks for replying :)

Yes I didn't mention horizontal as everyone seems to only talk about vertical in these situations.

However the same sort of situation applies - 1024 would be 1.5 lines to every 1 of the original, wheras 1920 is 2.6 - again a finer pattern.
Looking at it very roughly, this would mean on a "lower resolution" set every 2nd line is repeated across the width, but on a true HD set you'd have 3 lines per original, with 1 of the 3 missed out every 3 sets or so.

I appreciate there is a lot of math involved in upscaing, but I would have thought the more pixels to play with, the easier to get a good picture.

Just trying to understand :smashin:
 

toodeep

Well-known Member
More pixels give sharper scaling artifacts whereas SD is smoother on fewer lines? 1080 displays might scale SD better by cropping to 540 lines and doubling, but they don't it seems.
 

Osakan

Active Member
Scaling systems are not designed simply to add extra pixels here or there, while this can work fine on uniform features such as gradients, other image features such as edges and patterns are more troublesome. The varying complexity of the image features and signal means that a variety of techniques are used to upscale each frame.

The fundamental problem is that, when upscaling, the image processor in the TV does not know what the picture should actually look like, so has to guess. If the TV knew what the picture "should" look like - where there should be sharp edges, where there should be gentle gradients, and most importantly where there should be additional visible surface textures and features - then the extra pixels in a 1080p display would help, but it doesn't, so all those extra pixels do is add more opportunities for "noise" to be introduced.

In the simplest sense, the closer the output screen resolution is to the input signal resolution, the less information has to be "made up" by the image processor.

(I know this is not a great explanation, but I hope it may help a little)
 

Bumtious

Banned
Scaling systems are not designed simply to add extra pixels here or there, while this can work fine on uniform features such as gradients, other image features such as edges and patterns are more troublesome. The varying complexity of the image features and signal means that a variety of techniques are used to upscale each frame.

The fundamental problem is that, when upscaling, the image processor in the TV does not know what the picture should actually look like, so has to guess. If the TV knew what the picture "should" look like - where there should be sharp edges, where there should be gentle gradients, and most importantly where there should be additional visible surface textures and features - then the extra pixels in a 1080p display would help, but it doesn't, so all those extra pixels do is add more opportunities for "noise" to be introduced.

In the simplest sense, the closer the output screen resolution is to the input signal resolution, the less information has to be "made up" by the image processor.

(I know this is not a great explanation, but I hope it may help a little)


Actually that is a great explanation.:thumbsup:
 

kjt2004

Active Member
Scaling systems are not designed simply to add extra pixels here or there, while this can work fine on uniform features such as gradients, other image features such as edges and patterns are more troublesome. The varying complexity of the image features and signal means that a variety of techniques are used to upscale each frame.

The fundamental problem is that, when upscaling, the image processor in the TV does not know what the picture should actually look like, so has to guess. If the TV knew what the picture "should" look like - where there should be sharp edges, where there should be gentle gradients, and most importantly where there should be additional visible surface textures and features - then the extra pixels in a 1080p display would help, but it doesn't, so all those extra pixels do is add more opportunities for "noise" to be introduced.

In the simplest sense, the closer the output screen resolution is to the input signal resolution, the less information has to be "made up" by the image processor.

(I know this is not a great explanation, but I hope it may help a little)



Indeed a very good explanation


Interpolation is tricky for The scaler within most HD sets (due to manufacturers having to cost cut) , and will almost allways be done better by a dedicated external scaler.

Think of it in simple terms if the scaler is interpolating an SD picture to upscale it and that picture has a number of dark/shaded pixels then it simply reproduces these on a greater scale making the whole effect darker/ shadier than it should be

Also Most people refer to the vertical lines on a display as the human eye is more susceptible to these than the horizontal ones , i cant remember why exactly something to do with the way the human eye interpolates the information

regards
 

The latest video from AVForums

Fidelity in Motion's David Mackenzie talks about his work on disc encoding & the future of Blu-ray
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom