Have you checked your HDTV's geometry? A guide.


Novice Member
In this happy new world of HDTV, flat-panel displays, and 1:1 pixel mapping, you may assume that geometry errors are a thing of the past.

You'd be wrong. I've just moved to my first non-CRT display, and I've been disappointed to see more geometry errors than I ever have.

If you're not particularly picky about the shape of your pictures, I suggest you wander away now... :cool:

Otherwise, let's talk about picture shape.

What do you mean by "geometry error"?

People and things not being the right shape - too fat, or too thin. The most common error is getting 4:3 and 16:9 content confused - we've all seen this, and I'm going to assume that you've got at least this right, so people are roughly the correct shape, not 33% too fat.

If you're not bothered by that, or if you use a non-linear ("smart") stretch mode to avoid black bars, please leave now. The errors we're talking about are smaller than those you already accept.

Here I'm going to give a more advanced guide, where I'm talking about getting the last few minor errors sorted - making sure everything's exactly the right shape, doesn't vary depending on connection type, and you're not seeing any pointless cruft at the sides. Calibration, rather than basic set-up.

I'll throw in a little about overscan while we're at it.

First, start with HD geometry

The first thing to do with an HDTV is to check, and if necessary adjust, its HD geometry. In most cases, there will be no problem, as this is easy for manufacturers to get right.

You'll want a true HD test signal - not something upscaled from SD. I suggest the overscan test patterns on Blu-ray/HD DVD Digital Video Essentials, or the BBC HD testcard.

If you've got a 1920×1080 resolution display, life should be easy here - you should be able to get 1:1 pixel mapping with overscan off, and assuming your pixels are correctly square and your TV is actually 16:9, everything should be fine.

Regardless of your actual display resolution, the aim is to see as much of the test card as you can, while keeping the shape right. For the BBC testcard, the points of the white diamonds should touch the edges of a 16:9 screen. For the DVE overscan card, the white outer border should be visible all the way around.

Ideally, TVs should not overscan HD sources - everything should be visible all the way to the edge. If possible, turn off any overscan, or at least make adjustments to minimise it.

However, for normal TV picture shape is more important than seeing every pixel at the edge. Seeing picture content right at the edge is not that important - it's more important that the information in the middle is the right shape.

So check this, with a tape-measure or ruler. On both cards, the center circle should be exactly circular, to as close as you can measure. On the BBC testcard, the background grid should be squares, so any measurement of, say, a 6x6 part of the grid, should be square. And if you can't see the edges of the picture due to overscan, it should at least be symmetrical - the same amount cropped top/bottom and left/right, so that the centre marking is in the centre. (In the BBC testcard, the cross on the blackboard precisely marks the centre).

If your screen is truely 16:9, there shouldn't be much of a problem - if the edges are in the right place, the circle will be the right shape. But check - is your TV really precisely 16:9? If not, see below.

But how do I adjust it?

Ah. This could be tricky. Many TVs have an "overscan" setting in the user menu, but nothing more, certainly not separate width/height scale settings. You may have to find your service menu. On my Panasonic V10 plasma, the width, height and position of every resolution on every input can be adjusted in the service menu. Fortunately, no adjustment was required for HD inputs. In particular 1080p inputs have 1:1 pixel mapping by default - perfect geometry.

If the geometry is just ever so slightly off in a 1:1 pixel mode because your pixels aren't quite square, it's probably not worth adjusting - 1:1 pixel mapping is of sufficient benefit to not lose it for a very small geometry tweak.

If you're using multiple TV inputs, particularly connection types (both component and HDMI, say), check them all. And check both 720 and 1080-line, if you're going to use 720-line inputs. There may be a slight behaviour difference between native and non-native resolutions.

My TV isn't exactly 16:9!

Ah, my sympathies. This isn't actually that uncommon. Many smaller LCD TVs are actually narrower than 16:9 - 16:10 (1.6:1) is common, which is closer to 14:9 (1.56:1) than 16:9 (1.78:1). (There are also super-wide TVs like the 21:9 (2.33:1) model from Philips - I won't address those here).

On such not-quite 16:9 TVs, you will have to do one of three things:

  1. Slightly letterbox 16:9 pictures
  2. Crop the sides of 16:9 pictures
  3. Squeeze 16:9 pictures

Or some combination of the above. I would personally strongly favour the first two. If cropping, then having the diamonds touch top and bottom means losing 5% left and right. This is not too bad. Squeezing would defeat the aim of getting the stuff we see the right shape.

So we're done, right?

Sorry, grasshopper. We're moving on to the difficult bit: standard definition sources. This is where stuff is most likely to go wrong.

For this, we'll want SD testcards. I recommend the PAL Digital Video Essentials DVD (1.78:1 Aspect ratio pattern, title 19, chapter 12), or the BBC Test Card W (available on Freeview).

What's the problem with SD?

Look closely at the test cards. In particular, do you see the extra little triangles at the sides of the BBC W testcard? They're indicating excess content that shouldn't be seen. The DVE card is less clearly marked, but the labelling confirms the basic facts.

Both test cards are 720×576, but they are wider than 16:9 (1.78:1). They are roughly 16.4:9 or 1.82:1 (as indicated at the top right of the DVE card).

No they're not

Yes they are. Compare them to the HD cards above.

Note that the circles are still circular, and the heights the same, indicating no deformation. (There may be some slight inaccuracies - this is the best I could do with images of the testcards I could find online).

Explain this idiocy

720-pixel wide sampling was the first form of digital video. Prior to this, video was analogue. It had 576 lines (you could count them), but there were no "pixels" horizontally - it was just an analogue waveform. All you could say was that the picture data on a line was specified as 52 microseconds long (with an error margin allowing it to be up to 52.3µs).

The first digital video standards specified that when digitising video, the waveform should be sampled at 13.5MHz, and 720 pixels should be stored. This meant that 53.33µs of video data was captured (720 / 13.5MHz). More than necessary, but this was done to ensure that nothing was lost at the edges due to slight timing misalignment - the picture data being a little earlier or later than expected.

Now, the shape of the picture is defined in the analogue domain. It's the 52µs-wide picture that is 4:3, or 16:9 if anamorphic. So a 720×576 digital image that captured 53.33µs of analogue signal is a bit wider than that. The actual signal only occupies 702 pixels (52µs × 13.5MHz). (Note also that the pixels aren't square, unlike HD video. 702/576 does not equal 4/3 or 16/9. That's not important here though).

Now, later digital SD standards have stuck with the 13.5MHz sampling rate, but varied a little in what they do with their width.

For example, DVDs primarily designated their width as 704 pixels (the closest "round" number above 702), or 352 (704/2) for low-resolution. That's all that's needed to store a 4:3 or 16:9 picture. But 720 was permitted for compatibility. In practice most DVDs, including DVE, use 720 resolution. So all those DVDs are wider than 16:9 (or 4:3).

The best DVB SD broadcasts are 720×576, or sometimes 544×576. A 544×576 image is stretched by 4/3 (its 720*3/4 = 540, and 544 is the next highest multiple of 16 - a "round" number as far as MPEG is concerned), meaning that its still wider than 16:9. The middle 526-7 pixels should be visible. (One source I've found says 530 - I think that's an error, based on stretching 544 to 720, rather than 540 to 720).

And then there's HDMI, which uses 720×576 resolution for 576i/p images. I'll get onto that later.

So all this means...

When calibrating a SD testcard, don't rely on making all the edges visible. If it's a 720×576 frame, it's wider than 16:9, and you're not supposed to see the left and right edges. If you are seeing all the edges, then:

  1. Picture content will be too thin. Everyone will be a bit skinny.
  2. If you have no overscan, you'll tend to see rough edges and/or thin black bars at the left and right, particularly on broadcast TV. There is often not 720 pixels (or 53.3µs) of actual picture.

Calibrating SD

You'll want to check this for each input and/or signal type on the TV.

In the case of the BBC Testcard W, on a 16:9 TV without overscan, the points of the diamonds should touch the edge of the screen, the same as the BBC HD testcard. The little outer triangles should not be visible.

You want this:
(circle circular, diamonds touching edges)

not this:
(circle thin, edge triangles visible)

(But be aware that your Freeview box's rendering of the testcard may not be absolutely perfect - it relies on precision of its digital text rendering - it may be a pixel or two out of line.)

On the DVE geometry pattern, the marking is less clear, but it is sufficient. On a 16:9 TV without overscan, you should only just see the white border at the top/bottom, and just the tip of the longest of the 10 lines at the left/right edges. That means 9 pixels are being cropped at each side, which is correct. Like this:

Then, most importantly, with either card, check the geometry with a ruler as before. The potential error here is up to about a centimetre on the diameter of the centre circle, on a 42"-50" TV.

Also, unlike HDTV, a little overscan on SD inputs may not be a bad thing. Cruft right at the top/bottom and sides of the picture is not unusual. So you may want to overscan by just a few pixels on each side. But ensure that this is done symmetrically, and preserving overall picture shape.

If the shape's wrong, where do I adjust it...?

Ah, that's also tricky, thanks to upscalers. And thanks to an problem with the HDMI spec (more on that later). Let's go through the possibilities.

No external upscaler - feeding 576i or 576p to the TV

In this case, adjustment will be in the TV, probably through the service menu, as with HD. On my Panasonic V10 plasma, analogue 576i/p inputs had correct geometry. HDMI 576i/p inputs, and the internal DVB-T tuner, were too thin, displaying all 720 pixels.

This TV has independent service menu controls for every resolution and every input, so it is possible to stretch HDMI 576i/p without affecting the already-correct HD or analogue resolutions.

If you can't adjust SD separately from HD, you're going to have to fiddle a bit more. Maybe per-input setting is possible, and you can route all SD sources through one input, and all HD through another.

For example, maybe you could use SCART for all your SD sources, if the TV's SCART aspect is correct, but its 576i HDMI aspect is wrong, as long as the use of the SCART doesn't introduce any worse quality problems.

Integrated upscaler in DVD player or Freeview/satellite receiver

Here, you may be in trouble. If the device is outputting HD, then it is its responsibility to trim the edges off of a 720x576 signal when upscaling to an HD resolutions. 720p, 1080i and 1080p are all true 16:9, so only the 702x576 16:9 central region of the signal should be scaled into them.

If your device gets it wrong, the best hope is that it has a manual scaling option, or maybe a specific "pixel cropping" option. Otherwise, the next option may be to feed this device into a different input of the TV, if the TV has per-input adjustment, and configure that input to (incorrectly) stretch 1080p, say, to counteract the error in this device. Don't adjust an input that will also be used for correct 1080p.

Alternatively, avoid using the device's upscaler at all. Is it really giving you better quality than just feeding 576i/576p to the TV?

My belief is that probably the majority of kit gets this wrong. Why? Because reviewers unaware of the aspect issue are more likely to notice pixels being cropped on test cards and mark them down, than to notice the resulting geometry error from squeezing the picture to get all the pixels in.

Various digital TV standards are very explicit on this - they say that the cropping to 702 must be done when upscaling to HD resolutions.

I've got a separate post in the DVD forum looking for DVD players that get this right.

As a further note, such integrated upscalers tend to be full-frame, hence implicitly have no overscan. Combined with the desire for an HDTV to have no overscan on HD inputs, this does tend to rule out any overscan when using such a source. Probably not a problem for DVDs, which tend to be "cleaner" at the edges, but could be a little annoying for broadcast TV. Not a lot you can do about it, unless you can control overscan on a per-input basis on the TV.

Separate upscaler, eg in an A/V receiver or video processor

If this is going wrong, there are a couple of options. First, if it does analogue->HDMI upconversion, try an analogue input instead. It may get the geometry right for analogue but wrong for HDMI, as the designers will have had less of an urge to "get all 720 pixels" in to the upscaler.

Secondly, a good separate upscaler may have per-input scaling adjustments.

Any more horror stories?

Just one - the HDMI spec is apparently wrong. It says (or rather its underlying CEA-861-E spec says) that its 13.5MHz 720x576 mode is 16:9 or 4:3, exactly. So a display device or upscaler is arguably perfectly in its rights to upscale a 720x576 HDMI 16:9 input to fill a 16:9 frame.

But the problem with that is that this means that any cross-conversion between analogue or SDI and HDMI would be greatly complicated, and pretty much all SD HDMI sources outputting 576i/p in the world would be wrong. If the HDMI spec was taken at its word, they would have to take the wider-than-16:9 720x576 picture off of a DVD, or off of a TV broadcast, crop 18 pixels off the sides, and scale the remaining 702 to fill the 720 of the HDMI signal. No-one does this. They output the same 13.5 MHz data over HDMI as they do over their analogue outputs - a direct pixel mapping of the 13.5 MHz data of the source.

This means that in practice, a 720x576 HDMI signal is the same shape as a 720x576 SDI signal or a 720x576 DVD frame, or a DVB 720x576 frame. Slightly wider than 16:9. Despite what the spec says.

The HDMI/CEA-861-E spec appears just to have made an error here. There's no suggestion of any deliberate intent to differ from the underlying and preceding 13.5 MHz standards. And a clarification note in one place mentions adding black borders to a 704x576 frame when outputting over HDMI, suggesting that the intent was that the pixels, and hence the aspect ratio, be the same.

So there is a school of thought, from slavish following of the HDMI spec, that says if you're outputting 576i/p over HDMI, and it's too thin, it's your source's fault for not cropping and stretching it horizontally. I can't go along with that.

I'd rather see all the 720x576 pixels

Fair enough. But just be aware that the price you pay is standard SD content being 2-3% too thin, and frequent left/right cruft.

But on the other hand, I'm sure a not insignificant minority of content is incorrectly mastered believing that 720x576 is 16:9 or 4:3. You would be compensating for the error in that content - correctly set up systems would show it 2-3% too wide.

What about 480i/p?

The issue's basically the same, but the underlying specs are a bit woollier. General consensus is that a 4:3 or 16:9 image is 704x480, so 8 pixels should be cropped from each side of a 720x480 frame.

If you're going to be using 480i/p, go back and check, using the NTSC version of Digital Video Essentials.

What about 4:3 content?

With luck, you shouldn't have to do any separate calibration. Assuming you use pillarboxing, it should be shown at 3/4 of whatever width you set up for 16:9, and hence be correct. When investigating, you can no longer rely on the markers at the edge - the TV may well be blanking part of the left/right edges itself as a virtual "overscan" for pillarboxed content. You'll have to check geometry with a ruler. If you don't have a 4:3 test card, use a 16:9 one, put the TV into 4:3 mode manually, and check the circles' and squares' widths are exactly 3/4 of their height.
Last edited:


Novice Member
No problem with the typing - I like a good technical rant. :smashin: It's the damned images that were a pain...


Novice Member
Great technical thread KMO, I now understand what the hell Humax are doing with their output from my FOXSAT HDR!:rotfl:
I incidentally have a Panasonic TX-P42S10 Viera TV btw.

On my panasonic DSB-50 I could turn overscan off on my RGB SCART and it would still fill the screen and look sharper.
I can't do this on the Humax, there are blank edges on SD output left and right, or left and bottom.
The FreeSat Guide page via RGB Scart unfortunately crops with overscan engaged!:mad:

Most BBC HD output, overscan can be disengaged, but some content still has blank areas, left, right, or bottom : annoying!! I dont want uneven phosphor wear.:(

Whether it would be a good idea to reposition/centre non-overscanned SD content, I'm not too sure - rather reluctant to dive into the service menu!:eek:

Does the overscan correct the geometry issues on the DVB-T tuner, inside your Panasonic V10?
How noticeable is the distortion?

When using the HDMI for SD on my Humax, I run at 1080i, it looks quite bad IMHO at 576i, 576p and 720p from this receiver. Any thoughts as to why this should be?

Incidentally, and perversely, switching HDCP default to disabled actually sharpens up HDMI and SCART output on this box!:eek: Yes, I know it shouldnt do that! ;)


Novice Member
Haven't actually checked what overscan does to the geometry, as I use THX mode for everything. I'd imagine it doesn't affect it - if it's 2% thin without overscan, it'll still be 2% thin with overscan.


Novice Member
Many thanks KMO for this excellent info. Looks like the missus will be putting up with an evening of test card fun! :thumbsup:


Novice Member
In practice most DVDs, including DVE, use 720 resolution. So all those DVDs are wider than 16:9 (or 4:3).
Are they really? My recollection of my anamorphic 2.xx:1 DVDs (with top/bottom black bars and where you would *not* want any image in the horizontal overscan area) is that the image uses all 720 horizontal pixels. Are DVDs typically encoded incorrectly (i.e. 720 pixels = 16:9) or do I need to take a closer look?

Presumably, when the BBC shows movies in SD, it correctly displays 702x576 pixels?


Well displaying 576x720 across the screen rather than the correct 576x702 the Digital Video Essentials circle is more the wrong shape than the THX Optomizer circle. So as far as how a DVD is suppose to be mastered and how it has actually been mastered who knows. It would not surprise me if it varied film to film depending on who did the mastering.

In theory for NTSC video image plus blanking intervals 525 lines running at a frame rate of 30,000/1001 frames per second. 13.5 MHz digital master clock divided by 525 lines divided by (30,000/1001) = 858 digital samples per line of video. 858 - 138 (blanking interval and timing information between each line) = 720 picture samples. This is good as MPEG encoding requires a multiple of 16 progressive or 32 interlace. But this is production aperture in analogue details at the edges can suffer ringing due to overshooting voltage. So a safety margin is built in, the damaged pixels at left and right cropped off to give clean aperture display picture of 702 samples.
Since analogue NTSC is 486 lines to enable MPEG encoding due to mpeg blocks it is truncated to 480 lines. Analogue NTSC is 52.66μs line, digital pixel clock 13.5MHz, this would give 13.5 x 52.66 = 710.91 samples, but since the image is truncated 6 lines it is (710.91/486) x 480 = 702.1333 rounded down to 702 samples display picture.

For PAL analogue an entire frame video image plus blanking intervals contains 625 lines running at a frame rate of 25 fps. When mastered for DVD a 13.5 MHz digital clock is used. 13,500,000 divided by 625 divided by 25 = 864 samples per line of video. 864 - 144 (blanking interval and timing information between each line) = 720 picture samples. But again this is production aperture. Display picture should be master digital pixel clock of 13.5MHz, PAL line is 52μs, so 13.5 x 52 = 702 picture samples.

NTSC digital TV to save bandwidth / bit-rates typically uses 704 instead of the 720 used by DVD. Although all the following are standard definition digital NTSC transmission standards 480x720, 480x704, 480x640, 480x544, 480x528, 480x480, 480x352.

PAL digital TV is anywhere from full-D1 resolution 576ix720, to cropped-D1 576ix704, to sub-sampled D1 576ix544 or 576ix528.

For digital TV picture quality it is generally better when bit-rate is limited to pre-smooth and lower the resolution, rather than maintain a higher encoded resolution and suffer lots more MPEG encodinging artifacts.

DVD films as opposed to test discs typically have less detail resolution than their encoded 720x480 or 720x576 resolution. DVDs are sampled at 6.75MHz (13.5MHz master clock) but the A/D converter is normally preceded by a low-pass (anti-aliasing) filter with a specified cut-off frequency of 5.75MHz which effectively removes all detail above 6MHz. Which is approximately equal to a detail resolution of 640 (NTSC converted to 4:3 square pixels is 480x640)
Since DVD is encoded interlaced and expected to be displayed on a crt interlace display, it is also pre-smoothed vertically to prevent interlace line twitter, so maximum contrast single line height detail is also not present.

For production using digital displays with square pixels the following are commonly used.
PAL square pixel production image is 788x576, display picture 768x576. Widescreen it is 1050x576 production, 1024x576 display.
NTSC square pixel display is production 720x534, display 712x534. Widescreen it is 872x486 production, 864x486 display.
Last edited by a moderator:


Novice Member
Great post; still re-reading it :) Can I clarify: The correct display picture size is 702x576 for PAL DVD and 702x480 NTSC DVD?

Do you feel that the correct display picture is generally used for DVD?


Great post; still re-reading it :) Can I clarify: The correct display picture size is 702x576 for PAL DVD and 702x480 NTSC DVD?

Do you feel that the correct display picture is generally used for DVD?
In theory I believe 702 is correct because of the 13.5MHz digital master clock used by the video adc during mastering and by the dvd player video dac during playback, and the time a analogue crt display has to draw each line.

But in practice I have no idea.

I think all dvd players output 576x720 not 576x702 and all fixed pixel displays will display 576x720 not 576x702. Both with no option to display 576x702, so it is a none issue unless you have a stand alone video scaler with more user controls.

As far as how DVDs are mastered it seems to vary, some have image all the way out to 720, while others will have narrow black bars to the left and right if displayed with no overscan. I would expect any slight geometric distortion to be really minor, not noticeable unless you are doing a side by side comparison or getting out a tape measure.

Oddly European digital satellite receivers seem to vary some outputting 576x702, others 576x720. But if upscaling to HD and working correctly to the standards they scale only the central 702 up to HD.
Last edited by a moderator:


Well-known Member
I have only just become aware of this revived thread but still thanks KMO -will take some reading so will print it off and read at leisure.

Glad to see use of BBCHD testcard - I use this for all applicable aspects of set-up using the BBC guidelines. Its amazing what it tells you.
Last edited:


Well-known Member
Before I moved home I had a VHS recording of the test card.
Loved my historical geeky tape. Lost it in the move. :(
Great to see I'm not the only test card fan! :smashin:


Novice Member
Excellent post but i'm not going to read any of it as i have a enough ocd issues as it is with blurring, black levels and input lag to name a few:rotfl:

Trending threads

Top Bottom