HDTV in Doors, Sunday Times

quig

Standard Member
Joined
Jan 17, 2004
Messages
29
Reaction score
2
Points
30
Will HD squeeze out Freeview?
http://www.timesonline.co.uk/printFriendly/0,,2-535-1491011,00.html

Sky, the satellite service, says that it will start to introduce an unspecified number of HD broadcasts next year, and intends to offer “standard” HD, with a resolution of 720 lines per picture (compared with today’s 625), and a “luxury” version with 1,008 lines.
Perhaps someone here would like to correct this. If technology journalists don't understand the formats, what chance do the public have?
 
I've lost count of the number of times that I've seen "technology" journalists confuse 625 and 576, 525 and 480, i.e. the total number of scan lines with the active number.

I've also given up counting the number of times 1125 is described as superior to 1080 because it has "more lines"... Aaagh - 1080 IS 1125 - though 1030-1050 line active formats have also used 1125 (Japanese HiVision and Muse systems)

What gets worse is when they confuse scan lines with "horizontal resolution" which is also (annoyingly) often quoted in "TV lines".

The number of times I've read that VHS recorders can't record "the full 625 lines" of a broadcast signal, and instead reduce this to 200 to squeeze it onto the tape, whereas S-VHS recorders only squeeze it to 300 so offer a better picture.

Aaaagh!!

God help us when they try to explain interlace vs progressive, and understand the differences between 1080/25p, 1080/50i and 1080/50p, or 720/50p, or 576/50i or 576/50p... Then they have to understand and explain the difference between interlaced acquisition, interlaced broadcasting and interlaced displays, and progressive acquisiting, broadcasting and display, and all the combinations thereof.
 
The author's e-mail address is at the bottom of the on-line page.
 
I already wrote to him to correct some of his mistakes. For goodnes sake, hes supposed to be the one educating us, not the other way arround. How did he get a job for the Times?
 
Sky, the satellite service, says that it will start to introduce an unspecified number of HD broadcasts next year, and intends to offer “standard” HD, with a resolution of 720 lines per picture (compared with today’s 625), and a “luxury” version with 1,008 lines. For Sky, broadcasting from a satellite means bandwidth is cheap and available.

Ignoring the fact that 1008 is probably 1080i, surely $ky wouldn't bring out two versions of HD would they, apart from the fact many including myself believe 720p is superior to 1080i, it would just be a bodge.
 
taffyboyo said:
Ignoring the fact that 1008 is probably 1080i, surely $ky wouldn't bring out two versions of HD would they, apart from the fact many including myself believe 720p is superior to 1080i, it would just be a bodge.

How have you compared the two?

Most of the 1080 programming I've seen is obviously superior to 720 if one is is simply talking about visible resolution. The technical benefits of progressive over interlaced are not enough, in my opinion, to overcome the differences in resolution between 720p and 1080i that the eyes can see.
 
taffyboyo said:
Ignoring the fact that 1008 is probably 1080i, surely $ky wouldn't bring out two versions of HD would they, apart from the fact many including myself believe 720p is superior to 1080i, it would just be a bodge.




I expect they will broadcast in the format that either suits the material they are screening or the format they are supplied with although they could convert either format to the other if they choose.
Regardless of the format broadcast the STB or HD panel can manipulate the resolution to the viewers wishes (with I expect varying results).
 
Got a very prompt reply from the author. He is as annoyed as we are. The 625 vs 720 vs 1008 numbers were inserted by someone downstream of him in the production process (I guess a sub-editor or similar) - they weren't part of his original article.
 
Why are Sky supporting 720/50p and 1080/50i in their receivers?

Well they have to cater for all broadcasters who use their platform, not just Sky themselves. After all there are loads of non-Sky channels on their SD platform - the BBC, ITV, C4, Five, Discovery, UKTV, Viacom/MTV etc. and they may all have reasons for using either or both formats.

I suspect they realise that they can't select a single standard and force other broadcasters to use it - so they've had to include both European HD standards.

Compatibility with both formats also allows them to defer a decision, cope with some future technology developments, or change their minds!
 
MPK said:
well, it is a general misconception that 1080i is better than 720p. have a look at this:
http://alvyray.com/DigitalTV/Naming_Proposal.htm

Hmmm - that site glosses over the major advantages of 1080i though.

1080i is usually broadcast with 1920 (or in some areas 1440) horizontal samples compared to 720p's 1280.

1080i is going to be sharper horizontally...

1080i is NOT 540p. Whilst it is generally viewed that 1080i delivers approximately 70-75% of the resolution of a 1080p system - that is still way more than a 540p system.

The graphics on that site, whilst seeming to explain the difference between interlace and progressive, grossly misrepresent the two standards by using groups of lines, rather than single lines, to represent interlaced scanning. It makes the artefacts look much bigger than they are.

The author also glibly ignores the brain's persistence of vision, and also the widespread use of de-interlacing.

We've had interlaced scanning since 405 launched in 1936 - we don't see the field based line structure as you'd imagine from that article.

Sure progressive has advantages over interlaced - especially with progressive displays. Sure 1080i and 720p isn't simply a numbers comparison game. Neither is it clear that 720 or 1080 is better - it depends on the source material, distribution scheme and display system, not just the source format.

1080i may well look better for film sourced material than 720p. 720p may well look better for sports.

1080i may look better on a native interlaced display than de-interlaced to progressive.

You can't make a single - 720p is better than 1080i or 1080i is better than 720p argument.
 
You do all realise of course that HD testing is already taking place and available on you local friendly old sky analog satellite? 19 deg east i think...Of course you do need a hd reddy screen and an HD receiver...but i can assure u testing is up and running.... Mostly rubbish german TV but it is there..... :lesson:
 
I fully agree. Didn't say that 720p was better, just that it's a misconception that 1080i is automatically better than 720p because of the higher number of lines. Considering that 1080i will have to be rescaled a lot more than 720p to fit the 1366x768 (or 1280x768) resolution of most displays, this can deteriorate the PQ as well.
 
Marvinius said:
You do all realise of course that HD testing is already taking place and available on you local friendly old sky analog satellite? 19 deg east i think...Of course you do need a hd reddy screen and an HD receiver...but i can assure u testing is up and running.... Mostly rubbish german TV but it is there..... :lesson:

Yep, and it has been for over a year, though this is MPEG2 HD using DVB-S - and even those running the services currently on Astra have agreed that MPEG4 will be the future. (And probably DVB-S2)

Euro1080 have announced they will be simulcasting HD-1 in MPEG2 and MPEG4.
 
MPK said:
I fully agree. Didn't say that 720p was better, just that it's a misconception that 1080i is automatically better than 720p because of the higher number of lines. Considering that 1080i will have to be rescaled a lot more than 720p to fit the 1366x768 (or 1280x768) resolution of most displays, this can deteriorate the PQ as well.

Yep - though scaling 720 to 768 is likely to cause quite a nasty process, as scaling similar resolutions is always pretty difficult to do well. Much easier to scale by bigger factors.
 
Stephen Neal said:
Why are Sky supporting 720/50p and 1080/50i in their receivers?

Well they have to cater for all broadcasters who use their platform, not just Sky themselves. After all there are loads of non-Sky channels on their SD platform - the BBC, ITV, C4, Five, Discovery, UKTV, Viacom/MTV etc. and they may all have reasons for using either or both formats.

I suspect they realise that they can't select a single standard and force other broadcasters to use it - so they've had to include both European HD standards.

Compatibility with both formats also allows them to defer a decision, cope with some future technology developments, or change their minds!
Sorry I'm new to all this HD technology so excuse my ignorance :rolleyes: but does it matter what TV/display you get if Sky are going to support both formats?
 
As long as you have a display that supports the propper resolution (1920X1080 or 1366X768) and have the propper HDCP (copy protection) HDMI or DVi-D digitla conections, then you should be fine. Component will also be included in any new TV set, but it might never get used, so it doesnt really matter.

Just 2 things are important. The rsolution and the propper conections.
 
Stephen Neal said:
Hmmm - that site glosses over the major advantages of 1080i though.

1080i is usually broadcast with 1920 (or in some areas 1440) horizontal samples compared to 720p's 1280.

1080i is going to be sharper horizontally...

1080i is NOT 540p. Whilst it is generally viewed that 1080i delivers approximately 70-75% of the resolution of a 1080p system - that is still way more than a 540p system.

The graphics on that site, whilst seeming to explain the difference between interlace and progressive, grossly misrepresent the two standards by using groups of lines, rather than single lines, to represent interlaced scanning. It makes the artefacts look much bigger than they are.

The author also glibly ignores the brain's persistence of vision, and also the widespread use of de-interlacing.

We've had interlaced scanning since 405 launched in 1936 - we don't see the field based line structure as you'd imagine from that article.

Sure progressive has advantages over interlaced - especially with progressive displays. Sure 1080i and 720p isn't simply a numbers comparison game. Neither is it clear that 720 or 1080 is better - it depends on the source material, distribution scheme and display system, not just the source format.

1080i may well look better for film sourced material than 720p. 720p may well look better for sports.

1080i may look better on a native interlaced display than de-interlaced to progressive.

You can't make a single - 720p is better than 1080i or 1080i is better than 720p argument.

Stephen is correct. Ultimately, it's all about what the eyes see.

My experience watching HDTV is that 1080 looks quite a bit better than 720P, even with sports (CBS compared to FOX, for example). While it is not simple to make comparisons of the two for similar or same TVs I can also say that even on progressive displays, plasma and LCD, 1080 programming invariably looks noticeably better. I'm no expert on video but the eyes don't lie. Friends that have HDTVs agree.

And yes, I have 20/20 vision :)
 
MPK said:
I fully agree. Didn't say that 720p was better, just that it's a misconception that 1080i is automatically better than 720p because of the higher number of lines. Considering that 1080i will have to be rescaled a lot more than 720p to fit the 1366x768 (or 1280x768) resolution of most displays, this can deteriorate the PQ as well.

I don't know how scaling works for video but for still photography it isn't a concern to downsize as information is already there, such is not the case with smaller images resized up.
 
Interlacing a TV signal has a single purpose: Bandwidth compression

While this method might have been a viable one 60 -70 years ago, nowadays we have powerful computers doing much more intelligent work in compressing signals (e.g. MPEG 4 AVC).

Hence no need for interlacing whatsoever ! :suicide:
 
joys_R_us said:
Interlacing a TV signal has a single purpose: Bandwidth compression....

Interlacing was originally used to enable pictures sent at 25 fields per second to be sent with less flicker, by sending two half-frames every 50th of a second.

Interlacing is used on Alis panels on some plasma TV's, because it reduces the number of connections that have to go to the individual pixels, making the space between the pixels smaller (i.e. making the pixels larger and brighter).
 
joys_R_us said:
Interlacing a TV signal has a single purpose: Bandwidth compression

While this method might have been a viable one 60 -70 years ago, nowadays we have powerful computers doing much more intelligent work in compressing signals (e.g. MPEG 4 AVC).

Hence no need for interlacing whatsoever ! :suicide:

Whilst what you say is true for transmission - it isn't quite the same for production - i.e. how you actually make the programmes.

Broadcast video doesn't use compression "in studio" to link items of equipment (as compression introduces both quality loss - and time delays - neither of which are desirable) Small amounts of compression are used in VTRs and Disc Servers (but these are only in the region of 2:1 to 20:1 ish - not the 100+:1 that is used in transmission)

Whilst it may well be quite easy to broadcast 1080/50p at an acceptable quality, using the latest transmission codecs, it is still a non-trivial exercise to actually produce real-TV in this format. HD-SDI is the standard connection system for broadcast cameras, mixers, VTRs, graphics equipment etc. It carries around a max of 1.2Gbps - which is enough for 1080/50i or 60i, or 720/50p or 60p - but nowhere near enough for 1080/50p or 60p - which is the progressive equivalent in motion terms.

It is possible to double-up and use dual-HD SDI connections, and use more VTR compression (or seriously expensive VTRs) - but this just simply isn't practical in a TV environment (say an OB covering a football match)

The bottom line is that for production - the options are pretty much :

1080/24p/25p/30p (low frame rate - film equivalent - no good for sports)
720/50p/60p (high frame rate - progressive - lower horizontal resolution)
1080/50i/60i (high field rate - interlaced - higher horizontal resolution)

Whilst 1080/24-30p seems like a good solution - the low frame rate really isn't good enough. It is good as a film-replacement for drama - but you'd be mad to limit your production systems to it.

720p or 1080i at 50 or 60 are still the only real games in town for running a real transmission system currently.

If you were really keen on future-proofing your system you could conceivably gear up for 1080/50p or 60p transmission (so your encoders and receivers were 1080/50p or 60p compatible) - but currently just feed this with 720p or 1080i source material. However the cost to implement 1080p in domestic receivers is also non-trivial. (You have to remember it is running at twice the uncompressed data rate of 1080i, which itself runs at a higher data rate than 720p)
 
Abit said:
I don't know how scaling works for video but for still photography it isn't a concern to downsize as information is already there, such is not the case with smaller images resized up.

Stills photography is easier because you don't have to do it 25 or 50 times a second in "real-time".

Scaling a 720 line image to 768 lines involves quite a lot of maths, as there isn't a simple mathematical relationship between the line counts - so you have to use quite complicated multi-tap filter theory AIUI. This requires quite a lot of processing, or you don't do it very well and get a drop in quality.
 
Totally agree about the problems with scaling 720p to 768p. It involves 16/15 scaling and is impossible to do well without applying a lot of vertical filtering which will inevitably soften the picture. I personally would be very reluctant to buy a panel with a resolution of 1366 x 768 if I were expecting to watch mainly 720p broadcasts. What are the alternatives? CRTs are 1080i; the only common 720p native TVs I know of are DLPs. There are a few 720p LCDs around, but they're rare. I'm not aware of any 720p plasmas. A 1080p display would seem like the perfect solution, if you can find one.

I don't disbelieve Abit when he says that 1080i looks better than 720p, but it's very hard to make a fair comparison when it involves comparing two different display technologies. What I would say about interlacing - and this applies equally to SD and HD - is that on my new Toshiba CRT, which has the sharpest picture of any CRT I have owned, the interlacing artefacts are very noticeable - and very objectionable. When I look at the best LCDs and plasmas, which deinterlace to pseudo-progressive scan, they look fine when there is no movement; when there is fast movement I see severe motion blur, something I don't really notice on a 50Hz interlaced CRT. Choosing a new TV standard is not just a question of which format has the sharpest picture.
 
the 1080i 720p debate will rage (but 1080p display should look better with a 1080i signal as an identical 720p device with 720p how they cope with each others signals is a different matter and 1080p displays are rare and pricey!

i think the quality of the picture will have a lot to do with the hardware involved and while making accedemic arguments for the merits of either system is fun it does not dictate the quality of the END result!
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom