Quantcast

Reality check on HDTV interesting read

tryingtimes

Well-known Member
It's a bit full of "if"s "but"s and "maybe"s for my liking without really giving solid advice. Only just short of scaremongering.

It's right about HD broadcast not being in 1080p, but that's no reason not to buy a 1080p display.

However consumers do have to be very careful at the moment.


1) If you buy a 1080 resolution display, make sure it accepts a 1080p signal and doesn't do anything mad with it.

2) Make sure you can address the native rate of the display with external devices (preferably through the hdmi/dvi inputs)- this at least means that if you do end up finding that it does something mad, you can bypass it with an external device (scaling DVD player/video processor/scaling HDTV box)

3) If you want your new BRD/HD-DVD player for playing video content (i.e. TV shows), then wait for a 1080p capable player - the current Tosh HD-DVD player isn't.

4) If you want a new 1080i-ONLY disc format (because you can't wait) for FILM content, make sure something in the chain has inverse telecine and 2:2 cadence detection - this could be your display, a scaling player or an external scaler (video processor). This will be able to find the original 24 progressive frames without loss of info.

5) Probably best not to go to a high-street chain - get specialist advice and the ability to take it back if it doesn't do what you want.
 

richard plumb

Distinguished Member
it also talks about US specific stuff like 3:2 pulldown and no displays doing 24p. Well in the UK thats not really an issue as we can do 50Hz updates not 60.

Although what US imports will look like I don't know - will we get filmed drama at 24p and converted by Sky, or 1080i60 rate converted?
 

Welwynnick

Distinguished Member
MR2Harvey said:
Follow this link (ok to do that I hope)
http://www.hdtvexpert.com/pages_b/reality.html
Harvey
That's a dangerous article, full of selective, misleading and irrelevant mis-information that is only designed to bring the author some attention. Anyone that believed the message he is putting across would do themselves an injustice. I usually beat the drum about 1080p TVs having 1080p inputs, but that's absolutely nothing to do with broadcast and storage standards only using 1080i. (Which isn't the case anyway, as it now appears that both BD & HDDVD will be storing HD films as 1920x1080p24/25).

We, well the US at least, have had 480p plasmas and 480i broadcasts and DVDs for years. DVD players have been producing good, progressive scan outputs that the plasmas were able to take advantage of very effectively. That was only possible because the plasmas would accept digital native progressive video inputs. It would be incredibly short-sighted if 1080p displays didn't do the same thing and spend a few more pence on the same thing, a 1080p HDMI input in this case.

Next year, videophiles will all be buying BD or HDDVD players with 1080p50/60 HDMI outputs, and will want somewhere to plug them in. If you are watching a film, you can get back every pixel of the 1080p original from your 1080i video source. The display may not be able to do it, but there is absolutely no point stopping something else from doing it for you.

And as for wobulation only giving you 960x1020 resolution - well that is simply wrong!

Bad article.

Nick
 

Tarbat

Well-known Member
tryingtimes said:
However consumers do have to be very careful at the moment.
A great set of questions, but how on earth can anyone get the answers to these questions. Not even the manufacturers are willing to disclose information about whether a particular input resolution will be scaled, what deinterlacing is used, whether they've implemented cadence detection on the HD inputs, etc. Is there any web-site that pulls each new TV apart and gets to the bottom of these types of question?
 

tryingtimes

Well-known Member
not that I know of - it's probably just a case of asking on here and going with manufacturers that have a history of doing it right. An experienced dealer like Joe Fernand might be the best bet.

I guess the first 2 questions are the main ones - you should just about be able to find that out - then it's a case of fixing any problems you find with connected devices (e.g. buying an HD-DVD player that has 3:2 detection) - or just going with a scaler next year when <&#163;1000 ones are out with proper HD handling.
 

Pecker

Distinguished Member
Scaremongering? Maybe.

But I'll wager my house that the following will come true:

1 - Most early adopters (us) will buy displays which are 720p (or 1080i at best), and not wait for 1080p displays.

2 - 1080p will eventually become the industry standard.

3 - We will be told at that time that anyone sticking with a 720p display is a 'ludite' and a 'dinosaur', and that 1080p makes lesser HD-lite formats look like VHS.

Prepare to replace your £11k Sim projector with a 1080p model.

The introduction of HD in steps, rather than when 1080p displays can be produced at an affordable level, is nothing short of greed by the manufacturers.

Steve W
 

tryingtimes

Well-known Member
yes, you're absolutely right - I guess a lot of people on this forum already own 720p projectors and panels - and I guess a lot willl jump to 1080p as soon as they come in at less than &#163;3000.
I've had a 720p projector for 5 years now (Barco crt) and it seems like I've done well - I'm starting to feel the need to investigate all the 1080p projectors around now. To make it seem worth it, I'm also investigating a Constant Height 2.35:1 setup so that my improvements aren't just limited to resolution. I also get the bonus of a smaller, quieter pj too.

If I was just swapping a 768 panel for a 1080 one of the same size, I'd feel more cheated. I've already compared 720 versus 1080 with all else being equal, and I find the difference to be worth it - I think the perception is that we've had 576 for so long that the upgrade to 720 has more impact though.
 

Ulink

Novice Member
Dunno about 'non live' sources - ie games / dvd, but IMO you'll not see 1080p/50 live broadcasts for a long time yet. To quote from the original link -

"What about live HDTV? That is captured, edited, and broadcast as 1080i/30. No exceptions. At present, there are no off-the-shelf broadcast cameras that can handle 1080p/60, a true progressive format with fast picture refresh rates. It’s just too much digital data to handle and requires way too much bandwidth or severe MPEG compression. (Consider that uncompressed 1920x1080i requires about 1.3 gigabits per second to move around. 1080p/60 would double that data rate.)"

Not all totaly true - as has been mentioned in other threads - there are cameras which can capture the required data - eg Thomson LDK6200 will do 1080i/100 for 2 x supermotion, but this comes out as 2 wires to the disc recorder, which in turn squirts out half speed 1080i/50.

It's the 1.3GB/sec that causes a problem. On a live OB you need to send HD feeds to the VT/ replay truck etc, and it doesn't go very far down copper coax & is exceptionally intollerant of any connection that isn't a perfect 75ohm. The longest I've managed is about 100m using very fat cable & no joins! 1080p/50 would produce 2.6GB/sec which would go almost nowhere!

The only way to produce 1080p/50 in a live enviroment is to compress at source which would open a whole new can of worms..........
 

loz

Well-known Member
Pecker said:
.

The introduction of HD in steps, rather than when 1080p displays can be produced at an affordable level, is nothing short of greed by the manufacturers.
Not sure that is greed, but normal technology progress.

They will only get to 1080p displays at an affordable level, by making 720p displays at an affordable level first, and then refining their processes and technology to produce 1080p.

Its a bit like saying the computer industry was greedy making 5mb HDD, they should have waited until they could produce 500gb HDD at an affordable level.
 

hornydragon

Well-known Member
Putman said that Blu-ray and HD-DVD is also unlikely to manage 1080i either. Most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what's needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of today's HDTVs don't even support external 720p signal sources, which requires a 44.9 kHz higher scan rate.
Muppet

Original article http://www.hdtvexpert.com/pages_b/reality.html
First off, there is no 1080p HDTV transmission format. There is a 1080p/24 production format in wide use for prime time TV shows and some feature films. But these programs must be converted to 1080i/30 (that's interlaced, not progressive scan) before airing on any terrestrial, satellite, or cable TV network.
Oh and broadcast is obviously the only way of getting a picture i mean HD-DVD and Blu-ray arent possibilities for 1080p and and neither is the PS3 is it.........

Oops! Almost forgot, that same 1080p TV may not have full horizontal pixel resolution if it uses 1080p DLP technology. The digital micromirror devices used in these TVs have 960x1080 native resolution, using a technique known as &#8220;wobbulation&#8221; to refresh two sets of 960 horizontal pixels at high speed, providing the 1920x1080 image. It's a &#8220;cost thing&#8221; again. (Let's hope these sets don't employ the 540p conversion trick as well!
This is interesting.........
 

pjclark1

Novice Member
If you read the full item
http://www.hdtvexpert.com/pages_b/reality.html
you will see he also claims that most HDTVs can't display 720p either, but convert the signal to 540p before it is upscaled and displayed at 720p (can that be correct?).

"To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what's needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of today's HDTVs don't even support external 720p signal sources, which requires a 44.9 kHz higher scan rate."

"But that's not all. To show a 1080i signal, many consumer HDTVs do the conversion from interlaced to progressive scan using an economical, &#8220;quickie&#8221; approach that throws away half the vertical resolution in the 1080i image. The resulting 540p image is fine for CRT HDTV sets, which can't show all that much detail to begin with. And 540p is not too difficult to scale up to 720p.

But a 540p signal played back on a 1080p display doesn't cut the mustard. You will quickly see the loss in resolution, not to mention motion and interline picture artifacts. Add to that other garbage such as mosquito noise and macroblocking, and you've got a pretty sorry-looking signal on your new big screen 1080p TV."
 

hornydragon

Well-known Member
I think he is mainly reffering to cheap RPTV's offered in the US not LCD and plasma but cheaper processsors will do things like scale 1080i to 540p
 

m1ket

Well-known Member
Pecker said:
Scaremongering? Maybe.

2 - 1080p will eventually become the industry standard.

Steve W

1080p broadcasts wont be coming any time soon if at all for the next 5 or 6 years so i dont believe it will become the industry standard
 
E

eric23

Guest
There truth is that (1) there is no 1080p spec in the production world, other than 1080p/24, as he correctly points out, and the majority of broadcast equipment, cameras, etc. can only handle 1080i at the moment, (2) our off-air delivery platforms don't have anywhere near the required bandwidth for 1080p broadcasts - so off-air 1080p is certainly a pipedream at the moment!

Sure, if you're going to watch all your content off of HD-DVD, then maybe spending the extra &#163;6000 on a 1080p capable display is right for you. :suicide: But for the moment, I'm going to be sitting tight with my WXGA display, thanks very much.
 

blakey1

Novice Member
hornydragon said:
I think he is mainly reffering to cheap RPTV's offered in the US not LCD and plasma but cheaper processsors will do things like scale 1080i to 540p
Threads on the Sky HD forum suggest that most LCD's simply take a 1080i signal and turn it into 540p not sure about Plasmas.

To be honest I think that most Retailers manufacturers have been aware that the standard for TV/ HD-DVD etc would be 1080i/p for sometime and 720p would not be much use other than for gaming.

The expense of 1080 panels though has meant that the maeket has been flooded with 720p panels to kick start HD and to get the general public to buy into HD.

Theres no doubt that within the next few years we will all be told that 720 panels dont cut it and everyone needs a 1080 panel to make the best use of HD especially all those with panels which turn a 1080i signal into 540p which seems to be 99% of LCD's on the market at the moment.

I dont think this will bother too many AV fans but the average Joe who thinks they have future proofed themselves for the next 5-10 years could be disappointed.
 

tryingtimes

Well-known Member
Turning 1080i into 540p isn't quite as bad as it sounds - you're not loosing any of the original 1080i information (this is a bit of a bugbear with me at the moment so forgive me for saying it one more time).
BUT a properly deinterlaced and processed 1080i video signal will look much better. Virtually no devices are capable of this at the moment though.

I think people are also more critical of DVD than they are of TV, in which case I still think that if you're investing in a display with 1080 lines, you should make sure it can accept 1080p and also be 1:1 pixel mapped.

For 720 and 768 displays, I personally would be looking at ones which can be 1:1 pixel mapped, then I know I can add an external Video Processor in the future to better handle the new formats.
 

Mr.D

Distinguished Member
tryingtimes said:
Turning 1080i into 540p isn't quite as bad as it sounds - you're not loosing any of the original 1080i information .
Yep this is a bob deinterlace and to be honest its what most progressive displays (and players) will do if they have no film mode detecting deinterlace. Its also what the majority of deinterlacers use as the fallback mode if they can't detect a valid film cadence or detect field based video.

And its not that bad a deinterlace as far as it goes. As tryingtimes says it doesn't chuck away every other field , it scales each field into a frame and doubles the frame rate.

You can't really get a better video deinterlace unless you combine this approach with motion estimation and switch it with a weave deinterlace on static shots ( dscaler is a good example of this). Or you introduce vector based interpolation and segment the frames (artifact city unless you have 20k going spare).

As far as I'm aware the panasonic plasmas have film mode detection if you turn cinemareality on I don't know if that includes 2:2 pulldown.

I really wish manufacturers would give us optional manual deinterlacing controls , it would be cheap as they don't need to include a sophisticated detection system or could leave the default as bob for people who don't care and the vast majority of people who want to watch films could select the deinterlace they want. This is one of the advantages on PCs , even my freeview app lets me toggle the deinterlace between a bob and a weave.

As for bad edits ...I've essentially been using manual deinterlace selection on all my viewing for about 5 years and have yet to see any field ordering problems except with the most awful masters (which I either bob or run through Dscaler).
 

Welwynnick

Distinguished Member
tryingtimes said:
Turning 1080i into 540p isn't quite as bad as it sounds - you're not loosing any of the original 1080i information
Yes, but if it's a film source (which is quite likely with BD or HDDVDs) then a proper film mode will weave two fields and double the frame to reproduce the original progressive sequence with out losing ANY information. I believe there's a lot to be gained by doing that, at least with big 1080p displays, anyway. Ony a few players / processors / dsiplays do it as yet, though. I believe that the Panasonic PHW screens DON'T do it, but the PHD screens DO do it, for example. This sort of information is rather difficult to find, unfortunately. Manufacturers always have different marketing-driven names for it.

Nick
 

tryingtimes

Well-known Member
Hi Nick
You're absolutely right for film-source.
I was referring to native 1080i sources like broadcast which some are using as a reason NOT to buy/wait for 1080p-capable displays.
If it was me - I'd wait/save.
 

Tarbat

Well-known Member
I'd be happy to have BOB for true interlaced material, and NO deinterlacing for film material. But, how can anyone determine if their TV has film mode detection in 1080 mode? In gneral do TVs have film mode detection for 1080i, or is this only found on high-end TVs?

I know film mode works really well on my TV for 576i material, and just hope it can do the same for 1080i.
 

Welwynnick

Distinguished Member
Only a handful of the most expensive top of the range VPs, panels and projectors do 1080i film mode. The only ones I can think of are:

Marantz S4 PJ
Yamaha 1300 PJ
Sanyo 12000 PJ maybe
Sony Ruby PJ
Sony SXRD RPTV
Panny PHD PDP
Some Fujitsu PDPs
Pio PDPs maybe
Lumagen HDP/HDQ VP
VantageHD VP
Crystalio II VP
HD Leeza VP
Faroudja $$$$$$ VP

And maybe one or two others; Gordon @ Convergent has a good handle on this; but I think there will be a lot more this year. Of course, many of the above also do proper 1080i video de-interlacing.

Anybody ever wonder why some 720 and 768 line displays were so expensive?

Nick
 

Similar threads

Trending threads

Latest News

Bowers & Wilkins confirms corporate changes
  • By Andy Bassett
  • Published
Freeview Play coming to Android TV devices
  • By Andy Bassett
  • Published
Box Office still important draw for movie goers
  • By Andy Bassett
  • Published
Samsung Onyx cinema LED screen opens in Australia
  • By Andy Bassett
  • Published
Disney CEO Bob Iger steps down
  • By Andy Bassett
  • Published
Top Bottom