Why we should embrace HDMI part II

Y

yan

Guest
Regarding the recent issues raised from the thread "why we should embrace HDMI", I got the following response from the HDMI organisation. It is detailed and long. I felt I had to put it in a separate thread as the other one seems to have wound to a close and I wanted people to see this as it may adress many of the issues raised, feel free to move it to the other thread.


Resposnse from HDMI Organisation:
"There is no doubt that consumers are very sensitive about content protection issues. And we can understand why that sensitivity would cause some concern about HDMI. Even though HDMI itself has no content protection requirement, most manufacturers typically add HDCP technology to their products at the same time they add HDMI, so consumers often equate the two.

That said, HDMI was definitely meant to be all about audio and video quality.

From a video perspective, HDMI is capable of carrying a higher quality video signal than any prior consumer or professional interface. It can carry any current or planned video resolution/timing used in any industry. The common HDMI Type A connector can carry up to 165Mpixels/sec which is higher than (1920x)1080p at 60Hz, that is, more than twice the bandwidth needed today for 720p or 1080i. The Type B connector has no specified upper limit, though the technology is currently limited to about 440MHz, which would be roughly equivalent to a screen with twice the resolution of a WUXGA screen. Not even the far future dreamers discussing Quad-HDTV resolutions will be restricted by HDMI.

Each digital video pixel on HDMI can be carried as RGB or YCbCr 4:4:4 with 8 bits per color or YCbCr 4:2:2 with 12 bits per component. This allows 16 times more shades of grey or color variations than you can find today on any commercial MPEG source. But, as the display technologies improve, additional color depth will become more important, so HDMI was designed from the start with this capability.

HDMI is capable of carrying more audio data than 8 S/PDIF (or AES/EBU) cables put together. This huge amount of bandwidth can be used to increase the number of channels, the number of samples per second, the number of bits per sample, or all three. As initially defined, HDMI can carry an uncompressed digital audio stream containing up to 8 channels of audio, with each channel sampled at 192kHz and up to 24 bits/sample. There is no other digital audio interface that comes close. Of course, HDMI can carry compressed streams as well (AC-3, DTS, EX, etc.).

Regarding the assertion that HDMI doesn't address 1280x768, there are a couple of issues. First, there seems to be quite a bit of confusion about HDMI's capabilities. HDMI does address 1280x768, today, and does so better than any previous interface. If the plasma display's HDMI EDID (display identification data) indicates that it can support 1280x768 (or 1366x768, which is the more common format for 16:9 plasma TVs) and if the source device can support that resolution, HDMI can carry it. No problem.

The reason that 480i, 480p, 576i, 576p, 1080i and 720p are established as "default" formats is because that's what all of the source material is today. It is true that all of these formats need to be scaled to fit a 1366x768 display, but the reality is that 100% of the content is in these formats. The reason that HDMI uses these formats as the default formats is to guarantee that, no matter what source material is being watched, and no matter what video processing technologies are in the source and display, the user will always be able to see a high-quality image and likely, will be able to see the image in 100% of the original encoded resolution (which today is never 1280x768). If the plasma controller box (or the video processor or scaler box) that is connected to the HDMI plasma accepts a 1080i or 720p stream from an HD-DVD or satellite feed, and has the ability to deinterlace (1080i->1080p) and scale that stream to the native plasma resolution of 1366x768, then HDMI can carry that stream at up to 12 bits per color from the scaler to the plasma ... a higher quality transfer than is possible with any previous interface standard.

The key point is that you only want that scaling operation to occur once. Since we know that all of the source material will, by definition, need to be scaled to fit the panel resolutions that are being used today, and since there are almost 100 different panel resolutions being sold today (and more in the future), then the only way to guarantee that the scaling will occur only once is to have the source device know what the resolution of the panel is. It can then decide to do the scaling to fit the native format if it also happens to be doing scaling to adjust aspect ratio or whatever. This capability is provided by HDMI's "EDID" ROM, which is a data structure stored to every display that directly indicate the display's video capabilities, including aspect ratio, native resolution, and other support formats. Without an EDID-enabled link, the source might scale a 480p up to 1080i and then the display scales the 1080i to it's native format (1280x768, 1024x1024, 1366,768, 858x480, 858x483, etc., etc.). With the EDID, the source can do the single scaling necessary. That said, you will likely only see this "native scaling" capability in the higher-end products initially.

The same EDID is used for indicating audio capabilities so the DVD player can automatically detect if the audio amplifier is capable of supporting AC-3 or DTS decode.

As you can tell, we feel pretty strongly that HDMI is an extremely powerful interface from the start and, as manufacturers begin to take advantage of it's more advanced capabilities, this will become even more obvious to the end users. We have yet to hear any accusations about poor resolution support or poor audio support or others that have turned out to be true. We would appreciate your skepticism about these same criticisms."

Well what do you make of this guys?
 
All I can make of it is that the consumer video guys only started talking to the pc industry last week, which is a bit late. Cuurent displays, with the exception of HD projectors, are not really compatible with the aims of HDMI. Marks for trying to Pioneer, but they need to change the panel to make it work.
 
MAW,

could u please explain what u mean by "but they need to change the panel to make it work ?", Thx
 
Just like it says up there, there should be only 1 point of scaling, therefore 768 vertical lines is the wrong starting point. We need 1280 x 720 panels. I've got one, he says smugly, but it's a projector.
 
It does address some of the concerns though ie that it was always about censorship and not about quality. "It states clearly that HDMI itself has no content protection requirement" contrary to what I was lead to believe in part I of the forum. How well informed are we about HDMI? Thats why I posted their response.
 
Yes, most informative. I had thought that HDCP was part of the spec. The audio side is the next area where progress is necessary. I presume eventually we will connect our sources to an av hub/scaler/amplifier/switcher, then to the display from there, bit like the denon AVC A11 or something, via HDCP DVI, so I don't see the need for an actual HDMI connector on a display, cos this component will never contain digital audio decoding and have the speakers wired from it. DVI is what you need on a 'display only' panel
 
maw: From limited tests I've done with Joe it seems there is more to DVI and HDMI switching than meets the eye. While DVI switchers pass HDMI signals they don't so far seem to pass HDMI communication protocols. I suspect this will change.

We always knew that HDMI could carry any resolution up to its bandwidth and that the lack of support for native resolution is a Pioneer idiocy issue. If you want to only do scaling once then set your 868 to interlaced or 480/576P output.

Early days but as I have said it's going to be with us and we just need to get the manufacturers on the case.

Gordn
 
MAW why dont you send them your issues reagrding the digital audio, the guy was really helpful and as it says at the bottom, "We have yet to hear any accusations about poor resolution support or poor audio support or others that have turned out to be true. We would appreciate your skepticism about these same criticisms" Post his response, the more info we can get the better.

[email protected]
 
I too have come across HDMI switching issues, DVI is hard enough and expensive! I will ask the HDMI lot about the audio side of things, I think it's just not complete though. And really, whilst it carries video signals, the carrying of audio signals to a display is pointless for the majority of flat display owners. Certainly multi channel is pointless. We need equipment to support the format before it actually becomes useful to us, as well as more development, and as Gordon says, one main area for both is to design a working switch/hub for it. To me this is also where the audio extraction will take place. I should think it'll be 5 years before this is mainstream, though I daresay 2 years for the very top end to get it something like right.
 
So does this mean that HDMI is a decent format and we should be pressuring the manufacturers such as Pioneer to apply it properly to their equipment. I mean is it the finsihed article in its current state and its the application by the manufacturers that is letting it down?
 
Originally posted by MAW
Just like it says up there, there should be only 1 point of scaling, therefore 768 vertical lines is the wrong starting point. We need 1280 x 720 panels. I've got one, he says smugly, but it's a projector.

So why is 1 point of scaling acceptable if 2 or more are not. I would have thought that what is needed is a panel to support the source material, which for the majority currently would be DVD.
 
eager: Source material comes in lots of different native resolutions and it's likely there will be more all the time. Also scaling done well can actually create information that, although not in the original immge being manipulated, may well have been in the actual image that was captured (if this makes sense)

Yan: The implimentation of HDMI just now is not complete. Current silicon doesn't support full multi channel high resolution digital audio formats. By end of this year it should be better supported. You are correct though that for video it's done. Just as with DVI the issues are the EDID structure on the displays and also sources. The fact Pioneer make a DVD player that is intended to plug in to their plasma's and that the plasma doesn't support native resolution in on HDMI and the DVD player will not stick out native resolution is a bit silly.

G
 
Originally posted by MAW
I too have come across HDMI switching issues, DVI is hard enough and expensive! I will ask the HDMI lot about the audio side of things, I think it's just not complete though. And really, whilst it carries video signals, the carrying of audio signals to a display is pointless for the majority of flat display owners. Certainly multi channel is pointless. We need equipment to support the format before it actually becomes useful to us, as well as more development, and as Gordon says, one main area for both is to design a working switch/hub for it. To me this is also where the audio extraction will take place. I should think it'll be 5 years before this is mainstream, though I daresay 2 years for the very top end to get it something like right.

I can see what your saying, AV Recievers do already have anologue video switching capability, that's what makes them AV Recievers. What you really want is an AV Reciever with plenty of HDMI+HDCP inputs (as well as legacy inputs), HDMI output to the screen so it can detect the native res and scale appropriately, optional DVI and BNC anologue component outputs with manual scaling adjustment, and the usual speaker outputs for the speakers. It would then process both the video and the audio in one place, which would make a lot of sense - as you say who want's audio going directly to the screen? pointless.

There's only one small problem, the screen will have to support all manner of referesh rates to avoid motion judder when scaling, upscaling PAL 50hz to 720p/1080i will have this problem because 720p/1080i are 60hz standards. I don't know how they would fix this easily, but I do know that 576p is 50hz and so if you output 576p to the screen it avoids the problem but your stuck with the screen's scaler.
 
Originally posted by cybersoga
There's only one small problem, the screen will have to support all manner of referesh rates to avoid motion judder when scaling, upscaling PAL 50hz to 720p/1080i will have this problem because 720p/1080i are 60hz standards.

I thought (may well be completely wrong) that the ATSC HD 720p & 1080i standards were defined to be 60Hz or 50Hz as they were trying to set global standards? Since refresh rate is still stuck to the frequency of the electricty systems, surely support for 50Hz is present?
 
My 868i sends (1280x)720p in both 50hz and 60hz refresh rates depending on source disc.(PAL/NTSC etc)
 
If you could plug a dvd player into a scaler via HDMI (576i), and plug the scaler into the screen with HDMI (scaling to the screen's native resolution), would it be able to decrypt HDCP encryption by passing the decryption key from the screen to the scaler, or would the scaler have to have it's own decryption key?
 
Originally posted by fulabeer
My 868i sends (1280x)720p in both 50hz and 60hz refresh rates depending on source disc.(PAL/NTSC etc)

Do many *screens* support resolutions above 576p (720p, 1080i etc) at 50hz?
 
Perhaps the best thing would be to have two seperate HDMI connections one for video and one for audio. The plasma screen would have an HDMI video connection, the DVD player would have an HDMI video and audio connection while the amplifier would have an HDMI audio connection (or both). Obvioulsy you would run video from DVD player to screen and audio from dvd to amp (when as Gordon stated we get better supported full multi channel high resolution digital audio formats for HDMI).

If the manufacturers also pull their fingers out everything will be perfect.

There you go nice and easy simple as that:D

Serioulsy is that a workable idea?
 
There are a huge number of resolutions that were propsed for ATSC. 50 and 60 Hz are included from memory. The issue is similar to PAL Progressive though. As the amount of sources that output those resolutions at those refresh rates is negligible the displays have not been set up to accept them. As demand increases then so will support.

Cybersoga: The new Lumagen scalers will, hopefully, accept 480i and 576i HDMI copy protected signals and will do exactly what you suggest, adding HDCP to their DVI outputs. This has to be tested of course but it is the plan.

Gordon
 
The HDMI boys have set out their store. It takes a lot to change a standard. Yan, you are proposing a new one. I'm proposing an alternate way for manufacturers to use the existing one. Lumagen would appear to agree, and Gordon, we'll discuss it next time you are down south. I suppose it doesn't really matter if the connection to the screen is HDMI, except that it restricts the resolutions that can be sent. If you have a scaler which can accept HDMI inputs, output sound by i-link, digital whatever means as long as it's good, or of course be an amp as well, and output the video, scaled to your chosen resolution via DVI, with HDCP if that is what is demanded by Hollywood, (personally I don't like that bit) but if you choose the resolution, then it gets around needing a whole new generation of display resolutions, which are not likely to be this year at least.
 
The problems with HDMI are not the standard, but the fact that it isn't fully implemented yet, and that additional standards are required (such as HDCP, and consistency of supported resolutions) to fully specify communication. So we'll just have to wait. (Analogy: just because you're using a standard phone doesn't mean you will understand the caller).

The problem with switching HDMI/HDCP is that the communication can be "stateful" (ie the ends have to know "where they are" in the conversation). This means they can't in general be simply switched. (An analogy: if you are part way through placing a telephone credit card order, the operator can't be simply switched to another caller part way through without things going wrong.)

So we get into comuter networking concepts like spoofing and man-in-the-middle attacks.
 
Shouldn't the designers have thought about that? Does this mean you have to power down and power up every time you switch sources? Windows got over that one some time ago!
 
Errr... instead of discussing what may or may not be in standards, why don't you go and read them!

Source: http://www.hdmi.org/

6.2 Video Format Support
In order to provide maximum compatibility between video Sources and Sinks, specific minimum requirements have been specified for Sources and Sinks.

6.2.1 Format Support Requirements
Some of the following support requirements are in addition to those specified in EIA/CEA-861B.
• An HDMI Source shall support at least one of the following video format timings:
. 640x480p @ 59.94/60Hz
. 720x480p @ 59.94/60Hz
. 720x576p @ 50Hz
• An HDMI Source that is capable of transmitting any of the following video format timings using any other component analog or uncompressed digital video output, shall be capable of transmitting that video format timing across the HDMI interface.
. 1280x720p @ 59.94/60Hz
. 1920x1080i @ 59.94/60Hz
. 720x480p @ 59.94/60Hz
. 1280x720p @ 50Hz
. 1920x1080i @ 50Hz
. 720x576p @ 50Hz
• An HDMI Sink which accepts 60Hz video formats shall support the 640x480p @ 59.94/60Hz and 720x480p @ 59.94/60Hz video format timings.
• An HDMI Sink which accepts 50Hz video formats shall support the 640x480p @ 59.94/60Hz and 720x576p @ 50Hz video format timings.
• An HDMI Sink which accepts 60Hz video formats, and which supports HDTV capability, shall support 1280x720p @ 59.94/60Hz or 1920x1080i @ 59.94/60Hz video format timings.
• An HDMI Sink which accepts 50Hz video formats, and which supports HDTV capability, shall support 1280x720p @ 50Hz or 1920x1080i @ 50Hz video format timings.
• An HDMI Sink that is capable of receiving any of the following video format timings using any other component analog or uncompressed digital video input, shall be capable of receiving
that format across the HDMI interface.
. 1280x720p @ 59.94/60Hz
. 1920x1080i @ 59.94/60Hz
. 1280x720p @ 50Hz
. 1920x1080i @ 50Hz

Notes:

No 48Hz or 72Hz support
So if you like Region1 format you will have to live with 3:2 pulldown judder. :( I'll keep my 1:1 pixel mapped analogue with no judder thank you.

No native resolutions or PC resolutions support
Unsuprising as a "multimedia" interface it supports HD 16:9 (e.g. 1920x1080) resolutions by no PC's 10:9 resolutions (e.g. 1920x1200) or fresh rates (I like my PC monitors running at least 100Hz).

Therefore HDMI is not going to replace PC video connections for a while.

Futhermore, no native resolution for plasmas/LCDs etc mean that external scalers are knackered too -- unless you leave some element of scaling down to the screen, in which case, what's the point in an external scaler? Of course this is not a problem if your display's native resolution matches one of these resolutions, but no plasmas do. More over, next step would be native resolution at 50Hz -- for HTPC owners -- and that's not listed either.

50Hz not a requirement
Doesn't say that anything that 50Hz must be supported as well as 60Hz, still reads like an either or.

StooMonster
 

The latest video from AVForums

TV Buying Guide - Which TV Is Best For You?
Subscribe to our YouTube channel
Back
Top Bottom