Regarding the recent issues raised from the thread "why we should embrace HDMI", I got the following response from the HDMI organisation. It is detailed and long. I felt I had to put it in a separate thread as the other one seems to have wound to a close and I wanted people to see this as it may adress many of the issues raised, feel free to move it to the other thread. Resposnse from HDMI Organisation: "There is no doubt that consumers are very sensitive about content protection issues. And we can understand why that sensitivity would cause some concern about HDMI. Even though HDMI itself has no content protection requirement, most manufacturers typically add HDCP technology to their products at the same time they add HDMI, so consumers often equate the two. That said, HDMI was definitely meant to be all about audio and video quality. From a video perspective, HDMI is capable of carrying a higher quality video signal than any prior consumer or professional interface. It can carry any current or planned video resolution/timing used in any industry. The common HDMI Type A connector can carry up to 165Mpixels/sec which is higher than (1920x)1080p at 60Hz, that is, more than twice the bandwidth needed today for 720p or 1080i. The Type B connector has no specified upper limit, though the technology is currently limited to about 440MHz, which would be roughly equivalent to a screen with twice the resolution of a WUXGA screen. Not even the far future dreamers discussing Quad-HDTV resolutions will be restricted by HDMI. Each digital video pixel on HDMI can be carried as RGB or YCbCr 4:4:4 with 8 bits per color or YCbCr 4:2:2 with 12 bits per component. This allows 16 times more shades of grey or color variations than you can find today on any commercial MPEG source. But, as the display technologies improve, additional color depth will become more important, so HDMI was designed from the start with this capability. HDMI is capable of carrying more audio data than 8 S/PDIF (or AES/EBU) cables put together. This huge amount of bandwidth can be used to increase the number of channels, the number of samples per second, the number of bits per sample, or all three. As initially defined, HDMI can carry an uncompressed digital audio stream containing up to 8 channels of audio, with each channel sampled at 192kHz and up to 24 bits/sample. There is no other digital audio interface that comes close. Of course, HDMI can carry compressed streams as well (AC-3, DTS, EX, etc.). Regarding the assertion that HDMI doesn't address 1280x768, there are a couple of issues. First, there seems to be quite a bit of confusion about HDMI's capabilities. HDMI does address 1280x768, today, and does so better than any previous interface. If the plasma display's HDMI EDID (display identification data) indicates that it can support 1280x768 (or 1366x768, which is the more common format for 16:9 plasma TVs) and if the source device can support that resolution, HDMI can carry it. No problem. The reason that 480i, 480p, 576i, 576p, 1080i and 720p are established as "default" formats is because that's what all of the source material is today. It is true that all of these formats need to be scaled to fit a 1366x768 display, but the reality is that 100% of the content is in these formats. The reason that HDMI uses these formats as the default formats is to guarantee that, no matter what source material is being watched, and no matter what video processing technologies are in the source and display, the user will always be able to see a high-quality image and likely, will be able to see the image in 100% of the original encoded resolution (which today is never 1280x768). If the plasma controller box (or the video processor or scaler box) that is connected to the HDMI plasma accepts a 1080i or 720p stream from an HD-DVD or satellite feed, and has the ability to deinterlace (1080i->1080p) and scale that stream to the native plasma resolution of 1366x768, then HDMI can carry that stream at up to 12 bits per color from the scaler to the plasma ... a higher quality transfer than is possible with any previous interface standard. The key point is that you only want that scaling operation to occur once. Since we know that all of the source material will, by definition, need to be scaled to fit the panel resolutions that are being used today, and since there are almost 100 different panel resolutions being sold today (and more in the future), then the only way to guarantee that the scaling will occur only once is to have the source device know what the resolution of the panel is. It can then decide to do the scaling to fit the native format if it also happens to be doing scaling to adjust aspect ratio or whatever. This capability is provided by HDMI's "EDID" ROM, which is a data structure stored to every display that directly indicate the display's video capabilities, including aspect ratio, native resolution, and other support formats. Without an EDID-enabled link, the source might scale a 480p up to 1080i and then the display scales the 1080i to it's native format (1280x768, 1024x1024, 1366,768, 858x480, 858x483, etc., etc.). With the EDID, the source can do the single scaling necessary. That said, you will likely only see this "native scaling" capability in the higher-end products initially. The same EDID is used for indicating audio capabilities so the DVD player can automatically detect if the audio amplifier is capable of supporting AC-3 or DTS decode. As you can tell, we feel pretty strongly that HDMI is an extremely powerful interface from the start and, as manufacturers begin to take advantage of it's more advanced capabilities, this will become even more obvious to the end users. We have yet to hear any accusations about poor resolution support or poor audio support or others that have turned out to be true. We would appreciate your skepticism about these same criticisms." Well what do you make of this guys?