Active Member
Is there any difference in quality between DVI and HDMI, (excluding potential inclusion of digital sound for HDMI). Are both connections completely lossless?


Well-known Member
Not at present, but there will be when dvd goes hi def i guess.
as dvd is limited to 8bit at present and dvi maxs out at 8bit but HDMI can go i believe to 12bit.

but at present its not realy a concern. this flu is giving me a head blank but dosent the `bit` refer to colour? correct or claify plz, im too i`ll too think today


Novice Member
I didn't know DVI was limited to 8-bit, I thought it was merely the transport mechanism for data and didn't have any interest in just what data it was carrying. :confused:

AFAIK the 'bitness' is the same as in PCs in which case it defines the number of shades of each primary colour there are, ie. it's the number of bits used to define the colour of a pixel ... so in an 8-bit system there are 256 shades while 12-bit has 4096. However, in order to make use of the increased colour depth it needs display devices with that precision in their panels/screens/etc.


Yeah must re-read the spec but yes 8bits per RGB or 256^3 or 16.78 million colours (equiv to 24bit modes on a PC).

What gets really confusing is that the new top players have 10/12/14bit video dacs! What this actually does I lost track of but they don't mean much to a DVI output as that'll still be the original DVD 8bit output.

Even my PJ is, I think, 12bit so maybe component from the players 12bit dacs to the 12bit PJ panels would be better, presuming the player does some fancy colour aliasing to convert 8->12bit. But then what do the scalers in the A11 & HS20 do with these extra bits? Must be lost on the A11 as it only scales to DVI...

...arggg brain hurts.... :)

The latest video from AVForums

Panasonic LZ2000, LZ1500 & LZ980 Hands-on Launch Event | No QD-OLED for 2022, new 77-inch for LZ2000
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom