I didn't know DVI was limited to 8-bit, I thought it was merely the transport mechanism for data and didn't have any interest in just what data it was carrying.
AFAIK the 'bitness' is the same as in PCs in which case it defines the number of shades of each primary colour there are, ie. it's the number of bits used to define the colour of a pixel ... so in an 8-bit system there are 256 shades while 12-bit has 4096. However, in order to make use of the increased colour depth it needs display devices with that precision in their panels/screens/etc.
Yeah must re-read the spec but yes 8bits per RGB or 256^3 or 16.78 million colours (equiv to 24bit modes on a PC).
What gets really confusing is that the new top players have 10/12/14bit video dacs! What this actually does I lost track of but they don't mean much to a DVI output as that'll still be the original DVD 8bit output.
Even my PJ is, I think, 12bit so maybe component from the players 12bit dacs to the 12bit PJ panels would be better, presuming the player does some fancy colour aliasing to convert 8->12bit. But then what do the scalers in the A11 & HS20 do with these extra bits? Must be lost on the A11 as it only scales to DVI...