1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HDMI vs DVI

Discussion in 'Blu-ray & DVD Players & Recorders' started by higenbs1, Mar 16, 2004.

  1. higenbs1

    higenbs1
    Active Member

    Joined:
    Aug 16, 2002
    Messages:
    460
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Ratings:
    +11
    Is there any difference in quality between DVI and HDMI, (excluding potential inclusion of digital sound for HDMI). Are both connections completely lossless?
     
  2. KraGorn

    KraGorn
    Active Member

    Joined:
    Aug 30, 2003
    Messages:
    4,740
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    68
    Location:
    Warrington
    Ratings:
    +27
    Read all about it here. :)

    Executive summary: video quality is the same as they're carrying the same signals except HDMI is encrypted and DVI (without HDCP) isn't.
     
  3. gandley

    gandley
    Well-known Member

    Joined:
    Aug 31, 2003
    Messages:
    5,024
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    106
    Location:
    Paradise Lost
    Ratings:
    +269
    Not at present, but there will be when dvd goes hi def i guess.
    as dvd is limited to 8bit at present and dvi maxs out at 8bit but HDMI can go i believe to 12bit.

    but at present its not realy a concern. this flu is giving me a head blank but dosent the `bit` refer to colour? correct or claify plz, im too i`ll too think today
     
  4. KraGorn

    KraGorn
    Active Member

    Joined:
    Aug 30, 2003
    Messages:
    4,740
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    68
    Location:
    Warrington
    Ratings:
    +27
    I didn't know DVI was limited to 8-bit, I thought it was merely the transport mechanism for data and didn't have any interest in just what data it was carrying. :confused:

    AFAIK the 'bitness' is the same as in PCs in which case it defines the number of shades of each primary colour there are, ie. it's the number of bits used to define the colour of a pixel ... so in an 8-bit system there are 256 shades while 12-bit has 4096. However, in order to make use of the increased colour depth it needs display devices with that precision in their panels/screens/etc.
     
  5. gandley

    gandley
    Well-known Member

    Joined:
    Aug 31, 2003
    Messages:
    5,024
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    106
    Location:
    Paradise Lost
    Ratings:
    +269
    yeah that sounds right, but dvi is limited to 8bit
     
  6. ailean

    ailean
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Yeah must re-read the spec but yes 8bits per RGB or 256^3 or 16.78 million colours (equiv to 24bit modes on a PC).

    What gets really confusing is that the new top players have 10/12/14bit video dacs! What this actually does I lost track of but they don't mean much to a DVI output as that'll still be the original DVD 8bit output.

    Even my PJ is, I think, 12bit so maybe component from the players 12bit dacs to the 12bit PJ panels would be better, presuming the player does some fancy colour aliasing to convert 8->12bit. But then what do the scalers in the A11 & HS20 do with these extra bits? Must be lost on the A11 as it only scales to DVI...

    ...arggg brain hurts.... :)
     

Share This Page

Loading...