1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Any Quality Difference Between DVI & D-Sub

Discussion in 'LCD & LED LCD TVs' started by robomonkey, Oct 4, 2005.

  1. robomonkey

    robomonkey
    Active Member

    Joined:
    Sep 30, 2005
    Messages:
    66
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Ratings:
    +4
    Are there any quality differences between connecting a PC to an LCD via D-Sub or DVI?

    If you connect the PC via D-Sub will you still be able to show videos at HD quality 720 or 1080?

    Ta
     
  2. RockySpieler

    RockySpieler
    Active Member

    Joined:
    May 23, 2005
    Messages:
    1,659
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    66
    Location:
    Snaking the Lineup
    Ratings:
    +116
    Depends on the screen..............

    According to the Philips 37PF9830 thread, it cannot accept 1080i via DVI (HDMI + adapter) but does display 1080i via VGA.

    "On my TEVION" (I know I start a few sentances that way, hopefully not for much longer :D ) however..... DVI is very good, not 1:1 pixel mapping, but very clear. On the other hand VGA has a shimmering effect on edges of visual window whilst web browsing, yet looks more "natural" when watching video HD AVI's.

    So unfortunatley no conclusive sweepling generalisation from me (that's unusual!).
     
  3. robomonkey

    robomonkey
    Active Member

    Joined:
    Sep 30, 2005
    Messages:
    66
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Ratings:
    +4
    Hmmm, I don't get it!! (not hard to confuse me though).

    I want to connect a media centre pc to an LCD, should I be looking for a DVI connection for best results, or will the D-Sub produce a similar picture?

    Am I just completely limiting my choice by discounting any sets that have D-Sub, what about the HDCP thingy?
     
  4. matt_p

    matt_p
    Active Member

    Joined:
    Apr 7, 2005
    Messages:
    845
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Location:
    Nottingham
    Ratings:
    +38
    Say you have a PC with DVI and D-Sub outputs, and a LCD TV with 1366x768 resolution.

    Scenario 1: You play a 720p file which is 1280x720 resolution in, say, Windows Media or TheaterTek etc. Your PC is set to 1366x768, so the PC upscales the image from 1280x720 to 1366x768.

    Scenario 2: You play a 1080i file which is 1920x1080 resolution. Your PC is set to 1366x768, so your PC downscales the image from 1920x1080 to 1366x768.

    So, if you connect a PC using D-sub or DVI, a 720p file will have all the detail intact and will look fine. A 1080i file will have SOME detail removed (as it is downscaled), but will still play fine and look good. The upscaling and downscaling is all handled by your PC/graphics card.

    Couple of things to consider:

    DVI should give a pixel-perfect reproduction on screen, as long as you are at native res. ie. the PC is set to the same res as the screen's native res. usually 1366x768. (the res will actually be 1360x768 as graphics cards need a res divisible by 8. You'll be missing three columns of pixels on the left and right - unnoticable). BUT!!!! Not all LCD screens accept native res over DVI. Most max out at 1024x768. You'll have to investigate further once you've decided on a suitable screen. Some can do it, some can with a work-around, some can't.

    D-sub will give good results, but not quite as good as DVI. Text will be noticably fuzzier, but video should be almost indistinguishible from DVI.

    HDCP is only a consideration when you are trying to watch encrypted content, such as HD-DVD or Blu-Ray HD movies. They will require a HDCP compatible output on your graphics card... Haven't seen any yet, and don't know if it's the kind of thing that can be activated with a driver update... Will have to see. At the present moment though, you don't need to worry about it (with regard to using PC). Sky HD will use it (over its HDMI output), and upscaling DVD players use it, but your original question was about PC use.
     
  5. Shakey_Jake33

    Shakey_Jake33
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    The quality difference between DVI Vs. D-Sub (VGA) is there, but it's not night and day.
    I have my PC connected to my TV running at my screen's native resolution (well, set to 1360x768 on my PC, the TV is 1366x768 native), and I really can't see any difference between DVI-D and analogue (VGA) unless I look extremely close at some edges of the windows on the screen, and even then, it's minor.

    If your TV has a free DVI port, you may as well use it, but you won't be loosing out as such by using VGA.
     
  6. robomonkey

    robomonkey
    Active Member

    Joined:
    Sep 30, 2005
    Messages:
    66
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    8
    Ratings:
    +4
    Cheers guys, you've put my mind at rest, I don't think i'll insist on my tv having dvi, it it does its a bonus!!!
     

Share This Page

Loading...