DVI to HDMI

Discussion in 'Cables & Switches' started by judderman85, Sep 5, 2007.

  1. judderman85

    judderman85
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Decided to post here as I might get a better response than in the LCD forum.

    -----------------------

    Having recenty bought a Samsung LE32R87BDX I was having to switch between 2 different VGA inputs, my computer and my Xbox 360.

    So, I bought a DVI to HDMI cable and hooked it up to my PC so I could swap inputs at the touch of a button. However I turned myb computer on and the picture is awful, it looks as though the res is off. I have tried fiddling with the res and checked out the R87 settings thread but can't get anything to work. I'm running a Radeon X800 if that helps.

    ------------------------

    A bit better. I updated the drivers and they came with specific controls for using the DVI output. So now the res is the recommended on the R87 thread and the picture is better but still sub optimal (i.e worse than the same TV and computer were using VGA)

    Can someone explain to me why the same TV and source use different resolutions depending on what lead I use and is there much chance of impriving the picture.

    To be exact the picture on videos is fine it's on things such as text where it looks slightly 'off' as though the resolution is off by a couple of pixels.


    Thanks for any help.
     
  2. Gregory

    Gregory
    Active Member

    Joined:
    Mar 27, 2005
    Messages:
    312
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Midlands
    Ratings:
    +9
    I'm not familiar with the exact TV model, but a common issue is that the non-computer signals go through a scaler that is aimed at making sure that the picture fills the screen and looks good, no matter what resolution it it. To my surprise that is done for DVI/HDMI signals as well, even though you'd imagine that 1:1 pixel; matching would be smarter - at least as an option! The VGA input can work differently though, since computers often expect 1:1 matching, and also use text where the edge definition of text matters a lot. So, for a film type signal it can look great - you wouldn't notice a couple of pixel stretch across the screen. But, for a PC is can look dire, with bands of 'smearing' across the screen.

    In terms of improvement, I don't have much to offer, other than using the VGA input which may bypass the scaler (it sometimes does).

    Cheers

    Greg
     
  3. Joe Fernand

    Joe Fernand
    Distinguished Member AVForums Sponsor

    Joined:
    Jan 20, 2002
    Messages:
    28,720
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    166
    Location:
    The Borders
    Ratings:
    +3,513
    Hello judderman85

    Gregory is correct re the non acceptance of Native signals on many HDMI Inputs - its all down to the choice of HDMI Receiver 'chip' the Display manufacturer chooses to design into the Display.

    Some HDMI Receivers will only accept 'Video' resolution signals and the Display then has to scale the Input signal to fill the pixel array - in most cases its better to revert to Analogue HD15(VGA) for a PC signal.

    See http://www.siliconimage.com/products/productfamily.aspx?id=1&#29

    A very decent quality 2x1 Automatic VGA Switch will cost around 150.00 GBP.

    See http://www.kramerelectronics.com/indexes/item.asp?name=VP-211DS

    Joe
     

Share This Page

Loading...