1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

1360x768 via DVI

Discussion in 'LCD & LED LCD TVs' started by Rob_Quads, Jun 28, 2005.

  1. Rob_Quads

    Rob_Quads
    Active Member

    Joined:
    Feb 14, 2004
    Messages:
    373
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    Chandlers Ford
    Ratings:
    +12
    I have my Samsung LE32R41B running from my HTPC. From VGA it runs fine under 1360x768 mapped 1:1 as far as I can see..so the next step... DVI. The machine has a NVidia 6600GT Dual DVI graphics card

    Got a strange problem with DVI - its doing something very weird with the picture such that no-matter what signal you pass it 480p 720p 1080p It blows the picture up and only shows the middle bit i.e. you loose a boarder round the whole picture (a % rather than number of pixels) It can see seen in the pics attached.

    Any ideas?

    [Strangly when the card is set to output 720p my projector shows wide720 via VGA (correct) but 750p/50 when via DVI?]

    A fews pics of what I mean can be seen here

    http://www.avforums.com/forums/attachment.php?attachmentid=15511
    http://www.avforums.com/forums/attachment.php?attachmentid=15512
    http://www.avforums.com/forums/attachment.php?attachmentid=15513
     
  2. allanp

    allanp
    Active Member

    Joined:
    Jun 28, 2004
    Messages:
    1,027
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    48
    Ratings:
    +94
    It looks like it's doing some kind of virtual desktop mode. Or it's some setting somewhere that is overscanning the picture, like a tv would do.
     

Share This Page

Loading...