Outputting 1080i resolution from a PC for display on 720p tv???

too_funky

Established Member
Joined
Aug 17, 2008
Messages
521
Reaction score
46
Points
126
Location
Liverpool
Hi,
I have a Sony KDL26T3000 tv. It is only 720p but can handle 1080i fine. I know this as my ps3 regularly upscales to 1080i and my tv recognises this and displays the 1080i picture with no problems.

When usining the tv as a pc monitor, it sets the resolution to 1280x768 with no problems, however if i try to up the resolution to 1920x1080, the screen refreshes to 1080i but then the the desktop image extends past the size of the screen and the quality appears reduced. Im guessing this is becuase the graphics card is outputting 1080p but the tv can only accept 1080i maxium resolution. Is there anyway to change the settings so that i can get 1080i resolution from my pc.

Also if this helps, i connect the pc to the tv using a dvi to hdmi cable. The graphics card does have a hdmi output but i didnt know that at the time of purchasing the dvi to hdmi cable.

Could the cable be causing any problems, would it be better using the hdmi output on the graphics card? The tv also has a pc in, its a vga socket i think, would this be be better?

Any suggestions welcome, thanks in advance.
 
You should have the option to output at 1080i25 or 30. I have the option in my driver, but it is ATI not nvidia. What is the TV native resolution? Mine is 1366x768, thats best for desktop work, but i output at 1080P for movies and it works fine, even though my TV is not full 1080P.
 
hi thanks for the reply, iv done a bit of searching and i found the options on the nvidia control panel with preset settings for various tvs, iv picked the one for 1080i hdtv and i still get the same problem as before, the picture extends off the boundaries of the screen and it become difficult to read txt and other things that seem to reduce quality. The otion also alows you to adjust the frequency, iv tried 25Hz, 30 Hz and 29Hz but none seem to work.
 
I have succesfulyy upped the resolution a little, it was 1280x768 but is now set at 1360x768 at 60Hz. However this seems to be where it maxes out, anything higher than that...well iv explained before, it just doesnt seem to like it.
 
Well that is the naive resolution (or as near as makes no difference). Your graphics card will have a better scaler in it than your TV so id leave it like that since anything else will just be scaled by the TV anyway. I only send my TV 1080P because my other screen is 1680x1050 and it lines up better.

Some Sony TVs are a bit funny about what they will accept, whereas my cheapy one will take anything! Its probably down to the scaler used.
 
Last edited:
cheers, i wasnt too optomistic about getting it to work but i just thought id ask incase anybody knew how, oh well thanks anyway :)
 
hi thanks for the reply, iv done a bit of searching and i found the options on the nvidia control panel with preset settings for various tvs, iv picked the one for 1080i hdtv and i still get the same problem as before, the picture extends off the boundaries of the screen and it become difficult to read txt and other things that seem to reduce quality. The otion also alows you to adjust the frequency, iv tried 25Hz, 30 Hz and 29Hz but none seem to work.

is this the nvidia control panel page you were talking about (in attachment)?

i can only check the interlaced button if i select CVT timing standard. however, then the computer says it failed to output at those settings.

therefore, obviously i should be changing some of the other values at the same time. can anyone tell me which ones?
 

Attachments

  • custom.GIF
    custom.GIF
    32.2 KB · Views: 906
As stated before you TV has a native resolution of 1368x768 and whatever resolution you send it, you will still be seeing 1368x768.

Plus as your TV is an LCD it cannot display interlaced images, so not only is your TV have to do work downscaling the 1080 lines of resolution you sending it, it is also having to de-interlace the signal into progressive scan.

All this extra processing can deteriorate the picture quality, it can also cause input lag.

Its best to stick to the closest resolution that your set can natively display.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom