robomonkey said:
Hmmm, I don't get it!! (not hard to confuse me though).
Say you have a PC with DVI and D-Sub outputs, and a LCD TV with 1366x768 resolution.
Scenario 1: You play a 720p file which is 1280x720 resolution in, say, Windows Media or TheaterTek etc. Your PC is set to 1366x768, so the PC upscales the image from 1280x720 to 1366x768.
Scenario 2: You play a 1080i file which is 1920x1080 resolution. Your PC is set to 1366x768, so your PC downscales the image from 1920x1080 to 1366x768.
So, if you connect a PC using D-sub or DVI, a 720p file will have all the detail intact and will look fine. A 1080i file will have SOME detail removed (as it is downscaled), but will still play fine and look good. The upscaling and downscaling is all handled by your PC/graphics card.
Couple of things to consider:
DVI should give a pixel-perfect reproduction on screen, as long as you are at native res. ie. the PC is set to the same res as the screen's native res. usually 1366x768. (the res will actually be 1360x768 as graphics cards need a res divisible by 8. You'll be missing three columns of pixels on the left and right - unnoticable). BUT!!!! Not all LCD screens accept native res over DVI. Most max out at 1024x768. You'll have to investigate further once you've decided on a suitable screen. Some can do it, some can with a work-around, some can't.
D-sub will give good results, but not quite as good as DVI. Text will be noticably fuzzier, but video should be almost indistinguishible from DVI.
HDCP is only a consideration when you are trying to watch encrypted content, such as HD-DVD or Blu-Ray HD movies. They will require a HDCP compatible output on your graphics card... Haven't seen any yet, and don't know if it's the kind of thing that can be activated with a driver update... Will have to see. At the present moment though, you don't need to worry about it (with regard to using PC). Sky HD will use it (over its HDMI output), and upscaling DVD players use it, but your original question was about PC use.