According to the Philips 37PF9830 thread, it cannot accept 1080i via DVI (HDMI + adapter) but does display 1080i via VGA.
"On my TEVION" (I know I start a few sentances that way, hopefully not for much longer ) however..... DVI is very good, not 1:1 pixel mapping, but very clear. On the other hand VGA has a shimmering effect on edges of visual window whilst web browsing, yet looks more "natural" when watching video HD AVI's.
So unfortunatley no conclusive sweepling generalisation from me (that's unusual!).
Say you have a PC with DVI and D-Sub outputs, and a LCD TV with 1366x768 resolution.
Scenario 1: You play a 720p file which is 1280x720 resolution in, say, Windows Media or TheaterTek etc. Your PC is set to 1366x768, so the PC upscales the image from 1280x720 to 1366x768.
Scenario 2: You play a 1080i file which is 1920x1080 resolution. Your PC is set to 1366x768, so your PC downscales the image from 1920x1080 to 1366x768.
So, if you connect a PC using D-sub or DVI, a 720p file will have all the detail intact and will look fine. A 1080i file will have SOME detail removed (as it is downscaled), but will still play fine and look good. The upscaling and downscaling is all handled by your PC/graphics card.
Couple of things to consider:
DVI should give a pixel-perfect reproduction on screen, as long as you are at native res. ie. the PC is set to the same res as the screen's native res. usually 1366x768. (the res will actually be 1360x768 as graphics cards need a res divisible by 8. You'll be missing three columns of pixels on the left and right - unnoticable). BUT!!!! Not all LCD screens accept native res over DVI. Most max out at 1024x768. You'll have to investigate further once you've decided on a suitable screen. Some can do it, some can with a work-around, some can't.
D-sub will give good results, but not quite as good as DVI. Text will be noticably fuzzier, but video should be almost indistinguishible from DVI.
HDCP is only a consideration when you are trying to watch encrypted content, such as HD-DVD or Blu-Ray HD movies. They will require a HDCP compatible output on your graphics card... Haven't seen any yet, and don't know if it's the kind of thing that can be activated with a driver update... Will have to see. At the present moment though, you don't need to worry about it (with regard to using PC). Sky HD will use it (over its HDMI output), and upscaling DVD players use it, but your original question was about PC use.
The quality difference between DVI Vs. D-Sub (VGA) is there, but it's not night and day.
I have my PC connected to my TV running at my screen's native resolution (well, set to 1360x768 on my PC, the TV is 1366x768 native), and I really can't see any difference between DVI-D and analogue (VGA) unless I look extremely close at some edges of the windows on the screen, and even then, it's minor.
If your TV has a free DVI port, you may as well use it, but you won't be loosing out as such by using VGA.