Strange problem has just occured with my setup. I have the LG 37" LCD and a HTPC with an ATI 8500. Going HDMI>DVI to the 8500 led to an instant, pixel perfect, 1360x768resolution. VGA was also fine. Both 60hz only. While waiting for my X1600 to arrive I put my ATI 9700 Pro in. This also detected 1360 x 768, with this card I could choose refresh rate on DVI via Pstrip, I could run 50hz via DVI...or 75hz..or 85hz! My X1600 finally turned up but it doesn't work like the other two cards. VGA is fine. When hooked up via DVI I get 1360 x 768 but it is underscanned by about an inch all around!! The TV reports 1360 x 768 @ 60hz and is set to 'PC' mode. The only way I can remove the underscan is via VGA or running 720p via DVI in 'DTV' mode (the TV overscans in DTV mode) The really strange thing is, prior to driver installation and running on Windows generic VGA driver there is no underscan! I've tried both the latest ATI drivers and the Omega's - both the same. I've fiddled all over but I cannot rectify this, the ATI driver does at some point pop up a box mentioning I am connected via DVI to HDMI - this did not happen with the two older cards. I'm wondering if it is deliberately underscaning because it thinks an HDMI connection means a RPTV or something? (that needs underscan) Frustrating because the X1600 is a noticeable step up from the 9700, but I've lost DVI and 50Hz, thats a trade off I cannot make Help anyone?