Help!
I have been trying to output video from the VGA port of my ATI Radeon 9600XT to my TV via RGB SCART.
I can successfully connect the HTPC to the TV via s-video, but I thought I'd try out this other route for improved quality. After tackling with this for the last 3 days, I'm seriously considering just giving up and just going down the s-video route. The quality difference IS worth it, isn't it?
I live in the UK (PAL broadcasts), and my TV is a Loewe Aconda (CRT, Widescreen, 100Hz). It has 3 SCART sockets on the back, and I have manually set the 'AV2' SCART to expect RGB input via the OSD.
I am using a VGA > SCART cable I have wired-up as: http://www.idiots.org.uk/vga_rgb_scart/
...I have tried with and without the 'Blanking' connection.
I have powerstrip installed. I use WinVNC to remotely change the display settings & connect to the HTPC. I have tried switching between Interlaced & Progressive, and Composite Sync. I've tried many resolutions, including 640x480p@60Hz, 640x480i@60Hz, 640x540p@50Hz, 640x540i@50Hz, 640x540i@100Hz, 640x576i@50Hz, 640x576i@100Hz.
I've tried stepping-up and down through the various resolutions and frequencys, hoping that I would get close to displaying a fixed computer image.
I've attached a photograph of the best result (flicking-image) I have achieved so far on the TV. This photograph was taken using powerstrip settings of Interlaced, 720x540, 100Hz...
[Custom Resolutions]
720x540=720,50,72,126,540,25,5,54,60403,312
This image 'travels' up and left on the tv.
The 'composite sync' checkbox doesn't make any difference to the TV image when turning ON / OFF
1) Do I need to flash my Radeon's bios with RadEdit to enable correct RGB Scart output, even though I have powerstrip installed?
2) Is this flicking effect normal during powerstrip setup, or is it my hardware?
Thanks in advance,
Marc.
I have been trying to output video from the VGA port of my ATI Radeon 9600XT to my TV via RGB SCART.
I can successfully connect the HTPC to the TV via s-video, but I thought I'd try out this other route for improved quality. After tackling with this for the last 3 days, I'm seriously considering just giving up and just going down the s-video route. The quality difference IS worth it, isn't it?
I live in the UK (PAL broadcasts), and my TV is a Loewe Aconda (CRT, Widescreen, 100Hz). It has 3 SCART sockets on the back, and I have manually set the 'AV2' SCART to expect RGB input via the OSD.
I am using a VGA > SCART cable I have wired-up as: http://www.idiots.org.uk/vga_rgb_scart/
...I have tried with and without the 'Blanking' connection.
I have powerstrip installed. I use WinVNC to remotely change the display settings & connect to the HTPC. I have tried switching between Interlaced & Progressive, and Composite Sync. I've tried many resolutions, including 640x480p@60Hz, 640x480i@60Hz, 640x540p@50Hz, 640x540i@50Hz, 640x540i@100Hz, 640x576i@50Hz, 640x576i@100Hz.
I've tried stepping-up and down through the various resolutions and frequencys, hoping that I would get close to displaying a fixed computer image.
I've attached a photograph of the best result (flicking-image) I have achieved so far on the TV. This photograph was taken using powerstrip settings of Interlaced, 720x540, 100Hz...
[Custom Resolutions]
720x540=720,50,72,126,540,25,5,54,60403,312
This image 'travels' up and left on the tv.
The 'composite sync' checkbox doesn't make any difference to the TV image when turning ON / OFF
1) Do I need to flash my Radeon's bios with RadEdit to enable correct RGB Scart output, even though I have powerstrip installed?
2) Is this flicking effect normal during powerstrip setup, or is it my hardware?
Thanks in advance,
Marc.