Just wondering if anyone knows this one: I have a Sapphire Atlantis Radeon 9600 (non pro) This has a VGA output port and a DVI output port. I have a DVI to VGA adaptor plugged into the DVI port. meaning I have two VGA ports now. Does anyone know if the REAL VGA port, or the VGA port driven from the DVI out gives the better quality image ? There must be some componant differences on the circuit board, one of the signals going through more than the other one. Anyone know, or have a guess ? Also, do you know how to tell windows/ATI software which screen uses which port ?