We had a discussion on this forum earlier about how different dvd-players and/or dd/dts decoders sound different (discouting the analog stage differences). Some of us came to the conclusion that this was not possible from signal theory point of view and that DD/DTS streams are immune to jitter and all audible differences are imaginary. Well, guess what? DD / DTS stream is not immune to jitter - or to be more precise, the DA-stage after the DD/DTS decoding and decompression stage is NOT. Why? Because the clock for the bitstream to the DAC is derived from the incoming DD/DTS bitstream in order to keep synch correct. Supposedly some really high end receivers/decoders use their own high precision crystal to re-clock the data after DD/DTS decode/decompress stage, so that jitter becomes almost a non-issue for DAC. But most devices do NOT do this. So, the quality of DVD transport spewing out DD/DTS bistream can in theory affect sound quality. There is an ongoing thread about this in the AVSForum Home Theater PC section. Cheers, Halcion References: Julian Dunn, Jitter Theory 2, Audio Precision "There are many circumstances in which a sample clock must be derived from an external source. In a digital audio recorder or a digital surround processor, for example, the sample clock controlling the DAC is extracted from the input data stream."