This is a point which which has always troubled me. It's fairly well established, now, that even a rather poor digital interconnect won't be poor enough to introduce any actual bit errors. (There was the famous case of someone getting no bit errors at all while using a wire coat hanger as a digital interconnect). I think we've also managed to establish that, for movie soundtracks (DD, DTS, etc.) jitter is completely irrelevant if it's higher up the chain than the processor, because the data is compressed and therefore must, by definition, be buffered and reclocked before it hits the processor's DACs. (So jitter in the player or interconnect can't have any influence on the eventual sound). But jitter certainly can have a big impact on the performance of a processor playing back a PCM stream, e.g. when a DVD player is acting as a CD transport and the processor's DACs are doing the conversion. This fact is often quoted as the reason why a more expensive digital interconnect can have an influence on sound quality. But I have yet to come across any explanation as to how a cable can introduce jitter. How is it possible for one pulse to pass through it at a different speed from the next one?