fraggle
Prominent Member
After reading lots of threads with lots of contributions from people who don't realise the theory behind what the high end cables are trying to do, I though I'd post this.
I'm only talking about cables carrying high frequency digital signals (HDMI, SPIDF, TOSlink) here, low frequency signal cables (speaker, mains) have their own completely different set of problems and they're another ballpark.
High Frequency Cables (HDMI, SPIDF, TOSlink, etc)
These all pass a digital signal. So the voltage on the wire is switching between two states, "on" and "off", 5 volts (or 3.3V) and 0 volts. This switch from 5V down to 0V (or 0V to 5V) in an ideal world would be absolutely instantaneous, in reality it takes a finite time.
All cables are not completely transparent to electricity flowing down them, they have resistance, capacitance and inductance.
In simple terms this means that the 5V voltage you put in one end is reduced a bit by the time it comes out the other end, also the 0V level actually increases a little (with a high frequency signal).
It also means that the change (from a 1 to a 0 or the other way around), which at the input to the cable took 'X' nano seconds, now takes slightly longer, it's been "stretched", if you looked at the waveform on an scope you would see the sharp right angle bends have been rounded off:-
The blue trace is our input signal, nice sharp, almost instant changes from one level to the other. The purple trace is a decayed output signal. The sharp changes are gone, and you can see that it wouldn't have to get much worse before the signal starts to becomes unusable.
The longer the cable, the more these effects happen.
What effects will these problems have?
Well, at the input to the cable the difference between the 5V "on" and the 0V "off" is obviously 5V. If the cable is short we'll get maybe 4.5V and 0.5V out, a difference of 4V. No problem. Now put that down a very long cable, we get 3V and 2V out of it. A difference of only 1V - too small for the receiving end to reliably decode the signal.
The biggest problem however is illustrated by the image above. HDMI signals carry a huge amount of data, so they need an extremely high frequency signal to carry all that data, and so the time taken to switch states (0 to 1 or 1 to 0) is extremely small, and the time between state switches is equally extremely small.
Feed the clean signal into an HDMI receiver and it'll have no problem decoding it. Feed the above "decayed" signal in and the HDMI receiver would probably have no problem decoding 99.99% of it correctly, an error rate of 0.01%. Feed an even worse signal into it and the error rate goes up.
Say 99% of the signal is received correctly, the other 1% is flagged as errors. 0.9% of those errors could be corrected, leaving the other 0.1% uncorrected.
The receiver has to "guess" what the signal should be in those errors and fill in the blanks with its best guess, which, unless you're NASA with an unlimited budget, won't be absolutely correct.
Anyone with a square wave generator and a decent scope can see the above effect, connect the scope directly to the generator and the square wave is nice and crisp. Put the signal through 100M of 2+1 mains cable and it'll likely look like a nice, curvy, sine wave. The sharp switches have been destroyed.
Cables have been designed to minimise these effects, but they cost more to manufacture. (better quality conductors, insulators and spacers, better manufacturing processes, better terminators/plugs/sockets)
I'm not an expert as to where these errors would be most noticable, but I would hazard a guess that in areas where you have large continuous areas of colour (light or dark), you'd notice "noise" (the "guesses" made to fill in the uncorrectable the errors wouldn't exactly match the surrounding colour), also the errors are more likely to occur in the high frequency parts of the signals, in video that's where you have sharply defined edges (i.e. they'll become slightly blurry), and the high frequency component of audio (i.e. the ambience, transparency, fidelity would suffer).
Please note I am NOT defending any particular person, magazine, review, cable or manufacturer. I'm simply trying to explain *why* there can be a difference between a poor quality cable and a good quality cable and having a stab at how the resulting problems might be observed.
I'm only talking about cables carrying high frequency digital signals (HDMI, SPIDF, TOSlink) here, low frequency signal cables (speaker, mains) have their own completely different set of problems and they're another ballpark.
High Frequency Cables (HDMI, SPIDF, TOSlink, etc)
These all pass a digital signal. So the voltage on the wire is switching between two states, "on" and "off", 5 volts (or 3.3V) and 0 volts. This switch from 5V down to 0V (or 0V to 5V) in an ideal world would be absolutely instantaneous, in reality it takes a finite time.
All cables are not completely transparent to electricity flowing down them, they have resistance, capacitance and inductance.
In simple terms this means that the 5V voltage you put in one end is reduced a bit by the time it comes out the other end, also the 0V level actually increases a little (with a high frequency signal).
It also means that the change (from a 1 to a 0 or the other way around), which at the input to the cable took 'X' nano seconds, now takes slightly longer, it's been "stretched", if you looked at the waveform on an scope you would see the sharp right angle bends have been rounded off:-
The blue trace is our input signal, nice sharp, almost instant changes from one level to the other. The purple trace is a decayed output signal. The sharp changes are gone, and you can see that it wouldn't have to get much worse before the signal starts to becomes unusable.
The longer the cable, the more these effects happen.
What effects will these problems have?
Well, at the input to the cable the difference between the 5V "on" and the 0V "off" is obviously 5V. If the cable is short we'll get maybe 4.5V and 0.5V out, a difference of 4V. No problem. Now put that down a very long cable, we get 3V and 2V out of it. A difference of only 1V - too small for the receiving end to reliably decode the signal.
The biggest problem however is illustrated by the image above. HDMI signals carry a huge amount of data, so they need an extremely high frequency signal to carry all that data, and so the time taken to switch states (0 to 1 or 1 to 0) is extremely small, and the time between state switches is equally extremely small.
Feed the clean signal into an HDMI receiver and it'll have no problem decoding it. Feed the above "decayed" signal in and the HDMI receiver would probably have no problem decoding 99.99% of it correctly, an error rate of 0.01%. Feed an even worse signal into it and the error rate goes up.
Say 99% of the signal is received correctly, the other 1% is flagged as errors. 0.9% of those errors could be corrected, leaving the other 0.1% uncorrected.
The receiver has to "guess" what the signal should be in those errors and fill in the blanks with its best guess, which, unless you're NASA with an unlimited budget, won't be absolutely correct.
Anyone with a square wave generator and a decent scope can see the above effect, connect the scope directly to the generator and the square wave is nice and crisp. Put the signal through 100M of 2+1 mains cable and it'll likely look like a nice, curvy, sine wave. The sharp switches have been destroyed.
Cables have been designed to minimise these effects, but they cost more to manufacture. (better quality conductors, insulators and spacers, better manufacturing processes, better terminators/plugs/sockets)
I'm not an expert as to where these errors would be most noticable, but I would hazard a guess that in areas where you have large continuous areas of colour (light or dark), you'd notice "noise" (the "guesses" made to fill in the uncorrectable the errors wouldn't exactly match the surrounding colour), also the errors are more likely to occur in the high frequency parts of the signals, in video that's where you have sharply defined edges (i.e. they'll become slightly blurry), and the high frequency component of audio (i.e. the ambience, transparency, fidelity would suffer).
Please note I am NOT defending any particular person, magazine, review, cable or manufacturer. I'm simply trying to explain *why* there can be a difference between a poor quality cable and a good quality cable and having a stab at how the resulting problems might be observed.