Actually I think Cam means is there a difference between a digital interconnect and one half of a stereo interconnect.
If this is the case the answer as always is "it depends".
It depends on what the stereo lead is made from and what the digital lead being compared is made from.
Stereo analogue cables tend to be 50 ohm microphone cable terminated with a flash RCA connector.
Digital incons should be 75ohm cable terminated with a 75ohm termination (usually BNC). It is possible to get RCA connectors taht are close to 75ohm. Canare being one.
Although you could use an analogue 50ohm cable as a digital interconnect ideally you want to use the correct spec cable. Correct is not expensive. There are many options in that regard. I use Belden 1505a with Canare crimp on BNC's or RCA's as appropriate.
The 75ohm bit is the issue. RCA sockets and plugs are not 75ohm interfaces. Crushing the dialetctric, melting it when soldering and the actual design of the plug itself all affect the impedance across the frequency range.
Canare's crimped on plugs (RCA) are I believe the closest to getting to 75ohm over the bandwidth required.
If you want a good article on digital interconnects then Beekeepers article about cables is good and TAG's white paper on their digital interface design and cable designs is quite interesting.
RCA was never designed as a 75ohm connector, BNC was. I understand that the new Eichmann Bullet plugs get close to 75ohms as well.
It should be remembered that the 75ohms refers to the impedence, this is not just resistance as it incorporates the complex inductive and capacitive properties of the cable.
If the entire cable, terminations and sockets they're attached to have a 75ohm impedence then the whole thing is impedence matched. If the impedences aren't matched at a point then the electrical signal will be reflected back where it came from creating standing waves. If you're feeling brainy look here: http://en.wikipedia.org/wiki/Impedance_matching
Impedence matching is crucial for power transfer which is of course what we're trying to achieve.
I've yet to come across a designer even considering impedence matching any solder joints though . Just using a 75ohm cable and a 75ohm termination isn't really good enough if they're not connected well.
as far as i understand it coaxial is better for shorter (standard) lengths because:
When the signal leaves the DVD player the information remains electrical meaning no conversions are necescery whereas optical involves a conversion at both DVD player and amp to change from electrical and back.
Optical is better for long distances.
Please correct me if i'm wrong
hi, there is no conversion for electrical to optical. All that happens is that an LED is there at the transmitter end instead of a RCA plug. If you looked at the signals they would be identical.
I would say optical to be better becuase it provides electrical isolation at the signal level.
Dont be fooled by some optical connectors with "gold surfaces for improved surface contact" This is rubbish. Look inside your connector on DVD and amp.... there isnt any metal for an electrical connection.
the cables are more fragile than electrical cable.
I understand the difference in impendance, but what is the audible result? My ears are definitely not trained, but I could not hear a difference when listening through a dedicated digital cable and a simple composite RCA cable. I guess it has no difference from a single stereo RCA, right?
By the way, it is clear that quality is important in analog connections, because a cable may block out some frequencies more than others. This, however, should not hold for digital signals, as long as the 1's and 0's go through right. Is this correct?
It is correct. Ive got a 10m electrical digital link thats just twisted pair which will have a charachteristic impedance of about 110ohm. This however does not reject noise and interference in the cable and causes the decoder to lose the signal lock.
I then converted to using audio cable but this had such a mad impedance (ie it changed allong the cable) that simmilar errors occurred.
Using 10p/m aerial downlead solved all probs.
The digital 0's and 1's go down the cable at a rate of about 500kb/sec altough this can be more for DTS and less for stereo. If any get lost or over written due to noise in the cable then the decoder has to "work out" what was missing, using error correction. This is normally quite reliable.
The decoder has to lock onto the incoming signal using various techniques. In with the audio signal there is also timing signals that enusue everything runs together, should this go then you lose the lock and the sound drops out.
Losing the clock will be noticible and losing audio data wont be, unless its quite substantail.
What you are more likely to lose are the high frequency signals, a loss of digital data can be heared as sometimes a high frequency whine (somethimes audible in cinemas) or as a strange running watter sound in high pitched sounds. This is the decoder doing its error correction.
If your cable is less than 1m you shouldnt have to worry about it, however over that then proper 75ohm cable is needed to prevent standing waves in the cable which (if you could view on a TV) would produce a "ghosting" (image to appear), in the signal. This would also cause errors.
Optical can actually be a VERY good connection method but again the devil is in the detail. Electrical isolation is excellent but you need a decent optical cable like VDH, Chord or QED not the thin stuff. Despite popular opinion there is not an extra stage of conversion as both coax and Toslink is converted to I2S for internal digital connections in the kit. For noisey kit I prefer optical, other kit I use the same as Gordon Both are capable of quality solutions IF implemented well
Good to see Toslink now being classed as acceptable connection these days (whats the world coming to eh ?), FWIW RCA is a more than accpeptable to many audiophiles, a lot is made of a true 75ohm connection being critical but to many many people the RCA SPDIF connection (ie non 75 ohm ) does what is required of it ie getting a signal from a b without break up of 1s & 0s data. Always has been FWIW.
Whether a lot of people could tell the difference between cable types or connection types in blind tests well thats a different story. All I can say is that I have tried and I cannot.
BTW seeing as we are on an AVForum have anyone ever questioned why jitter in cables & connector types is never that detectable in passing a 5.1 signal over a length of cable regarded by audiophiles to be sacrosanct ? ie an cable believing audiophile will swear blind that they can hear difference in cables/connectors but could they do so with a 5.1 system that a AV user enjoys ?