1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Digital Interconnection (Technical)

Discussion in 'AV Receivers & Amplifiers' started by Demon, Jan 17, 2003.

  1. Demon

    Demon
    Active Member

    Joined:
    Dec 13, 2000
    Messages:
    954
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    Cheltenham
    Ratings:
    +57
    I must be missing something...

    There seems to be a massive amount of people asking about digital connections, Optical or Coaxial...

    And there seems to be a substantial amount of people that are giving evidence to suggest that there is an audible performance difference between the two.

    I think I may have totally misunderstood what the 'Digital' link was being used for...

    I admit that I don;t know the electrical specification of the physical layer, and so am not sure if the term 'bitstream' is the same term I apply to any digital communications system.

    I just though it was another single 'wire' asynchronous serial communications bus, like LAN/CAN/RS232 etc....

    If it is a digital comms channel, then how can there be any difference between the two?

    I'm assuming that both the coaxial / optical cables are of sufficient construction(not expensive) to allow the physical layer specification to be met...

    Is there no error checking at the physical/protocol levels?

    People are adamant that coaxial sounds much better...

    So I can only think that
    1. They are comparing normal Analogue Line Level connections against using a digital connection.. which would sound different, but is nothing to do with the coaxial/optical comments... and how can people be so confused?
    2. There is a massive amount of lost/corrupt data generated, and so either transmitter/receiver are not capable of meeting the electrical specification.
    3. The cables are such poor quality that they just don't meet the specifications, (Probably borrowed from some sparkly christmas decoration, with a couple of Toslinks put on the ends with superglue)
    4. The physical layer has been so badly designed that it is susceptible to almost the same level of noise an analogue channel.


    I'm juts wondering about the New Hard Drive Serial ATA technology, will this make my Data look different to Parallel ATA??? will my MP3's sound different

    Got to stop, sarcasm overload....

    Someone please explain where I am going wrong!!!!!!!

    Dolby say...
    "Under most conditions, optical and coaxial digital connections work equally well. Under some rare circumstances, however, coaxial cables, particularly very long ones, can pick up radio frequency (RF) interference generated by household appliances, or nearby high-tension power lines or broadcast towers.

    If cost is a consideration, start with coaxial, which is less expensive. If you then hear RF interference, you can try relocating the cables, moving your components closer together so you can use shorter cables, or, if all else fails, changing to costlier optical cable. If cost is no object, using high-quality optical cables from the outset is probably your best long-term choice.

    Note: some DVD players and Dolby Digital decoders have either a coaxial or an optical connector. Be sure that the units you purchase both use the same type.
    "
    which sort of makes me think it is a truly digital serial comms bus.... and the High Quality optical long term solution is for mechanical durability....
     
  2. Nic Rhodes

    Nic Rhodes
    Well-known Member

    Joined:
    Mar 23, 2001
    Messages:
    17,133
    Products Owned:
    0
    Products Wanted:
    1
    Trophy Points:
    133
    Location:
    Cumbria
    Ratings:
    +1,277
  3. Silent Fly

    Silent Fly
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    I am kind of out of breath regarding interconnects discussions and to be honest I don’t think is worth the time. It looks to me more matter of faith that scientific observation.

    Do you remember The Matrix?

    ”You take the blue pill and the story ends. You wake in your bed and you believe whatever you want to believe.

    You take the red pill and you stay in Wonderland and I show you how deep the rabbit-hole goes."


    In my opinion the answer is: "whatever makes you happy".
     
  4. NicolasB

    NicolasB
    Well-known Member

    Joined:
    Oct 3, 2002
    Messages:
    5,870
    Products Owned:
    1
    Products Wanted:
    0
    Trophy Points:
    136
    Location:
    Emily's Shop
    Ratings:
    +564
    To give you brief answer, Demon, there are a couple of differences between transmitting pulses along a computer network cable and transmitting audio information.

    1) PCM data (e.g. a CD audio track) does NOT contain any error correction. (There is plenty of error correction available at the point where the data is being read off the disc, but typically none after that).

    2) The cables move around more, so a poorly made cable is more likely to suffer mechanical failiure.

    3) By far the most significant difference is the importance of jitter. The sound signal is eventually being fed to a DAC. For the purposes of this discussion a DAC can be regarded as a device which accepts a sequence of pulses at the input (a digital signal) at regular intervals and outputs a smoothly varying voltage (analogue signal) whose amplitude depends on the digital signal. Let's imagine that in between two consecutive sample values the output is supposed to change from 0.50 volts to 0.51 volts. In order for the "shape" of the output signal to be correct, 3 things have to happen. i) The correct sequence of pulses has to arrive. (If it doesn't then you have an actual bit error, but this is rare). ii) The DAC has to produce the correct voltage for a given input value. iii) The DAC has to produce this voltage at exactly the right time. If the output signal hits 0.51 volts slightly too early then the signal will be rising too fast. If it's too late it will be rising too slowly. We're talking a minimum of 44,100 samples per second, so that means even quite tiny errors in the timing of the pulses can result in noticeable levels of distortion in the output signal.

    This is what is called "jitter" and it's responsible for most of the distortion that one gets in digital audio circuitry (once you get past more basic problems such as noise and interference). This sort of effect doesn't matter in a computer network because there is no digital to analogue conversion at the end - so long as the correct pulses arrive the correct (digital) data can be extracted.

    The reason why optical connections are sometimes less effective than coaxial electrical connections has nothing at all to do with the transmission through the cable (usually - although cheap optical cables can develop microfractures) it's because the signal has to be converted from an electrical to an optical signal at one end of the cable, and then back to an electrical signal again at the other end. And the circuitry used to do this can often introduce jitter.
     
  5. Jeff

    Jeff
    Well-known Member

    Joined:
    Jul 13, 2000
    Messages:
    5,489
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    106
    Location:
    Basingstoke
    Ratings:
    +256
    What about reflection problems in short cable lenghs?
     
  6. alexs2

    alexs2
    Well-known Member

    Joined:
    Nov 13, 2002
    Messages:
    13,895
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    136
    Ratings:
    +1,674
    The other point about optical cables and jitter is the interface between the cable and the output port at one end and the input port at the other.....dirt/light/etc at this point can induce data loss and jitter also.

    Jitter levels vary vastly between transports,and may be responsible for transports sounding different,when in theory they should not(Paul Miller of Miller Audio Research has been responsible for a lot of the analysis and interpretation of this).
    Certainly to my ears anyway,digital optical transmission sounds significantly worse than using a high quality wire,but a lot also comes down to the quality of the DACs,clocks,and any DSP units in the signal path.
     
  7. Demon

    Demon
    Active Member

    Joined:
    Dec 13, 2000
    Messages:
    954
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    Cheltenham
    Ratings:
    +57
    Thanks guys, I've read all the threads now, and although I still have questions its clear that I may have misunderstood the AC-3 protocol....

    I understand Jitter, thanks for the explanations though, at least I know we are all singing off the same hymn sheet... and so can understand how PCM is very susceptible to Jitter...

    What isn't so clear is the AC-3 protocol and how the Jitter gets through to the DAC, I really was looking for some info on the web, but have not found what I am looking for... I've found a few snippets, so can someone expand on these, or point out where I am going wrong...

    MPEG 2..
    This seems to have the right Idea, and has PTS information sent in the data, this is the Presentation Time, and tells the DECODER, and therefore DAC, when to output the audio.... but is this at too high a level? or is this exactly what would eliminate Jitter, since the decoder now knows exactly when to 'present' the audio to the DAC... thus any Jitter is due to the DSP/DAC on the amp, and the interconnect is transparent....

    AC-3
    This is a different Kettle of fish, this carrries sync information, which tells the decoder what sample rate the incoming data was sampled at, and so at what rate to clock it out, this should eliminate Jitter for packets shortly after this sync but I can't find any info on how the Synch blocks are used, so can anyone help me out?
    I could see that possibly, if the sync's are only sent periodically, then you will get jitter around the start of each sync block, since it is impossible to completely sync each clock pulse...

    But I can't find anyone who knows ac-3 at this level..

    Anyone?



    I suspect that if someone can answer my questions, then this will fully explain why the interconnect is important, since Jitter is a variable that exists between all cables.. but I don;t want to get down to the usual Optical/Coaxial debate... we'll just left it as read that Jitter is generally slightly worse on most optical connections...(as already stated, more down to the electrical to light, back to electrical conversion, then lead)...

    The real reason I need to know, is I was gonna knock up a buffer using a DSP, etc to give me something to elminate Lip sync on a projector I was thinking of buying, but obviously need it to be transparent audio wise..... so if its gonna be too hard, or intrusive to the quality... I'll save up for the New Denon 3803..
     
  8. Nic Rhodes

    Nic Rhodes
    Well-known Member

    Joined:
    Mar 23, 2001
    Messages:
    17,133
    Products Owned:
    0
    Products Wanted:
    1
    Trophy Points:
    133
    Location:
    Cumbria
    Ratings:
    +1,277
    Most of the stuff you need is in the thread talked about earlier.

    Separate digital buffers are certainly possible but require for good operation much more thought. Input to the system is via SPDIF, then there is the translation through internal comms protocols I2S. Then you have to get your DSP to delay the signal into it.

    I believe some people have tried to make digital delay boxes with varying degrees of success (mine was a failure!). For a single task box they looked very expensive, which they are. When you look at the electronics to do this well it is just the same that is already in any DSP based receiver / processor anyway! So any receiver / processor that is already using DSP like SHARCS (Lexicon, Denons, Tag etc) can easily give you this feature by a simple tweak of software. Tag have already done this as have other I think, others are planning this. Your digital buffer using DSP is the heart of any receiver / processor anyway, what is the point of making one other than for the fun?
     
  9. Demon

    Demon
    Active Member

    Joined:
    Dec 13, 2000
    Messages:
    954
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    31
    Location:
    Cheltenham
    Ratings:
    +57
    I read that thread, and unfortunately not one reply dealt with how the sync/jitter is introduced into the AC-3 application layer..

    There's loads on DACs/Bitstream jitter, which is fine,

    But AC-3 requires buffering, and the sync info tells the DSP at what rate to present the data to the DAC...

    There could be sync errors at the start of each block, which may introduce noise at this point, and then there is the effernal issue of the fact the DVD may be outputting the bitstream slightly the wrong speed, leading in over/underflow of the buffer...

    Maybe I missed it in the thread.....
    The actual question I need answering is how are these sync errors occuring in the AC-3 layer, and the mechanism for how the sync process introduces noise...

    I am realistic and feel I am missing something fundamental, I'm no genius, so please can someone who fully understands the AC-3/DTS bitstream help me understand.......

    I have various ideas on the Hardware front, all are inexpensive, you don't need anything too fancy, its just I need to fully understand the importance of what I am trying to achieve...
     
  10. Reiner

    Reiner
    Active Member

    Joined:
    Jul 16, 2000
    Messages:
    3,315
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    61
    Location:
    Germany
    Ratings:
    +13
    I think the commen consensus is that there might be an audible difference with PCM (typically CD-Audio) but DD / DTS are not - or at least less - sensitive to the actual transmission medium used since they are packetized.
    Comparing PCM to (PC) data transmission is not "fair" as it happens in real time without any error correction while a TCP/IP network uses packets which can be repeated if they get lost or corrupted. Thus a corrupted PCM signal will always be measurable but if it's adudible would be defined by the sensitivity of our hearing or the level of the distortion.

    Cheap optical cables are made of plastic (and not glass like expensive F/O cabled), thus I would prefer a coax for it's better mechanical characteristics. But for long runs a good optical cable might be better since coax can indeed pick up some noise (see the quote from Dolby) as it essentially functions like an antenna. A good shielding can somewhat prevent that however though the longer the cable gets the higher the loss and by that the signal (signal level) arriving at the other end will become weaker.

    Digital is not just 0/1, on/off or works/does not work - I am sure everyone using a GSM phone has experienced some interruptions or distortion before without being cut-off totally.
    However in an AV / hifi system with short cable runs I would consider all those potential problems as "academic", i.e. in theory there are a lot of things which could happen but practically it's neglectable and most people wouldn't know the difference if they are conducted by using a double blindtest.
     

Share This Page

Loading...