1. Join Now

    AVForums.com uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Newbie - Dumb Digital Question

Discussion in 'AV Receivers & Amplifiers' started by matt_symes, Jan 10, 2005.

  1. matt_symes

    matt_symes
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Hello.

    I'm going to be buying an AV Amp that has coaxial + optical digital inputs.

    What I don't understand is why I need to worry about the quality of the source components that will feed this Amp. Surely if the input is digital then *any* DVD player I get which has optical audio out will sound exactly the same when pushed through this Amp - after all, it's all just 1's and 0's read straight off a DVD.

    Since everything I read in magazines assures me this is not the case - what I'd like to know is - why not?

    Much thanks in advance for helping out a befuddled newbie.

    - Matt
     
  2. Zacabeb

    Zacabeb
    Active Member

    Joined:
    Jan 17, 2002
    Messages:
    474
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    21
    Ratings:
    +9
    I also wonder about this, since I'm quite a skeptic.

    As far as I understand, the theory is that the decoder must synchronize itself with the input data stream, and so its actual processing speed is sort of under the spell of the actual rate of the incoming data. If that data is jittery, it makes the decoder stumble as it tries to lock onto the incoming signal.

    Considering how many buffers and DSP's the signal passes through, the idea of jitter affecting the sound seems a bit unreasonable though. I'm inclined to think that a signal with so much jitter that it would produce audible results, would also cause severe problems for the system to even lock onto the signal and the audible result would be noise, distortion, clicks, pops, or even intermittent losses of signals.

    I can't see how jitter would affect compressed audio signals like AC-3 either. They're buffered in memory and decoded. Could the entire system, working at clock speeds way above the data rate be completely synchronized with the clock given by the input data?

    I also wonder how computers, with tons of separate clocks and components working together in the same cabinet and even interfering, can still allow for a stable system where tons of processors happily synchronize at transfer rates of several Gbps. Must be the pixies. :rolleyes:
     
  3. CosmicOne

    CosmicOne
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    to add to this
    i also wonder if different DVD players ranging from 100 to 2000 UKpounds will add anything to the sound if used with the digital out coz the digital out is just bypassing the bit stream to the AMP/Reciver to decode it
    so i want to know from the guys with experince here does a DVDplayer with 100 differ from a DVDplayer with 2000 in audio quality when digital out is used????

    analog is another story coz analog uses the dacs inside the DVD player instead of the dacs inside the AMP/reciver???!!!!!!
     
  4. matt_symes

    matt_symes
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    After further thought I'm going to be gutted if I've spent good money on a DVD player just to find that the picture is better through DVI out of my PC and the sound is the same as my PC via digital out.

    And my DVD *recorder* in my PC only cost £80!

    I'm sure I *must* be missing something. Are there any technical wizards out there that can explain this one?
     
  5. Lostinapc

    Lostinapc
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    in my opinion you buy an expensive dvd player for improved picture, you buy an extremely expensive dvd player for improved picture & sound.
     
  6. Zacabeb

    Zacabeb
    Active Member

    Joined:
    Jan 17, 2002
    Messages:
    474
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    21
    Ratings:
    +9
    There are a few articles on the issue, this one and the QA that follows are written by mastering engineer Bob Katz:

    Everything you always wanted to know about jitter but were afraid to ask

    Questions and answers about jitter

    Those pages are informative, but equally confusing.

    There are also two articles more than a decade old from Stereophile:

    The Jitter Game

    Jitter and the Digital interface

    The age of those again raises questions regarding how technology has advanced. I have no idea how recent the footnotes are.

    And my own questions pile up the more I read.

    What about parallel vs. serial data? In ye auld days, weren't multi-bit audio DAC's connected with parallel interfaces, i.e. one pin per bit? The DAC in my old Yamaha CDX-1060, which had an early 1-bit implementation, is huge. The audio DAC's of today are tiny and I think almost exlusively fed serial data.

    So, is it the jitter in the bit that triggers the D/A conversion of any given sample which is the culprit, and the other bits innocent? And what about oversampling? Today's audio DAC's add tons of their own samples, interpolated from the existing ones after they are stored in a buffer. Some of them even change the original sample values in the process. Most of today's DAC's are 1-bit converters and must oversample like mad since they are using a single bit to produce the waveform. How are they affected by jitter? Given the oversampling and delay, are words output affected by jitter coming in from later samples?

    And what about the purest-of-pure DSD used in SACD? Wouldn't that be jitterama, given that it's a raw one-bit stream and does not have multi-bit words? Doesn't jitter in every single bit count there? :confused:
     
  7. dvdsubtitles

    dvdsubtitles
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    You are partly right. Any source component pushes a digital signal out of its outputs in exactly the same way. The difference is in how well the source reads the content. A very good DVD player will read a DVD disc with much fewer bit errors than other players. There are ALWAYS errors - as engineers always say, there is no such thing as 100% efficiency! This affects any digital stream - ie. both picture and sound.

    I'm not a technical expert but this is how I see it.

    Cheers,
    Mat
     
  8. Zacabeb

    Zacabeb
    Active Member

    Joined:
    Jan 17, 2002
    Messages:
    474
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    21
    Ratings:
    +9
    But the idea of jitter is that if those 1's and 0's don't get sent out at regular intervals, or if their edges are smeared so much by noise or other phenomena that the receiver syncs to the incoming 1's and 0's at irregular intervals, it also causes D/A conversion to not be performet at regular intervals. It is a valid claim as far as that jitter does exist. The 1's and 0's don't toggle instantaneously, there's always a slope between them because of bandwidth limitations.

    The real question is whether jitter is so severe in practice that it will indeed affect the behavior of D/A conversion, and if so, what the effect really is and if it truly is audible. That's what we should be asking.

    Here's yet another link to a site explaining jitter, actually a site entirely dedicated to the issue of jitter, and with a domain name ensuring it will not be mistaken for anything but a site dedicated to the issue of jitter. It might be biased since it's maintained by Altmann Micro Machines, a manufacturer of audio equipment, but it can still be useful.

    www.jitter.de
     
  9. matt_symes

    matt_symes
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Thanks to everyone that's commented on my question so far :thumbsup: I was not previously aware of 'jitter'.

    To me though, this does beg the question of how the cheapo DVD player in my PC can read a computer program off a CD perfectly, completely unaffected by jitter. After all, even a single bit out of place could screw up an .exe file. :confused:

    At least this has shaken me out of my misconceptions around 'digital'. It seems that the digital world has still got plenty of 'analogue' left in it that generates room for confusion!
     
  10. dvdsubtitles

    dvdsubtitles
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Actually this is different - your PS's operating system and CD-ROM drivers have extensive error-checking built in which uses checksums to repeatedly ensure every bit of the file/program is transferred correctly without any corruption. This works because the file/program is not used until after transfer to memory/Hard disk is complete.

    Mat
     
  11. matt_symes

    matt_symes
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    OK - that makes sense. But then why not use the same sort of error checking / buffering technology in my DVD player?

    Are you saying that the DVD player in my PC will give me better audio and picture (if pushed out of my digital out on the soundcard and the DVI out on my video card) than *any* DVD player?

    Is this a secret that the AV industry are trying to keep quiet?! It just gets more and more confusing :lease:
     
  12. CosmicOne

    CosmicOne
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    Hey guys,

    no one answered my questiones regarding the digital out of different players,Please guys recheck my post.
     
  13. dvdsubtitles

    dvdsubtitles
    Guest

    Products Owned:
    0
    Products Wanted:
    0
    Ratings:
    +0
    My logic goes as follows:

    It is easy to error check a movie on a DVD against a copy of it on a hard disk and ensure they are an exact copy.

    However you can't do the same thing between the movie on the DVD and what you see on the screen, or between the movie on the hard disk and what you see on your computer screen. The DVD player/computer cannot "see" the whole movie on the screen to check if it is an exact match with the source.

    No, because even if you make a perfect copy from the DVD to the HDD, you still have the same problem of getting the DVD picture accurately to the PC monitor.

    This is merely a logical argument. I'm not up to speed on the intricacies of DVD player technology so this is as far as I can go.

    Mat
     

Share This Page

Loading...