The Gadget Show - HDMI Cable Comparison

ADSL is an high frequency analog signal carrying a digital signal in exactly the same way as hdmi does (nothing like as high frequency but the principle is the same).
ADSL and hdmi do not carry digital signals in the same way. ADSL uses a form of QAM which performs amplitude and frequency shift modulation of the carrier to relate the digital data. HDMI on the other hand is pure digital (ones and zeros without a carrier).
The higher the digital bandwidth the higher the frequency needed to carry it and the shorter the distance it will travel.
I suspect you mean The higher the digital bandwidth, the greater the carrier bandwidth.
If you want a true digital signal then optical is the only way to go at anything like high bandwidth, digital over copper always involves an analog element.
Under most cases, carriers are used for both optical and copper cable over long distances but not used for short distances. Carriers are primarily used to allow for simultanous transmissions on a single line (multiple simultanous digital transmissions could not be easily performed over a single line cable without carriers) and possible increased bitrate. For example TV Optical fiber uses a single mode fiber and uses WDM (Wave-Division-Multiplexing) to allow multiple frequency carriers to be used (8 MHz bands in the UK and 6 MHz bands in the US).
 
Now, now, you're ignoring SACD & DVD-A - both of which offer far higher resolution than CD!


Well, if they ever became the mainstream then yes. But I have never listened to either, and I doubt many have.

The original point was Vinyl, CD, MP3.

But if you want to be pedantic then yes.
 
What does the D stand for then?

It really doesn't matter what the technology is called (although I imagine the intent was to indicate that the technology allows digital networks to interconnect rather than what the transmission scheme in between is). However, it doesn't change the fact that the POTS network uses analogue signals (whereas the ISDN network uses digital ones).

There's plenty of places on the great Wiki to read about the difference between analogue signals and digital signals.

You can also read about the DMT modulation scheme, one of the modulation schemes that ADSL uses that has a nice little diagram of a typical transmitter, clearly showing a digital being applied to the input, before passing through a digital to analogue converter and getting modulated with a carrier for transmission.

As Mike said:

ADSL and hdmi do not carry digital signals in the same way. ADSL uses a form of QAM which performs amplitude and frequency shift modulation of the carrier to relate the digital data. HDMI on the other hand is pure digital (ones and zeros without a carrier).

There is nothing similar at all about the transmission methods.
 
It was intended to be a bit tongue in cheek. I feel suitably ashamed now.
 
Firstly I'd like to say thank you to those who tried to answer my original question about DTS HD sound being affected by the quality of the hdmi lead but what I received were it SHOULD or SHOULDN'T not it DOES or DOESN'T and I needed a definitive answer, not a guess.

So with this in mind I went for a demo at my local hi-fi shop. The equipment used was a Panasonic BD30 blue ray player into a Onkyo receiver feeding a 720p projector. The leads used were an oem generic hdmi lead out of one of the demo kit boxes and a Chord Company one.

Picture wise I could not really tell the difference, probably as it was through a 720p projector rather a 1080p screen but the sound, oh the difference and that is what I was really interested in. I was surprised at just how much difference it made. Analog interconnects have always made a big difference but digital ones, while noticeable, I've never been that convinced about. But this was almost in the analog interconnect zone, the sound field really opened up.

So my question is now well and truly answered, when I buy my BD player a quality lead will be coming with it. For SKY I'll continue using the lead that came with it for the time being, it doesn't carry the sound anyway, the DD5.1 goes by optical and it isn't 1080p. But when I've got the alternative lead from my BD player I'll investigate that further.

So as they say on TV, thank you and good night! :smashin:
 
Firstly I'd like to say thank you to those who tried to answer my original question about DTS HD sound being affected by the quality of the hdmi lead but what I received were it SHOULD or SHOULDN'T not it DOES or DOESN'T and I needed a definitive answer, not a guess.

So with this in mind I went for a demo at my local hi-fi shop. The equipment used was a Panasonic BD30 blue ray player into a Onkyo receiver feeding a 720p projector. The leads used were an oem generic hdmi lead out of one of the demo kit boxes and a Chord Company one.

Picture wise I could not really tell the difference, probably as it was through a 720p projector rather a 1080p screen but the sound, oh the difference and that is what I was really interested in. I was surprised at just how much difference it made. Analog interconnects have always made a big difference but digital ones, while noticeable, I've never been that convinced about. But this was almost in the analog interconnect zone, the sound field really opened up.

So my question is now well and truly answered, when I buy my BD player a quality lead will be coming with it. For SKY I'll continue using the lead that came with it for the time being, it doesn't carry the sound anyway, the DD5.1 goes by optical and it isn't 1080p. But when I've got the alternative lead from my BD player I'll investigate that further.

So as they say on TV, thank you and good night! :smashin:


And I sir, claim that you are telling "a million little fibers".

I wonder how many people will get that reference:confused:
 
I was surprised at just how much difference it made. Analog interconnects have always made a big difference but digital ones, while noticeable, I've never been that convinced about. But this was almost in the analog interconnect zone, the sound field really opened up.

:rotfl: That cheered up a wet Sunday morning.....

I assume this was a joke, right?

The quality of the digital interconnect can not change the "sound field" or any other such characteristic.

Yet another case of "Duffy Moon" syndrome ;) (I wonder how many people will get that reference?)
 
fozzieAV

so glad to have brightened your day...

Cannot change hey? Perhaps I was mistaken then, oh well that will save me money then.

mattclarkie

"a million little fibers" says a lot about the integrity of your posts even if you call someone who disagrees with you a liar by covering it up with a play on words.
 
fozzieAV

so glad to have brightened your day...

Cannot change hey? Perhaps I was mistaken then, oh well that will save me money then.

mattclarkie

"a million little fibers" says a lot about the integrity of your posts even if you call someone who disagrees with you a liar by covering it up with a play on words.

Anyone who doesn't get a South Park reference has no right to comment.

Rise to vote siR

As a quick question, was this one of those amps with multiple inputs that can be customised, in which case the difference in quality could be the sales man pulling a crafty one to sell you an expensive cable. Wouldn't be the first sales-person to try such a trick.
 
As a general rule I don't trust salesmen either but I don't think he could interfere in this case as he was actually swapping the lead not inputs
 
Picture wise I could not really tell the difference, probably as it was through a 720p projector rather a 1080p screen
No, probably because it was identical in both cases.

but the sound, oh the difference and that is what I was really interested in. I was surprised at just how much difference it made. Analog interconnects have always made a big difference but digital ones, while noticeable, I've never been that convinced about. But this was almost in the analog interconnect zone, the sound field really opened up.

So my question is now well and truly answered, when I buy my BD player a quality lead will be coming with it. For SKY I'll continue using the lead that came with it for the time being, it doesn't carry the sound anyway, the DD5.1 goes by optical and it isn't 1080p. But when I've got the alternative lead from my BD player I'll investigate that further.

So as they say on TV, thank you and good night! :smashin:
Right. Well, it's your money. I don't know how else anyone can explain that there can't be a difference. Can't, Won't, Isn't. Whatever you prefer. And while it's true that Sky is less vulnerable to signal loss because it only needs half the frequency for 1080i, it's still the case that a working cable is a working cable and it cannot, and will not, look any better with a more expensive one.
 
Saw these in "Home Bargains" today if anyone is looking for one (1.5m)...

DSC00739.jpg
 
:rotfl: That cheered up a wet Sunday morning.....

I assume this was a joke, right?

The quality of the digital interconnect can not change the "sound field" or any other such characteristic.

Yet another case of "Duffy Moon" syndrome ;) (I wonder how many people will get that reference?)

You can do it Duffy Moon :)
 
You can do it Duffy Moon :)
Yep - the kid that if he said something enough times in his mind, it would happen..........."this cable does improve the sound quality, this cable does improve the sound quality..........wow, it actually does" :)
 
With the Sky supplied cable I got regular "this display does not support HDCP" messages which have never reoccurred since replacing it with something else (forget what, but it wasn't expensive). The main difference between HDMI cables is probably to do with the way they are constructed and their ability to shield signal leakage, both into and out of the cable. Also perhaps the tolerances in the manufacturing process. However the difference in cost to manufacture a reliable HDMI cable should be pretty small and certainly nothing like the cost of the premium brands.
 
With the Sky supplied cable I got regular "this display does not support HDCP" messages which have never reoccurred since replacing it with something else (forget what, but it wasn't expensive). The main difference between HDMI cables is probably to do with the way they are constructed and their ability to shield signal leakage, both into and out of the cable. Also perhaps the tolerances in the manufacturing process. However the difference in cost to manufacture a reliable HDMI cable should be pretty small and certainly nothing like the cost of the premium brands.


Could be a faulty cable, you will always get faulty cables no matter the technology. As long as you didn't waste a large sum on a replacement it doesn't matter, as I have said before I don't use the in box cables, but that doesn't mean that it wouldn't be as good.
 
Could be a faulty cable, you will always get faulty cables no matter the technology. As long as you didn't waste a large sum on a replacement it doesn't matter, as I have said before I don't use the in box cables, but that doesn't mean that it wouldn't be as good.
That is possible, but I also have a gigabit ethernet switch and cabling fairly close to the TV and audio stuff, and a music server which uses 2.4GHz wireless next to the Sky HD box, so interference was a possible cause.
 
A few thoughts of mine...

Essentially a "one" is a positive voltage (potential difference) on the wire and a "zero" is nothing (no voltage).

That isn't how many digital interconnects work. Instead a differential system is often used, with two wires per signal used rather than a single wire relative to a common earth. Differential signalling effectively uses a relative rather than absolute system - with one state signalled by one wire being higher voltage than the other, and the other state signalled by the same wire being lower voltage than the other. This differential approach is more robust at high speeds and is commonly used for parallel signal transfer at high speeds.

Both HDMI and the much older broadcast parallel Rec 656 interconnect standards use differential signalling ISTR.

So rather than a 1 being the presence of a voltage relative to a common ground, and a 0 being the absence of a voltage (i.e. the data wire is at the same voltage as ground), a 1 could be that one data wire in a pair is at a higher voltage than the other, whilst a 0 would the that the same data wire in a parit is at a lower voltage than the other).

In fact in many systems a more sophisticated system is used than this - as you may want to maximise or minimise transitions - so rather than a 1 being a +ve difference and a 0 a -ve difference, a 1 may be a transition from one state to the other and a 0 may be signalled by not changing states, or vice versa. Even more complicated systems are also possible.

From memory (single-link) HDMI has 3 concurrent serial data streams and a clock stream all using differential signalling pairs, with each pair also shielded?
 
I don't buy into the argument that more expensive cables work better over longer runs, for the same reason I don't believe there is a quality difference between any HDMI cables.

Digital is Digital, it doesn't degrade. It's either there, it's not there, or you have massive chunks of data missing. If you have the latter, it's not because you bought a cheap cable, it's because your cable (or TV, or source) is faulty.

Digital doesn't degrade :)

Actually, that isn't really correct.

The digital signal is carried as an analog waveform. Hence, the analog waveform will degrade eventually, and the integrity of the digital signal will fall apart.

(unless you are using fibre optic of course, but even then, the optical cable isn't 100% pure, and the signal still degrades).

However, it is true that the analog waveform can degrade quite a lot without impacting the integrity of the digital signal. So whereas an analog signal might show visible/audiable signs of degredation over a long cable, a digital signal over the same distance might not. As long as enough of the analog waveform gets through, then it can still be converted back to a perfect digital signal.

Similarly, you can loose (corrupt) very small parts of a digital signal - just one bit. Not just "massive chunks".
It is these corrupt bits that give rise to "sparklies" that appear on the display. The bulk of the picture is still intact - and perfect, but individual pixels are corrupt - hence the sparkly effect.
 
Just to complicate things further.


Digital Coaxial Audio Cables
Digital Optical Audio Cables

Produce in terms of signal output identical signals to the amplifier. But whenever I hear the two I am sure that Coaxial sounds richer and Optical sounds more mechanical. Is this just how you mentally perceive it, or is it the amp treating the inputs slightly differently. Who knows.

But I would never claim that buying a monster cable coaxial is better as I have used the cheapest of the cheap coaxial that came in the box with my amp, and it looks like a composite video(Can someone tell me if a composite video could actually work as a digital coaxial, cos it sure looks like one) and a £15 belkin pro-series and I can't tell a difference between them.
 
(Can someone tell me if a composite video could actually work as a digital coaxial, cos it sure looks like one)
I've used one half of a standard audio phono cables to connect a DVD player to an amp using digital coax input/output when I've been up against it... Only issue was that the shielding was a bit pants - meaning I got glitches when the central heating switched on and off... (I guess as a result of impulse noise)
 
Produce in terms of signal output identical signals to the amplifier. But whenever I hear the two I am sure that Coaxial sounds richer and Optical sounds more mechanical. Is this just how you mentally perceive it, or is it the amp treating the inputs slightly differently. Who knows.

I don't think I've seen review yet, where an optical output has been reported as sounding better than a coaxial one. It's probably due to the extra transitions from electrical to optical and back.

I've used one half of a standard audio phono cables to connect a DVD player to an amp using digital coax input/output when I've been up against it.... Only issue was that the shielding was a bit pants - meaning I got glitches when the central heating switched on and off... (I guess as a result of impulse noise)

:rotfl: From one extreme to the other! This made me laugh out loud.
 

The latest video from AVForums

TV Buying Guide - Which TV Is Best For You?
Subscribe to our YouTube channel
Back
Top Bottom