Five.tv Gadget Show : HDMI Cable @£20 = @£120

mk-donald

Established Member
Joined
Sep 15, 2006
Messages
259
Reaction score
37
Points
65
FYI: On tonight's Monday 12 May 2008 'The Gadget Show' on five.tv at 8pm, they did a comparison of HDMI cables, first giving their unanimous opinion that buying an expensive cable was pointless and then sitting down in front of 2 identical setups, except for HDMI cable, and stating there was no difference at all between the images produced.

The two cables they used were
IXOS XHT658 HDMI Lead £119.95 from RicherSounds
versus
Logitech HDMI Cable £19.99 from Currys, though the voiceover said words to effect "can find cheaper ones in the shops if you shop around"

Outline details at:
http://gadgetshow.five.tv/jsp/5gsma...er&show=s8e7&featureid=761&description=Studio
 
I have 4 hdmi cables 'a 2m basic black one (freebie) 2.4 m with nylon woven sheath'12.99 play.com. a 1 m yellow with ferites 6.99 ebay and last a QED performance 1 m £45 which came Free with my yammy 2700 dvd and performance wise i cannot see any difference between them' but there is one thing the QED fits nice and snug into my yamaha rxv1700 where as the others seem slightly loose and at a slight angle but they work ok.
 
In our next podcast we discuss HDMI cables and set the record straight with regard to when and why you may need more than a budget cable.
 
This would be the same Gadget Show that sat down in front of three flat panels and decided conclusively that the LCD was the best....

At 1m most HDMI cables will look the same on most systems, but before I say *all* you should see some of the higher end offerings (Chord, Wireworld Ultraviolet, Display Magic etc). Of course the rest of the system needs to be of a similar calibre to realise any benefit, budget setups needn't bother trying to get that extra ounce with a £100-£200 HDMI cable, might as well put the extra £100-200 toward better display or source. At 3-5m onwards, especially with 1080p signals, then decent HDMI cables are an absolute must. Looking forward to AVF review hopefully putting this straight!!
 
This would be the same Gadget Show that sat down in front of three flat panels and decided conclusively that the LCD was the best....
If this was the recent episode I thought they took it to a panel of independent experts? Previous to that they put the two TVs in a large hanger and let people go to the one they preferred. As I remember people were split 50:50. Maybe you don't like the results or the difference is not really noticeable.

At 3-5m onwards, especially with 1080p signals, then decent HDMI cables are an absolute must.
Why exactly?
 
I got £64.99 Qed SR Hdmi Cable and imo there is a slight improvement in colour definition and the build quality is a lot better than my £20 mad catz one.But I could be wrong. The Gadget Show ROCKS!
 
I am a telecoms engineer and used to commission digital systems. When we install new cables to carry digital traffic we use a pattern generator to send a pattern of digits over 1 cable and loop it back over another back into the pattern generator.

The pattern generator compares every single bit going out with every single bit coming in and counts any single error. I once ran this tester overnight over a 400m length of cable and found not 1 single error. Thats 172800000000 bit of data sent out.

Auditioning should be used for the analogue parts of the system. Any digital to digital part should be error tested and if no errors are found then it will not sound/look any different from any other component.

If AV reviewer are correct and they can see/hear the difference between cables then the cables are out of spec. Put a test signal down the cable and prove it.
 
I cheered when The Gadget Show finally did something on this.

If an HDMI cable works, it works. HDMI digital video (and audio) signals are reclocked in the display - so jitter should be a non-issue. Therefore the only real issue is whether the signal is carried error-free or not. If the signal arrives error-free - then a cheap cable delivers exactly the same picture quality as an expensive cable.

Where cable quality DOES play a part is when you are running long cables, and poor performing cables (which doesn't always mean cheap cables) may not deliver an error-free signal. However errors will appear as very visible artefacts - zitting etc. and not softer or ringier pictures as would be the case with poor quality cables using analogue techniques.

HDMI and other digital interconnects like HD-SDI actually carry a digital signal using analogue techniques. As long as the analogue properties of the cable don't degrade the signal to the point where it can't be decoded error-free then there should be no difference between a £10 and £100 cable.

Using eye-height analysers would be a scientific method of assessing performance of cables of various lengths - but even a poorly performing cable, as measured by eye-height, that is still error-free will deliver identical picture quality to a decently performing cable, as measured by eye-height, that is also error-free. That is the joy of using digital connections. When they work you get the same bits out as you put in.

Jitter is an issue with some digital systems - but only those that don't use clocking techniques. HDMI video and audio use these techniques AIUI so jitter shouldn't be an issue.
 
Steve's post is spot on.
HDMI ICs themselves are tested and specified for jitter and eye diagrams are shown on the datasheet.
The same sorts of tests are also carried out on any HDMI certified products.

Chris Muriel, Manchester
 
I think devices that reclock digital audio signals are still very much in the minority.

Presumably all compressed audio carried via HDMI is reclocked when it is decompressed - so jitter shouldn't be a major issue with audio carried as Dolby Digital, DTS or AAC via HDMI or SPDIF/Toslink - unless the output clocks are tightly linked to the input source.

When it comes to HDMI the audio stream is carried as a form of Sound In Syncs (which in HDMI is called the Data Island Period) - similar to embedded audio in SDI/HD-SDI (and a bit like the way D/D2-MAC carried digital audio in between the analogue video contents). Thus the audio - whether compressed or PCM - isn't a constant stream it too had to be reclocked. The audio in HDMI isn't carried on separate pins via HDMI as a constant stream, it is carried as bursts between the active video. As such it surely has to be clocked into a buffer and then clocked out separately? The input and output clocks may be tightly linked - though I'd have thought any PLL flywheel would have tidied up a lot of jitter - or am I missing something?

AIUI SPDIF PCM stuff is the most common digital audio stream not to be reclocked - but that isn't relevant when discussing HDMI.
 
If this was the recent episode I thought they took it to a panel of independent experts? Previous to that they put the two TVs in a large hanger and let people go to the one they preferred. As I remember people were split 50:50. Maybe you don't like the results or the difference is not really noticeable.

This was about two years ago, and from memory it was a sharp LCD (about 50-60" and truly horrible, before even decent dynamic contrast techniques had come to market), versus IIRC a Pioneer plasma and maybe an LG. But again, like the projector comparison a couple of weeks ago (which was where they had "specialists" set it up) it was blatantly obvious from viewing via their video cameras and through my TV that the colour temperatures were WELL off and the two displays were both running different gamma settings. Also to their credit, the split was 100% to the DLP on the projector test (thank christ).
Why exactly?
Sending over longer distances is more difficult, and sending higher bandwidth signals (i.e. higher resolution and/or higher refresh rate) is also more difficult. Note I did not say more "expensive" cables, just better quality. In my installations speaker cabling for example rarely gets above £4 a metre for multi-room audio use, and only up to your silly prices for high-end installs (£50k+ for single 5.1 setup). With HDMI I have a responsiblity to my clients to install the best and what will be able to be used in many years time. Only a few CI companies (i.e. the good ones) can say that they had the good sense to use 1080p capable HDMI cables three or four years ago. We're going back upgrading systems now from 576p or 720p displays to 1080p models and not having to cut the walls and ceilings back open. I follow the same theory now, for the last 6-12 months we have been using cables which are able to transmit HDMI 1.3 CAT2 so that if I return in 3 years time to fit a system which uses deep colour and xvYCC colourspace again I won't need to be chopping out walls.
 
HDMI is not reclocked. It suffers jitter horribly. Better shielded cables will help prevent EMI/RFI which will cause jitter. We're all hoping that high-end manufacturers will bring reclocking into their HDMI AVR/AVPs to solve this issue. At the moment HDMI audio sounds like first generation CD audio!!

Again I am not saying this needs to be ultra expensive cable, but it does need to be a decent guage, with decent shielding, which is able to handle the bandwidth you intend on throwing at it.
 
HDMI is not reclocked. It suffers jitter horribly. Better shielded cables will help prevent EMI/RFI which will cause jitter. We're all hoping that high-end manufacturers will bring reclocking into their HDMI AVR/AVPs to solve this issue. At the moment HDMI audio sounds like first generation CD audio!!

Again I am not saying this needs to be ultra expensive cable, but it does need to be a decent guage, with decent shielding, which is able to handle the bandwidth you intend on throwing at it.

When you say it isn't reclocked - I don't think we can mean the same thing.

HDMI only carries audio signals in the gaps between active video - as the same pins are used for audio and video at different times. As a result whether the audio is PCM, DTS or Dolby Digital, the audio is only sent in short bursts in the gaps between video (i.e. in blanking)

As this means the audio is not sent continuously it has to be clocked out at a different rate to the one it was clocked in at to deliver a continuous audio stream for DACs etc. If the clocking out rate is tightly linked to the input clock rate then I can potentially see how jitter might survive - however surely there is some damping in the linking between the input clock and output clock which removes (or at least massively reduces the jitter). This is discussing PCM via HDMI.

I can see how a constant SPDIF stream could be susceptible - but HDMI audio is just like SDI/HD-SDI embedded audio, or for that matter DSIS, isn't it?

Surely Dolby Digital and DTS audio via HDMI goes through so much processing (i.e. decoding) the input signal is massively divorced from the output audio isn't it?
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom