£5 vs £150 for HDMI cable

jhkanguk

Standard Member
It seems that many people still seem to believe that they can get better picture/sound from more expensive HDMI cables.

If you can copy MS word or powerpoint files from your computer to your exnternal USB hard drive using £0.50 usb cable, then there is absolutely no need to buy £1 or £10 cable.

You don't get any better sentences or better presentation by using more expensive cables in digital world.
When you copy file A from your computer to the usb drive, you won't want to get 'improved file A' on the usb drive.

HDMI also works exactly in the same way in the same digital world.

If your £5 HDMI cable is good enough then it would play audio and video without cracking noise or digitally jerking display.

In that case, it could be said that the cable is successfully copying video file A from your player to your TV.
It is completely a ******** if someone tells you that you can get better audio/picture if you upgrade to £150 HDMI cable in that situation.

If any given cable is not good enough for your system, then you will notice it without anyone telling you about the problem. It will be as easily noticeable as when you can't copy files to your usb drive. It is when you have to upgrade to better quality cable.
 
Last edited:

BlueWizard

Distinguished Member
While I agree with you completely, I think you are forgetting about the middle ground. To me paying £150 doesn't make any more sense than paying £5. I think you can get a perfectly serviceable cable in the range of £10 to £20.

Assuming the data can get through the cable at all, then it should be accurate on the receiving end.

To illustrate the point, let's use broadcast TV as an example. With analog TV, as the signal fades, the picture fade, and keeps fading until it is gone.

But that is not what happens with broadcast Digital TV. Here when the signal is strong, the picture is clear. As the signal diminishes, the picture stays clear until the signal gets so small and weak that the receiving end can't resolve the data, and you get picture break up, freeze frames, and then total and complete drop out of the signal and picture. But at the level of break up or freeze frames, the signal level is incredibly small. If an analog signal was that low, it would be virtually all 'snow'.

The same applies to a digital cables, weak as the signal may be, as long as the receiving end can resolve the numbers, the picture holds together. But also consider, there really is no reason for the signal to be weak or difficult to resolve. It only travels a best a meter or two, so their really shouldn't be an deterioration in the signal.

Now if you have a projector, the HDMI cable run could be more than a meter or two, in which case, the cable length could cause some deterioration to the signal. But once again, simply getting reasonable quality cable should be sufficient to keep the signal clean when received.

At £5 or less, I start to question the quality of the cable. But, on the other hand, one does not need to go so far as £150 cable as the alternative. I think cables in the £10 to £20 range are sufficient for most systems, though if you felt you needed more, then perhaps in the £20 to £40 for standard length cables wouldn't be a disaster. Still for normal consumer grade equipment, I would be reluctant to go much more than that.

To some extent, and much like speaker wire, it is about reasonable proportions. I doesn't make sense to buy a £300 AV amp, and put a £150 HDMI cable on it. However, if you happen to have a £15,000 system, a £150 cable might not be necessary, but neither would it be out of place.

Steve/bluewizard
 

formbypc

Novice Member
It seems that many people still seem to believe that they can get better picture/sound from more expensive HDMI cables.....

I note that none of this contains a statement that you've actually tried both levels of cable in your system.

Therefore, your assertions are all based on theoretical hypotheses. You haven't tested those hypotheses, have you?

I would dispute that statement that HDMI transmission between two video devices in real time is 'exactly the same' as shuffling data between two PCs or file locations on a hard disk.
 

andy1249

Distinguished Member
I note that none of this contains a statement that you've actually tried both levels of cable in your system.

Therefore, your assertions are all based on theoretical hypotheses. You haven't tested those hypotheses, have you?

I would dispute that statement that HDMI transmission between two video devices in real time is 'exactly the same' as shuffling data between two PCs or file locations on a hard disk.

I have tested hundreds of cables , from the cheapest to the most expensive.

In terms of data transmission , once the cable is mechanically sound and the signal is clear of the mask , they all do exactly the same job.

Based on rock solid data , there is no justification whatsoever for a 150 pound HDMI cable.

In terms of data transmission protocols , HDMI transmits a TMDS signal over a balanced line. PC's usually use ethernet protocols of CAT5/6 or equivalent.

Not exactly the same , but any attempt to use this difference to argue that one HDMI cable is better than any other is a futile task , any mechanically sound HDMI cable performs as well as any other.
 
Last edited:

PhilCTTE

Well-known Member
I've just bought 2 * 1.8m for £6.50 in a sale ! Works fine for use between DAC and computer.

I use the same logic as HDMI cables and Optical. As long as the connections/terminations are of good quality and will last. I look at the specifications of the cable which for usb my requirement is usb2.0, to me that guarantees that the cable is designed well enough to deliver the data at the defined speed 12mb/s. How this is acheived is down to design qaulity of the cable whether its twisted or the AWG rating. I'm out of my depth on the technicalities and rely on these standards as the level of spec when I',m purchasing.

It still doesn't justify £150 price tag. No way. In fact it angers me that I see audio websites trying to rip customers off.

Optical may be slightly different argument , but for me POF types suffice.

For both HDMI and optical I've never spent more than £30-£40 which I think now is probably £20 to much.
 

formbypc

Novice Member
See, that's more like it - someone (andy1249) who has actually done some testing, and recognises that HDMI and PC data transmission aren't exactly the same.

Can I ask though - at any point, did your testing involve simply connecting basic and premium HDMI cables up to a video setup and forming an opinion for yourself on whether there was any noticeable difference in picture quality?
 

Avi

Distinguished Member
Can I ask though - at any point, did your testing involve simply connecting basic and premium HDMI cables up to a video setup and forming an opinion for yourself on whether there was any noticeable difference in picture quality?

I've compared numerous HDMi cables from FOC included to loan cables that cost several hundreds. I don't have the equipment to measure the signal i.e. cat eye mask performance but I have measured the results at the display in terms of differences in key image attributes i.e. black level, white level, grayscale, gamma, primary/secondary saturation/luminance, chroma resolution etc. I've also watched the same real world video sequences to compare. Within the tolerance of the measuring equipment I wasn't able to detect or identify by eye any difference across cables that functioned properly. Blind comparison also failed to show any correlation between the price of the properly working HDMI cable.

That said if I look long enough at the same material I can convince myself I can see differences even without changing the cable. I guess this is a symptom of the human condition.

Avi
 
Last edited:

Mark.Yudkin

Distinguished Member
The big difference between PC data copying over USB and HDMI transmission is that the first supports retransmission for error correction, whereas the second doesn't, as it needs to meet hard real-time demands. However...

All of the arguments about price are spurious in the context of a cable with a central certification authority. Any certified cable is certain to meet the performance specification for which the cable has been certified. Leaving automotive applications aside, the only problems that arise are 1) using a category 1 cable (normal speed) in a category 2 context (high speed), or 2) using a non-ethernet cable in an ethernet context. Both of these are trivial to avoid. See here for a full explanation.

The likely "problems" with a £5 cable are that it possibly isn't certified, in which case there is no guarantee of anything, and that it's unlikely to be high speed, which can cause problems if you want to send 1080p. Certified high speed cables typically cost a bit more than £5, but nothing like £150.
 
Last edited:

andy1249

Distinguished Member
Can I ask though - at any point, did your testing involve simply connecting basic and premiumHDMI cables up to a video setup and forming an opinion for yourself on whether there was any noticeable difference in picture quality?

In pretty much all cases yes , for one particular test all cables under test had a trailer played to an identical screen setup ( Return of the king to be precise ) , however its more usual to display test patterns and measure for accuracy (Rec 709).

This would be done only under more specific and strenous tests and only under insistence from the customer , as if the data is the same , and the screen is the same , then by definition the result is exactly the same.

This of course , was completely verified by the rec 709 measurements , and once we had the data to present , you usually go back to the customer , present the data , and eliminate that test to save time and money.
 

Nallen0401

Active Member
I note that none of this contains a statement that you've actually tried both levels of cable in your system.

Therefore, your assertions are all based on theoretical hypotheses. You haven't tested those hypotheses, have you?

I would dispute that statement that HDMI transmission between two video devices in real time is 'exactly the same' as shuffling data between two PCs or file locations on a hard disk.

I find this point of view pretty interesting. Perhaps it's born of a long time in the analogue AV world where cable quality is actually an issue and perhaps some of that time was spent justifying the sometimes absurd costs.

As I said in a previous thread on this topic to hold the opinion that different HDMI cables can influence a digital signal, assuming that 100% of the digital signal is passed through is in my mind analogous to thinking that a digital photo will fade. It's just misunderstanding the technology.
 

srynznfyra

Standard Member
Thinking cables can influence a digital signal so that it's still there, but changed, is akin to thinking supernatural forces wash through your TV, producing a bad picture.

The key issue here is that a digital signal is a digital signal, ie. a series of 1s and 0s. Either the cable will work, and the electronics will interpret the signal at the other end, or it will not work, and you will get no picture. Cables have no sort of capacitative element to them, smoothing out and warping the signal. That just simply isn't the nature of them. They are just wires, and the reliability of the signal is solely related to the equipment producing and interpreting it.

Our eyes and brain can very easily trick us into thinking something we've spent an arm and leg on will be advantageous relative to a dirt cheap alternative. However, in this case at least, by definition we *know* that cables cannot effect digital signals, and that the worst they can possibly do is not let it through (in which case the screen will go black, of course).
 

Mark.Yudkin

Distinguished Member
The key issue here is that a digital signal is a digital signal, ie. a series of 1s and 0s. Either the cable will work, and the electronics will interpret the signal at the other end, or it will not work, and you will get no picture. They are just wires, and the reliability of the signal is solely related to the equipment producing and interpreting it.
Unfortunately, this gross oversimplification is just that: a complete failure to consider reality.

I suggest reading a standard text book on digital communications, so as to achieve a basic understanding of the engineering issues involved. For example: Ian Glover: Digital Communications, Robert Gallager: Digital Communications or Bernard Sklar: Digital Communications: Fundamentals and Applications.

As I already explained, the complexity of getting HDMI right is of little consequence to the consumer, as certification by the HDMI organization provides a performance guarantee, and the consumer need only check for the certification logo.
 
Last edited:

srynznfyra

Standard Member
Unfortunately, this gross oversimplification is just that: a complete failure to consider reality.

May I ask how this is true? :rolleyes:

A digital signal is a square wave, containing just two values (you cannot argue with that :smashin:). The only way the wave could get corrupted is if parts of it that were "low" were so high as to be interpreted to be "high" by the digital signal processor (or vice versa), OR if the jump between "high" and "low" took too long, OR if the difference between "high" and "low" was made smaller by the cable (ie. level loss - something that could not *really* occur in any cable with the lengths we're talking about). Obviously there are other things involved with digital video, but as far as I know they all are to do with the actual processors and electronics that process the waves, rather than the waves in and of themselves.

Please tell me if there are any other such means for the actual wave to be interpreted differently to the original, OR how, in any way, a cable could produce these "corruptions" in the square wave, OR how, in any way, a cable could cause a signal processor at the other end to interpret the wave wrongly, without actually significantly changing the wave (without linking to any books please :nono:).
 

phil t

Well-known Member
Cables have no sort of capacitative element to them, smoothing out and warping the signal. That just simply isn't the nature of them. They are just wires, and the reliability of the signal is solely related to the equipment producing and interpreting it.

Not so.

Any two conductors separated by a distance can store a charge. So any two wires in a cable or harness can store a charge. This stored charge can affect how the cable behaves during testing.
The term "Capacitance" describes the ability of two conductors (separated by insulation) to store a charge. Capacitance is affected by the distance between the conductors and the insulation around the conductors. As the conductors get closer together or have more surface area (longer wires, shields etc.) the capacitance will increase.

Capacitance in cable is usually measured as picofarads per foot (pf/ft). It indicates how much charge the cable can store within itself. If a voltage signal is being transmitted by a twisted pair or coaxial cable, the insulation on the individual wires becomes charged by the voltage within the circuit. Since it takes a certain amount of time for the cable to reach its charged level, this slows down and interferes with the signal being transmitted. Digital data pulses are a string of voltage variations that can be represented by square waves with near-vertical rise and fall transitions. A cable with a high capacitance slows down these voltage transitions so that they come out of the cable looking more like "saw-teeth", rather than square waves, and the circuitry may not recognize the pulse. The lower the capacitance of the cable, the better it performs at higher frequencies.

:)
 

Mark.Yudkin

Distinguished Member
without linking to any books please
I gather you just like to argue from ignorance, as you're obviously unwilling to make effort to learn, prefering to be rude rather than review the underlying engineering issues. That is your problem, you have been given three hyperlinks to introductory text books. Now it's over to you to make the necessary effort.

[Why is it we either have 1) mega-bucks make a difference (except in properly conducted tests) vs 2) any old junk is perfect (except when it fails), when the reality is that both positions are untenable?]
 

Alan Mac

Active Member
Quote:
“The lower the capacitance of the cable, the better it performs at higher frequencies.”

I'm afraid that that is not true either!

When the cable is acting as a transmission line, the capacitance (and the inductance) of the cable disappears and the line looks purely resistive if terminated in its characteristic impedance.


At high frequencies the characteristic impedance simplifies to:

√(L / C)

(which has the dimensions of pure resistance)

where:
L is the inductance per unit length
C is the capacitance per unit length



However the (resistive) losses in the dielectric and the conductors of a practical transmission line cable do increase as the frequency increases. So a digital signal will have the square transitions progressively “rounded off” as the clock rate is increased or the length of the transmission line is increased, leading to the familiar (?) “eye” shaped pattern when the output signal is viewed in the time domain.


Alan
 
Last edited:

srynznfyra

Standard Member
@phil t, regarding a wire acting as a capacitor:

Okay, got me there. Wires can act as capacitors. My point still stands though - as you said the capacitance is usually measured in picofarads per foot. If you ask me this is pretty small. Also, regarding the OP, I doubt a gold wire or whatever would have less capacitance than a cheap(er) copper wire.

I gather you just like to argue from ignorance, as you're obviously unwilling to make effort to learn, prefering to be rude rather than review the underlying engineering issues. That is your problem, you have been given three hyperlinks to introductory text books. Now it's over to you to make the necessary effort.

You gather wrong, I'm afraid. I'm perfectly willing to learn, but I presume that if you possess enough knowledge to tell me, as a matter of fact, that I'm wrong in everything I say, then I would gather that you should be able to tell me in a few short sentences why, rather than linking to a text book without even a reference to a specific point within it, which is arguably a very vague way of proving a point, and at worst patronising.

So please tell me why I'm wrong, if you care so much :) :lesson:
 

andy1249

Distinguished Member
In terms of the design of any digital interface , or any interface at all for that matter , there is a lot to consider.

For the purposes of this forum , and to answer questions for those with a non technical background , I usually just get to the point , and say that at the end of the day , when it comes to HDMI cables they either work or they quite obviously dont.

The reasons why some dont are very technical indeed. When it comes to HDMI , there is a lot to learn if you want to understand it completely , its not something you can write in 3 or 4 sentences.

You have to understand the nature of balanced signals , you have to get to grips with the TMDS protocols , you need to understand the concept of channel codes , you need to understand the bandwidth limitations of the interface and why the geometry of the twists in the wires are so important.

That and many many other concepts.
This is true of the HDMI interface and any other interface.

For example , getting to grips with Ethernet and TCP//IP takes a couple of years of study if you want work in networking , explaining it to the non technical is not easy , that cant be done in a couple of lines either , for that the best I could do would be to point you to one of the many CCNA books .

For HDMI , I could point you to the HDMI 1.3 spec which is available from many sites for free , if you want to see the HDMI 1.4 spec you must register and pay a fee , the latest spec is never free.

http://www.dybkowski.comule.com/download/hdmi/hdmi_spec_1.3_gm1.pdf

So , while it is definitely the case that HDMI either works or does not , the reasons why this is so are not as simple as you said , its actually way more complex than that , and its not possible to say why in a couple of lines.

The spec I linked to is 237 very technical pages , and you need an education in the subject to understand the majority of it. You should be be able at least , to glean from the content that it cannot be summarised in a paragraph or two , if it could , we would all be engineers wouldnt we?
 
Last edited:

BlueWizard

Distinguished Member
Let's view this from a slightly different perspective. Let's say I buy a cable and claim that it produces bluer blues. Well one would then have to ask, how does the cable know which of the numbers passing through it are blue numbers, and how does the cable consistently select the blue numbers?

Any changes to the data passing through a given cable are going to be purely random, and purely random changes to the number passing through are not going to show up on the screen in any consistent, and certainly not in any helpful way.

A cable can not produce bluer blues, redder reds, or greener greens because those color values are fixed in the numbers being transmitted. And it certainly can't make blacker blacks because black is a number value of ZERO. So, can the cable make the number ZERO even more ZERO? I don't think so.

Any change that a cable makes is going to be in the form of random errors, totally and completely random errors, and as Mark points out, if the cable is certified by a standards body, then there really should be no errors at all.

Yes parameters of the cable can alter the wave shape, and that could cause read errors, but it is not likely that a cable like that would be certified to a given standard. And as Mark further points out, cheap under £5 cable may look right, but it is not likely to perform right. Which is why I say, any reasonable cable in the £10 to roughly £20 price range is going to do as good a job as any.

Now, it is possible that even with certified cable, you might get a one in a million error, but can you detect a one in a million error, especially when it is occurring randomly? I don't think so.

Even an erroneous number, is likely to only have a few bits misplaced, so it will have some of its original value. So, 256 becomes 248, but how would you know that 248 was the wrong value?

Further, while there is no transmission error correction or re-transmission, the TV itself has a error correction mechanism. If a given byte of data (pixel) is unreadable, the TV will substitute the data from the previous screen in its place. Which means only if it occurs in a scene transition would it even remotely be noticeable, and the time frame of a single full screen is about 1/60th of a second. You've got to have a pretty fast eye and pretty fast brain to detect a single pixel error lasting 1/60th of a second.

Now, when the errors become gross, the TV might be forced to substitute an entire screen, but still, that only represents 1/60th of a second error correction. For a screen to really break up or freeze in a noticeable way represent a sustained run of transmission of errors, and while that might happen over the air, that is EXTREMELY unlikely to happen through a short run of certified cable. And if it does happen, it is more likely related to the extreme length of the cable, or a problem with the DVD/BluRay disk, DVD/BluRay player or the TV itself.

But the point of all this rambling, is that a cable can not add positive benefit to the signal, it can only transmit it accurately or with increasing amounts of error. It can never add to the signal, it can only equal or lessen it. The data either came through or it didn't.

And, for cable to noticeably decrease the quality of the signal in any noticeable way, it would have to be some pretty crappy cable. So, the solution is, don't buy crappy cable. But then, if you only have to move one grade above crappy, then it doesn't really pay to buy overpriced cable either.

Again, it is about a combination of minimum quality and a price in reasonable proportion to your system. You don't want crappy cable, but once that is off the table, any decent cable will do. Now if you have countless Thousands into your system, then spring for some high quality cable, you can afford it. But for the rest of us, just get some decent consumer grade certified cable and you'll be fine.

In conclusion, it is impossible for cable to ADD to the signal, the best it can do is equal or subtract. So, bluer blues, blacker blacks, etc... is a steaming pile of meadow muffins ... in my opinion, of course.

Steve/bluewizard
 
Last edited:

dlg78

Novice Member
I am using a 1m cable between PS3 and a Panasonic LED LCD display, Blu ray looks fantastic and game graphics look great too.

Cable cost £0.76 from Amazon.
 

Don Dadda

Distinguished Member
I am using a 1m cable between PS3 and a Panasonic LED LCD display, Blu ray looks fantastic and game graphics look great too.

Cable cost £0.76 from Amazon.


Dig78,

Nothing wrong with your post and it is relevant so please don't be offended, but to read the above after reading the previous 3 or 4 posts which 3 were biggies and somewhat techincal, had me in stiches. :rotfl::rotfl:

Short but sweet. :thumbsup:

Should be a comedy sketch :laugh: :laugh:.
 

Mark.Yudkin

Distinguished Member
So please tell me why I'm wrong, if you care so much
It was you who posted the oversimplified fantasy arguing with all the other posters who had provided simple explanations on a number of points. My suggestion was that you make the effort to understand the fundamentals by reading a standard undergraduate text book - a book because the complexity of the subject cannot be summarized in a couple of short sentences, but needs space to explain. My youngest son is now in in his second of four years of an electrical engineering degree at a college of technology, I spent 7.5 years studying digital communications at university level (and fully admit I can't teach, and therefore don't).

The electrical properties of HDMI 1.3 are explained in section 4.2 of the specification, downloadable free of charge, but with registration, from HDMI :: Manufacturer :: Specification. Cable requirements are defined and quantified in subsection 4.2.6. The HDMI 1.4 specification is not free. Of course, a specification is like a book: it takes time to read and comprehend, and the HDMI specification assumes a certain level of background.
 

Mark.Yudkin

Distinguished Member
For HDMI , I could point you to the HDMI 1.3 spec which is available from many sites for free , if you want to see the HDMI 1.4 spec you must register and pay a fee , the latest spec is never free.
http://www.dybkowski.comule.com/download/hdmi/hdmi_spec_1.3_gm1.pdf
I see that's the V1.3 22/6/2006 version (237 pages). I've been using the V1.3a 10/11/2006 version (276 pages). The biggest difference is in the CEC appendix, which is expanded from 62 to 97 pages. You can find it at http://www.evernew.com.tw/HDMISpecification13a.pdf
 

Trollslayer

Distinguished Member
A digital signal is a square wave, containing just two values (you cannot argue with that :smashin:).

Sorry but the problem is that the signal is NOT square.
To take a simple analogy what happends to sound in a tunnel? Echoes and distortion occur as well as attenuation, noise from surround sources etc..
The consistency of a cable is as important as it's attenuation which is why you can kink a mains cable and it always works fine but not a high speed signal cable.
 

The latest video from AVForums

Podcast: Samsung TV Launch & QN95A Neo QLED Review, plus Film & TV news & Reviews
Subscribe to our YouTube channel

Latest News

Tidal streaming service acquired by Twitter/Square's Jack Dorsey
  • By Andy Bassett
  • Published
Netflix adds support for reduced theatrical release window
  • By Andy Bassett
  • Published
Paradigm launches Founder speaker series
  • By Andy Bassett
  • Published
AVForums Podcast: 3rd March 2021
  • By Phil Hinton
  • Published
Music revenue in U.S. sees vinyl sales overtake CD
  • By Andy Bassett
  • Published
Support AVForums with Patreon

Top Bottom