Why is it impossible for a HDMI cable to be responsible for incremental changes in picture quality ?
Video information is transmitted as a series of 24-bit pixels - 8 bits each for each of the primary colors; these are encoded using the TMDS protocol into three 10-bit words per pixel clock period (i.e. each pixel is made up of 30bits).
They are also supplied to the screen at a rate equal too …
Bandwidth = Resolution x Refresh Rate x [1 + Blanking Period] in Bps
Where the Blanking Period is the sum of the horizontal and vertical blanking intervals.
For 1080p @60hz this would be
1920 x 1080 x 60 x [1 + 0.16] = 144.4MHz or 144.4 million pixels/sec.
You can work out how many pixels per second for any particular content by substituting resolution figures and hz rates into the equation above , Im picking 1080p @60hz as this is the worst case in terms of content actually possible at the moment.
Now lets start by assuming a 0% bit error rate on a particular cable.
That means no errors whatsoever in the stream , meaning all pixel information , that's 3 x 10 bit words for each pixel gets through as it was transmitted , meaning a perfect result.
A Cable is a passive collection of wires , there is no way for a cable to manipulate the data in the 3 x 10 bit words , so a cable can in no way improve on the data , so for a cable with 0 bit error rate , this is as good as it can be.
That means that the only possible way for a cable to change the data is if the cable in some way corrupts the data.
Now consider what would have to be the nature of this corruption if the cable is somehow to be responsible for “Deeper blacks“, more vibrant colours , and so on.
The type of thing claimed by some of the more disreputable magazine reviewers and cable sellers !
The corruption would have to be in the form of a fault which somehow changes the 3 x 10 bit words for each pixel so that they all the pixel information for blacks were changed to a deeper value of black and all the different pixel information for the other colours were changed to more values that encoded for more vibrant colours !
So you would have to believe that random errors introduced into the bitstream could somehow cause all pixels to encode for values that produced a better picture and that it could do this 144 million times per second.
Clearly such a thing is beyond any probability, in fact any errors introduced would most likely cause a 3 x 10 bit word pixel value that made no sense or was totally different to the original value , showing nonsense on the screen if indeed it showed anything at all. ( In the vast majority of cases this is exactly what happens on data corruption , the interface just stops working )
In the case of long cables , there is an attenuation problem , but this again does not affect sound or picture quality because the electrical signal present on the cable is not an analog signal , only the levels representing the data are important , and when these become indistinct you lose data. Meaning in the case of an HDMI cable , which carries so much data , that in all probability the cable just stops working.
As you can see from this , it is clearly impossible for a HDMI cable to be responsible for incremental picture changes , either the data gets through with a very very low or zero bit error rate, or most of the data gets corrupted , meaning for most cables it really is a case of they work or they don't.
You cannot have cables that are slightly better than others in terms of picture quality; the technology simply doesn't allow it. Its completely impossible.
Regarding audio quality with HDMI , Audio data is inserted into the video blanking periods , it is not a continuous bitstream , it is recovered and reclocked at the sink ( receiving device ) using a formula based on the Video clock.
The audio clock is not transmitted over HDMI
Rather, it is derived at the sink end from the video clock
The Source computes integers N and CTS such that
128×fs = fTMDS_CLK×N/CTS
N is fixed for a given video and audio rate (table lookup)
Source counts TMDS clocks per audio clock to determine CTS
N, CTS transmitted in audio clock regeneration packet
The sink regenerates the audio sample clock from the received fTMDS_CLK, N, and CTS values
Asynchronous video and audio clocks, or audio clock jitter, can cause CTS to change over time , but its important to realize that such jitter cannot happen due to the
cable , this can only happen due to problems in either the source or sink silicon/clock combination , so regardless , the cable cannot affect audio quality.
In addition , you need to note that typical HDMI chipsets , such as the Silicon image 9134/9135 transmitters and receivers , have jitter performance better than 1ps.
Summing it all up , in the case of a HDMI cable , no amount of spending will improve picture or sound quality , thats impossible. If it works , it works , Thats it !!