DVI to HDMI

DublinMeUp

Novice Member
Joined
Apr 12, 2014
Messages
4
Reaction score
0
Points
1
Age
39
Hey guys,

First post so apologies if I'm in the wrong section/or infract on any rules etc.

Basically, just bought a DVI-D to HDMI cable and discovered that it doesn't carry audio. But I know that it can as I used to use one of these: (in conjunction with a normal HDMI cable)
2vrv31w.png

Why would the adapter above carry sound but not the cable I just bought?
I did notice that the DVI part of the cable is missing about 9 pins in the centre of the block whereas the adapter has them. Could this be it?
 
Yes, DVI has three channels and HDMI uses the fourth for audio.
HD audio can have very high bitrates.
 
Sorry, Is that a yes to the missing block of pins being the culprit?
 
Just and old Philips LCD to an even older Dell XPS tower.

Have an ATI radeon 4800 series card in it.

I swear using that simple little adapter to "create" a HDMI port sent both sound and video. Can't work out how it did seeing as the general consensus is that this cannot happen through DVI.
 
Is that the same combination of Source and Sink (Display) you used with the DVI Cable + DVI 2 HDMI adapter?

If you read the Wiki you will note that some Graphics Cards were capable of Outputting audio via DVI to an HDMI equipped Sink - though that required careful configuration of the PC and you limiting the audio to 2-Ch.

Joe
 
Audio over DVI is a non-standard implementation, so it is down more to the source and sync than the cable.

The missing 9 pins are part of the Dual Link specification, but I don't think this will affect operation. The analogue is carried on the 4 large connectors at the end of the connector.

Audio is carried as packets within the video "channels", not as a 4th channel. In fact, all the audio, video and control is interleaved within these channels. Depending upon the signal type, the audio packets are carried in different parts of the stream. This is why non-3D compliant AV receivers cannot decode the audio and pass through the 3D content, as the audio is not where it would be for normal 2D component.

DVI being an older standard does not support audio or component video, only 8 bit RGB. Some displays cannot accept RGB at more than 1024 x 768 and disregard any audio if the source is DVI. I think this may be part of the problem.

DISCLAIMER... I think I got most of this right!
 
Yep everything is the same, only thing different is the cable. If that's what was meant by "Source and Sink"?

Only remembered that I got the adapter with the computer originally so yeah my card does carry audio, went into sound settings:
2822kgy.jpg


Can't see that the cable and adapter would be that different, there must be something in the card itself that recognises when the adapter is fitted but not a normal cable?

So anyway, even if I could get the sound to go through I would be better off using the whatever audio out ports are available as it's only gonna be stereo?

Thanks for the help guys
 
SPDIF looks to be your best route there.

And yes, based upon your comments, I would imagine the adapter used some sort of sense pin to detect an HDMI adapter.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom