Optical cable signal loss


Well-known Member
I have recently installed an LG TV on my wall. As many TVs do, it has the design fault that a lot of the ports face out of the back of the TV making it impossible to wall mount and use the ports at the same time.

I used this right-angled adaptor so I could fit an optical cable:

The problem is that it doesn’t work. I can see a red light coming out of the end of the optical cable, but neither of my AVRs detect a signal. I have got a Bose sound bar that DOES detect a signal and plays it back fine.

So, my questions are:

1. Is the right-angled adaptor likely to be introducing significant signal loss? The red light coming out of the cable is dimmer than the light me Samsung TV produces (that works fine with the same amps)

2. How can the Bose gain an undistorted signal when the AVRs cannot?

I’m here on my own at the moment which is why I can’t just take the TV off the wall to test it.


Well-known Member
It’s a Marantz SR7010. I’ve got it up and running now using a shorter cable into the right-angled adaptor.

It appears that the adaptor does introduce some loss, but using a different cable has made it usable.

I can only assume that the Bose soundbar Is more sensitive than the AVR as that was producing a full, undistorted signal from the original cable when the AVR produced nothing.

The latest video from AVForums

AVForums Movies Podcast: Streaming Theatrical Releases And The Future Of Cinema
Subscribe to our YouTube channel
Support AVForums with Patreon

Top Bottom