Using TV as gaming monitor

Stein67

Established Member
Joined
Nov 5, 2008
Messages
214
Reaction score
92
Points
86
Hi folks, I have recently put a 1660super in to my old pc. Today, I've connected it up to my Sony 65 XG95 and I'm getting 1080p 120hz or 1440p 60hz.......so far so good.

But I've no sound!! I have it connected via HDMI. TV is eArc'd to JBL 9.1.

Any ideas how to get around this? In my sound options I'm only getting nVidea options and they're greyed out as they are not cable connected!! A bit stumped......

Thanks for any help.
 
The HDMI cable is definitely plugged into the graphics card's HDMI socket, not via an adaptor and not into the motherboard sockets?
 
Yeah, def mate.
 
I've tried everything and no joy o_O
 
On your PC make sure the sound output is set to the graphics card.

What OS are you on?

If Windows 10.

Click on the windows icon (bottom left)
Type SOUND
Select SOUND SETTINGS
When that opens select SOUND CONTROL PANEL over on the right hand side
You will see something like this

Sound.JPG


Make sure you have the NVIDIA Output selected.

You can see on mine, I have HEADSET EARPHONE selected so in my case nothing would go to a TV connected by HDMI.

I remember it took quite a bit of puzzling to get sound to work on my HT PC in my man cave.

Cheers,

Nigel
 
Last edited:
Nigel, thanks so much mate for the indepth reply. Really appreciate it.

I actually got a bit of joy a while ago.....but it's a mixed bag.

If I boot the PC with the HDMI cable either removed from the PC or TV - it then boots up and I can choose HDMI audio and it works great.

If I leave the cable connected then it's not recognised!!

So it's only recognising the HDMI cable AFTER the PC has booted up......So while I'm glad I've made a bit of progress there is hopefully a way around this because disconnecting and reconnecting a cable after PC bootup is a bit of a pain and annoying!!

Anyone with further suggestions, given the above info?
 
What order are you turning on the TV and the PC ? The two items need to hdcp handshake for the destination device (TV) to tell the source (the PC) what audio codecs it can use,

You could use a hdmi switch to avoid disconnecting the cable. Just select an unused input so the PC can't see the TV till you select the PC input on the switch.
 
Sorry for delay in getting back. Yeah, I've been looking at the splitter boxes but haven't seen any cheap 1080p/120mhz and 1440p/60mhz compatible ones.

Absolutely insane I have to plug the hdmi cable in after booting up the pc!!
 
Sorry for delay in getting back. Yeah, I've been looking at the splitter boxes but haven't seen any cheap 1080p/120mhz and 1440p/60mhz compatible ones.

Absolutely insane I have to plug the hdmi cable in after booting up the pc!!

I think this supports 1440p 60Hz, I have a memory of trying it with that but gave it away to another person so cannot confirm.

Another option is if your sound system has HDMI inputs then bypass eARC.

Get a Displayport to HDMI cable and run that cable directly to the sound system, then set that connection as your default audio in Windows, found Displayport to be more reliable in scenarios like this.

As for correcting the HDMI audio detection that is somewhat notorious in windows for being buggy.

There is something you can try via the CRU to override the detected audio formats but there is no guarantee it will
a) work
b) stick, in that you might have to do it each time upon boot

Basic instructions, go to extension blocks, click on audio formats, add LPCM 2.0 (if missing), now okay your way to exit CRU then run restart 64

That would add PCM 2.0 (stereo) to Windows.
 
— As an Amazon Associate, AVForums earns from qualifying purchases —
Cheers mate for the help.

Got it sorted....and it was an easy fix!!

Deleted HD audio driver and got Windows to find one.....voila!!

I never trust windows to find anything but sorted this time.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom