Q95T (Q90T) signal lost when monitor is connected to PC

nu1mlock

Novice Member
Joined
Nov 23, 2018
Messages
13
Reaction score
1
Points
54
Age
36
Location
Sweden
Hi,

I have an issue with connecting my PC to my TV as a secondary display to play games on my couch.

TV works fine and confirms 4K 120Hz HDR VRR. However, as soon as I connect my monitor with a DP-cable (any DP-port) my PC loses the signal to my TV. TV goes black, Windows plays the "cable disconnected sound" (or rather, keeps playing "connected", "disconnected" over and over). TV is no longer visible in either Windows or Nvidia Control Panel.

As soon as I disconnect the monitor, TV re-establishes connection and is again working with 4K 120Hz HDR and VRR.

I have tried a couple of things, like restarting my PC with/without TV connected, with/without monitor connected, both and none.

I have also tried lowering the bandwidth and have tried 1080p120 without HDR. I have also lowered resolution of my monitor to 800x600 and tried everything again. No difference. Doesn't seem to be a bandwidth issue?

I have now properly (?) tried all HDMI inputs on my TV and also tried enabling "Input Signal Plus" (which is required for HDR etc. on Samsung TVs) on them all. These are the results:

Highest possible resolution:
HDMI 1: 4K 60Hz, RGB, 8bit, HDR. Works. (10bit only with 4:2:0)
HDMI 2: 4K 60Hz, RGB, 8bit, HDR. Works. (10bit only with 4:2:0)
HDMI 3: 4K 60Hz, RGB, 8bit, HDR. Works. (10bit only with 4:2:0)
HDMI 4: 4K 60Hz, RGB, 8bit. Not HDR. If monitor is connected to PC.
HDMI 4: 4K 120Hz, RGB, 10bit, HDR. Only if monitor is disconnected from PC.

Any suggestions are greatly appreciated. Thanks!

GPU: RTX 3080
Monitor: Acer Predator Z35P
TV: Samsung Q95T (same as Q90T with OneConnect box)
Cable: 65ft (20m) amazon.com/B0923WRVY7/

Edit: It seems that it doesn't matter what resolution and frequency I use for HDMI 4 - as soon as I enable "Input Signal Plus" the TV goes black if my monitor is connected. If my monitor is not connected, it works as intended.

HDMI 123 works with "Input Signal Plus" enabled and monitor connected. However, those ports only support 4K60 which is not enough. Also only with 4:2:0 if I want HDR. Those ports simply aren't for connecting a PC.
 
Last edited:
You need to configure the 2 monitors if used at the same time.

Have you tried exactly the same resolution on both monitors at the same time, 1440p or 1080p?
Their highest res. is different from each other and that could be one of the reasons you get the problem ,one at 3440x1440 and the samsung at 3840x2160.

They would both do 2560x1440 and 1080p, try this way with same resolution and see.

As extended should not be a problem to use both at their highest resolutions but still needs a configuration.

You can do this from the window system or the Nvidia panel "configure surround".
Just experiment until you get it right.

You must also decide/configure which one you want as primary monitor as that could be one of the reason why you lose signal.

Excuse my bad English and I hope I understood your problem but please let us know how you get on with it.
 
You need to configure the 2 monitors if used at the same time.

Have you tried exactly the same resolution on both monitors at the same time, 1440p or 1080p?
Their highest res. is different from each other and that could be one of the reasons you get the problem ,one at 3440x1440 and the samsung at 3840x2160.

They would both do 2560x1440 and 1080p, try this way with same resolution and see.

As extended should not be a problem to use both at their highest resolutions but still needs a configuration.

You can do this from the window system or the Nvidia panel "configure surround".
Just experiment until you get it right.

You must also decide/configure which one you want as primary monitor as that could be one of the reason why you lose signal.

Excuse my bad English and I hope I understood your problem but please let us know how you get on with it.
Thank you for your reply.

It does not matter which resolution I have on the TV (or monitor). If "Input Signal Plus" is enabled on the TV (it has to in order to use any HDMI 2.1 features) the signal is lost if I connect my monitor.

But, just in case I did what you asked anyway. I set both the TV and monitor to 1920x1080 and 60Hz. As soon as I enable "Input Signal Plus" the TV signal is lost. Or, if "Input Signal Plus" is already enabled, then TV signal is lost as soon as I connect the monitor.

Edit: To be clear - I cannot configure anything regarding the TV since the signal is fully lost. It is as if it is not connected at all.

When I disconnect my monitor though, the TV works as intended. They simply cannot be connected at the same time.

I have tried another cable from another manufacturer, also 20m and also fiber optic HDMI. With that I could use both my monitor and TV at the same time. However, that cable had other issues and had to be returned.
 
Last edited:
One thing I must ask even so you mentioned it, look at my picture and what I done is to only show my main monitor (the samsung is now switched off but still connected to the GPU) and as you can see it still does show both on the nvidia panel, so it does still detect both.

Can you reconfirm if this is the case or you actually see only one after you lose connection?
 

Attachments

  • Capture2.PNG
    Capture2.PNG
    600.8 KB · Views: 142
Last edited:
One thing I must ask even so you mentioned it, look at my picture and what I done is to only show my main monitor (the samsung is now switched off but still connected to the GPU) and as you can see it still does show both on the nvidia panel, so it does still detect both.

Can you reconfirm if this is the case or you actually see only one after you lose connection?
Hi,
The signal is lost. The TV is instantly removed from everywhere, as if you would physically remove the cable.
 
Using a gtx1070ti and I made my test with a Samsung q9fn. I also got a q90t somewhere else which I could test.
I hope someone else can give us more clues about your problem.
Saying so, is there a way you can try with a different GPU? Just anything for testing.

Also , just noticed now you are using a 20m hdmi2.1 fibre cable. Could that be the problem if used at the same time as the monitor? It can be ok by itself but in reality the GPU might just refuse dealing with 2 monitors at the same time because of a cable that might be unsuitable for the purpose and cuts it off completely as a safety measure.

Have you tried with a normal cable? Just any hdmi 2.0 cable provided you reduce the resolutions.
Just to test that everything is working as it should be.
 
Using a gtx1070ti and I made my test with a Samsung q9fn. I also got a q90t somewhere else which I could test.
I hope someone else can give us more clues about your problem.
Saying so, is there a way you can try with a different GPU? Just anything for testing.

Also , just noticed now you are using a 20m hdmi2.1 fibre cable. Could that be the problem if used at the same time as the monitor? It can be ok by itself but in reality the GPU might just refuse dealing with 2 monitors at the same time because of a cable that might be unsuitable for the purpose and cuts it off completely as a safety measure.

Have you tried with a normal cable? Just any hdmi 2.0 cable provided you reduce the resolutions.
Just to test that everything is working as it should be.
Thanks for your reply!

The cable works as long as I don't enable "Input Signal Plus" on HDMI 4. Then I can have the monitor connected. However, without "Input Signal Plus", I cannot enable HDR, high refresh rate etc, as it is needed for those features.

Basically, if I use the cable as a "2.0-cable" it works, but not otherwise.

HDMI 1, 2 and 3 can use the same cable with "Input Signal Plus" enabled and monitor connected, but those ports do not support any 2.1-features.

When "Input Signal Plus" in enabled on HDMI 4 it will use FRL (Fixed Rate Link), so it will differ from the other HDMI ports that use TMDS. It is another kind of signal.

Unfortunately I do not have another RTX 3080 to test with. An older GPU will not reproduce these issues as the 2.1-features would not be available. Using another device like an Nvidia Shield will also not reproduce these issues. I would need another 2.1 device.

The issue is likely the cable, even though it seems like it could be anything.
 
I think you should use a 2m HDMI 2.1 cheap certified cable and see what the results will be. A possibility worth to try first of all.
Tested my q90t but with an DP1.4 to hdmi 2.1 active adaptor. The best I can get from my gtx1070ti is 4k 120hz 4.2.2 10bcp and it works ok with both (monitor and TV), it does not switch off or lose the signal on the 2.1 HDMI4 connection.
Not sure if it can make the difference as my gpu would work uncompressed I think.
Yours it is a DSC1.2 compressed DP connection and not sure if that could actually be the reason for the problem.
Can only get HDR if one of the 2 monitor is used as extended only.
 
I think you should use a 2m HDMI 2.1 cheap certified cable and see what the results will be. A possibility worth to try first of all.
Tested my q90t but with an DP1.4 to hdmi 2.1 active adaptor. The best I can get from my gtx1070ti is 4k 120hz 4.2.2 10bcp and it works ok with both (monitor and TV), it does not switch off or lose the signal on the 2.1 HDMI4 connection.
Not sure if it can make the difference as my gpu would work uncompressed I think.
Yours it is a DSC1.2 compressed DP connection and not sure if that could actually be the reason for the problem.
Can only get HDR if one of the 2 monitor is used as extended only.
I have ordered another cable, so we'll see if that works better. RMA has already been accepted for my GPU in case the new cable also doesn't work.

Edit: Unfortunately, the next cable didn't work either. I will purchase a short cable and a new DP-cable in case it is interfering. At this point it is more likely the GPU.
 
Last edited:
I finally found a fix for my issues. All cables I tried did the same thing, even a 1m Ultra High Speed certified cable.

What I had to do was go into the Nvidia Control Panel > "Set up multiple monitors". In that menu, my TV kept appearing and disappearing, and Windows kept making that "disconnected/connected" sound over and over.

This time, however, I was super quick and was able to check the box for my TV and then suddenly everything was fine. No more disconnects and 4K 120Hz 10bit HDR VRR was working on all cables. It's been a couple of days now and I've rebooted both TV and PC several times and all cables work fine now.

Very strange. Perhaps that setting somehow adds another voltage or something from the GPU. Previously I couldn't have TV and monitor connected at the same time, but now that also works.
 
That is great but not sure why in your situation you had to do the set up both on window and the Nvidia CP. Possibly the different behaviour of the GPU in your case.
Anyway, the important thing is that you got it right now.
 
If I recall Nvidia has separate resolutions for UHD (i.e TV modes) and PC as the timings are different

I would imagine it's related to 4k/120 HDR on TV is not the same as 4k/120 HDR on a pc and so the card goes to enable PC mode when you hook up the DP cable as by default it is probably trying to duplicate rather than extend the desktop. TV doesn't support PC mode and disconnection occurs.

Once the extend option is ticked it outputs to each display individually. Strange you couldn't do this from Windows settings though as it should be possible to have the PC going through the DP monitor and have TV connected (but on standby) so the pc acknowledges its presence and then tick the extend box.

As least you found at solution though :)
 
If I recall Nvidia has separate resolutions for UHD (i.e TV modes) and PC as the timings are different

I would imagine it's related to 4k/120 HDR on TV is not the same as 4k/120 HDR on a pc and so the card goes to enable PC mode when you hook up the DP cable as by default it is probably trying to duplicate rather than extend the desktop. TV doesn't support PC mode and disconnection occurs.

Once the extend option is ticked it outputs to each display individually. Strange you couldn't do this from Windows settings though as it should be possible to have the PC going through the DP monitor and have TV connected (but on standby) so the pc acknowledges its presence and then tick the extend box.

As least you found at solution though :)
It is all very strange to me. At first it only worked if "Input Signal Plus" or Game Mode was disabled on the TV (though without HDR and 10bit). I had the TV set to PC resolution 4K (3840x2160) and it was also extending the display. But as soon as I enabled "Input Signal Plus" or Game Mode (which enables "ISP") I would lose signal to the TV and Windows started making that "disconnected/connected" sound.

I was going crazy and had spent about €300 on different cables. I'm glad it's fixed now, but it shouldn't be this hard to get it working.
 
I am now having new issues, but I am not sure if it is the cable or my GPU.

As long as the cable is connected to my PC, it can "flicker" and "reset" my PC monitor maybe once every 30 minutes or so. Sometimes it's more often, sometimes a bit less.

If I am playing a game, monitor will go black and then freeze the game on-screen. Sound is still playing. I have to Alt-Tab to unfreeze the game. It happens in every game as long as the HDMI cable is connected, even if my TV is turned off. It also happens when the TV is on no matter chosen TV input.

If I remove the HDMI cable everything goes back to normal.

What's more strange is that it never happens on my TV - only on my monitor. I don't know how this can be! So if I'm playing a game on the TV it never happens. I don't understand anything right now.
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom