No Signal After GFX Card Upgrade

andy06shake

Posts: 506   +162
Hi, I hope someone can shed some light on my problem.

I recently upgraded my GPU from a Palit GTX1070 Gamerock to the RTX 2070 Gamerock variant.

Done a clean driver install for the new card, after which my LG TV which I use as a second monitor started displaying "No Signal Detected".

This is strange because the system boots/displays the bios screen on the TV down to HDMI being the default I assume, plus also my windows login screen, after which it is displayed on my primary monitor, and the TV screen says "No Signal detected".

Windows 10 can see the device under display settings and the drivers for GPU and both monitor/tv are all up to date, I can even move windows from one device to the other, but just can see the display on the TV.

I have tried using other HDMI ports on the TV, resetting the TV, and changed the cable out with another, still with the same result.

Obviously it works down to the TV displaying the logon and bios screen, but for some reason decides to do something weird afterward.

Anyone any idea as to the solution?
 
This may sound like a silly suggestion, but is the HDMI cable firmly pushed into the GPU's socket? Sometimes with new cards, the plate that supports the output sockets doesn't quite fit against the PC case, in the same way that the previous card did - this can affect how well the cables plug into the card. For example, with mine I'm stuck with only being able to use 1 DP socket because the case gets in the way of the others, no matter how I adjust where the card sits.
 
This may sound like a silly suggestion, but is the HDMI cable firmly pushed into the GPU's socket? Sometimes with new cards, the plate that supports the output sockets doesn't quite fit against the PC case, in the same way that the previous card did - this can affect how well the cables plug into the card. For example, with mine I'm stuck with only being able to use 1 DP socket because the case gets in the way of the others, no matter how I adjust where the card sits.

At this point mate, any suggestions are considered.

The bios and windows logon screen display on the TV, so I'm thinking its not a socket fault on ether end.
 
Apologies - you'd mentioned that in your first post. What mode is your multiple display system in? For example, when you press Windows Key + P, what is it set to?
 
That's the thing, the display is extended, I can use the window keys to port the screen to the TV, with windows plainly detecting and seeing the device. Yet the TV still displays "No Signal detected" after I log on, even though the system posted the bios, windows load logo, and logon, before switching over to the other monitor. Stumped I am. LoL

Edit: Seem to have resolved the issue somehow by swapping the HDMI port to another whilst the PC was still on. Im thinking it was an LG issue as opposed to it being a Windows-based problem. Thanks for taking the time to try and help all the same.
 
Last edited:
I wonder if the 2070 is using a refresh rate unsupported by the TV. Could you post a screenshot of the Nvidia Control Panel - specifically the 'Change Resolution' section?
 
Here is the screenshot requested.

As you can see the its set to 60Hz, which is the limit the TV supports, was set to 60Hz before I changed the HDMI port on the TV, and its not the port that has a problem as it displays the PS4 signal just fine. Like I said problem seem to have resolved itself, like to know what was wrong all the same. TV Nvidia specs.jpgTV Nvidia specs.jpg
 

Attachments

  • TV Nvidia specs.jpg
    TV Nvidia specs.jpg
    351.9 KB · Views: 0
Good to know it's fixed and it's another bit of useful knowledge to store away, next time somebody else has this problem! What specific model of LG TV is it?
 
Its a LG 2017 model (55UJ651V-ZA )4K TV.

Worked flawlessly as a 2nd 4K/1440p 60Hz display until I paired it with the RTX 2070.

Anyhoo problems solved, thanks for taking the time to assist me, and a happy new year to you also.
 
Back