Hi,
I've been having a problem with the display over remote connection.
In the configuraiton I'm using, I have 2 PC's connected by LAN, but only 1 screen, and in order to see the video output of the 2nd PC, I just use remote desktop connection from the 1st PC.
I've realized that when starting up the RDP, I get an error "The video card is not updated", although this error does not appear when connecting a monitor directly to that PC.
In addition, there's decreased performance of the video output through the RDP, and graphics seems to be with a slower refresh rate. I didn't have that kind of a problem with my previous graphics card, which was older.
I'm thinking that it has to do with the default boot of the card. If no screen is connected, it thinks that it's starting up with VGA configuration perhaps?
Any ideas?
I've been having a problem with the display over remote connection.
In the configuraiton I'm using, I have 2 PC's connected by LAN, but only 1 screen, and in order to see the video output of the 2nd PC, I just use remote desktop connection from the 1st PC.
I've realized that when starting up the RDP, I get an error "The video card is not updated", although this error does not appear when connecting a monitor directly to that PC.
In addition, there's decreased performance of the video output through the RDP, and graphics seems to be with a slower refresh rate. I didn't have that kind of a problem with my previous graphics card, which was older.
I'm thinking that it has to do with the default boot of the card. If no screen is connected, it thinks that it's starting up with VGA configuration perhaps?
Any ideas?