I have a Geforce FX 5600 256mb beast (well, at least it was back in the day) running on win xp home on a Dell Dimension 8100 1.4 (several years old) with VGA and DVI output. I just got the excellent 19 inch Microtek 997M monster. I mean, I had a 15 inch box up untill now. This is just the sweetest thing. Except that the DVI-D output does NOT work.
When only having the DVI hooked up, the screen is blank. That's while booting up or when XP has finished booting up. I can hook up both ports to my monitor ( I really hope it doesn't cause interference or anything) and activate the dual monitor thingy, under the "clone" mode so both outputs should be producing the same image. But the second I unhook the analog cable my monitor goes blank. It doesn't matter what resolution I have it go before it flakes out. there are no colors, no lines, just darkness. I have updated to the newest NVIDA drivers and even updated the motherboard bios, which let me tell you was scary as hell since it said if there was a power outage during updating that my PC would be junk. Can you believe that? and my girlfriend said computer repair wasn't exciting
So my card works just fine, but only with the analog output. What do you think nvidia said to me? Look at my nickname, that'll give you a hint. Basically the blame it on the monitor, but the thing is brand new, as well as the cable it came with (DVI-D). The NVIDIA detects that there is a digital display hooked up, otherwise it would not give me the option of having "dualview" as it calls it so it seems the video card is sensing it OK. I even formatted my hard drive to make sure that it wasn't something I installed. I borrowed my friend's CRT and hooked up my DVI LCD and his monitor to my video card: the computer detected the microtek monitor, but it was always blank and didn't even flicker when I added it using the CRT monitor. My new monitor's user guide says that it will automatically detect between DVI and analog and there is no way to manually select between the two. I am going nuts. Any help would be GREATLY appreciated. I hear DVI has got to be seen to be understood...the raw pixel power....ahhhhh... thanks :wave:
When only having the DVI hooked up, the screen is blank. That's while booting up or when XP has finished booting up. I can hook up both ports to my monitor ( I really hope it doesn't cause interference or anything) and activate the dual monitor thingy, under the "clone" mode so both outputs should be producing the same image. But the second I unhook the analog cable my monitor goes blank. It doesn't matter what resolution I have it go before it flakes out. there are no colors, no lines, just darkness. I have updated to the newest NVIDA drivers and even updated the motherboard bios, which let me tell you was scary as hell since it said if there was a power outage during updating that my PC would be junk. Can you believe that? and my girlfriend said computer repair wasn't exciting
So my card works just fine, but only with the analog output. What do you think nvidia said to me? Look at my nickname, that'll give you a hint. Basically the blame it on the monitor, but the thing is brand new, as well as the cable it came with (DVI-D). The NVIDIA detects that there is a digital display hooked up, otherwise it would not give me the option of having "dualview" as it calls it so it seems the video card is sensing it OK. I even formatted my hard drive to make sure that it wasn't something I installed. I borrowed my friend's CRT and hooked up my DVI LCD and his monitor to my video card: the computer detected the microtek monitor, but it was always blank and didn't even flicker when I added it using the CRT monitor. My new monitor's user guide says that it will automatically detect between DVI and analog and there is no way to manually select between the two. I am going nuts. Any help would be GREATLY appreciated. I hear DVI has got to be seen to be understood...the raw pixel power....ahhhhh... thanks :wave: