DVI problems with NVIDIA card

Status
Not open for further replies.
I have a Geforce FX 5600 256mb beast (well, at least it was back in the day) running on win xp home on a Dell Dimension 8100 1.4 (several years old) with VGA and DVI output. I just got the excellent 19 inch Microtek 997M monster. I mean, I had a 15 inch box up untill now. This is just the sweetest thing. Except that the DVI-D output does NOT work.

When only having the DVI hooked up, the screen is blank. That's while booting up or when XP has finished booting up. I can hook up both ports to my monitor ( I really hope it doesn't cause interference or anything) and activate the dual monitor thingy, under the "clone" mode so both outputs should be producing the same image. But the second I unhook the analog cable my monitor goes blank. It doesn't matter what resolution I have it go before it flakes out. there are no colors, no lines, just darkness. I have updated to the newest NVIDA drivers and even updated the motherboard bios, which let me tell you was scary as hell since it said if there was a power outage during updating that my PC would be junk. Can you believe that? and my girlfriend said computer repair wasn't exciting

So my card works just fine, but only with the analog output. What do you think nvidia said to me? Look at my nickname, that'll give you a hint. Basically the blame it on the monitor, but the thing is brand new, as well as the cable it came with (DVI-D). The NVIDIA detects that there is a digital display hooked up, otherwise it would not give me the option of having "dualview" as it calls it so it seems the video card is sensing it OK. I even formatted my hard drive to make sure that it wasn't something I installed. I borrowed my friend's CRT and hooked up my DVI LCD and his monitor to my video card: the computer detected the microtek monitor, but it was always blank and didn't even flicker when I added it using the CRT monitor. My new monitor's user guide says that it will automatically detect between DVI and analog and there is no way to manually select between the two. I am going nuts. Any help would be GREATLY appreciated. I hear DVI has got to be seen to be understood...the raw pixel power....ahhhhh... thanks :wave:
 
if it makes you feel any better i have a chaintech ge-force fx5200 and iiyama prolite e435s monitor and exactly the same here.i think a new graphics card(not nvidia)could be our answer.fustrating innit?
 
It can't be that all nividia cards don't support Dvi...I mean, they went to all the trouble of adding a DVI port in the back so it doesn't make sense.

Did you try replacing the DVI cable? They cost a bundle but I was thinking about doing it even though the one I have is brand new...as for getting a new video card? well I wish I had the money..maybe you can transfer a little to my paypal account? :giddy:
 
nvidiasucks check this out.i borrowed a radeon 9550 agp and guess what? dvi instantly,wow.i guess i'll snap this fx5200 into little pieces.nvidia really does suck.hope you get yours sorted.
 
Jesus, I will have to seriously look at getting a new card then from ATI. But hey: does the DVI make that much of a difference? Like is it really that good, or is it all hype (for games, DVDs, Divx, etc.)?
 
only tried it on soldier of fortune and its better,but i wont go over the top and say its worth spending £50+ on a new card.
 
Yep, I can't count the number of forums around the net on this issue. I have a viewsonic vx715 with NVidia FX5700 256 ddr, blank and eventually steady a few minutes when warm, tried it at another computer with his dvi cable and graphic card - also NVIDIA but FX5200. I don't fully blame NVidia, ATI users also have experienced just as much problems - my friends EIZO is working fine with DVI on his but not my viewsonic.
 
AAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHHHH I am want to kill DVI, Nvidia and ViewSonic... ERRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR!

Just bought a viewsonic vx924 FOR THE SPECIFIC REASON OF DVI AND IT WILL NOT WORK WITH MY 6800 GT. Screen goes blank after start up, analog display on this monitor totally SUCKS. God damn it! What a waste of money....
 
I don't know if this is the answer but I noticed that the first poster in this old thread mentioned the DVI-D output on his graphics card. He might have misstated it, but I think it should have been DVI-I.

It doesn't make much difference unless you got the wrong cable, one for DVI-D instead of DVI-I. Could that be the problem?
 
figured it out

Actually, I figured it out. Well, to be honest it kind of fixed itself. After messing with it for 5 hours, I settled on using the analog. I had just got BF2 so I updated my nvidia drivers and the DVI started working... so I am happy.
 
Wish I knew how you fixed things!

Wish I knew better what you did with the driver install that fixed things. I LOST the DVI on my monitor when I had BF2 lock up on me. I bought a new 7800 GS card to see if that was the issue, but it does the same thing.

Below is my situation if anyone has suggestions!

***

Basically, I was just running the game using an AGP nVidia 6600GT card from ASUS and the game and system locked up completely. I rebooted and I believe at this point I lost the use of the DVI on my monitor.

My first thing to do was see if my video card got fried, so I bought a nVidia 7800GS. Although this did not allow me to use the DVI, I did verify the video card was not the issue.

Bought a new DVI cord, still not the problem..

Unplugged the monitor and let it sit to "reset" itself. It still works fine on VGA, but not DVI :(

I next updated the motherboard BIOS and downloaded the newest drivers for my new card. Nothing is helping the situation at this point...

I can say that the system will not identify my monitor, where before it would actually list my particular monitor brand and model number. I'm not sure if this is because the VGA connection does not carry information and DVI does...the biggest issue is that because of this, I do not get XP offering me the widescreen resolutions (I cheated by making a custom resolution for the system, but it does not allow me to get widescreen resolutions in games that would otherwise offer these options).

Ok, so I've given as much info as I can and now I'm hoping that someone will tell me that there is an option that I've missed that will fix things :)

I realize the one part of this problem that I have not excluded is that the DVI port on my monitor is bad. I can't find another computer with DVI to test on.

I will really appreciate any help you guys can offer!

Brian




IHATEDVI said:
Actually, I figured it out. Well, to be honest it kind of fixed itself. After messing with it for 5 hours, I settled on using the analog. I had just got BF2 so I updated my nvidia drivers and the DVI started working... so I am happy.
 
Hey.

It depends on on when you fist installed your card. if you used the standard vga adapter then it made that area of the card primary and the dvi side of the card secondary. if you wish to use the dvi instead of the vga i would recommend. due to the better responce time. but not to get off topic. if you connect your standard vga and then select the properties of the video card, right click on desktop: properties: display: then go to the drop down menu and select the secondary video adapter option. and set it to primary if you plug in your dvi and reboot your computer it should be able to pick it up. or uninstall your video card and then reinstall your video card using the dvi it should auto select and boot your monitor. i know i had that issue with my ati 9800 xt 256 it was very picky on how it wanted to work.
 
I'm going to give that a shot. Thanks for taking the time to give your suggestions!



faudin said:
It depends on on when you fist installed your card. if you used the standard vga adapter then it made that area of the card primary and the dvi side of the card secondary. if you wish to use the dvi instead of the vga i would recommend. due to the better responce time. but not to get off topic. if you connect your standard vga and then select the properties of the video card, right click on desktop: properties: display: then go to the drop down menu and select the secondary video adapter option. and set it to primary if you plug in your dvi and reboot your computer it should be able to pick it up. or uninstall your video card and then reinstall your video card using the dvi it should auto select and boot your monitor. i know i had that issue with my ati 9800 xt 256 it was very picky on how it wanted to work.
 
Status
Not open for further replies.
Back