Help - DVI outputs a blank screen after POST, but VGA doesn't...

By aziatic · 6 replies
Aug 30, 2005
  1. If I make use of my DVI output, my screen goes blank after POST. It also seems like it never loads windows at that point. With my VGA output, screen doesn't go blank and Windows loads??? I've installed the LATEST drivers and it still occurs. I also used the same graphics card in my last build with a DFI Lanparty MB and DVI worked fine. Can someone lend me some assistance? Thanks.

    Current Config
    Nvidia 6800 OC 128MB AGP
    ASUS K8N-E Deluxe NF3 Chipset
    AMD 64 3400
    2X512 OCZ
  2. aziatic

    aziatic TS Rookie Topic Starter

    Also... If I plug a DVI to VGA adapter to my dvi output on my graphics card and utilize the vga cord, everything works fine??!! Any ideas?? :knock:
  3. mailpup

    mailpup TS Special Forces Posts: 7,182   +469

    Perhaps it is a defective DVI cable. Can you try another? Also, is the cable compatible, that is, no DVI-D versus DVI-A issues?
  4. aziatic

    aziatic TS Rookie Topic Starter

    Thanks for the reply... Yeah using another DVI cable was going to be my next test. I just don't have another one atm. No compatibility issues. The same DVI output was working fine on my last build. Going to try a new cable tomorrow.
  5. RayJon

    RayJon TS Rookie Posts: 30

    Yea off topic but, Az is top 4 rappers of all time, nice name. (lol if your name isnt off the title of his cd jut throw rocks at me cause im 20% retarded)
  6. aziatic

    aziatic TS Rookie Topic Starter

    Yeah AZ is my fav rapper. Most people don't recognize that. You're one of the first.
  7. Varr

    Varr TS Rookie

    I have an ASUS A7N8X Deluxe motherboard, 2600+ XP AMD processor, ATI Radeon 9500 Duel Pro video card, and Samsung SyncMaster 215TW flat panel display. After dealing with support for ASUS, ATI, and Samsung all blaming each other for the problem I solved it myself.

    Problem: After the display had been working fine I was getting a blank screen at boot up. I tried everything all the support sites said, “Disable USB support on motherboard, powered on the display before computer, updated all drivers.” I was only able to get the display to be detected by using the DVI-I to VGA adapter. Only once in a blue moon would the display be detected using the DVI-D cable.

    After resetting my bios 30 times, reseating the video card 10 times, and testing all the ports on my display and video card, hours reading the support forums, I figured it out. When my display was detected the EDID information for my display frequency was 75Hz though Samsung reports it to be 60Hz when run at 1680x1050. My motherboard’s default bios setting is auto and does not have 60Hz. Samsung support told me to low the refresh rate in Windows when the display was detected WRONG. I would not get the port to detect monitor. If I set the bios to 50Hz for the AGP bus the computer would not boot. Then I set it to 66Hz the next setting no boot. Then I tried 75Hz the computer booted but the display was not detected. Then there was some information on a support for DVI problems with DVI-D cable and voltage levels. With my bios I was able to step the voltage for the AGP slot 1.5, 1.6, and 1.7. Default was 1.5, at 1.6 blank screen, 1.7 bingo display worked and was detected from boot up.

    Solution: Use analog VGA connection to get into bios, and have DVI connected to display at the same time. Set the AGP frequency to 75Hz (or Detected EDID Hz) and voltage to 1.7 (or default then step up to next option). Once in windows switch the source to digital and make it the default display. Then disable the VGA if you are not going to use it. Reboot if needed and display should be sent digital signal from boot up to Windows. Also I have the ATI control panel reduce to DVI frequency for high-resolution displays.
Topic Status:
Not open for further replies.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...