Here is the computer -
Asus A8R32-MVP (newest bios release)
CPU - Opty 180 (Toledo core) OC'ed 10%
2X's ATI X1900's in crossfire (stock bios and speed)
Monitor - BenQ FP222W H 22 inch
Win XP sp3
Cat's 9.3
Problem:
The computer & monitor using DVI to DVI has been working great for about 2 years, since I put it together. No major hardware issues at all. About a week ago though, The computer stopped sending a signal to the monitor, just out of the blue with no changes to the system at all. DVI to vga works, but the same old flicker came back (I think that cable became screwed up somehow from my old system). DVI to HDMI works - but for some strange reason I think the video drivers then pick up that its not a computer monitor, (although ccc still picks up on its manuf. and modle number) but a TV, as the options in CCC change to DTV, resolutions and refresh rates change to modes the monitor cannot support. And even at 1680X1050 which is the monitors max res at it's only possible refresh rate (60hz) the screen still blurs badly. And if I enable crossfire, the only resolution available is strangely is 1280 X 720, totally distorting even just pictures.
I take it that it's not a problem with the computer, as the DVI still works to vga and hdmi - just not correctly in either circumstance.
I know HDMI IS DVI just with sound, so why do the drivers act so differently (and not in a good way) with the change?
and anyone else have such an out of the blue dvi to dvi meltdown? No jostling, no moving around of any kind beforehand to knock lose a solder- just "no signal" on every startup ..
thanks
Asus A8R32-MVP (newest bios release)
CPU - Opty 180 (Toledo core) OC'ed 10%
2X's ATI X1900's in crossfire (stock bios and speed)
Monitor - BenQ FP222W H 22 inch
Win XP sp3
Cat's 9.3
Problem:
The computer & monitor using DVI to DVI has been working great for about 2 years, since I put it together. No major hardware issues at all. About a week ago though, The computer stopped sending a signal to the monitor, just out of the blue with no changes to the system at all. DVI to vga works, but the same old flicker came back (I think that cable became screwed up somehow from my old system). DVI to HDMI works - but for some strange reason I think the video drivers then pick up that its not a computer monitor, (although ccc still picks up on its manuf. and modle number) but a TV, as the options in CCC change to DTV, resolutions and refresh rates change to modes the monitor cannot support. And even at 1680X1050 which is the monitors max res at it's only possible refresh rate (60hz) the screen still blurs badly. And if I enable crossfire, the only resolution available is strangely is 1280 X 720, totally distorting even just pictures.
I take it that it's not a problem with the computer, as the DVI still works to vga and hdmi - just not correctly in either circumstance.
I know HDMI IS DVI just with sound, so why do the drivers act so differently (and not in a good way) with the change?
and anyone else have such an out of the blue dvi to dvi meltdown? No jostling, no moving around of any kind beforehand to knock lose a solder- just "no signal" on every startup ..
thanks