Monitor loss of signal problem

Status
Not open for further replies.

alexgoh

Posts: 31   +0
Dear all, recently I have been stuck with a big problem with my Samsung T220 monitor. Apparently whenever I run games at a refresh rate of 75Hz, the monitor seems to lose the DVI signal and tends to search for alternate sources. This problem also occurs even when i use analog input. This is extremely irritating as I play counter-strike and need that 75Hz. I have swapped graphics card from an nVidia 8800GTS 320mb to a ATI 4870 512mb and still the problem exists. I even swapped RAM but to no avail. The monitor seems to be problem free only when it is at 60Hz. I have no idea what is wrong and I hope that someone can enlighten me. Thanks for all the advice. My computer specs are as follow:
710W PSU
ASUS Striker Extreme
Core2Quad Q6600 @ 2.4Ghz
ATI Radeon 4870 512mb
2Gb Kingston HyperX FSB 800
 
The monitor likely only supports 60Hz. That's rather common for cheaper LCDs it seems. It might work at 75Hz on occasion, but it's not made for that, and thus, is unlikely to function stably at 75Hz. While I can see the slight difference between 60Hz and 75Hz, it doesn't really make that much of a difference when gaming on an LCD monitor, as they are immune to flicker. The only reason you'd ever need more than 60Hz is on a CRT, and that'd only be because of the flicker being hard on the eyes. Try playing Counter-Strike on 60Hz, I'm sure it won't be that bad. Perhaps it will take a little getting used to, but there's nothing for it. I don't think that monitor supports anything beyond that.
 
Status
Not open for further replies.
Back