Graphics card not working

Status
Not open for further replies.

AbZ

Posts: 10   +0
Hi. I recently brought a HD 4550 for my Dell C521 but when I plug in the card, the monitor doesn't receive any signal. I had a HD 3450 before which worked for about 6 months and then stopped working suddenly. I took it to my local tech shop and they thought my graphics card broke as they successfully installed a Nvidia 6200 LE. (which blow) My original x1300 won't work either - the only card that works is the Nvidia one. It could be that my new card is faulty but I feel that's unlikely. The tech shop told me they didn't know why my ATI cards wouldn't work and they said it couldn't be a driver issue etc.

How should I proceed? Thanks.
 
Hi AbZ,
well first of all, they didn't install a Nvidia 6200LE, 6200LE is integrated graphics. given the history you described, it sounds like you may have a problem or a defunct pcie/AGP slot as the case may be. can you try these cards in another machine to make sure its not the cards? (unlikely as it is that all are bad i know)
 
Hm thanks. Unfortunately I don't really have another computer here I can test the cards on but I'll go to my local tech store to ask them to try them. Should it be the case that its my PCI slot which has gone bad, how much will it cost to repair it / buy the needed pieces? Also, surely if it were my PCI slot then any graphics card inserted into that slot would not work right? even a integrated graphics card like the 6200 LE..
 
well you have to remember that the 6200 le is not a card, its actually integrated into the motherboard, so it does not occupy a slot.
 
But physically it does take up a slot. In fact it looks exactly like any other regular card. So how does that work? I'm not very educated in the internals of computers as you can probably tell, so if you could explain the difference between 'normal' cards and integrated ones a bit further I would appreciate it. Thanks.
 
But physically it does take up a slot. In fact it looks exactly like any other regular card. So how does that work? I'm not very educated in the internals of computers as you can probably tell, so if you could explain the difference between 'normal' cards and integrated ones a bit further I would appreciate it. Thanks.

Is this a Ge-force 6200 LE?, as in one of these?
http://reviews.cnet.com/graphics-cards/evga-e-geforce-6200le/4505-8902_7-32955666.html
I thought you were referring to an integrated graphics chipset in the 6100,6150 family.
 
oh ok, my bad,
now your saying that the 6200 is the only one that works right?
when you try the ATI cards, did you completely uninstall the old driver set before installing the new one?, or did you install over the previous drivers?
 
I've never attempted to uninstall a graphics driver before inserting a new card. Let me explain again. My computer came with a ATI Radeon x1300 which I replaced with a HD 3450. That worked for about 6 months and suddenly my monitor recieved no signal whilst that card was inside. I then tried my old x1300 to see if it would work but that also wouldn't work. Not knowing what to do I took my computer to the local tech shop and they claimed my HD 3450 broke and they gave me the 6200 LE there which works. (using that now) I just brought a HD 4550 which gives me the same issue as my old ATI cards.

Anyway to answer your question I've never uninstalled a driver before using a new card. I just installed the new graphics card with the driver CD given in the package. But is that an issue? I mean surely my monitor should at least recognise and receive a signal regardless of having latest drivers etc. ?
 
it should yes, but it sounds like when you put the ATI card in, its still loading the Nvidia drivers and not recognizing the ATI card. your dell C521 has an integrated 6150 LE video. I would remove all traces of video drivers use driver cleaner pro, or the like , and download the newest ATI drivers for your 4550.
 
I have the same problem except I'm trying to install a EVGA GeForce 8400GS

It use to work but recently (two months ago) I installed Windows 7 and it gives a black screen after the Welcome screen (common problem with a lot of Windows 7 users until recently) so I removed the 8400GS GPU and began to use my GeForce 6100 IGP again... and it worked flawlessly

But I bought Assassin's Creed and some other games and it runs painfully slow on my GeForce 6100 (Don't tell me that the 8400GS won't improve it by that much cause I can play Crysis on the 8400GS (OC'ed using EVGA Precision) at 1440x900 (Native Resolution of my Monitor) at Low-Medium Settings (Physics at High) runs at 20-30FPS)

So I got my GeForce 8400GS again and inserted it but it doesn't work at all I plugged the VGA Cable to my 8400GS but no signal is recieved so I connect it to the GF 6100 and it's still getting signal

I tried disabling through the BIOS(I'm using a Lenovo 3000 J115)
The Video menu on the BIOS shows which source: IGP, PCI, PEG and Frame Buffer

It use to be that the Integrated graphics would disable itself as soon as I install my dedicated GPU

I was planning on getting a GeForce 9600GT but don't have that much money yet (I'm only 13 LOL)

Also my graphics card has a problem very common... The VGA port doesn't work I try to plug in a Monitor (I have a lot of monitors I tested on it) and it keeps giving no signal so I have to use the DVI port and the DVI-to-VGA converter (cause I don't have a DVI Monitor) and it would work

I think the Integrated GPU isn't disabled and I don't know where to disable it (tried using BIOS, Uninstalling it with Windows though I knew that wouldn't really help cause Windows is a OS and if the GPU works the BIOS would boot up outputting signal on the Video Card)

So any help? please

Email me

justinxtreme at yahoo.com (Techspot wants me to post 5 or more so I can get to post my email just convert at) (I will reply as fast as I can I have an iPhone and it gets e-mail really fast)
 
Status
Not open for further replies.
Back