I have a dell dimension 2400, and recently wanted to upgrade my graphics card. As the motherboard has no PCI-e slots I had to get a PCI card, so I went for the FX 5500. Dell gave me the intel 82845G integrated card, and I tried everything to get rid of it. I unistalled it, disabled it and tried through BIOS but there is no option of turning it off, only an option of having it being the only card used. Everytime I turned on the computer and they were both there, the computer would load up but I would get a black screen. Finally I gave up and just plugged in the nvidia card while the onboard card was still installed. Success! But although the computer found it and relasised it was there and operational, I still only got pictures while running the monitor from the onboard card. Even when I plugged in a second monitor into the nvidia card, and activated the dual screen set up, only the onboard card worked. It let me access the properties for the nvidia card though, and while it was plugged in in a dual screen system I tried to adjust the refresh rates, to see if it would make an image appear. Sadly it caused my computer to freeze, and now the computer will only load up if the nvidia card is not attached. What can I do to get the nvidia card to work. While they were both in and in the dual screen system it said there was no conflicts with either of them. Is it the onboard card, the nividia card or my PC that needs fixing?