which is your favorite company for video cards?
Well, 1. This probably isn't the thread for this- try Audio and Video.
2. I bet it's been asked many many times.
3. This is begging a flamewar. Something like putting an extremist Muslim and an extremist southern Baptist in the same room. Not funny type war.
And 4, for what it's worth, I like NVidia.
Since I am not a gamer, I prefer Matrox (currently running a dual-head G550).
I was die hard Nvidia then they made crap of the FX series so I bought ATi which I really like ATi but I am having problems now with my fairly new 9800pro and I hear Nvidia is making a good line of 6800's so my next one will most likely be NVIDIA if they are making good stuff this time around, as opposed to ATi I heard is still good but slightly(very slightly) lacks certain things. So I like em' both basically no preference
Which ever company's next-gen cards become available at an affordable price first.
To suprise you all...
But after they #"¤%"#¤%"#¤%"#¤&%/(/&)/()=)%&§ away (with some of my money) I partial to ATI.
(I like nvidia's cards, but not some of their tactics with PR (especially since that contributed to 3dfx' demise.......))
Deer Dance14... Everyone will have there opinion, and mostly this will be selected by previous experiences with cards. I have used both nVidia and ATI and think they are both great card manufacturers. They both have great products. I am partial to nVidia since they sponsor more LANs in my area. I like ATI's price vs framerate... All a toss up, just depends on which one has the best deal at the time.
$oulo... Are you running a stock cooler on that card?
Godataloss... nVidia is coming out with the 6600 series for about $200. Not sure if that is the price range you are looking at, but that will be the next best thing to the 6800 series. IMO
Yea buddy it is stock cooling, I was contemplating getting the ThermalTake GIANT III but I don't know if it will cool the mem as well?
The Giant III is the shizzle
in the pics, you'll first note that it comes with RAM sinks.
Second it has holes in the main fan area to provide ventalation into the RAM area
Third it has the turbine style fan on top (looks kinda funny to me...)
Personally, I would make sure that the ram sinks are in contact with the larger heatsink, but you want to verify that you don't leave a gap for the GPU.
I know that most 9800 pro stock cooling wasn't enough, so I would highly recommend getting an after market cooler.
You should post some pics when you finally decide on something.
JESUS! Yeah that GIANT really is the SHIZZLE!!
Nvidia, definately. I have an X800 Pro and that thing is a piece of crap. I can barely squeeze out 40fps in Far Cry (i've seen it dip as low as 9 before). Thats on 1024x768, 2xAA, 4xAF, high settings. Doom 3 is at least playable on high settings with major overclocking. Now the 6800GT, which I could have gotten for the same price, can play Far Cry and Doom 3 and never dip below 60FPS.
Which ever offers the best performance/technology for the money. I've had both Nvidia and ATI, my last two cards have been ATI because they have provided the best product in my opinion.
Never had any problems with Far Cry. Doom3 runs fine especially with the 4.9beta catalyst drivers. Considering Doom3 has specific render paths for Nvidia cards it is hardly surprising it runs that game better. HL2 should be a good example of how good the X800 series can perform.
If Half-Life 2 ever comes out...
HAHAHA :grinthumb That's what I am saying it better be really good for all the drama and wait is has caused or I am gonna be PISSED off
you wont have to wait much longeer at all, cause i think it's released on the 3rd of september. the screens are immense and give you a great idea of the graphics.
The Counter Strike:Source BETA is wicked cause they use the HL2 engine and o it's good.
I read something a while back about JC either taking the special rendering path out (due to the great shader performance of the 6800 series) or only enabling it for the GeForce FX series. I'm pretty sure the 6800 series and X800 series are both using the same rendering path.