GeForce 6600 vs GeForce 5700 Ultra

Status
Not open for further replies.

VoodooHellfire

Posts: 33   +0
Well, I'm looking at upgrading again, and in the interest of finances, I'm upgrading the video and keeping the rest of the system the same. I'm currently running an ASUS A7N8X-E Deluxe w/ Athlon XP 3200+ w/ 1GB of PC3200 ram. I'm looking at replacing my BFG GeForce 5700 Ultra with something a bit more modern and keeping the price low. I've found a 6600 (may be and LE version). The GPU and memory clocks are lower and the pixels/sec isn't as high, but the 6600 has support for Shader 3.0 and has the ability to process MPEG2 video on card which will help watching DVD's and with my TV card also. Plus there's the support for processing HD Video. I'm just looking for people's opinion on whether this is a good upgrade or not before jumping in. Fell free to say whatever. Thanks!
 
Grab that XFX GeForce 6600GT DDR3 128mB AGP? very good video card for most today games, especially FEAR, DOOM3 and Counter-Strike:Sources in high setting (not the maximum <unless you Overclock it and watercooled it ^^>)
 
VoodooHellfire said:
The 6600 has the ability to process MPEG2 video on card which will help watching DVD's and with my TV card also.
Which card doesn't?

Plus there's the support for processing HD Video.
There also needs to be software to take advantage of that, otherwise they're just marketing buzzwords (and what does processing mean in this context anyway?).
 
Just to clarify, it's not marketed by XFX as a GT model. The model # is PV-T43K-UD. Looking at the specs it doesn't state it's an LE model either. Maybe it's just a standard 6600. Here's the specs.

Graphic Core 256 bit
Memory Interface 128 bit
Pixels per Clock 8
Fill Rate 2.4 Billion Texels/Sec
225 Million Vertices/Sec
256 MB DDR Ram

I've compared that to my BFG 5700 Ultra specs which are listed differently:

Memory 128MB DDR2
Core clock 475MHz
Memory Clock 900MHz (effective)
356 Million Vertices/Sec
14.4GB/s memory bandwidth
Pixels per clock 4

So...as you can see, part of this is comparing apples to oranges. The parts that are confusing is it looks like my old card does more vertices/sec but half the pixels per clock. There's double the ram on the new card too. I just want to make sure I'm getting a better performer for my money. Thanks for the comments so far.
 
Mictlantecuhtli said:
There also needs to be software to take advantage of that, otherwise they're just marketing buzzwords (and what does processing mean in this context anyway?).

Well in two of my most recent software purchases, FEAR and TES IV Oblivion, HDR support is present. my x800 is left in the dust. Im not arguing that it seems like a "cool feature" but it seems to be more and more prevalent.
 
AtK SpAdE said:
Well in two of my most recent software purchases, FEAR and TES IV Oblivion, HDR support is present. my x800 is left in the dust. Im not arguing that it seems like a "cool feature" but it seems to be more and more prevalent.
Yeah, but High Dynamic Range rendering isn't the same as High Definition video ;)
 
Thanks for the feedback. Going off of the initial feedback, I went ahead and purchased the card. I opted for the PNY VCG66256. It's the same card spec wise as the XFX PVT43KUD. I used to own a PNY Ti 4400 and it never let me down. I guess you stick with what you know. Anyways, the link to the card comparison on Tom's Hardware was very helpful. I tried finding something like that for a few days before asking for people's opinions here. Techspot comes through again!

About the HD processing and all. Compared to the 5xxx line of cards, from what I'm reading, the 6xxx takes some of the processing from the CPU to lighten the load, and actually cleans up the display a little. My system has a DVD drive and an ATI Remote Wonder Pro TV card. As a side note, I have noticed quite a difference since moving from my 21 in CRT to my 19 LCD widescreen as far as the TV display goes. Before my refresh on the CRT was 85 and I think that was messing with the TV display which is only 60. Now that doesn't seem to be an issue any more, but hopefully the new PNY card helps out a little more.

The biggest reason for the purchase is the Shader 3.0 support with all that comes with it...HL2 Lost Coast w/HDR...here I come!

Thanks.
 
I think you're going to be dissapointed as the PNY VCG66256 looks like a standard 6600 with 256MB memory (they put 256MB on 6600's to sell them to people that don't understand the limitations of lots of memory on these cards) which gets seriously whipped by any 6600GT with 128MB of memory.
 
Well, I'm going to benchmark the 5700 this weekend with all sorts of 'artificial' benchmarks, and a few games. When I get the 6600 I'll run it through the same series of test. Part of what I'm looking to get is increased image quality due to the higher amount of ram and ability to store more textures on the card. Cutting down on the image 'hitching' due to having to load textures in, and the support for Shader 3.0. The additional support for HD Video processing on chip is a plus too.

The GT in my opinion was a little too much comparing price for the little bump you would get in performance. I'm going to be upgrading hopefully in a year or little more to a 64 bit system with PCI-E, so this card is more of a 'bump up' than a true screamer upgrade.

I appreciate all the input and plan on posting the different scores and increases I get when the card comes in, so stay posted! (Get it, stay posted...in a forum...where people post...eh..never mind)

Thanks!
 
i got a fx 5500 geforce...i'm new to this i just want to know wat settin should i but for the AT setting...i put to 4x but some of my program like lime wire freeze up
 
Just wanted to update anyone still monitoring this. I received the 6600 on Wednesday Apr 5th, and it rocks. Much better than my 5700 ultra. Smaller too! Haven't run any benchmarks yet, but I did download the tech demos from Nvidia's site. Not very impressive in my opinion. I fired up HL2: Lost Coast though and WOW! Blown away. I'm going to run it through various benchmarks and I'll put up the comparisons for people to see.

As a side note. Is there a way within HL2 to get it to show framerate during gameplay? Also, are there any tech demos that really show off what shader 3 cards are capable of? I mean, besides the one's on Nvidia's site. Like a non-video card manufacturer tech demo? Again, thanks for the feedback, and I'll be posting soon
 
I've seen that before, but wouldn't running that app in the background tax an already somewhat taxed system playing HL2? Just wondering...never actually used it before.
 
I never thought Fraps would have any considerable impact on game performance but it probably could. Anyway I think this console command gives you FPS info:

cl_showfps 1

or for a more detailed view:

net_graph 1
 
Status
Not open for further replies.
Back