Nvidia 5900 256mb ram or 128, Which is better and why?

Status
Not open for further replies.

r1_forever

Posts: 27   +0
I am about to buy the Nvidia 5900 card but i noticed the 5900 with only 128mg of ram compared to 256mg ram is almost $300 cheaper. Can anyone help me out as to what the benefits will be if i buy the 5900 with 256mg ram over the same card with only 128mg of ram. This seems so far to be a question which no one can give a definitive answer to. Any help would be gratefully accepted. Andrew - Australia.
 
It's obviously got more memory.. sometimes "budget" cards, have slightly slower clock speeds, and the RAM itself on the 256MB card, is probably quicker.

Are both cards retail? one might be an OEM card.
 
Yes both cards are Retail versions from Winfast Leadtek. The 256mg card is called the 5900 Ultra and the 128mg is just called the 5900. Will I notice much difference during gameplay with the smaller Ram on latest release games? The $300 saving could go on another piece of hardware if the difference is only going to be minimal.
 
Generally speaking, a larger memory allows games to be played at higher resolution. There will no be any sort of increase in performance providing the clock speed of the memory and core is maintained.
 
You've answered the question yourself then

5900 'Ultra' vs 5900 'Non-Ultra'

ULTRA :-

NVIDIA GeForce FX 5900 Ultra GPU
CineFX II Engine
Intellisample Technology HCT
High-Precision Graphics
nView Multi-display Technology
Digital Vibrance Control (DVC)
Unified Driver Architecture (UDA)
AGP 8X
0.13 Micron Process Technology
400MHz RAMDACs
1.3 Billion texels/ sec fill rate
Graphics Core: 256-bit
Engine clock 450 MHz
Memory Interface: 256-bit
Memory Bandwidth: 27.2GB/sec
Fill Rate: 3.6 billion texels/sec.
Vertices/sec. 338 million
Memory Data Rate: 850MHz
Pixels per Clock (peak): 8
Textures per Pixel: 16(Maximum in a single rendering pass with 8 textures applied per clock.)
RAMDACs 400MHz

NON-ULTRA :-

NVIDIA GeForce FX 5900 GPU
CineFX II Engine
Intellisample Technology HCT
High-Precision Graphics
nView Multi-display Technology
Digital Vibrance Control (DVC)
Unified Driver Architecture (UDA)
AGP 8X
0.13 Micron Process Technology
400MHz RAMDACs
1.3 Billion texels/ sec fill rate
Graphics Core: 256-bit
Engine clock 400 MHz
Memory Interface: 256-bit
Memory Bandwidth: 27.2GB/sec
Fill Rate: 3.6 billion texels/sec.
Vertices/sec. 338 million
Memory Data Rate: 850MHz
Pixels per Clock (peak): 8
Textures per Pixel: 16(Maximum in a single rendering pass with 8 textures applied per clock.)
RAMDACs 400MHz

The Engine clock is 450 as oposed to 400 which is 12.5%

typically, they'll have better quality chips on them to i'd guess..
 
The specs look attractive on the GeForce FX 5900 Ultra, but the only difference is the size of the memory, and a little clock speed, but nothing you can't change yourself. At the current moment, no games are out that are going to utilize the extra memory. As for the clock, everyone can use a few more Mhz here and there and that can be changed easily thanks to the massive coolers nVIDIA cards use. The only use for large memory applications are 1600x1200 games with 4x AA and 8x AF enabled, which looks nice, but runs at 14-30 frames a second, which doesn't help gameplay at all. The extra memory is more of a marketing ploy than anything else, because no games or applications (except DX9 benchmarks) can take advantage of it. I own a 256MB R350 PRO and there is no difference from the 128MB version, except for a whooping 10Mhz increase in memory clock and that extra RAM. So to answer your question, don't buy it just yet. An ATI RADEON 9700 PRO is a defacto standard card for gaming, its cheaper, powerful, and has a full 8 pixel rendering architecture, where nVIDIA only uses 4 pixels in color and Z buffer operations and 8 in everything else. Save you money and get a RADEON 9700 PRO, I believe you will be glad you did, because with that money you save, you can buy something else for your computer.
 
The difference in more RAM, especially right now when you compare a 128MB card to a 256MB card is that the 256MB card is going to be able to basically handle higher resolutions. This is because, obviously, to push higher resolutions the program is pushing higher detailed textures, graphics, effects, etc.

For the same reason that your computer needs more RAM to handle more applications running at once, a video card needs more RAM to be able to handle those higher resolutions and detail.

This isn't something you have to be worried about if your running at 1024x768 with a 128MB card. In that case there is no point in spending the money on a 256MB video card (depending on your monitor size, but I would say if your using a 17" or 19" monitor then go for the 128MB card and a resolution of about 1280x1024).

In my opinion there is no point in running such high resolutions that you need that much memory, but with new games coming out it may be possible that you could benefit from the 256MB to be able to turn your graphics, effects and other detail all the way up while still maintaining a decent resolution. I dont think this will be the case until at least Half Life 2 comes out, or maybe Doom 3, but AGAIN unless you are running at a high resolution then you still shouldnt need the 256MB card, but rather a really good 128MB card instead.

Now, if you are running on a 21" Monitor or other pretty large monitor then you obviously will want to turn the resolution up to a decent size, in this case you will most definately benefit from a 256MB card. So take into account your monitor size.

This is the basic rule I go by:

lower than 19" - 128MB vid. card (@ 1024x768 or 1280x1024)

19" - this is where you kinda have to decide depending on what res. your 19" will support, and what res. you WANT to run at while gaming. I would go for 1280x1024 but you may want to go higher. If you choose to run at a res higher than 1280x1024 then you may well benefit from the 256MB card.

higher than 19" - 256MB card
 
I'll totally agree with the above post, with bigger monitors, will usually require more memory due to resolution (who is gonna buy a 21" and set it to 800x600 anyway?)
 
Status
Not open for further replies.
Back