1GB vs. 512MB

By IH8PunkRock
Jun 26, 2006
  1. ok im searching the web and bam ATI has a card with one gig of memory. im thinkin wow thats really cool but, can it possibly be worth the $1,500 asking price? wouldnt two 512 cards be about the same? and cheaper? ATI's x1900xtx is only $500 (of course ATI doesnt have sli as far as i know). nvidia makes cards that compete with it....
    all im askin is do u think it would be worth the money to get a card with one gig of memory or will 512 be more than enough?
  2. wolfram

    wolfram TechSpot Paladin Posts: 1,967   +9

    Where did you find that card??

    Also, 1GB of graphics memory would be completely useless right now. Still, 512Mb is too much.

    Anyway, I recommend a 512MB Radeon X1900XTX or Geforce 7900GTX (very expensive ones though)
  3. KingCody

    KingCody TS Evangelist Posts: 992   +8

    SLI is exclusive to nvidia, ATI cannot and will not ever have it.

    They do however have their own version of SLI. ATI's version is called crossfire.

    and as mentioned earlier, 1GB video RAM would be useless, since you have apparently found the only 1GB card in existance (j/k :)), no games are written to take advantage of such a beast because nobody owns one.

    by the way, a $1500 video card :eek: :confused: ... that thing better cook you breakfast every morning for that price! :D
  4. blue_dragon

    blue_dragon TS Rookie Posts: 190

    i hope he wasnt looking at the proffesional rendreing cards such as the nvidia quadros
  5. cfitzarl

    cfitzarl TechSpot Chancellor Posts: 1,975   +9

    What graphic card was this? I have seen the Nvidia GeForce 7950 gx2 for a cheaper price (~$700.00). The Radeon X1900 is around $650 with 512 mb. I thought that the only g-card with 1gb was the GeForce 79 series. I'm not even close to an expert though.... I'm pretty sure I have never seen a program that requires 1 gb or even 512 of video memory. Save some money and go for a 256 mb g-card. Most if not all programs/games will work with it. Although if you are a computer enthusiast and need a lot of power, and you have the budget, go for the 1gb. In the future of video games, we'll all probably need it.
  6. IH8PunkRock

    IH8PunkRock TS Rookie Topic Starter Posts: 26

    ok here they are, by popular demand the 1 gig cards.|c:1558|&Sort=3&Recs=10

    (still dont know how to make links, sorry)
    the one i was talkin about is at the bottom, i urge you to check it out.
    thanx for the info guys
  7. wolfram

    wolfram TechSpot Paladin Posts: 1,967   +9

    Well, the 1GB cards are the 7950 GX2, very expensive. The $1000 card is a workstation card, it`s not designed for games. You`re far better with a X1900XTX or even a 7950 GX2. I still prefer ATI over nVidia :)

    Stay away from the $1000 card!
  8. AtK SpAdE

    AtK SpAdE TechSpot Chancellor Posts: 1,495

    I really dont understand the whole hype of the 7950. Nvidia was talking about a dual core Video Card, but in reality its two GPUs on one card. Thats not Dual Core, thats SLI, no big improvement.

    But I guess if you have the cash.

    As said, the bottom card would not be a good choice for gaming.
  9. IH8PunkRock

    IH8PunkRock TS Rookie Topic Starter Posts: 26

    ok can someone explain to me what a "workstation" card is? is it for like video programs or picture programs or somethin like that?
  10. Sharkfood

    Sharkfood TS Guru Posts: 1,019

    Workstation card is used for professional applications, not games.

    For example- if you did AutoCAD work and designed hotel lobbies, skyscrapers or homes using $10K computer aided design software and had dual-monitors... those $1000 cards are designed for such things. Unfortunately, they are very, very poor for games though as they are custom tailored for professional applications.
  11. IH8PunkRock

    IH8PunkRock TS Rookie Topic Starter Posts: 26

    ok so its for the awesome not fun computers. i get it thanx man
  12. Sharkfood

    Sharkfood TS Guru Posts: 1,019

    One other comment on-topic... 2x512mb cards does not yield an effective 1 gig of videoram. The reason being- dual gpu's have to duplicate all texture usage.

    Example, if a game has 180mb of textures- this will consume 360mb of vram on SLI, Crossfire or dual-gpu configurations as the textures have to be loaded local on each GPU's memory. Textures take up the lionshare of vram consumption and these are replicated for each gpu.

    So think of 2x512mb as actually having maybe 680mb of useable vram for a game as textures will need to consume 2x the vram.

    Also, I have to go against the grain with the 512mb vram being "more than enough" as more modern games are already pushing well past the 256mb vram threshold. Oblivion, for example, can easily load-up 286-320mb of vram and be required to texture over the bus from insufficient vram. While PCI-Express is very fast for this, it's still only a fraction of the performance of texturing locally on the videocard. There are also 3rd party high-detail texture mods that can also push to the 512mb vram point as well. Battlefield2 at higher resolutions and maximum (ultra, config mod) settings can also push over 256mb vram. BF2 also performs a lot of swapping of textures in/out of vram at this level, thwarting performance. Having 512mb of vram can help in this situation, big time.

    It's only a matter of time before games will allow users with >512mb vram to disable texture/asset swapping to improve performance. Oblivion and BF2 could gain major improvements if the developers allowed for this option as both can easily surpass 256mb of vram and do already cramp a 512mb vram card WHILE moving textures/assets in and out of vram constantly to make efficient use of this.
  13. blue_dragon

    blue_dragon TS Rookie Posts: 190


    im the one who thought about that whole workstaton quadors..
    nvidia never said it be dual core, its dual gpu
    dual core is one gpu with two cores
Topic Status:
Not open for further replies.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...