EVGA teases dual-GPU GeForce

By Jos · 23 replies
Jan 7, 2011
Post New Reply
  1. Nvidia-exclusive add-in card maker EVGA has taken the occasion to show some of its new and upcoming products at this year’s CES. Among those is one that is bound to give AMD a run for its money: a new dual-GPU board that would presumably debut under the GeForce 500 family. The upcoming card has one SLI connector for quad-SLI setups, a dual-slot cooler with three fans, two 8-pin PCIe power connectors, and three DVI outputs.

    Read the whole story
  2. Mizzou

    Mizzou TS Enthusiast Posts: 823

    It's going to be really interesting to see this card go head to head with the Radeon 6990.
  3. TeamworkGuy2

    TeamworkGuy2 TS Enthusiast Posts: 191

    Very cool, can't want to hear confirming details about what core(s) it will be inside.
    Would love to see it released under $400...
  4. SilverCider

    SilverCider TS Rookie Posts: 71

    Only if this card doesn't turn out to be the dual GF104 chips because that would simply situate it within the GTX 480/580 territory. Either way I think it will be a great card if the money is right! XD
  5. Mizzou

    Mizzou TS Enthusiast Posts: 823

    Agree with your assesment, but would expect something more in the $500 range for either card ... hopefully no higher. I have a pair of 5870's in my Intel rig and am curious to see if either of these can provide a substantial performance gain.
  6. Benny26

    Benny26 TechSpot Paladin Posts: 1,535   +51

    Sounds and looks like a right beast.
  7. princeton

    princeton TS Addict Posts: 1,676

    If the current Ati 6000 cards are any indication the 6990 will be underwhelming at best.
  8. Sarcasm

    Sarcasm TS Guru Posts: 367   +46

    If they make it at around or under $400, they might have a winner.
  9. fpsgamerJR62

    fpsgamerJR62 TS Rookie Posts: 489

    If it's true that EVGA is putting together a pair of 460s on a single card, they would probably have to price it somewhere between the GTX 570 and the yet to announced GTX 560. More of a niche product actually like the 4850X2 that Sapphire made a while back. I would be more interested in a card with dual GTX 570 GPUs and with more than the reference 1.25 GB GDDR5 memory per GPU. After all, didn't Nvidia create the GTX 295 from a pair of GTX 260 / GTX 285 hybrids which it later sold under the name GTX 275.
  10. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,714   +855

    Besides what it can potentially do performance wise - that's just a damn good looking card.
  11. RaiDeR55

    RaiDeR55 TS Rookie Posts: 45

    That price seems to be about the sweet spot for this card..Otherwise you would better off just getting 2
    460's in SLI..Might be a tad cheaper..
    With 3 fans should stay cool,but maybe a a little loud on fan noise possible.
  12. Jesse

    Jesse TS Evangelist Posts: 358   +42

    I agree, its a beaut.
  13. CamaroMullet

    CamaroMullet TS Rookie Posts: 93

    Yeah, that is one sexy card. Curious to see the final specs and price. I doubt a dual 460 card could bet the 6990.
  14. princeton

    princeton TS Addict Posts: 1,676

    I'll repeat what I said before because I think you missed my comment.

    "If the current Ati 6000 cards are any indication the 6990 will be underwhelming at best. "

    Also this could very well be two 570s on one PCB.
  15. blimp01

    blimp01 TS Enthusiast Posts: 144

    2 460 GPUs? i was expecting 2 480 or 570 gpus
  16. Mizzou

    Mizzou TS Enthusiast Posts: 823

    Think most of us were, that's what was being reported by Fudzilla last November. Hard to see how Nvidia could charge less for a pair of GF110's than a GTX 580 ... something here just isn't adding up.
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Not really.
    People seem to have forgotten that ALL the GTX 460's released thus far have one shader module disabled. What if EVGA (and other AIB's) have been stockpiling full functioning 384 shader/8 shader module GF104 GPU's ? Then bin them for low power and/or high clocks -remember a bog standard unbinned GTX 460 1Gb is a 160w TDP part. Assuming these full function GPU's exist they certainly wont be go on to the market as a single GPU solution since their competition will be a superior GTX 560. Nvidia's naming convention doesn't lend itself to a sperate model launch either...unless someone thinks GTX 467½ sounds like an attention grabber

    Having said that I think that EVGA are foxing. If everyone (including AMD ) assumes it's either a detuned GTX 570 or GTX 460 then it becomes fairly easy to extrapolate probable performance...and to tailor a competing product to match or just exceed that performance.
    I would think that sets the card up nicely to launch using a dual GF114 (GTX 560).
  18. They made one just because they can. Other wise its pointless. There is no room for it anywhere between all the 4xx and 5xx cards.
  19. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,224   +164

    Not really any other choice is there Chef? Wont either of these in dual GPU configuration need to
    have the 'throttle down' power monitoring circuitry? one that possibly cuts in before a 'furmark or OCCT type of load? I imagine this is why neither AMD or Nvidia have a x2 card put yet.
  20. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,264

    Power throttling, as you say, is a given -whether software or hardware based- at his stage of the game. To run competitively against either a single hi-end GPU card or dual single card setups, this card and the HD6990 are going to have to squeeze 2 x 200w performance into the 300w PCIE specification. That means a combination of cherry picked (binned) GPU's and reduced clock speeds at best (GTX 295 for example is essentially two GTX 275's that are 219w TDP parts- coincidentally the same TDP as the GTX 570 - running at GTX 260 clocks. The HD 5970 likewise is something of a "hybrid").

    A large mitigating factor is a dualie card introduction is it's limited appeal- both for consumers as a percentage of the graphics buying customer base, and AIB's who need a binning process way above that used for selecting factory OC CPU's/cards- the latter reason I would say explains why the HD5970 was both produced in very limited numbers, and priced very uncompetitively in respect of a dual HD5850 solution.

    The fact that dual GPU cards in general suffer disproportionately from driver woes (both as single card and quad-SLI/CFX), heat dissipation problems and protracted debugging timeframes probably makes them more of an asset to the respective PR departments than sales and tech support staff.
  21. Nice to see 3x DVI outputs ... no need for a $100 active display adapter for those of us with 3x DVI screens (in case of the 6970 or 6990). After my 4870x2 died I'm down to just a 4890 - thus currently looking for a replacement so that I'll have 3 screens again - 'll be watching this closely when it comes out. /jorgen
  22. Edito

    Edito TS Enthusiast Posts: 69   +9

    Im sou tired of these cards stuff in less than a month they will release new stuff and ppl will forget about this one and no Dev will use to full potential cause they-re busy doin console games tsc...
  23. Sarcasm

    Sarcasm TS Guru Posts: 367   +46

    herp derp

    I'm sorry, please keep your fanboy rantings on N4G
  24. Edito

    Edito TS Enthusiast Posts: 69   +9

    Im sorry but i think you didn't get my point all i wanted to say is that graphic cards keep coming and coming and the devs these days just don't make justice to them tell me am i wrong??? beside crytek who else makes you want a GTX580 instead of a cheaper GTX470? just to let you know im a diehard PC gamer and i love my PC more than any of my consoles keep that in mind and if you think they are using those cards to its full potential sorry but i think you're wrong. Don't get me wrong.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...