EVGA teases dual-GPU GeForce

Jos

Posts: 3,073   +97
Staff

Nvidia-exclusive add-in card maker EVGA has taken the occasion to show some of its new and upcoming products at this year’s CES. Among those is one that is bound to give AMD a run for its money: a new dual-GPU board that would presumably debut under the GeForce 500 family. The upcoming card has one SLI connector for quad-SLI setups, a dual-slot cooler with three fans, two 8-pin PCIe power connectors, and three DVI outputs.

Not many details were shared in terms of specs. Judging by the apparent power and cooling requirements, Tech Report speculates that it may house a pair of GF110 GPUs, clocked somewhat lower than on the GTX 580, but Hexus claims to have confirmed it'll be based on two GF104 chips (found on the GTX 460). Other than that they say it will carry 2GB of memory, support three-monitor Surround Vision gaming out of the box, and cost less than a flagship GTX 580.


We’ll have to wait for an official confirmation, but if Hexus is on the money then this isn’t the card that is expected to go against AMD's upcoming dual-GPU Radeon HD 6990 (Antilles). Still, we know what a pair of GeForce GTX 460 in SLI are capable of so if EVGA gets the pricing right this could be a real winner.

Permalink to story.

 
It's going to be really interesting to see this card go head to head with the Radeon 6990.
 
Very cool, can't want to hear confirming details about what core(s) it will be inside.
Would love to see it released under $400...
 
Mizzou said:
It's going to be really interesting to see this card go head to head with the Radeon 6990.
Only if this card doesn't turn out to be the dual GF104 chips because that would simply situate it within the GTX 480/580 territory. Either way I think it will be a great card if the money is right! XD
 
Only if this card doesn't turn out to be the dual GF104 chips because that would simply situate it within the GTX 480/580 territory. Either way I think it will be a great card if the money is right! XD

Agree with your assesment, but would expect something more in the $500 range for either card ... hopefully no higher. I have a pair of 5870's in my Intel rig and am curious to see if either of these can provide a substantial performance gain.
 
Mizzou said:
It's going to be really interesting to see this card go head to head with the Radeon 6990.

If the current Ati 6000 cards are any indication the 6990 will be underwhelming at best.
 
If it's true that EVGA is putting together a pair of 460s on a single card, they would probably have to price it somewhere between the GTX 570 and the yet to announced GTX 560. More of a niche product actually like the 4850X2 that Sapphire made a while back. I would be more interested in a card with dual GTX 570 GPUs and with more than the reference 1.25 GB GDDR5 memory per GPU. After all, didn't Nvidia create the GTX 295 from a pair of GTX 260 / GTX 285 hybrids which it later sold under the name GTX 275.
 
Besides what it can potentially do performance wise - that's just a damn good looking card.
 
sarcasm said:
If they make it at around or under $400, they might have a winner.

That price seems to be about the sweet spot for this card..Otherwise you would better off just getting 2
460's in SLI..Might be a tad cheaper..
With 3 fans should stay cool,but maybe a a little loud on fan noise possible.
 
CamaroMullet said:
Yeah, that is one sexy card. Curious to see the final specs and price. I doubt a dual 460 card could bet the 6990.

I'll repeat what I said before because I think you missed my comment.

"If the current Ati 6000 cards are any indication the 6990 will be underwhelming at best. "

Also this could very well be two 570s on one PCB.
 
If it's true that EVGA is putting together a pair of 460s on a single card, they would probably have to price it somewhere between the GTX 570 and the yet to announced GTX 560.

Not really.
People seem to have forgotten that ALL the GTX 460's released thus far have one shader module disabled. What if EVGA (and other AIB's) have been stockpiling full functioning 384 shader/8 shader module GF104 GPU's ? Then bin them for low power and/or high clocks -remember a bog standard unbinned GTX 460 1Gb is a 160w TDP part. Assuming these full function GPU's exist they certainly wont be go on to the market as a single GPU solution since their competition will be a superior GTX 560. Nvidia's naming convention doesn't lend itself to a sperate model launch either...unless someone thinks GTX 467½ sounds like an attention grabber

Having said that I think that EVGA are foxing. If everyone (including AMD ) assumes it's either a detuned GTX 570 or GTX 460 then it becomes fairly easy to extrapolate probable performance...and to tailor a competing product to match or just exceed that performance.
I would think that sets the card up nicely to launch using a dual GF114 (GTX 560).
 
They made one just because they can. Other wise its pointless. There is no room for it anywhere between all the 4xx and 5xx cards.
 
remember a bog standard unbinned GTX 460 1Gb is a 160w TDP part.


Having said that I think that EVGA are foxing. If everyone (including AMD ) assumes it's either a detuned GTX 570 or GTX 460

Not really any other choice is there Chef? Wont either of these in dual GPU configuration need to
have the 'throttle down' power monitoring circuitry? one that possibly cuts in before a 'furmark or OCCT type of load? I imagine this is why neither AMD or Nvidia have a x2 card put yet.
 
Power throttling, as you say, is a given -whether software or hardware based- at his stage of the game. To run competitively against either a single hi-end GPU card or dual single card setups, this card and the HD6990 are going to have to squeeze 2 x 200w performance into the 300w PCIE specification. That means a combination of cherry picked (binned) GPU's and reduced clock speeds at best (GTX 295 for example is essentially two GTX 275's that are 219w TDP parts- coincidentally the same TDP as the GTX 570 - running at GTX 260 clocks. The HD 5970 likewise is something of a "hybrid").

A large mitigating factor is a dualie card introduction is it's limited appeal- both for consumers as a percentage of the graphics buying customer base, and AIB's who need a binning process way above that used for selecting factory OC CPU's/cards- the latter reason I would say explains why the HD5970 was both produced in very limited numbers, and priced very uncompetitively in respect of a dual HD5850 solution.

The fact that dual GPU cards in general suffer disproportionately from driver woes (both as single card and quad-SLI/CFX), heat dissipation problems and protracted debugging timeframes probably makes them more of an asset to the respective PR departments than sales and tech support staff.
 
Nice to see 3x DVI outputs ... no need for a $100 active display adapter for those of us with 3x DVI screens (in case of the 6970 or 6990). After my 4870x2 died I'm down to just a 4890 - thus currently looking for a replacement so that I'll have 3 screens again - 'll be watching this closely when it comes out. /jorgen
 
Im sou tired of these cards stuff in less than a month they will release new stuff and ppl will forget about this one and no Dev will use to full potential cause they-re busy doin console games tsc...
 
Edito said:
Im sou tired of these cards stuff in less than a month they will release new stuff and ppl will forget about this one and no Dev will use to full potential cause they-re busy doin console games tsc...

herp derp

I'm sorry, please keep your fanboy rantings on N4G
 
sarcasm said:
Edito said:
Im sou tired of these cards stuff in less than a month they will release new stuff and ppl will forget about this one and no Dev will use to full potential cause they-re busy doin console games tsc...

herp derp

I'm sorry, please keep your fanboy rantings on N4G

Im sorry but i think you didn't get my point all i wanted to say is that graphic cards keep coming and coming and the devs these days just don't make justice to them tell me am i wrong??? beside crytek who else makes you want a GTX580 instead of a cheaper GTX470? just to let you know im a diehard PC gamer and i love my PC more than any of my consoles keep that in mind and if you think they are using those cards to its full potential sorry but i think you're wrong. Don't get me wrong.
 
Back