Dual-GPU GeForce GTX 590 next month, AMD to respond later

By on February 25, 2011, 10:39 AM
Having both Nvidia and AMD released their last generation GPUs late last year, it's only a matter of time before they ready an ultra expensive dual-GPU version of its latest graphics cards, hoping to take the undisputed performance crown from each other. Even if that means only a slight percentage of gamers will actually spend this kind of money ($500+) to get a hold of one of these boards, we eagerly await each year to see how this battle unfolds.

Rumors started back in November with a few leaked photos hinting at a possible GeForce GTX 590 release on February, but that is unlikely to happen at this point. Sources close to Taiwan-based Digitimes are saying that AMD and Nvidia are seeing a dropping demand for low-end discrete GPUs as IGPs and especially APUs (graphics bundled within the CPU package) take over this segment. As a result GPU manufacturers would be setting their sights on the performance market with particular attention this year given the shift in demand.


Word on the street now is Nvidia will unveil the dual-GPU GeForce GTX 590 by the end of next month. The GeForce GTX 590 is expected to pack two GF110 graphics processors for a total of 1024 CUDA cores, 3027MB of GDDR5 memory, dual 384-bit memory buses, 128 texture units, and 96 ROPs. These specifications would effectively double down on Nvidia's single-GF110 GeForce GTX 580.

Meanwhile, AMD is said to have its Radeon HD 6990 graphics card ready to go, but they are waiting on Nvidia to make the first move so they don't have the chance to make last-minute adjustments to tweak the GTX 590's performance. AMD's dual-GPU Radeon made a brief public appearance last month and is expected to integrate 3840 stream processors and have 4GB of GDDR5 memory clocked at 4.80GHz.




User Comments: 32

Got something to say? Post a comment
Win7Dev said:

I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.

Jurassic4096 said:

Win7Dev said:

I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.

Except the people buying these cards, know what it takes to run them. 650w for an enthusiast is weak at best. i got 2 HD 6950's and i have an 850w just so i have the headroom for this and future upgrades. Dual GPU cards are not what makes nVIDIA and AMD their GPU money. it's everything else, mainly low to midrange cards.

Jurassic4096 said:

as for your comment about more expensive motherboards being needed... that's just plain malarky. The same PCIe spec is the same on every board. only thing that changes are the number of lanes available. to sum up, ANY x16 slot will gladly accept this card. two of them running at x8 x8 is another story!

lipe123 said:

You need a monster board to get quad SLI with single cards, but even cheap boards have 2 slots.

Also typically if you are willing to spend 3million dollars for one of these you can spend another 500$ for water cooling.

Wagan8r Wagan8r said:

Win7Dev said:

I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.

That comment was full of ignorance.

It won't take any time at all because they already know that manufacturers won't offer these cards in OEM products. That's not what they're targeting with these. Custom system builders will put these kinds of PSUs in computers anyway, so you really don't have a point. No matter how expensive your motherboard is, the amount of power coming out of the PCI-E slot is the same. That was a retarded thing of you to say. Some people don't have two PCI-E slots (becoming more rare), and so SLI-like performance is only possible with a dual GPU. Your heat comments are trash. It's more likely that the TDP is lower on a dual GPU than an SLI set-up due to shared resources, so the amount of energy lost to heat and electrical leakage is probably less. I could go on and on, but it's not worth the effort.

mccartercar said:

Depending on driver development and game compatibility, you will always see better gains from a single card solution rather than an sli setup running at same spec.

princeton princeton said:

mccartercar said:

Depending on driver development and game compatibility, you will always see better gains from a single card solution rather than an sli setup running at same spec.

Uh no? It's the same technology just utilizing 1 PCIE slot. A dual gpu card will perform WORSE then a comparable multi card setup because the clocks are lower.

ebolamonkey3 said:

On the 2nd to last sentence, I think you meant AMD is waiting for GTX 590 to release so they can make last minute tweaks to the 6990.

ebolamonkey3 said:

Nvm, I can't read :p

jimsing59 jimsing59 said:

This is just overkill. You can a bunch of games at once with that.

Chazz said:

I really don't understand how Nvidia will pack two 580s on one card without completely killing the clocks and completely destroying the PCI spec standards. If nvidia pulls this off my respect for them will go up.

TrekExpert TrekExpert said:

What a lot of people don't realize is that even if this card accounts for a small margin of their sales it doesn't matter. What the majority of everyday consumers are going to remember when they go to buy a new system is that "Nvidia makes the best graphics card". They will have an overall bias towards Nvidia. This is the advantage of developing the high end market.

Sarcasm Sarcasm said:

I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

St1ckM4n St1ckM4n said:

Stop speaking filth about "omg overkill, nothing can max it!11!!one!!".

Come back when you are playing at 2560x1440, with AA, AF, and everything else maxed out.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Stop speaking filth about "omg overkill, nothing can max it!11!!one!!".

Come back when you are playing at 2560x1440, with AA, AF, and everything else maxed out.

Right! as a matter of fact "maxed out" gets a rather liberalized parsing here and elsewhere. you don't even have to go up to 2560 res to challenge the top cards out there. Try metro 2033 maxed...I mean actually maxed not the 'revised' edition of "maxed out" at 1920x 1080. Not to mention multi screen setups. DX11 is in its infancy, but will grow up fast. Take your GTX 570, HD 6970 or wherever you draw the line of not being overkill, and give 3DMark11 a run. The software coders are aware that this tech and GPU capacity is out there and available...and they intend to make use of every bit of it. I have 4-HD 5850's overclocked in quad Crossfire that will attest that there is no such thing as too much graphic horsepower if you want to actually play with everything on its highest.

I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

Yes there is:

[link] (PhysX disabled)

[link] ( this is only in 'Gamer mode' at 2 x AA) not even close to the Hp it takes for 'enthusiast and 4- x AA

[link]

***Note AA is disabled here ( was pre- GTX 580 release, but you get the idea)

dividebyzero dividebyzero, trainee n00b, said:

I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

AvP and Crysis Warhead maxed* (4xAA/16xAF) at 2560x1600

Metro 2033 maxed* (AAA / 4xAA with tesselation and DoF enabled) at 2560x1600.

* "Maxed" meaning maximum in-game menu. The Nvidia CP which of course can effect much higher levels of game IQ including higher levels of multisampled/supersampled antialiasing (with the option to effect transparency AA on either).

Bear in mind that a big selling point of the card will be Surround gaming using a single card (or 3D Surround for a select few).

captainawesome captainawesome said:

likes comments by: red1776, st1ckm4n, TrekExpert and Princeton.

He also lol'd at ebolamonkey3 because he thought the same thing until he re-read that sentence.

Ok I will stop referring to myself in third person-FB style.

I'm not so sure about this sentence though:

"Sources close to Taiwan-based Digitimes are saying that AMD and Nvidia are seeing a dropping demand for low-end discrete GPUs as IGPs and especially APUs (graphics bundled within the CPU package) take over this segment."

So LOWER demand from cheaper tech is taking over?

captainawesome captainawesome said:

Oh an, Holy crap red, FOUR 5850s? What exactly can't you play at max?

princeton princeton said:

captainawesome said:

Oh an, Holy crap red, FOUR 5850s? What exactly can't you play at max?

Red found gold in his backyard a little while back :P

I'm pretty sure he can max any of the games on the market on a single monitor.

red1776 red1776, Omnipotent Ruler of the Universe, said:

princeton;1009482]Red found gold in his backyard a little while back :P

Nope...it's a pure case of enthusiast OCD...cant stand an empty PCIE slot :p

I'm pretty sure he can max any of the games on the market on a single monitor.

True now...but give a month or two.

HP360 said:

There is still no info about the number of stream processors on the 6990. If its dual 6970 then it should have 3072 stream processors, but if the antiles cores are larger it might have 3960 stream processors, which will have 30% higher performance than two 6970s in crossfire if the base clock was overclocked at the level of the 6970 (if asus makes Ares II this will be possible). So if the halph year old leaked charts from AMD are correct, this can be the fastest graphics cart.

The 6990 should have higher clock speeds than the GTX 590 due to less heat created from the significantly smaller die size of its processors (Comparing the 6970 core to GTX580).

dividebyzero dividebyzero, trainee n00b, said:

Using TPU's power usage figures which are pretty much accepted by most people, the HD 6970 with 1536 shaders uses 185w in normal high-demand gaming -thats 370w for 3072 shaders- yet somehow you believe that two souped-up previously unannounced 1980 shader parts* are somehow going to be both feasible and run cooler than two downclocked GF110's when in single GPU state the nvidia card is marginally cooler for the most part.

Adding additional shaders does not cause a GPU to use less power and to output less heat.

* The only "1980 shader Cypress XT/ 3960 shader Antilles" items ever to surface on the net were photoshopped fakes that were discredited as soon as the actual shader count of Cypress was known (i.e. around two months ago).

fpsgamerJR62 said:

Performance wise, I think most of us already know how a single HD 6970 performs against a single GTX 580. So, when both AMD and Nvidia put a pair of their fastest GPUs on a single PCB, I would think that the resulting head-to-head benchmarks would mostly follow the pattern of the single GPU testing. My own theory on why AMD wants Nvidia to release its dual-GPU card first is based on my observations regarding the availability of AMD's 6900 series versus that of Nvidia's GTX 500 series at the computer stores where I buy my PC stuff. I've noticed that there are several brands of GTX 500 cards which are widely available. On the AMD side, only the 6800 series are easily available with the 6950 series being represented by Gigabyte and Sapphire, both more expensive than the GTX 560 Ti with the Sapphire card especially so since it sports only a 1-GB buffer. I'm guessing that AMD does not have that many 6970 GPUs in the first place which is probably why they're turning a blind eye toward the people flashing their 6950s with 6970 BIOSes. AMD wants Nvidia to launch first, not because it doesn't want to give Nvidia time to tweak the 590's performance, but probably because they don't have enough 6990's in stock to make a proper product launch.

HP360 said:

I didnt say the Antiles core will run cooler if it has 1980 stream processors insted of 1536, all i said is that if it did have 3960 it might outperform the GTX590, and if it doesnt than it will run much cooler than the GTX card (check the die sizes nVidias core size is ridiculous compared to amd and yet it barely outperforms the 6970 in newtral benchmarks, even the GTX 570 which is sloewer than the 6970 has a bigger die size. Looking at the history of dual GPUs nVidia is in a great disadvantage here even if their core is faster. They put two 8pin power connectors on that card whch will create so much heat that it will turn your computer into an oven don't even think about overclocking this card. Both the 5970 and the Asus made Ares are the fastest cards right now and they are running cool and stable which gives them a good room for overclocking.) So dont expect the GTX 590 to be more than 50% faster than the GTX 580.

dividebyzero dividebyzero, trainee n00b, said:

My own theory on why AMD wants Nvidia to release its dual-GPU card first is based on my observations regarding the availability of AMD's 6900 series versus that of Nvidia's GTX 500 series at the computer stores where I buy my PC stuff. I've noticed that there are several brands of GTX 500 cards which are widely available. On the AMD side, only the 6800 series are easily available with the 6950 series being represented by Gigabyte and Sapphire, both more expensive than the GTX 560 Ti with the Sapphire card especially so since it sports only a 1-GB buffer.

Yours is probably a case of restrictions/availability/pricing due to geographic location and where your area fits in the distribution heirachy. A case in point would be my location where factory OC'ed GTX560Ti's are on par (pricing) with HD6950's (both 1 abd 2Gb), yet HD6970's carry a 10-20% pricing premium over GTX 570's -most of which sport factory overclocks- when the MSRP/RRP's suggest that pricing difference should be minimal. We too have a relative dearth of vendor options (MSI in very short supply, no Diamond, VisionTek, PoV, Biostar, Sparkle, PNY (workstation only) and ECS for example). I'm not aware of any widespread (worldwide) supply constraints on either card.

I'm guessing that AMD does not have that many 6970 GPUs in the first place which is probably why they're turning a blind eye toward the people flashing their 6950s with 6970 BIOSes. AMD wants Nvidia to launch first, not because it doesn't want to give Nvidia time to tweak the 590's performance, but probably because they don't have enough 6990's in stock to make a proper product launch.

Personally I would think that both AMD and Nvidia have been binning low voltage leaking GPU's from day one. If anything, I'd say that nvidia has a little more leeway in their binning process due to the GF110 using both a lower input voltage and a wider voltage range -for instance a lot of GF110's work at <0.9v, whereas I don't think there is a CypressXT core in existance that can work when undervolted to 1v.

The BIOS flashing 6950 to 6970 scenario I think is just AMD not being fully in tune with the people they sell cards to. The whole 6970/6950 launch reeked of "rush-job" and I don't think the guys at AMD took note of the fact that enthusiasts will always look at two SKU's derived from the same GPU and try to gain some free performance (see selected GTX465 > GTX470 as a recent example).

The other option is that AMD skipped the laser fusing-off of shader blocks either as a cost cutting/time constraint or gamble that consumers wouldn't attempt the BIOS flash in great numbers and hence cannibalize 6970 sales.

I didnt say the Antiles core will run cooler if it has 1980 stream processors insted of 1536, all i said is that if it did have 3960 it might outperform the GTX590, and if it doesnt.....

Oh, my bad. I thought when you said it would run cooler, I thought you meant it would run cooler:

The 6990 should have higher clock speeds than the GTX 590 due to less heat created

Your whole premise is that IF the HD6990 has 3960 shaders. i.e. a joke. IF the CypressXT has 1980 shaders then they damn well would have released the 6970 with them enabled to ensure AMD grabbed the "fastest-GPU" PR crown. The only reason they would not do this (if the extra shaders existed) would be because the 1980 shader version was so power hungry it turned into a mini-blast furnace (see the original GF100 512-shader core as reference)- yet you believe that AMD decided to forego the having the fastest single GPU in preference to using the core in a limited edition part of very limited appeal.

On a related note...

1. CypressXT does not have 1980 shaders. Never had- never will

2. You're asking people to believe that someone who skipped grammar and vocabulary in school is somehow in possession of GPU facts that chip engineers and enthusiasts are totally ignorant of. You don't see a fundamental flaw in your argument do you ?

3. Why whine on about the GTX590 having 2x8pin power yet constantly reference the Asus Ares (a dictionary definition of PR stunt), a card that has 2 x 8pin AND 6pin power delivery.

4. Take your trolling somewhere else.

cyriene said:

sarcasm said:

I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

Uh, wrong. Once you set the resolution to 2560x1600 (or higher) and crank the AA and IQ to max many games benefit from the extra power. Metro 2033 for example and of course Crysis.

red1776 red1776, Omnipotent Ruler of the Universe, said:

Your whole premise is that IF the HD6990 has 3960 shaders. i.e. a joke. IF the CypressXT has 1980 shaders then they damn well would have released the 6970 with them enabled to ensure AMD grabbed the "fastest-GPU" PR crown. The only reason they would not do this (if the extra shaders existed) would be because the 1980 shader version was so power hungry it turned into a mini-blast furnace (see the original GF100 512-shader core as reference)- yet you believe that AMD decided to forego the having the fastest single GPU in preference to using the core in a limited edition part of very limited appeal.

On a related note...

1. CypressXT does not have 1980 shaders. Never had- never will

2. You're asking people to believe that someone who skipped grammar and vocabulary in school is somehow in possession of GPU facts that chip engineers and enthusiasts are totally ignorant of. You don't see a fundamental flaw in your argument do you ?

3. Why whine on about the GTX590 having 2x8pin power yet constantly reference the Asus Ares (a dictionary definition of PR stunt), a card that has 2 x 8pin AND 6pin power delivery.

Geez Chef...you have no imagination, I mean C'mon now, The guy has a point 1980 shaders x2 is definitely doable !

[link]

(I know ,I know, I had six minutes to PS it before taking the kids sledding) :p

dividebyzero dividebyzero, trainee n00b, said:

CrossfireX might pose a challenge for cable management !

red1776 red1776, Omnipotent Ruler of the Universe, said:

CrossfireX might pose a challenge for cable management !

Indeed! Especially for the wire management challenged such as myself.

....still...I'm thinking quad...I am assuming this dude has 8 way figured out as well.

Regenweald said:

UPDATE: you might be interested to know that [H] already has a 6990 in house testing. Somehow I doubt that reviewers have been provided with cards and are now expected to sit on them for a month. 8th March looks more likely. Techpowerup, 3d Center.

dividebyzero dividebyzero, trainee n00b, said:

@ Regenweald

Please try to keep up (note from post #46 posted a little over a day ago)

Oddly enough we were attempting to confine our HD6990 speculation to the HD6990 thread....nutty huh?! This is the GTX590 thread, until it got derailed by deranged troll (shock horror)

Regenweald said:

Nah, wasn't keeping track in the forums....hence the lateness. i don't mind the 6990/590 threads getting muddied up, they are after all, cousins...

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.