Dual-GPU GeForce GTX 590 next month, AMD to respond later

Julio Franco

Posts: 9,099   +2,049
Staff member

Having both Nvidia and AMD released their last generation GPUs late last year, it's only a matter of time before they ready an ultra expensive dual-GPU version of its latest graphics cards, hoping to take the undisputed performance crown from each other. Even if that means only a slight percentage of gamers will actually spend this kind of money ($500+) to get a hold of one of these boards, we eagerly await each year to see how this battle unfolds.

Rumors started back in November with a few leaked photos hinting at a possible GeForce GTX 590 release on February, but that is unlikely to happen at this point. Sources close to Taiwan-based Digitimes are saying that AMD and Nvidia are seeing a dropping demand for low-end discrete GPUs as IGPs and especially APUs (graphics bundled within the CPU package) take over this segment. As a result GPU manufacturers would be setting their sights on the performance market with particular attention this year given the shift in demand.


Word on the street now is Nvidia will unveil the dual-GPU GeForce GTX 590 by the end of next month. The GeForce GTX 590 is expected to pack two GF110 graphics processors for a total of 1024 CUDA cores, 3027MB of GDDR5 memory, dual 384-bit memory buses, 128 texture units, and 96 ROPs. These specifications would effectively double down on Nvidia's single-GF110 GeForce GTX 580.

Meanwhile, AMD is said to have its Radeon HD 6990 graphics card ready to go, but they are waiting on Nvidia to make the first move so they don't have the chance to make last-minute adjustments to tweak the GTX 590's performance. AMD's dual-GPU Radeon made a brief public appearance last month and is expected to integrate 3840 stream processors and have 4GB of GDDR5 memory clocked at 4.80GHz.

Permalink to story.

 
I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.
 
Win7Dev said:
I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.

Except the people buying these cards, know what it takes to run them. 650w for an enthusiast is weak at best. i got 2 HD 6950's and i have an 850w just so i have the headroom for this and future upgrades. Dual GPU cards are not what makes nVIDIA and AMD their GPU money. it's everything else, mainly low to midrange cards.
 
as for your comment about more expensive motherboards being needed... that's just plain malarky. The same PCIe spec is the same on every board. only thing that changes are the number of lanes available. to sum up, ANY x16 slot will gladly accept this card. two of them running at x8 x8 is another story!
 
You need a monster board to get quad SLI with single cards, but even cheap boards have 2 slots.

Also typically if you are willing to spend 3million dollars for one of these you can spend another 500$ for water cooling.
 
Win7Dev said:
I am wondering how long it will be before they realize that manufacturers won't offer these cards because then they will have to put in 650w+ PSU's and more expensive motherboards to be able to handle these power hungry cards. I would honestly rather get two cards and put them in crossfire or SLI to get this kind of performance. With one dual gpu card you are going to have some serious heat, but with two cards you will hopefully have half the heat of a dual gpu card. Really, you are going to get heat either way, but each card will have it's own fan or cooling system. There just isn't enough space to cool a dual gpu card as well unless you make it double the size, which really defeats the point of having a dual gpu card.
That comment was full of ignorance.

It won't take any time at all because they already know that manufacturers won't offer these cards in OEM products. That's not what they're targeting with these. Custom system builders will put these kinds of PSUs in computers anyway, so you really don't have a point. No matter how expensive your motherboard is, the amount of power coming out of the PCI-E slot is the same. That was a retarded thing of you to say. Some people don't have two PCI-E slots (becoming more rare), and so SLI-like performance is only possible with a dual GPU. Your heat comments are trash. It's more likely that the TDP is lower on a dual GPU than an SLI set-up due to shared resources, so the amount of energy lost to heat and electrical leakage is probably less. I could go on and on, but it's not worth the effort.
 
Depending on driver development and game compatibility, you will always see better gains from a single card solution rather than an sli setup running at same spec.
 
mccartercar said:
Depending on driver development and game compatibility, you will always see better gains from a single card solution rather than an sli setup running at same spec.

Uh no? It's the same technology just utilizing 1 PCIE slot. A dual gpu card will perform WORSE then a comparable multi card setup because the clocks are lower.
 
On the 2nd to last sentence, I think you meant AMD is waiting for GTX 590 to release so they can make last minute tweaks to the 6990.
 
I really don't understand how Nvidia will pack two 580s on one card without completely killing the clocks and completely destroying the PCI spec standards. If nvidia pulls this off my respect for them will go up.
 
What a lot of people don't realize is that even if this card accounts for a small margin of their sales it doesn't matter. What the majority of everyday consumers are going to remember when they go to buy a new system is that "Nvidia makes the best graphics card". They will have an overall bias towards Nvidia. This is the advantage of developing the high end market.
 
I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.
 
Stop speaking filth about "omg overkill, nothing can max it!11!!one!!".

Come back when you are playing at 2560x1440, with AA, AF, and everything else maxed out.
 
Stop speaking filth about "omg overkill, nothing can max it!11!!one!!".

Come back when you are playing at 2560x1440, with AA, AF, and everything else maxed out.

Right! as a matter of fact "maxed out" gets a rather liberalized parsing here and elsewhere. you don't even have to go up to 2560 res to challenge the top cards out there. Try metro 2033 maxed...I mean actually maxed not the 'revised' edition of "maxed out" at 1920x 1080. Not to mention multi screen setups. DX11 is in its infancy, but will grow up fast. Take your GTX 570, HD 6970 or wherever you draw the line of not being overkill, and give 3DMark11 a run. The software coders are aware that this tech and GPU capacity is out there and available...and they intend to make use of every bit of it. I have 4-HD 5850's overclocked in quad Crossfire that will attest that there is no such thing as too much graphic horsepower if you want to actually play with everything on its highest.

I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

Yes there is:

http://www.guru3d.com/article/geforce-gtx-580-review/14 (PhysX disabled)

http://www.guru3d.com/article/geforce-gtx-580-review/16 ( this is only in 'Gamer mode' at 2 x AA) not even close to the Hp it takes for 'enthusiast and 4- x AA

http://www.legionhardware.com/articles_pages/lost_planet_2_gpu_performance_preview,5.html

***Note AA is disabled here ( was pre- GTX 580 release, but you get the idea)
 
I don't know what's the point anymore. There really aren't any PC games that will even take advantage of even a single GTX 580 let alone a Dual GPU one.

AvP and Crysis Warhead maxed* (4xAA/16xAF) at 2560x1600
Metro 2033 maxed* (AAA / 4xAA with tesselation and DoF enabled) at 2560x1600.

* "Maxed" meaning maximum in-game menu. The Nvidia CP which of course can effect much higher levels of game IQ including higher levels of multisampled/supersampled antialiasing (with the option to effect transparency AA on either).

Bear in mind that a big selling point of the card will be Surround gaming using a single card (or 3D Surround for a select few).
 
likes comments by: red1776, st1ckm4n, TrekExpert and Princeton.
He also lol'd at ebolamonkey3 because he thought the same thing until he re-read that sentence.
Ok I will stop referring to myself in third person-FB style.
I'm not so sure about this sentence though:
"Sources close to Taiwan-based Digitimes are saying that AMD and Nvidia are seeing a dropping demand for low-end discrete GPUs as IGPs and especially APUs (graphics bundled within the CPU package) take over this segment."
So LOWER demand from cheaper tech is taking over?
 
captainawesome said:
Oh an, Holy crap red, FOUR 5850s? What exactly can't you play at max?

Red found gold in his backyard a little while back :p

I'm pretty sure he can max any of the games on the market on a single monitor.
 
princeton;1009482]Red found gold in his backyard a little while back :p

Nope...it's a pure case of enthusiast OCD...cant stand an empty PCIE slot :p

I'm pretty sure he can max any of the games on the market on a single monitor.

True now...but give a month or two.
 
There is still no info about the number of stream processors on the 6990. If its dual 6970 then it should have 3072 stream processors, but if the antiles cores are larger it might have 3960 stream processors, which will have 30% higher performance than two 6970s in crossfire if the base clock was overclocked at the level of the 6970 (if asus makes Ares II this will be possible). So if the halph year old leaked charts from AMD are correct, this can be the fastest graphics cart.
The 6990 should have higher clock speeds than the GTX 590 due to less heat created from the significantly smaller die size of its processors (Comparing the 6970 core to GTX580).
 
Using TPU's power usage figures which are pretty much accepted by most people, the HD 6970 with 1536 shaders uses 185w in normal high-demand gaming -thats 370w for 3072 shaders- yet somehow you believe that two souped-up previously unannounced 1980 shader parts* are somehow going to be both feasible and run cooler than two downclocked GF110's when in single GPU state the nvidia card is marginally cooler for the most part.

Adding additional shaders does not cause a GPU to use less power and to output less heat.

* The only "1980 shader Cypress XT/ 3960 shader Antilles" items ever to surface on the net were photoshopped fakes that were discredited as soon as the actual shader count of Cypress was known (i.e. around two months ago).
 
Performance wise, I think most of us already know how a single HD 6970 performs against a single GTX 580. So, when both AMD and Nvidia put a pair of their fastest GPUs on a single PCB, I would think that the resulting head-to-head benchmarks would mostly follow the pattern of the single GPU testing. My own theory on why AMD wants Nvidia to release its dual-GPU card first is based on my observations regarding the availability of AMD's 6900 series versus that of Nvidia's GTX 500 series at the computer stores where I buy my PC stuff. I've noticed that there are several brands of GTX 500 cards which are widely available. On the AMD side, only the 6800 series are easily available with the 6950 series being represented by Gigabyte and Sapphire, both more expensive than the GTX 560 Ti with the Sapphire card especially so since it sports only a 1-GB buffer. I'm guessing that AMD does not have that many 6970 GPUs in the first place which is probably why they're turning a blind eye toward the people flashing their 6950s with 6970 BIOSes. AMD wants Nvidia to launch first, not because it doesn't want to give Nvidia time to tweak the 590's performance, but probably because they don't have enough 6990's in stock to make a proper product launch.
 
I didnt say the Antiles core will run cooler if it has 1980 stream processors insted of 1536, all i said is that if it did have 3960 it might outperform the GTX590, and if it doesnt than it will run much cooler than the GTX card (check the die sizes nVidias core size is ridiculous compared to amd and yet it barely outperforms the 6970 in newtral benchmarks, even the GTX 570 which is sloewer than the 6970 has a bigger die size. Looking at the history of dual GPUs nVidia is in a great disadvantage here even if their core is faster. They put two 8pin power connectors on that card whch will create so much heat that it will turn your computer into an oven don't even think about overclocking this card. Both the 5970 and the Asus made Ares are the fastest cards right now and they are running cool and stable which gives them a good room for overclocking.) So dont expect the GTX 590 to be more than 50% faster than the GTX 580.
 
Back