Nvidia's dual-GPU GeForce GTX 590 coming in February

Matthew DeCarlo

Posts: 5,271   +104
Staff

NordicHardware has it on good authority that Nvidia plans to launch its dual-GPU GeForce GTX 590 graphics card this month. A release date isn't mentioned, but Nvidia is probably trying unveil its flagship in the same timeframe as AMD's competing Radeon HD 6990 (codenamed Antilles).

The GeForce GTX 590 will pack two GF110 graphics processors offering a total of 1024 CUDA cores, 3027MB of GDDR5 memory, dual 384-bit memory buses, 128 texture units, and 96 ROPs. That effectively doubles the specifications of Nvidia's single-GF110 GeForce GTX 580.


NordicHardware notes that Nvidia has chosen to enable the maximum number of CUDA cores while reducing the stock clock frequencies and voltages for improved power consumption. The GeForce GTX 590's TDP hasn't been revealed yet, but the GTX 580 is rated at 244W.

The site couldn't score shots of Nvidia's reference card, but it supposedly carries a full-cover cooler with a 9cm fan in the center -- not unlike GeForce GTX 295. The GTX 590's pricing also remains a mystery, but you can expect it to cost upwards of $700 considering the GTX 580's $500 price tag.

Permalink to story.

 
Wow. Those specs show this will be a beast. We'll have to wait to see it and the HD 6990 go head to head to find out though.
 
Man I feel bad for AMD, they keep getting 1up'd before they release their products.

Oh well, I'm interested to see how this dual gpu is going to perform and how much it costs.
 
sarcasm said:
Man I feel bad for AMD, they keep getting 1up'd before they release their products.

Oh well, I'm interested to see how this dual gpu is going to perform and how much it costs.

Well back with the 5000 series they didn't just 1up nvidia. They beat them by months and months.
 
LOL...AMD and nVidia can't release the highest end cards fast enough to beat each. I think I'll have to pass on this one though with that $700 price tag. :p
 
One fan? they should go the two fan route that gigabyte is a fan of using (pun not intended), putting the fans directly above the two main chips.

Either way, while I'll never be in the market for one, I can't wait until the 590 and the 6990 comes out. =o
 
Princeton said:
sarcasm said:
Man I feel bad for AMD, they keep getting 1up'd before they release their products.

Oh well, I'm interested to see how this dual gpu is going to perform and how much it costs.

Well back with the 5000 series they didn't just 1up nvidia. They beat them by months and months.

Yup good healthy competition in my eyes. It's just unfortunate that AMD cant respond to Nvidia's offerings other than lower prices though.
 
Someone here never kept an eye on the 5970... anyway, I'm pretty excited about this card. Seriously considering it if the numbers look good enough - to be honest I really miss my PhysX.
 
sarcasm said:
Princeton said:
sarcasm said:
Man I feel bad for AMD, they keep getting 1up'd before they release their products.

Oh well, I'm interested to see how this dual gpu is going to perform and how much it costs.

Well back with the 5000 series they didn't just 1up nvidia. They beat them by months and months.

Yup good healthy competition in my eyes. It's just unfortunate that AMD cant respond to Nvidia's offerings other than lower prices though.

Man, you should know that most people can't even start to imagine buying a 700-dollar-card. AMD does point its card to hardcore gammers that can affort that kind of cards, but they mainly direct its products to people like me :D
 
This is going to be interesting. if they are going to pot two full fledged GF110 on this thing, they must have binned some gems and been sitting on them just for this. And/or the clocks are going to be around 550mhz. Same for Antilles if its a 6970 x2. Bringing this in under (at least initially) at or under 300W package just doesn't seem to work on paper.
 
Hence my previous observation regarding the power limiting on both cards.
Looking at TPU's power consumption figures ( 390-444w "gaming" load) for stock cards which average 1.000v (they seem to range from 0.988 to 1.0500v for a vanilla card) it will be interesting to see whether the card has a special "stock" power limiter functionality in upcoming drivers. If nvidia has stockpiled <0.95v GPU's then that 10% lowering of voltage* should translate into a much less aggressive profile.
I personally can't see too many of these cards (or the HD6990) hitting the retail channel- binning, lower demand, lower profit margin should ensure that it becomes HD 5970 redux. Just enough in stock at newegg to qualify as a current card, priced high enough to ensure that a few units are available most of the time while still basking in the PR glow of "Worlds Fastest Gaming Card"™ (whichever of the two happens to win the title- my guess is they split benchmarks and the fanboy fatwas get cranked up a notch or ten)

* As a comparison, the GTX 570 stock voltage is around 0.975v (0.025 less than the GTX580) -compare power usage here, although the 570's lower memory chip count and controller would account for some of the lower usage.
Rage3D and a few other non-nvidia sites are also claiming 550-560MHz clocks for the GTX590 - personally I can't see nvidia releasing a card that would only beat the GTX 580 by around 35-40%. Not when putting two GF114 cores onto the same pcb would likely accomplish more for less (298w combined for GTX 560Ti SLI)
 
Not when putting two GF114 cores onto the same pcb would likely accomplish more for less (298w combined for GTX 560Ti SLI)

right, that's not what is making sense from a cost standpoint. how about a 300w 1024 Cuda, and a 300w 3072 SPU unit with a switch....for the 375W mode? :)
and before you ask...yes i think the oxy is getting to me :p :wave:
 
Only problem with that scenario is that the reference version of both cards needs to be 8pin + 6pin -if only for the appearance of being within PCI specification. I could quite easily imagine just enough reference cards to be built to satisfy a launch requirement..and then to simply disappear as AIB's quickly brought their own designs (2 x 8pin) to market - the same strategy used for the HD5970 4Gb ( both the XFX Black Edition and Sapphire Toxic).
The other alternative would be to release the cards with 2 x 8pin and have 2 pins capped in much the same way that some motherboards EPS12V sometimes ship with a plastic cap covering half of the eight pinout.
 
This is way more than I expected from Nvidia. I had hoped for a pair of GTX 570s on a single PCB though a pair of GTX 560s would have been more logical. I guess when Nvidia set out to build the Beast, they pulled out all the stops. I wonder if Nvidia would be willing to build a "baby" version of this SKU for those of us who don't own 30-inch monitors. GTX 565 / 575 ? :).
 
red1776 said:
This is going to be interesting. if they are going to pot two full fledged GF110 on this thing, they must have binned some gems and been sitting on them just for this. And/or the clocks are going to be around 550mhz. Same for Antilles if its a 6970 x2. Bringing this in under (at least initially) at or under 300W package just doesn't seem to work on paper.

With the PCI-E power limitation. Is this just a guideline? Or does it cease to function if a device exceeds 300w?
 
I personally can't see too many of these cards (or the HD6990) hitting the retail channel- binning, lower demand, lower profit margin should ensure that it becomes HD 5970 redux. Just enough in stock at newegg to qualify as a current card, priced high enough to ensure that a few units are available most of the time while still basking in the PR glow of "Worlds Fastest Gaming Card"™ (whichever of the two happens to win the title- my guess is they split benchmarks and the fanboy fatwas get cranked up a notch or ten)

Think that's a good read on how this will unfold, basically a battle for the high ground for the boys in PR. It will be interesting to see the benchmark results, tend to think Nvidia has a bit of an upper hand this time around.
 
I don't know if I'm more impressed by the card or by the price it will have.
 
lol, I love the little chart on the NordicHardware article that has the specs of the card, and the power consumption of the GTX590 is listed as "a lot" xD
 
With the PCI-E power limitation. Is this just a guideline? Or does it cease to function if a device exceeds 300w?
It's part of the present set of specifications that basically determine what is standard fit-out for a computer component. Much like the ATX standard for PSU's (dimensions, current parameters) and motherboard (size, slot placements etc.), so that any new product should theoretically fit into the build ecosystem.

The PCI spec is set at 300w but that isn't a hard and fast limit. A PCI slot is nominally rated at 75w but many boards can provide upwards of 90w with relative ease (hence the number of boards with a second EPS12V 8pin plug or auxillary molex), likewise a 6pin (75w) and 8pin (150w) aren't hard limits either. A good PSU will allow a higher draw without adverse effect-a least in the short term.

Where the PCI spec comes into play is with OEM's. Overarching full system warranties mean insurance and insurance means adhering to a standard -hence locked/limited BIOS, proprietry connectors- and OEM's, whether they be Dell/Alienware or a boutique builder (Puget, Cyberpower, iBuypower) simply don't set foot outside of industry standards for that reason...and OEM's wield a lot of power due to the massive orders they place for reference equipment.
 
What kind of an OEM is going to put a gtx 590 in their prebuilt rigs though?

Most of them still use crap stuff like radeon 4670 or "nvidia gt 220 with 1792mb memory"' in those "best buy" computers.
 
Back