1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Nvidia's dual-GPU GeForce GTX 590 coming in February

By Matthew · 21 replies
Jan 31, 2011
Post New Reply
  1. NordicHardware has it on good authority that Nvidia plans to launch its dual-GPU GeForce GTX 590 graphics card this month. A release date isn't mentioned, but Nvidia is probably trying unveil its flagship in the same timeframe as AMD's competing Radeon HD 6990 (codenamed Antilles).

    Read the whole story
  2. princeton

    princeton TS Addict Posts: 1,674

    Wow. Those specs show this will be a beast. We'll have to wait to see it and the HD 6990 go head to head to find out though.
  3. Sarcasm

    Sarcasm TS Guru Posts: 380   +48

    Man I feel bad for AMD, they keep getting 1up'd before they release their products.

    Oh well, I'm interested to see how this dual gpu is going to perform and how much it costs.
  4. Leeky

    Leeky TS Evangelist Posts: 3,357   +116

    Wow indeed!

    I cant wait to see some real world results for this GPU!
  5. princeton

    princeton TS Addict Posts: 1,674

    Well back with the 5000 series they didn't just 1up nvidia. They beat them by months and months.
  6. TomSEA

    TomSEA TechSpot Chancellor Posts: 3,121   +1,610

    LOL...AMD and nVidia can't release the highest end cards fast enough to beat each. I think I'll have to pass on this one though with that $700 price tag. :p
  7. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,534   +421

    One fan? they should go the two fan route that gigabyte is a fan of using (pun not intended), putting the fans directly above the two main chips.

    Either way, while I'll never be in the market for one, I can't wait until the 590 and the 6990 comes out. =o
  8. Sarcasm

    Sarcasm TS Guru Posts: 380   +48

    Yup good healthy competition in my eyes. It's just unfortunate that AMD cant respond to Nvidia's offerings other than lower prices though.
  9. LNCPapa

    LNCPapa TS Special Forces Posts: 4,281   +526

    Someone here never kept an eye on the 5970... anyway, I'm pretty excited about this card. Seriously considering it if the numbers look good enough - to be honest I really miss my PhysX.
  10. grvalderrama

    grvalderrama TS Booster Posts: 206   +11

    Man, you should know that most people can't even start to imagine buying a 700-dollar-card. AMD does point its card to hardcore gammers that can affort that kind of cards, but they mainly direct its products to people like me :D
  11. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,073   +164

    This is going to be interesting. if they are going to pot two full fledged GF110 on this thing, they must have binned some gems and been sitting on them just for this. And/or the clocks are going to be around 550mhz. Same for Antilles if its a 6970 x2. Bringing this in under (at least initially) at or under 300W package just doesn't seem to work on paper.
  12. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,267

    Hence my previous observation regarding the power limiting on both cards.
    Looking at TPU's power consumption figures ( 390-444w "gaming" load) for stock cards which average 1.000v (they seem to range from 0.988 to 1.0500v for a vanilla card) it will be interesting to see whether the card has a special "stock" power limiter functionality in upcoming drivers. If nvidia has stockpiled <0.95v GPU's then that 10% lowering of voltage* should translate into a much less aggressive profile.
    I personally can't see too many of these cards (or the HD6990) hitting the retail channel- binning, lower demand, lower profit margin should ensure that it becomes HD 5970 redux. Just enough in stock at newegg to qualify as a current card, priced high enough to ensure that a few units are available most of the time while still basking in the PR glow of "Worlds Fastest Gaming Card"™ (whichever of the two happens to win the title- my guess is they split benchmarks and the fanboy fatwas get cranked up a notch or ten)

    * As a comparison, the GTX 570 stock voltage is around 0.975v (0.025 less than the GTX580) -compare power usage here, although the 570's lower memory chip count and controller would account for some of the lower usage.
    Rage3D and a few other non-nvidia sites are also claiming 550-560MHz clocks for the GTX590 - personally I can't see nvidia releasing a card that would only beat the GTX 580 by around 35-40%. Not when putting two GF114 cores onto the same pcb would likely accomplish more for less (298w combined for GTX 560Ti SLI)
  13. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,073   +164

    right, that's not what is making sense from a cost standpoint. how about a 300w 1024 Cuda, and a 300w 3072 SPU unit with a switch....for the 375W mode? :)
    and before you ask...yes i think the oxy is getting to me :p :wave:
  14. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,267

    Only problem with that scenario is that the reference version of both cards needs to be 8pin + 6pin -if only for the appearance of being within PCI specification. I could quite easily imagine just enough reference cards to be built to satisfy a launch requirement..and then to simply disappear as AIB's quickly brought their own designs (2 x 8pin) to market - the same strategy used for the HD5970 4Gb ( both the XFX Black Edition and Sapphire Toxic).
    The other alternative would be to release the cards with 2 x 8pin and have 2 pins capped in much the same way that some motherboards EPS12V sometimes ship with a plastic cap covering half of the eight pinout.
  15. fpsgamerJR62

    fpsgamerJR62 TS Rookie Posts: 489

    This is way more than I expected from Nvidia. I had hoped for a pair of GTX 570s on a single PCB though a pair of GTX 560s would have been more logical. I guess when Nvidia set out to build the Beast, they pulled out all the stops. I wonder if Nvidia would be willing to build a "baby" version of this SKU for those of us who don't own 30-inch monitors. GTX 565 / 575 ? :).
  16. princeton

    princeton TS Addict Posts: 1,674

    With the PCI-E power limitation. Is this just a guideline? Or does it cease to function if a device exceeds 300w?
  17. Mizzou

    Mizzou TS Enthusiast Posts: 783

    Think that's a good read on how this will unfold, basically a battle for the high ground for the boys in PR. It will be interesting to see the benchmark results, tend to think Nvidia has a bit of an upper hand this time around.
  18. I don't know if I'm more impressed by the card or by the price it will have.
  19. lol, I love the little chart on the NordicHardware article that has the specs of the card, and the power consumption of the GTX590 is listed as "a lot" xD
  20. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,267

    It's part of the present set of specifications that basically determine what is standard fit-out for a computer component. Much like the ATX standard for PSU's (dimensions, current parameters) and motherboard (size, slot placements etc.), so that any new product should theoretically fit into the build ecosystem.

    The PCI spec is set at 300w but that isn't a hard and fast limit. A PCI slot is nominally rated at 75w but many boards can provide upwards of 90w with relative ease (hence the number of boards with a second EPS12V 8pin plug or auxillary molex), likewise a 6pin (75w) and 8pin (150w) aren't hard limits either. A good PSU will allow a higher draw without adverse effect-a least in the short term.

    Where the PCI spec comes into play is with OEM's. Overarching full system warranties mean insurance and insurance means adhering to a standard -hence locked/limited BIOS, proprietry connectors- and OEM's, whether they be Dell/Alienware or a boutique builder (Puget, Cyberpower, iBuypower) simply don't set foot outside of industry standards for that reason...and OEM's wield a lot of power due to the massive orders they place for reference equipment.
  21. What kind of an OEM is going to put a gtx 590 in their prebuilt rigs though?

    Most of them still use crap stuff like radeon 4670 or "nvidia gt 220 with 1792mb memory"' in those "best buy" computers.
  22. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,267

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...