1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Nvidia GeForce GTX 590 Review: Dual-GPU Wars

By Julio Franco · 55 replies
Mar 24, 2011
Post New Reply
  1. Jesse

    Jesse TS Evangelist Posts: 359   +42

    Different solutions are better for different people. a 5770 would never work for me at 2560x1600, but at 1680x1050 i'm sure its great. Likewise, 2 6950's might perform about a 580, but you can put 3 or 4 580's in tandem to kill everything. So... they all serve a purpose.
  2. Cota

    Cota TS Enthusiast Posts: 513   +8

    Agree 3 580 would give you a big time-lapse on staying on the latest charts, hell maybe 2 590's maybe will, but for a 1,500 Dlls o.0, besides since today every one is blaming every one for stoping the game development im sure even 2 6950's would stay viable for a long time and will give a nice upgrade if 3 Crossfired.

    Btw i do game in 1680x1050 stop reading my mind lol D:
  3. captaincranky

    captaincranky TechSpot Addict Posts: 12,402   +2,244

    Since it's so easy,and you expect it to be done right, then why not do it yourself?
  4. Rick

    Rick TechSpot Staff Posts: 4,572   +65

    With that logic, two 590s is actually the "obvious" choice...
  5. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    Somebody seems to be missing either sarcasm tags or a research budget.
    Nvidia at the present time haven't released a quad-SLI 590 driver.

    My $0.02
    Like the 6990, the 590 makes little sense other than as PR stunt and some fuel to keep the fanboy flames raging.
    It's one saving grace IMO is that Nvidia left a lot of performance on the table with this card (a 0.912v core voltage seems to more than bear this out), so buying one 590 and one waterblock (or a card already kitted out) should net the same approximate performance as a couple of 580's that would each require a waterblock. The thermal/noise characteristics of either card don't lend themselves well to air cooling...much less so if overclocking is on the agenda ( a given I would think when looking at the target market).
    Other than that small ray of sunshine, both the 590 and 6990 are more than likely going to be consigned to the oddities section of graphics history.
  6. Was the 6990 in the test running at 830MHz or 880MHz?
  7. “It's also interesting to note that a vast majority of the GTX 590's losses were in DX11 titles: S.T.A.L.K.E.R.: Call of Pripyat (-15%), Metro 2033 (-20%), Battlefield Bad Company 2 (-9%), and Aliens vs. Predator (-10%).”
  8. St1ckM4n

    St1ckM4n TS Evangelist Posts: 2,920   +629

    I was just interested in the performance difference, and comparing it to 2x 6970 vs 6990. To me, that would be interesting.
  9. Steve

    Steve TechSpot Editor Posts: 2,348   +1,495

    @ st1ckm4n – We would have loved to include a pair of GTX 580’s, problem being we simply do not have two. Buying a second it just no feasible at $499 for us and as a few astute members have pointed out it’s not exactly a relevant comparison anyway. A pair of GTX 570’s would have made more sense, sadly we only have the one card again. It’s hard enough acquiring one of every card, let alone two.

    @ Squuiid – First I did a very poor job with the Crucial RealSSD C300 256GB review “for a techspot writer” and now another poor review  I am sorry about that and honestly I do try, very hard.

    @ indiangamer – Hey mate glad you liked the review. On the second page we did give away that very soon we will be testing triple 30” LCD’s on the Radeon HD 6990 and GeForce GTX 590 for you guys ;) Also Crysis is just a DX9 game at the moment, I have the final copy and while it does look impressive for a DX9 game the results are no different from those found when testing the leaked beta, did you miss our controversial article on that? Everyone loved it.

    @ Mizzou – Thanks for the feedback, glad you liked the review. When it comes to graphics card predictions you can take what dbz says to the bank 

    @ Vrmithrax – Thanks for reading the entire review mate, that’s a breath of fresh air :D

    @ madboyv1 – With these high-end cards the only resolutions you should discount are those less than 2560x1600.

    Anyway thanks for all the feedback so far guys it has been great.
  10. Lurker101

    Lurker101 TS Evangelist Posts: 793   +315

    It would be quite interesting to see these tests run again, after Nvidia's next driver update which will no doubt be geared to tuning the 590s performance.
  11. dualkelly

    dualkelly TS Enthusiast Posts: 27   +7

    While this card seems to be catered to push high end gaming most people seem to forget that nvidia much larger market segment is in super/science computing as well as high end graphic and number crunching. This card is a quarter of the price of quadros and about 4 times the performance.
  12. Julio Franco

    Julio Franco TechSpot Editor Topic Starter Posts: 7,382   +800

    They might continue to improve the driver, but every time we review a new graphics card we use the latest drivers as provided by the manufacturer which are optimized for the cards being launched.
  13. Techspot's test system is clearly showing CPU bottlenecking making their review of a 590 vs 6990 invalid. Check this out for just cause 2 and compare: http://www.tomshardware.com/reviews/geforce-gtx-590-dual-gf110-radeon-hd-6990,2898-11.html
  14. Steve

    Steve TechSpot Editor Posts: 2,348   +1,495

    There is essentially nothing new about the GTX 590 when compared to other SLI GTX 500 series cards so I think expecting more performance is wishful thinking at this point. But as Julio said we will always test with the latest drivers when we can.

    So you are saying an LGA1366 processor at 3.70GHz is clearly creating a bottleneck, that is interesting. I have never seen this before and I know quite a great deal about this subject. Also you might want to research the impact that 8xAA has on performance ;)
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    A few points to consider.
    Quadro isn't Nvidia's segment for "super/science computing"- that would be Tesla (M/S series and C series).
    Quadro's FP64 (double precision) to single precision ratio is 1/2. GeForce is capped (or constrained) at 1/8.
    Even if you were to somehow modify a GTX 580 to Quadro 6000 specification by BIOS mod- a task that has become increasingly difficult with every passing generation of cards, you still need to remember that a GTX 580 had 1.5Gb of GDDR5 RAM sitting on the pcb. Its workstation counterpart -the Quadro 6000- has 6Gb. That, the 24/7 support and the binning/test parameters for the GPU's differentiate the two configurations- and is evident in the sizeable gulf between the rendering and visualization capabilities of the respective cards.
  16. Relic

    Relic TechSpot Chancellor Posts: 1,379   +16

    Looks like both teams in a way can say they are winners this time around. Unless of course you're the one that sets the new dual-gpu on fire :p [Link].

    That's a bit surprising and something I did not expect.

    I would say the real winner here is us, most of the time at least if companies aren't partaking in shenanigans :) . It's nice to have plenty of options, definitely better then being shoehorned one way or the other.
  17. Lurker101

    Lurker101 TS Evangelist Posts: 793   +315

    Not exactly. If you'll think back to the birth of the 295, which the 590 is basically just the next gen of, it had a few initial teething problems. Not only did it need a driver or two to help iron out performance, it also had the obligatory SLI hiccups that needed stamping down so you could actually use it.

    And while we're on the subject of the GTX 295, does anyone think Asus will be looking at the 590 as a candidate for the Mars III?
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    The problem with that is simply that every driver release then potentially becomes a full review candidate, since release games will also suffer if either AMD of Nvidia release it under their gaming banner. I'm sure AMD will push for reviews (either directly or through forum fanboys) when/if they get Civ 5 performance up to scratch. Likewise DAO and Shogun 2 would come in for the same attention from the green camp
    Doubtful IMO. Asus got burned on the Mars (too late into the market) and Ares wasn't (and isn't) a big seller regardless of the fact that only 1000 went to market...and who is going to plunk down $1k on 5870 Crossfire ? -and that's without considering that the same amount of cash (or less) buys GTX580SLI or the three-way crossfireX 6000 series of choice.
    Having said that I think the biggest obstacle to the full 580SLI Mars if Asus itself. A DirectCopperII version of the GTX590 would be, for all intents and purposes, the exact same card - especially if they add a third PCIe connector... Personally I'd wait it out for either a GTX580 w/ 3Gb ( if I were desperate for a card...an nvidia card) or keep the folding in my wallet and hope TSMC's 28nm process is on (or close to) time.
  19. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,443   +347

    I normally agree with this statement in from a sheer performance standpoint. However, with the obvious framerate pointing where people are commenting/jeering about differences of 5 fps between cards or average/max (not just this site, others included), especially in the low 30s and low 40s where it counts the most, it's obvious that from a practical performance standpoint your resolution probably should not be that high for that game, especially for the ones who live by "60fps min or the highway." So whether you discount a resolution really depends on how you're looking at it.

    Regardless of the resolution however, these dual GPU solutions are crazy powerful, and to think only 5 years ago we had the 8800 series and the HD2900, the beginning of Unified Shaders. =o
  20. fpsgamerJR62

    fpsgamerJR62 TS Rookie Posts: 489

    @dividebyzero - I'm also looking forward to a custom GTX 580 with 3 GB DDR5 similar to the 2GB GTX 285 cards which appeared late in that product's life cycle. I personally think the 590's 1.5 GB memory buffer does get exhausted at very high resolutions and image settings resulting in performance drop-offs in the benchmarks. The 6990's 2 GB buffer should give it an advantage in this situation. Maybe it's just me but I don't care much for Nvidia's oddball memory configurations. 3 GB sounds better to me than 1,536 MB.
  21. Archean

    Archean TechSpot Paladin Posts: 5,690   +95

    Toms recently did a review about performance improvements with each new release of graphic drivers (using HD 5870), however, the review is about ATI Catalyst drivers; but I think it is a good idea anyway which probably can give you a reasonable guestimate about what to expect from nVidia as well.

    Congrats Kiwis just beat Proteas, and they deserved to win after such a fantastic performance ........... but for Proteas, I guess they always somehow find a new way to choke ;)
  22. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    Technically the 3Gb of RAM would be 3076Mb. Other than seeming a little strange considering Nvidia use a "non-256Mb" industry standard memory bus, the "oddball" framebuffer doesn't have any detrimental effects over a 1024Mb or 2048Mb one.
    My reasoning for the GTX580 3Gb comes doen to the fact that, AIB "specials"/non-reference aside, the GTX590 is the last significant card to be released until late this year (or early next) when the GTX6xx and HD7xxx series are due to launch. Something needs to fill the "new release" gap...and a GTX580 3Gb would do just that, since the only area that the present 580 doesn't dominate (in framerate) is a few titles where frame buffer limitations play a significant factor - F1 2010 in particular and multi-screen gaming in general where high levels of AA and texture are utilized.

    From the moment of that bizarre Amla dismissal, anything was possible. The South African's seem to have either a mental block or the gods turn their back on them at World Cup time.
    Hopefully the team can put up a good show against England/Sri Lanka.
  23. Archean

    Archean TechSpot Paladin Posts: 5,690   +95

    It was a strange one indeed, but the irony is without any shadow of doubt I would say the best (and most balanced) team of the tournament has been knocked out (probably haunted by the deamons from the past?).

    Anyway, one thing which I missed to add in my last comment, I think it would be interesting to see (a comparison about) how nVidia's graphic driver performance progresses with each driver relase :)
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,262

    The main problem with comparing driver releases- especially over an extended period of time is that even if the system configuration does not change, the software/OS suite invariably will, as will the benchmark/game version. Once a patch or bug fix is added, it pretty much throws out any previous comprisons, and re-testing the patched/bugfixed game or synthetic can present a whole new raft of problems.
    Most comparisons I've seen usually involve 2 or 3 driver revisions at most, unless a whole slew of beta's are released in a short space of time.
    AMD's Catalyst driver for some reason known only to Shane Baxter and a select few is performance reviewed on every release...you can see that a 2 fps increase is generally cause for joyous celebration.
  25. Steve

    Steve TechSpot Editor Posts: 2,348   +1,495

    I would be surprised if there was even that most of the time.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...