Nvidia launches GeForce GTX 580, new community site

By Matthew
Nov 8, 2010
Post New Reply
  1. klepto12

    klepto12 TechSpot Paladin Posts: 1,364   +9

    god yes i am crysis is probably the worst coded game of the decade and thats the only reason systems cant play it on another note if you have a gtx 480 or 5870 there is nothing to get excited about in these new top end cards just wait till the next round of cards then it will be worth upgrading.
  2. dividebyzero

    dividebyzero trainee n00b Posts: 4,809   +642

    Maybe you should take up reading as a hobby, rather than relying on the pretty pictures.
    Peak power draw is lower for both video cards, but the GTX 580 uses substantially less than the 480. Obviously, much depends on the workload involved. We may have to consider using multiple workloads for power testing going forward.
    As borne out by TPU ( 226w GTX580 v 257w GTX 480)
    Bit-tech (363w GTX580 system v 376w GTX480 system)
    Hardware Canucks ( 395w GTX580 system v 431w GTX480 system)
    PC Perspective (407w GTX580 system v 412w GTX480 system)
    Anandtech ( 452w GTX580 system v 479w GTX480 system) -max load (power virus)
    HT4U (247w GTX580 v 249w GTX480)
    Some variance obviously, but I'd imagine even a troll could see the general trend.
    And...
    In spite of its relatively high peak power draw, the GTX 580 is among the quietest cards we tested.

    You can probably save the double posting, we heard you the first time.
  3. loL, compare this card with HD 5970...Thats faster than this one..
  4. vangrat

    vangrat Newcomer, in training Posts: 223

    Hey Decimae

    I was thinking the exact same thing. The graphs are a bit skewed by the sizes of the different bars as well. Making the difference between a 1.2 and 1.4 seem enormous, when in reality it is not all that much...

    I wait with pent excitement, however, for review of this card. I have a feeling that I will end up getting a 580 as it is already out, and the 6970 has only just been hinted at.
  5. vangrat

    vangrat Newcomer, in training Posts: 223

    Thanks for pointing this out, the 5970 isn't even listed...snarky little buggers.
  6. The 6970 will be faster in some games and be cheaper.

    But this all is grate for price war.
  7. ET3D

    ET3D TechSpot Paladin Posts: 968   +30

    Looks like what the 480 should have been in the first place. It will probably be a successful product for NVIDIA. It's certainly not for budget buyers, but people who are looking for the best quad-SLI solution are probably thrilled.
  8. Burty117

    Burty117 TechSpot Chancellor Posts: 2,489   +302

    When you consider though how much better this tessalates over the ATI equvilent I believe that even if the 6970 is faster it won't keep up with the times and games will start using Tessalation more heavily meaning the Nvidia's will have a consistantly higher FPS in future games.

    Of course, this is assuming that A) Games will start using Tessalation and come out in the next year or so and B) ATI won't change there Architecture and bring out a wave of new cards to compete.

    Either way though, I feel if you bought this card it would last years and years, not a year or so.

    Although for me, to spend £350 or so on a card like this is a very big risk if it doesn't last that long.
  9. ET3D

    ET3D TechSpot Paladin Posts: 968   +30

    Just read the review at Anandtech, and I think this card chip is pretty impressive. This is definitely "Fermi done right".

    @burty117, AMD pointed out that the levels of tesselation used in tests that make NVIDIA look particularly good are too high to be of practical use, i.e., you will get the same image quality with lower tesselation. Assuming that's true, once developers get a better hang of tesselation things should be better for AMD, which has improved tesselation performance in the Radeon 6000 family for low-mid tesselation level.

    I also don't think a high end card is meant to last for years. If you spend $600 on a card, you probably have enough money to invest in a new one a year or two down the road.
  10. Burty117

    Burty117 TechSpot Chancellor Posts: 2,489   +302

    I Guess so, I think thats why I don't really go above the 60's in the nvidia range, to expensive. I currently have a GTX260 want a GTX 460 but will probably get a GTX 560.

    If I knew they would last longer I would buy one though.
  11. dividebyzero

    dividebyzero trainee n00b Posts: 4,809   +642

    You probably can't go wrong with any of the current (or near future) cards from either camp imo. An HD6850/6870 or GTX460 gives playability for any game in release (albeit not with every game IQ setting checked) with the option of crossfire/SLI should the gaming landscape change, while a single more capable card such as the GTX580/570/560 or HD6950/6970 is only likely to be tortured by a game running across multiple screens and/or extreme levels of AA, DoF, tessellation, ambient occlusion etc.

    One thing to note with the GTX 580 is that it retains the HDMI 1.3 spec from the earlier GTX 480 (probably a result of it's quick gestation period) rather than using the new HDMI 1.4a spec that the GTX 460 and AMD cards are blessed with - so if you're looking at 3-D HD home theatre content then the GTX 580 is probably not a futureproof investment.
     
  12. zogo

    zogo Newcomer, in training Posts: 53

    Maybe nvidia's 500 series are superior than AMD's 6000 series, but they are much more expensive so naturaly consumers will be more AMD oriented. As for the benchmarks, I'll wait for TechSpot version because of reliability, than I can make my final conclusion.
  13. white2010

    white2010 Newcomer, in training Posts: 16

    Good things , new technology , better product . Wait for amd to answer this new challenge from nvidia
  14. skitzo_zac

    skitzo_zac TechSpot Chancellor Posts: 459

    So, anyone wanna buy me one?
  15. Burty117

    Burty117 TechSpot Chancellor Posts: 2,489   +302

    Thanks DBZ, I have been eyeballing the GTX460 for a while, think I might treat myself to one at the January sales after christmas :)

    Although I do have to correct you on the HDMI 1.3 spec on the GTX580
    the Nvidia site says its 1.4a
    http://www.nvidia.co.uk/object/product-geforce-gtx-580-uk.html

    Under the Specs page.

    Unless Nvidia are just bigging themselves up, I guess it is 1.4a??
  16. dividebyzero

    dividebyzero trainee n00b Posts: 4,809   +642

  17. xcelofjkl

    xcelofjkl Newcomer, in training Posts: 86

    Had nvidia released "full fermi" in the first place, wouldve shut AMD up for a long while. They just made it hard on themselves.
  18. all fake! if u want a real comparision, considering the price, the 5970 kicks butt and the 580


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.