Nvidia launches GeForce GTX 580, new community site

madboyv1 said:
Am I the only one getting tired of the whole "but can it play Crysis?" spiel? =/

god yes i am crysis is probably the worst coded game of the decade and thats the only reason systems cant play it on another note if you have a gtx 480 or 5870 there is nothing to get excited about in these new top end cards just wait till the next round of cards then it will be worth upgrading.
 
According to the Tech Report, the power consumption is actually slightly higher than that of the GTX 480 (Depending on the workload) and it is only one decibel quieter under load.
I can't say that I'm impressed.
Maybe you should take up reading as a hobby, rather than relying on the pretty pictures.
Peak power draw is lower for both video cards, but the GTX 580 uses substantially less than the 480. Obviously, much depends on the workload involved. We may have to consider using multiple workloads for power testing going forward.
As borne out by TPU ( 226w GTX580 v 257w GTX 480)
Bit-tech (363w GTX580 system v 376w GTX480 system)
Hardware Canucks ( 395w GTX580 system v 431w GTX480 system)
PC Perspective (407w GTX580 system v 412w GTX480 system)
Anandtech ( 452w GTX580 system v 479w GTX480 system) -max load (power virus)
HT4U (247w GTX580 v 249w GTX480)
Some variance obviously, but I'd imagine even a troll could see the general trend.
And...
In spite of its relatively high peak power draw, the GTX 580 is among the quietest cards we tested.

You can probably save the double posting, we heard you the first time.
 
loL, compare this card with HD 5970...Thats faster than this one..
 
Decimae said:
These images are rather off; they start at 0.80 or something else. That's no problem for noise, sine that is logarithmic, but otherwise they give the wrong impression.
Also, why compare to 5870cf? They should know that GTX480SLI/GTX460SLI/5970CF/6870CF performs better, so the graph gives no proof at all.
The funny thing is that if this is going to continue, NVIDIA/AMD will have their roles switched. AMD will have a good medium level GPU(6870/6850) and NVIDIA will have a good high level GPU(GTX580).

Hey Decimae

I was thinking the exact same thing. The graphs are a bit skewed by the sizes of the different bars as well. Making the difference between a 1.2 and 1.4 seem enormous, when in reality it is not all that much...

I wait with pent excitement, however, for review of this card. I have a feeling that I will end up getting a 580 as it is already out, and the 6970 has only just been hinted at.
 
The 6970 will be faster in some games and be cheaper.

But this all is grate for price war.
 
Looks like what the 480 should have been in the first place. It will probably be a successful product for NVIDIA. It's certainly not for budget buyers, but people who are looking for the best quad-SLI solution are probably thrilled.
 
When you consider though how much better this tessalates over the ATI equvilent I believe that even if the 6970 is faster it won't keep up with the times and games will start using Tessalation more heavily meaning the Nvidia's will have a consistantly higher FPS in future games.

Of course, this is assuming that A) Games will start using Tessalation and come out in the next year or so and B) ATI won't change there Architecture and bring out a wave of new cards to compete.

Either way though, I feel if you bought this card it would last years and years, not a year or so.

Although for me, to spend £350 or so on a card like this is a very big risk if it doesn't last that long.
 
Just read the review at Anandtech, and I think this card chip is pretty impressive. This is definitely "Fermi done right".

@burty117, AMD pointed out that the levels of tesselation used in tests that make NVIDIA look particularly good are too high to be of practical use, i.e., you will get the same image quality with lower tesselation. Assuming that's true, once developers get a better hang of tesselation things should be better for AMD, which has improved tesselation performance in the Radeon 6000 family for low-mid tesselation level.

I also don't think a high end card is meant to last for years. If you spend $600 on a card, you probably have enough money to invest in a new one a year or two down the road.
 
ET3D said:
Just read the review at Anandtech, and I think this card chip is pretty impressive. This is definitely "Fermi done right".

@burty117, AMD pointed out that the levels of tesselation used in tests that make NVIDIA look particularly good are too high to be of practical use, i.e., you will get the same image quality with lower tesselation. Assuming that's true, once developers get a better hang of tesselation things should be better for AMD, which has improved tesselation performance in the Radeon 6000 family for low-mid tesselation level.

I also don't think a high end card is meant to last for years. If you spend $600 on a card, you probably have enough money to invest in a new one a year or two down the road.

I Guess so, I think thats why I don't really go above the 60's in the nvidia range, to expensive. I currently have a GTX260 want a GTX 460 but will probably get a GTX 560.

If I knew they would last longer I would buy one though.
 
If I knew they would last longer I would buy one though.
You probably can't go wrong with any of the current (or near future) cards from either camp imo. An HD6850/6870 or GTX460 gives playability for any game in release (albeit not with every game IQ setting checked) with the option of crossfire/SLI should the gaming landscape change, while a single more capable card such as the GTX580/570/560 or HD6950/6970 is only likely to be tortured by a game running across multiple screens and/or extreme levels of AA, DoF, tessellation, ambient occlusion etc.

One thing to note with the GTX 580 is that it retains the HDMI 1.3 spec from the earlier GTX 480 (probably a result of it's quick gestation period) rather than using the new HDMI 1.4a spec that the GTX 460 and AMD cards are blessed with - so if you're looking at 3-D HD home theatre content then the GTX 580 is probably not a futureproof investment.
 
Maybe nvidia's 500 series are superior than AMD's 6000 series, but they are much more expensive so naturaly consumers will be more AMD oriented. As for the benchmarks, I'll wait for TechSpot version because of reliability, than I can make my final conclusion.
 
dividebyzero said:
If I knew they would last longer I would buy one though.
You probably can't go wrong with any of the current (or near future) cards from either camp imo. An HD6850/6870 or GTX460 gives playability for any game in release (albeit not with every game IQ setting checked) with the option of crossfire/SLI should the gaming landscape change, while a single more capable card such as the GTX580/570/560 or HD6950/6970 is only likely to be tortured by a game running across multiple screens and/or extreme levels of AA, DoF, tessellation, ambient occlusion etc.

One thing to note with the GTX 580 is that it retains the HDMI 1.3 spec from the earlier GTX 480 (probably a result of it's quick gestation period) rather than using the new HDMI 1.4a spec that the GTX 460 and AMD cards are blessed with - so if you're looking at 3-D HD home theatre content then the GTX 580 is probably not a futureproof investment.

Thanks DBZ, I have been eyeballing the GTX460 for a while, think I might treat myself to one at the January sales after christmas :)

Although I do have to correct you on the HDMI 1.3 spec on the GTX580
the Nvidia site says its 1.4a
http://www.nvidia.co.uk/object/product-geforce-gtx-580-uk.html

Under the Specs page.

Unless Nvidia are just bigging themselves up, I guess it is 1.4a??
 
Had nvidia released "full fermi" in the first place, wouldve shut AMD up for a long while. They just made it hard on themselves.
 
all fake! if u want a real comparision, considering the price, the 5970 kicks butt and the 580
 
Back