Nvidia demos 'fastest DX11 GPU', touts near-silent operation

Dsparil said:
Oh yes. The endless war between AMD (they got ATI) and Nvdia commences for the pockets of the consumers.

What they should do is decrease the power consumptions of the flagship cards. The 480 is just way too brutal, and when paired off with the i7... Let's just say that the power company is your best friend.

a 480 and i7 pulls about 450 watts during game play. Its not that serious. Its the noise on stock coolers that can bother someone.
 
Didnt look that revolutionary to me.. just more of the same. I know there were some improvements but its not like its groundbreaking.
 
looks nice...this whole ati vs. nvidia thing is quite interesting though, I´m pretty curious who will finally win this race
 
Hmm..sounds like another power hungry, super hot, large graphics card from Nvidia. Featuring 'amazing' theoretical features that will probably never make it to video games for the next 5 or 10 years if at all. I'm leaning strongly toward ATI cards. Less power, lower temps but still great performance.
 
Kibaruk said:
Either AMD or Nvidia, it makes competition so in the end only we (customers) win from price/performance wars and new technology.

I'm still an AMD fanboy and will keep buying them no matter what =)
I agree with what u said but my choice is only based on gaming performance/$.
 
What about multiple displays? I really need Nvidia to come out with triple monitor support on one card and allow SLI while sporting those 3 monitors. AMD did it, I don't understand why Nvidia can't?
 
I think I'll wait for third-party benchmarks before believing that.
The GTX 480 is presently the "fastest DX11 GPU"...and the GTX 580 is supposed to offer more performance in gaming in general, and tessellation (not the only DX11 feature) in particular.
So you seem to believe that this replacement "GPU" which has a higher shader count ( 512 to 480), faster core and shader frequency, and faster video memory is still going to end up somehow slower than the GTX 480. Belief like that takes a kind of very special enthusiast*....unless of course you somehow see Cayman XT both launching before the GTX 580 and having more performance.

Personally I'd see both the GTX 580 and HD 6970 offering very similar levels of performance and launch within the same timeframe. The difference here is that the GTX 580 has been demonstrated publicly, while the 6970 has not, thereby making nvidia's talking head statement demonstrably true.

( * very special enthusiast = fanboy)

What about multiple displays? I really need Nvidia to come out with triple monitor support on one card and allow SLI while sporting those 3 monitors. AMD did it, I don't understand why Nvidia can't?
Maybe because nvidia are playing catchup with this technology...much the same as AMD are with 3-D gaming.
And if you think AMD has a finished Eyefinity product, then think again. From the release notes of Catalyst 10.10:
-Enable dialog reposition does not show on proper monitor when system is configured with four displays (all Windows OS)
-Switching the preferred display in specific Eyefinity configurations may cause the displays to become disabled (all Windows OS)
-Under multi-adapter configuration, various rotated displays in Eyefinity set up might not be retained after reboot (Windows 7)
-Windows Media Center application may stop responding or system may intermittently fail while playing 1080p video in 2x2 and 4x1 Eyefinity mode.

And a biggie....which has supposedly been resolved -only to be relisted as a known issue for the next Catalyst release...
-Mouse cursor may intermittently be corrupt/missing in one of the displays under Eyefinity configuration while playing games/samples (same release notes)
And to top it all off....artifacting and screen corruption issues for numerous users when using DP in connection with DVI (signalling issues).

Word of advice hassaan- If you're going to pull some random feature from your fave's supposed repertoire it might be apropos to know something about it's implementation. Having set up a couple of Eyefinity (that's AMD's three or monitor single display) set up's I can assure you that it's not entirely ready for primetime.
 
limpangel said:
fastest DirectX 11 GPU on the planet
I think I'll wait for third-party benchmarks before believing that. This is not the first time Nvidia promised, but did not deliver. Or it delivered but with drawbacks on other fronts like heat and power consumption.
I don't say ATI is perfect, but in the last year or so they have been doing the innovating while Nvidia just kept getting lazier.

Nvidia's getting lazier you said? I'm not in on that 'fanboy' rubbish but Nvidia have been pushing hard with this Tesselation etc and I think that's nice but it's just a shame not many games are using it to the fullest like the Heaven Benchmark, that looks incredible and I hope that's the future of games(on PC anyways =/).
 
ChrisG683 said:
But can it play Crysis?

I wish this were something that I could laugh at. But seriously! Like Chris eludes to, I also look for a worst case scenario and then a graphics card that can best that situation.

With GPU technology has the wall been hit?

Its been 3 years since Crysis came out and its still hard (impossible) to find a single GPU that can play this game at 1920 x 1080, high quality details, at 60 frames per second. With the old days of iD and Quake, each new generation of graphics card seemed to double the frames per second of the old cards at the same resolution. (Example: http://www.tomshardware.com/reviews/nvidia-rocks-boat-tnt2,102-5.html ) Maybe we had brighter ideas back then? But now it seems like we're fighting to get 20% improvement in frame rates through hardware upgrades.
 
well we all know its not going to be considerably faster than AMD's offering who cares that they just know decided to care about noise and how cool there products are.
 
well we all know its not going to be considerably faster than AMD's offering who cares that they just know decided to care about noise and how cool there products are.
 
I wish this were something that I could laugh at. But seriously! Like Chris eludes to, I also look for a worst case scenario and then a graphics card that can best that situation.With GPU technology has the wall been hit? Its been 3 years since Crysis came out...
Probably the worst coded game in existance. Nvidia and AMD are somehow at fault or dragging their heels because one (of sometimes two*) poorly coded game in three years brought GPU's to their knees? The fact that you can count the number of games using CryEngine 2 on the fingers of a Mickey Mouse hand- and have fingers to spare for future releases should tell you how unoptomised the games default settings are. And if that doesn't then maybe seeing the massive jump in framerate once the game IQ setting are optimised should be somewhat apparent
Following your example, Intel and AMD are also stagnating with CPU vectorization because of GTA IV's voracious appetite for core speed and memory bandwidth?

(* The other game being Metro 2033 when checking the tacked-on DX11 IQ settings)
well we all know its not going to be considerably faster than AMD's offering who cares that they just know decided to care about noise and how cool there products are.
True. It's not like AMD have been using energy efficiency and it's attendant lower heat output as a marketing point for the last year or so.......oh.
BTW: Repeating your posting might inflate your post count, but it doesn't add a whole lot to the debate.
 
I'm not real concerned with heat and noise. I can hardly even hear my ATI HD4890 running at full blast, and I don't tend to overclock so heat has never been a problem with me. I care more about performance. It's one thing to claim to be the fastest, but will it hold up to review when it comes out?
 
That demo was pretty awesome especially the dynamic tessellation, definitely can't wait to see benchmarks for the GTX 580. I'm also happy to see improved cooling, as hot hardware is a bit of a pet peeve. With that said though I doubt I'll be purchasing one anytime soon as it's simply not a good buy for me and I'm sure many others. I'm more interested in what the refreshes of the 470/460 have to offer down the road especially with AMD bringing a good value with the 6850/6870 & possibly 6950.
 
Well, the Demo was very nice. But I still reserve judgement for the final hardware comparison. That, or whether NVIDIA will have their card out first. If NVIDIA get their card out this year, before Christmas. Then they will have my vote.
 
this is awesome but quite expensive but as someone who will buy black ops i think i might consider this....
 
this is awesome but quite expensive but as someone who will buy black ops i think i might consider this....

Somehow I don't think you would need the processing power of a GTX 580 to max out the settings on a Call of Duty game.
The only reason the card and the game are linked is because nvidia chose to unveil the card at an event is aimed at gaming/LAN. Black Ops just happens to be the flavour of the day (release wise).

For non-fanboys and the vaguely interested, TPU has their preliminary GTX 580 review up >>here<< , also a preliminary SLI review
 
Looks pretty sweet, just need to see some benchmarks to know how awesome it really is. way out of my price-range but progress is good.
 
It seems the TPU site is down.
Here's an alternate quick review (Googlish) with some added links

Seems to outperform the GTX 480 by 15-20% for the most part. 70°C max temp under Furmark is a big improvement. Fairly close to the HD 5970 in overall performance- individual game variance is quite high due to the buggy nature of the HD 5970's crossfire profiles
 
Probably the worst coded game in existance. Nvidia and AMD are somehow at fault or dragging their heels because one (of sometimes two*) poorly coded game in three years brought GPU's to their knees? The fact that you can count the number of games using CryEngine 2 on the fingers of a Mickey Mouse hand- and have fingers to spare for future releases should tell you how unoptomised the games default settings are. And if that doesn't then maybe seeing the massive jump in framerate once the game IQ setting are optimised should be somewhat apparent
Following your example, Intel and AMD are also stagnating with CPU vectorization because of GTA IV's voracious appetite for core speed and memory bandwidth?

(* The other game being Metro 2033 when checking the tacked-on DX11 IQ settings)

True. It's not like AMD have been using energy efficiency and it's attendant lower heat output as a marketing point for the last year or so.......oh.
BTW: Repeating your posting might inflate your post count, but it doesn't add a whole lot to the debate.


Crysis was a game of stature, not user performance. Many people here know I've always kept up to date with hardware and even back then it was disappointing as it still is now. 480 sli on my monitor in the living room 1080p 45fps max with high settings. We still hold Crysis to this potential but we dont take into account it was kind of abandoned as software all together. Crysis should have been the next pcmark vantage but it was handled incorrectly.
 
This here article does not speak of a card that will cost 500 USD... more like 600-700 especially with this new 'silent' technology they are boasting. Get your wallets ready.
 
Back