GeForce driver v267.52 could overheat GTX 590

By on March 25, 2011, 6:35 PM
SweClockers along with other tech review sites have discovered a flaw that could cause Nvidia's new dual-GPU GeForce GTX 590 to burn up in a plume of smoke. After frying two sample cards, the Swedish site blamed the driver they were supplied with. When overclocking the GTX 590 with GeForce driver version 267.52, the card supposedly draws more power than it can actually handle causing it to overheat. SweClockers has documented the issue on video:

The solution? Avoid driver v267.52 like the plague -- especially if you intend to overclock your GTX 590. Nvidia has released a new driver (v267.85) with improved overcurrent protection and SweClockers has confirmed that this solves the problem. It may be common sense, but Nvidia says that anyone who's overclocking or running stress tests should use the latest driver available on the company's website instead of the disc-supplied driver, which is inevitably outdated.

Download: Windows XP 32-bit | Windows XP 64-bit | Windows Vista/7 32-bit | Windows Vista/7 64-bit

Although driver v267.85 has enhanced overcurrent protection, Nvidia notes that it's still risky to raise your card's default clocks and voltages. The company recommends leaving the GTX 590 at its stock voltage when overclocking with air cooling, while folks with liquid cooling should remain within 12.5-25mV of the default voltage. Nvidia believes SweClockers pushed the GTX 590 beyond its limits by running it as high as 1.2V instead of the default 0.91V to 0.96V.





User Comments: 31

Got something to say? Post a comment
red1776 red1776, Omnipotent Ruler of the Universe, said:

I'm not so sure about it being a driver issue. TechPowerup blew up their review 590 using driver set

"GTX 590: 267.71" stating that:

"Card blew up during testing, power limiting system does not work reliably"

Here is that review. The quote is taken from the conclusions page of their review :

[link]

I had also read that a third site blew up a card as well

spydercanopus spydercanopus said:

I was a first adopter of NVIDIA 3D Vision and it stopped working with 3D Vision CD 1.45. You can't update GeForce drivers without the 3D Vision CD package or it will stop working, but this stopped working -with- the CD download.

Not to mention how far behind Tegra 2 is compared to the rest. I would be selling NVIDIA stock today if I hadn't yesterday.

Staff
Matthew Matthew, TechSpot Staff, said:

Thanks for the note red, we'll dig a bit deeper and update the post if we hear anything.

dividebyzero dividebyzero, trainee n00b, said:

Five sites blew up cards afaik including Hardware.fr and Tweaktown.

Nvidia's review notes apparently limited oc to 25mV over stock with the emphasis on no increase in core voltage if possible (Steve Walton may be able to confirm this)

From W1zzards TPU review:

[emphasis added]

Tweaktown used MSI Afterburner...and seems as though MSI don't know the parameters of Afterburners core voltage options -and that just for starters. It doesn't seem to be driver related...It seems more related to reviewers pushing heavy voltage through the card.

Seems pretty stupid to throw 0.3v on top of a <1.0v GPU on a dual-GPU, stock-cooling card if you ask me...Doubly so when the same reviewer used only stock voltage in his HD6990 overclocking test. Maybe W1zz angling for a AMD PR position.

Maybe this can become a trend...I look forward to seeing w1zzards attempts at forcing 1.4v through a HD6970, or 1.6-1.8v through a Bulldozer CPU in upcoming months.

FYI: 1.2v is enough to kill a bog-standard GTX580, let alone a dual card.

@Red1776

I think w1zz read your Minnesotan OC guide!

war59312 said:

So not only does the card suck but the driver does too?

Carma?

red1776 red1776, Omnipotent Ruler of the Universe, said:

Five sites blew up cards afaik including Hardware.fr and Tweaktown.

Nvidia's review notes apparently limited oc to 25mV over stock with the emphasis on no increase in core voltage if possible (Steve Walton may be able to confirm this)

From W1zzards TPU review:

[emphasis added]

Tweaktown used MSI Afterburner...and seems as though MSI don't know the parameters of Afterburners core voltage options -and that just for starters. It doesn't seem to be driver related...It seems more related to reviewers pushing heavy voltage through the card.

Seems pretty stupid to throw 0.3v on top of a <1.0v GPU on a dual-GPU, stock-cooling card if you ask me...Doubly so when the same reviewer used only stock voltage in his HD6990 overclocking test. Maybe W1zz angling for a AMD PR position.

Maybe this can become a trend...I look forward to seeing w1zzards attempts at forcing 1.4v through a HD6970, or 1.6-1.8v through a Bulldozer CPU in upcoming months.

FYI: 1.2v is enough to kill a bog-standard GTX580, let alone a dual card.

@Red1776

I think w1zz read your Minnesotan OC guide!

I didn't think it was driver related. I don't know why you would OC either of the cards period, let alone a voltage limited (cutoff) card.

@Red1776

I think w1zz read your Minnesotan OC guide!

Geez, I guess. either that or your customers 'Equator' OC'ing guide

[link]

dividebyzero dividebyzero, trainee n00b, said:

Yet another troll fest.

BTW: it's spelt Karma. I know you're in the U.S. and English is probably a second language, but you won't help your cause by sounding illiterate. You at least have the assurance of knowing that keeping the comments short puts you light years ahead of spydercanopus...who seems to check the NASDAQ ticker in 3D - well done that man! you must be the hippest player in the market.

dividebyzero dividebyzero, trainee n00b, said:

Geez, I guess. either that or your customers 'Equator' OC'ing guide

Yup, that's what happens when the idi0ts get hold of the important stuff, they either twiddle all the knobs and buttons like autistic lab reseach monkeys, or pen poorly formed trolling posts and purport to be stock market analysts. Go figure.

BTW:spydercanopus

If you really were affected by the 3D Vision 1.45 stopped working bug then you obviously didn't read up too much on how to fix it. But I suspect the only 3-D glasses you own are cardboard and free with every viewing of Pirahna.

spydercanopus spydercanopus said:

Yea I like to invest money as opposed to it stagnating in the bank or blowing it on every new upgrade. NVDA was on a roll for quite a while after it dipped to $7 I rode it up to $15 and sold.

"Divide by zero", you're really clever, dude. Learn that in 5th grade? You're picking personal jabs on a forum about technology...who's the troll?

spydercanopus spydercanopus said:

I posted as SpyderCanopus on that exact thread, dummy. That fix didn't work. NVIDIA claims it's a known issue to be fixed in the next CD release.

dividebyzero dividebyzero, trainee n00b, said:

Well, if the fix didn't work for you and you don't believe nvidia are going to fix the problem, then your best bet is to migrate to an AMD, VIA or Intel graphics solution. Things in that thread would have been somewhat more clear if you had posted after the fix was posted (post #11)- regardless of whether it worked or not.

Quite what Tegra 2 has to do with the posted story is also up for debate....or do you see TI or Qualcomm offering a desktop 3D graphics solution in the near future?

@Matthew, red1776

Seems the problem -as far as Hardware.france (one of those who turned their GTX590 into a pyrotechnics display) is concerned -and backed by Nvidia if the translator is doing it's job-is one of installing the new drivers over the top of existing ones. Doesn't seem to allow for the software based OCP to become active. Article >>here<< -use your favoutite translator.

So, an uninstall, driver sweep and clean install seem to be the order of the day -insane voltages aside.

Somehow, I don't think it's rocket science to predict that if a GTX580's max voltage is 1.21v then a lower Vcore binned dual card using the same 4+1 power delivery might not take to kindly to the same voltage.

"Up to around 1.125 V the card responds well to voltage" - w1zzard's GTX580 review. So of course the obvious answer is to up the voltage to 1.2v

mailpup mailpup said:

Just a reminder not to flame, name call or make personal comments. Restraint in this area is appreciated. Thanks.

dividebyzero dividebyzero, trainee n00b, said:

@Mailpup

Yeah, no problem.

Things can get a little heated sometimes, especially when people start posting dodgy "facts"

@spydercanopus

Maybe you could join Hardware.france's forums and find out why their 3D Vision setup works fine (to a degree) with the titles they tested yesterday.

[hardware.fr] [3D benchtest Starcraft 2], [3D benchtest Bulletstorm], [3D benchtest BattleForge], [3D benchtest Civilization V], [3D benchtest F1 2010] and [ 3D benchtest Metro 2033]

spydercanopus spydercanopus said:

It's an the very first revision of the 3D emitter that experiences issues with the new driver. Not interested in continuing dialog unrelated to the topic.

gwailo247, TechSpot Chancellor, said:

The stupid intern forgot to put the sticky notes that say "don't overclock plz, kthxbye lol ^-^ <3" onto the cards.

Good thing there are tech sites, they get to blow up their free cards before we do.

Guest said:

over-clocking always runs the risk of frying it, hence why the manufactures don't do it out-of-the-box. why is this so shocking?

red1776 red1776, Omnipotent Ruler of the Universe, said:

....shocking....I get it...

over-clocking always runs the risk of frying it, hence why the manufactures don't do it out-of-the-box. why is this so shocking?

Bad pun aside, I don't ever remember five cards on four sites fried on launch day. This was the launch of the 'so called' flagship of the companies discreet graphic card lineup, and they take great care to make sure that the creme de la creme is sent out. The fact that there is a misfiring voltage regulator that send them up in smoke is a rather large PR gaffe, Especially as this card (and the 6990) are nothing more than limited availability PR stunts to begin with.

Not to mention the vast majority of components will take a fair amount of abuse and survive the novice ideal that "the voltage slider goes to 1.4v....so it must be ok" and still function.

Regenweald said:

I think is obvious that partners will launch with much more robust VRM's so if you're an overclocker, best to wait and pay the premium I'd hazard a guess that on those burnt cards, the GPU's are healthy. Shame and a waste really....

dividebyzero dividebyzero, trainee n00b, said:

The reference run should be fairly small in any case I would imagine...just enough to qualify the card as a bona fide reference (rather than AIB) product. The DCII/UD/Lightning/TwinFrozrIII's should start rolling fairly soon, as should non-reference HD 6990's with the same solutions...assuming either AMD/Nvidia or their AIB's have any long-term interest in either card- which could be a matter for debate.

As with the HD6990's cooler/fan, the GTX590's voltage situation should be a minor rework -either through a VGA BIOS and/or a more robust power delivery (more than 4+1 in any case). Since it's already well proven that anything above 1.1v for a GTX580/570 becomes a case of diminishing returns at best (negative performance at worst), it would seem prudent to keep voltage to 1.05-1.1v -which from what I've seen would allow for the stock GTX590 clocks to be gained (GPU). vRAM I think is still going to hampered by the single-phase memeory voltage regulation and the complexity of 2 x 384-bit busses.

Either way, both the GTX590 and HD6990 are dinosaurs watching the meteor get closer if previous driver support for dual and quad GPU's is any indication.

red1776 red1776, Omnipotent Ruler of the Universe, said:

If this series of burned up cards is due to the voltage limiter being software based, It probably wont matter if the partners beef up the VRM's. It may not matter anyway as Nvidia made a lot of noise about this card not being a paper launch and having launch day availability. A cursory look around you see for example that Newwegg's listings for the 590 were accompanied with "out of stock" almost as soon as they went up. Over at Ebuyer.com , they have a total of 4 in stock, and a total of 10...for pre-order. They apparently are not expecting to get very many in the future either. Secondly, with a single GF 110 having a TDP of 244w, the binning process had to be extremely selective and constraining, whereas AMD has 8 different flavors and incarnations of the HD 6990 listed. I think this is one that Nvidia wants to go away. I also predict that Nvidias response to this is going to be more about the virtues of the GTX 580 than the GTX 590. A rather big to-do over two cards (590/6990) that don't make sense in any category.

I will be curious about two things-

1) How the 6990 holds up when the owners remove the yellow sticker

and..

2) if a GTX 590 manufactured 90 days from now performs at the same level as the review samples.

dividebyzero dividebyzero, trainee n00b, said:

I think the U.S. market is entirely an Asus and EVGA domain.

EVGA have their cards going in and out of stock all the time. I'm a member of the EVGA forums (as well as XS and oc.uk) a fair few people already have their orders enroute. I think Tiger Direct was the first seller of the cards and the main provider for those who already have theirs. Down in NZ and Aus I think only the Asus card is available -but anyone looking to buy one is going to get torn a new one ($NZ1799 = $US1350 !!!) The European markets are probably somewhat better seved thanks to PoV/TGT resurgence.

BTW: The Asus card has a review (verified owner) for the GTX590 a day after launch...so that makes it somewhat quicker to market than the HD6990 I believe- if we're spitting hairs:p

AMD probably need to keep the 6990 in stock so long to keep the GTX580 from being the "worlds fastest graphics blah blah" by default. So as long as the 6990 remains in stock, then so should the 590. Hopefully if gets replaced by a dual GTX560 with 900+ clocks and a more friendly price point soon.

red1776 red1776, Omnipotent Ruler of the Universe, said:

I think the U.S. market is entirely an Asus and EVGA domain.

EVGA have their cards going in and out of stock all the time. I'm a member of the EVGA forums (as well as XS and oc.uk) and a fair few people already have their orders enroute. I think Tiger Direct was the first seller of the cards and the main provider for those who already have theirs. Down in NZ and Aus I think only the Asus card is available -but anyone looking to buy one is going to get torn a new one ($NZ1799 = $US1350 !!!) The European markets are probably somewhat better seved thanks to PoV/TGT resurgence.

BTW: The Asus card has a review (verified owner) for the GTX590 a day after launch...so that makes it somewhat quicker to market than the HD6990 I believe- if we're spitting hairs:p

I get it's the opening reference volley, I just think the partners will get...oh about a dozen apiece to work with. or....see #2 in previous post. awfully cynical I know, but this "dual GPU' war has been rather silly.

Now here is a card that makes sense

[link]

dividebyzero dividebyzero, trainee n00b, said:

I'd expect that HD6990 owners shouldn' be overly affected by the BIOS switch. By all accounts fan noise precludes getting to carried away any big overclocking adventures. Powertune should keep things in check for people OC'ing on the 1.12v setting.

AIB non-reference models usually use a higher grade of componentry, so I doubt there will be too much commonality with what's available today. No doubt Kyle/Brent wil be doing a Asus DirectCu II GTX590/HD6990 article when both come to fruition. But since it will be in essence two cards sporting full-fat GTX580SLI/6970CFX those benches should be a forgone conclusion with the exception of quad GPU scaling and driver support.

Something tells me that w1zzard was aiming at being top-dog reviewer for the day. Not being satisfied with 815MHz (34% oc) leads me to believe he had predecided that the 0.963v guideline was going to be adhered to by most (if not all) reviewers and he wanted to make a big splash. The fact that Anand managed 750 (23% oc) and PureOC/Hilbert managed 775 (std GTX580 clock) with minimal voltage increases would tend to support the assumption. He also seems to be copping quite a bit of flak in his and other oc'ing forums for his methodology- rather unsurprisingly, especially after some statements attibuted to him and other reviewers stated that the cards blew on POST/OS startup (i.e. 2D clocks). If the card blew at, or under the 0.963v (or even 1-1.05v) then I think there is real cause for nvidia to recall the whole batch. I haven't heard anything definitive from SweClockers, but t-Break, Tweaktown, TPU and Lab501 all seem to have had pyrotechnics after pushing the voltage after Afterburner allowed the card to be overvolted to 1.21v

red1776 red1776, Omnipotent Ruler of the Universe, said:

I suppose they were thinking it has voltage limit, and the vast majority of cards will survive a over-overclock, so why not!? and went ahead and applied the Minnesota or equator OC to it.

By all accounts fan noise precludes getting to carried away any big overclocking adventures.

I'm not so sure, I did a machine for a guy who "had to have" a GTX 480. He likes to oc the hell out of it and it sounds like an F-14. His answer.....headphones!

DokkRokken said:

I suppose they were thinking it has voltage limit, and the vast majority of cards will survive a over-overclock, so why not!? and went ahead and applied the Minnesota or equator OC to it.

I'm not so sure, I did a machine for a guy who "had to have" a GTX 480. He likes to oc the hell out of it and it sounds like an F-14. His answer.....headphones!

Having two 470's OC'd, I've often considered the van Gogh route, and just doing away with my ears altogether to buy myself some peace. :p

dividebyzero dividebyzero, trainee n00b, said:

It's the nature of the enthusiast to push to the limits. I think that is a given.

What I (and a quite a few others) find bizarre is that the overclocking/performance gain parameters of the GF110 are already well known. W1zzard himself noted as much when saying that performance gains amount to nil once you start nearing 1.1v (and this on a GPU that has a 5+% higher stock voltage)

So, riddle me this Batman...Why would you push 1.2v when you already have this evidence on the same GPU ? What evidence is there from this data that 1.2v is going to offer a positive outcome ?

On a related note, w1zzard overclocked the HD6990 using stock (1.12v) voltage- didn't bother to even try 1.175v. Why? Using the same metric applied to the 590 article he could/should have had 1.47v as his upper (or starting) limit- why not try for 1100 Core?

Having two 470's OC'd, I've often considered the van Gogh route, and just doing away with my ears altogether to buy myself some peace. :p

I tried that already. You'd really have to love gaming and/or benching is all I can say.

red1776 red1776, Omnipotent Ruler of the Universe, said:

It's the nature of the enthusiast to push to the limits. I think that is a given.

What I (and a quite a few others) find bizarre is that the overclocking/performance gain parameters of the GF110 are already well known. W1zzard himself noted as much when saying that performance gains amount to nil once you start nearing 1.1v (and this on a GPU that has a 5+% higher stock voltage)

So, riddle me this Batman...Why would you push 1.2v when you already have this evidence on the same GPU ? What evidence is there from this data that 1.2v is going to offer a positive outcome ?

ahh, maybe the easiest question you have asked me. From what I have been able to tell from follow-up and phone calls from customers, Its because "I am the one doing it" just ego. They all think they have the magic touch. and if only they can get their hands on the controls...they can hit that magic 1Ghz core, or that 4.8Ghz CPU frequency...on air.

On a related note, w1zzard overclocked the HD6990 using stock (1.12v) voltage- didn't bother to even try 1.175v. Why? Using the same metric applied to the 590 article he could/should have had 1.47v as his upper (or starting) limit.

That I have no idea. But then i go up in .02 increments, so what do I know?

dividebyzero dividebyzero, trainee n00b, said:

That I have no idea. But then i go up in .02 increments, so what do I know?

I have a tendency to do likewise. Any fool can overclock by throwing voltage around. The art of overclocking is getting the best performance gain (not necessarily the highest clock either) for the lowest possible stable voltage.

Maybe w1zz has been in the game so long he can tell what settings need to be applied simply by looking at the pcb. Wish I was that talented.

red1776 red1776, Omnipotent Ruler of the Universe, said:

I have a tendency to do likewise. Any fool can overclock by throwing voltage around. The art of overclocking is getting the best performance gain (not necessarily the highest clock either) for the lowest possible stable voltage.

Maybe w1zz has been in the game so long he can tell what settings need to be applied simply by looking at the pcb. Wish I was that talented.

Right, it's like palm reading .You just read the tracers!

Guest said:

The SweClocker reviewer denies the 1.2 volt statement from nVidia. In the video he was running the card at 1.02v and at the same core speed as the GTX580. So with new drivers and maybe water cooling you can probably push this card beyond the 580 clocks. Future will tell ;)

dividebyzero dividebyzero, trainee n00b, said:

So, of the eight known GTX 590 fatalities, TPU, Tbreak, Tweaktown and Hardware.fr have all stated that they pushed their cards to 1.21v. The other four are pleading innocence or no comment. Although it sounds as if Lab501's card was DOA- so nvidia successfully tripping itself up yet again.

So, what's the excuse for Neoseekers HD6990 blowing up?

...however, when we were testing the performance of Dragon Age II the HD 6990 died on us. At the time of it's demise the card was set at the stock 830MHz setting and the BIOS switch was in the default position. The fact that it died could have been that we tested the graphics card at both the 375W and 450W settings, but since the review we have left the settings at default level.

Presently this leaves Neoseeker without a HD 6990 for future testing. AMD will not warranty the card so we are left with no choice but to reach out to their partners to see if we can get a sample

[source]

Is it possibly the same reason that the [link] ?*

And this considering AMD's solution is hardware based, and supposedly is designed with twice the power handling capability of the GTX590

I think we can safely say that both AMD and Nvidia have effectively found what is, and isn't feasible to stick onto one pcb.

And while the tribulations of the HD6990 shouldn't have any bearing on failing GTX590 cards, it seems fairly strange that the HD6990 seems immune from criticism. So the GTX590 will in all likelyhood be reworked with beefier VRM's and/or BIOS-locked/driver locked BIOS cap, but what precisely is the fix for the HD6990...assuming it needs one ?

*chances that these facts become mainstream news I guesstimate at 0 - 10% (based on these posts =#68, #70 from OCC's main guy). I would assume that if OCC/Neoseeker were spreading fud then the AIB's and AMD would be fairly quick to retute the claims since I would (again) assume that the dead cards would be returned to the vendor.

Can't wait to see who turns out the first three-GPU card.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.