GeForce driver v267.52 could overheat GTX 590

Matthew DeCarlo

Posts: 5,271   +104
Staff

SweClockers along with other tech review sites have discovered a flaw that could cause Nvidia's new dual-GPU GeForce GTX 590 to burn up in a plume of smoke. After frying two sample cards, the Swedish site blamed the driver they were supplied with. When overclocking the GTX 590 with GeForce driver version 267.52, the card supposedly draws more power than it can actually handle causing it to overheat. SweClockers has documented the issue on video:

The solution? Avoid driver v267.52 like the plague -- especially if you intend to overclock your GTX 590. Nvidia has released a new driver (v267.85) with improved overcurrent protection and SweClockers has confirmed that this solves the problem. It may be common sense, but Nvidia says that anyone who's overclocking or running stress tests should use the latest driver available on the company's website instead of the disc-supplied driver, which is inevitably outdated.

Download: Windows XP 32-bit | Windows XP 64-bit | Windows Vista/7 32-bit | Windows Vista/7 64-bit

Although driver v267.85 has enhanced overcurrent protection, Nvidia notes that it's still risky to raise your card's default clocks and voltages. The company recommends leaving the GTX 590 at its stock voltage when overclocking with air cooling, while folks with liquid cooling should remain within 12.5-25mV of the default voltage. Nvidia believes SweClockers pushed the GTX 590 beyond its limits by running it as high as 1.2V instead of the default 0.91V to 0.96V.

Permalink to story.

 
I'm not so sure about it being a driver issue. TechPowerup blew up their review 590 using driver set
"GTX 590: 267.71" stating that:

"Card blew up during testing, power limiting system does not work reliably"
Here is that review. The quote is taken from the conclusions page of their review :

http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/1.html

I had also read that a third site blew up a card as well
 
I was a first adopter of NVIDIA 3D Vision and it stopped working with 3D Vision CD 1.45. You can't update GeForce drivers without the 3D Vision CD package or it will stop working, but this stopped working -with- the CD download.

Not to mention how far behind Tegra 2 is compared to the rest. I would be selling NVIDIA stock today if I hadn't yesterday.
 
Five sites blew up cards afaik including Hardware.fr and Tweaktown.
Nvidia's review notes apparently limited oc to 25mV over stock with the emphasis on no increase in core voltage if possible (Steve Walton may be able to confirm this)
From W1zzards TPU review:
[emphasis added]
Tweaktown used MSI Afterburner...and seems as though MSI don't know the parameters of Afterburners core voltage options -and that just for starters. It doesn't seem to be driver related...It seems more related to reviewers pushing heavy voltage through the card.

Seems pretty stupid to throw 0.3v on top of a <1.0v GPU on a dual-GPU, stock-cooling card if you ask me...Doubly so when the same reviewer used only stock voltage in his HD6990 overclocking test. Maybe W1zz angling for a AMD PR position.
Maybe this can become a trend...I look forward to seeing w1zzards attempts at forcing 1.4v through a HD6970, or 1.6-1.8v through a Bulldozer CPU in upcoming months.

FYI: 1.2v is enough to kill a bog-standard GTX580, let alone a dual card.
@Red1776
I think w1zz read your Minnesotan OC guide!
 
Five sites blew up cards afaik including Hardware.fr and Tweaktown.
Nvidia's review notes apparently limited oc to 25mV over stock with the emphasis on no increase in core voltage if possible (Steve Walton may be able to confirm this)
From W1zzards TPU review:
[emphasis added]
Tweaktown used MSI Afterburner...and seems as though MSI don't know the parameters of Afterburners core voltage options -and that just for starters. It doesn't seem to be driver related...It seems more related to reviewers pushing heavy voltage through the card.

Seems pretty stupid to throw 0.3v on top of a <1.0v GPU on a dual-GPU, stock-cooling card if you ask me...Doubly so when the same reviewer used only stock voltage in his HD6990 overclocking test. Maybe W1zz angling for a AMD PR position.
Maybe this can become a trend...I look forward to seeing w1zzards attempts at forcing 1.4v through a HD6970, or 1.6-1.8v through a Bulldozer CPU in upcoming months.

FYI: 1.2v is enough to kill a bog-standard GTX580, let alone a dual card.
@Red1776
I think w1zz read your Minnesotan OC guide!

I didn't think it was driver related. I don't know why you would OC either of the cards period, let alone a voltage limited (cutoff) card.

@Red1776
I think w1zz read your Minnesotan OC guide!
[/QUOTE]

Geez, I guess. either that or your customers 'Equator' OC'ing guide

https://www.techspot.com/gallery/member-galleries/p4219-it-burnsit-burns-21-21-21.html
 
Yet another troll fest.

BTW: it's spelt Karma. I know you're in the U.S. and English is probably a second language, but you won't help your cause by sounding illiterate. You at least have the assurance of knowing that keeping the comments short puts you light years ahead of spydercanopus...who seems to check the NASDAQ ticker in 3D - well done that man! you must be the hippest player in the market.
 
Geez, I guess. either that or your customers 'Equator' OC'ing guide
Yup, that's what happens when the idi0ts get hold of the important stuff, they either twiddle all the knobs and buttons like autistic lab reseach monkeys, or pen poorly formed trolling posts and purport to be stock market analysts. Go figure.

BTW:spydercanopus
If you really were affected by the 3D Vision 1.45 stopped working bug then you obviously didn't read up too much on how to fix it. But I suspect the only 3-D glasses you own are cardboard and free with every viewing of Pirahna.;)
 
Yea I like to invest money as opposed to it stagnating in the bank or blowing it on every new upgrade. NVDA was on a roll for quite a while after it dipped to $7 I rode it up to $15 and sold.

"Divide by zero", you're really clever, dude. Learn that in 5th grade? You're picking personal jabs on a forum about technology...who's the troll?
 
I posted as SpyderCanopus on that exact thread, dummy. That fix didn't work. NVIDIA claims it's a known issue to be fixed in the next CD release.
 
Well, if the fix didn't work for you and you don't believe nvidia are going to fix the problem, then your best bet is to migrate to an AMD, VIA or Intel graphics solution. Things in that thread would have been somewhat more clear if you had posted after the fix was posted (post #11)- regardless of whether it worked or not.

Quite what Tegra 2 has to do with the posted story is also up for debate....or do you see TI or Qualcomm offering a desktop 3D graphics solution in the near future?

@Matthew, red1776
Seems the problem -as far as Hardware.france (one of those who turned their GTX590 into a pyrotechnics display) is concerned -and backed by Nvidia if the translator is doing it's job-is one of installing the new drivers over the top of existing ones. Doesn't seem to allow for the software based OCP to become active. Article >>here<< -use your favoutite translator.
So, an uninstall, driver sweep and clean install seem to be the order of the day -insane voltages aside.

Somehow, I don't think it's rocket science to predict that if a GTX580's max voltage is 1.21v then a lower Vcore binned dual card using the same 4+1 power delivery might not take to kindly to the same voltage.
"Up to around 1.125 V the card responds well to voltage" - w1zzard's GTX580 review. So of course the obvious answer is to up the voltage to 1.2v
 
Just a reminder not to flame, name call or make personal comments. Restraint in this area is appreciated. Thanks.
 
It's an the very first revision of the 3D emitter that experiences issues with the new driver. Not interested in continuing dialog unrelated to the topic.
 
The stupid intern forgot to put the sticky notes that say "don't overclock plz, kthxbye lol ^-^ <3" onto the cards.

Good thing there are tech sites, they get to blow up their free cards before we do.
 
over-clocking always runs the risk of frying it, hence why the manufactures don't do it out-of-the-box. why is this so shocking?
 
....shocking....I get it...

over-clocking always runs the risk of frying it, hence why the manufactures don't do it out-of-the-box. why is this so shocking?


Bad pun aside, I don't ever remember five cards on four sites fried on launch day. This was the launch of the 'so called' flagship of the companies discreet graphic card lineup, and they take great care to make sure that the creme de la creme is sent out. The fact that there is a misfiring voltage regulator that send them up in smoke is a rather large PR gaffe, Especially as this card (and the 6990) are nothing more than limited availability PR stunts to begin with.
Not to mention the vast majority of components will take a fair amount of abuse and survive the novice ideal that "the voltage slider goes to 1.4v....so it must be ok" and still function.
 
I think is obvious that partners will launch with much more robust VRM's so if you're an overclocker, best to wait and pay the premium ;) I'd hazard a guess that on those burnt cards, the GPU's are healthy. Shame and a waste really....
 
The reference run should be fairly small in any case I would imagine...just enough to qualify the card as a bona fide reference (rather than AIB) product. The DCII/UD/Lightning/TwinFrozrIII's should start rolling fairly soon, as should non-reference HD 6990's with the same solutions...assuming either AMD/Nvidia or their AIB's have any long-term interest in either card- which could be a matter for debate.
As with the HD6990's cooler/fan, the GTX590's voltage situation should be a minor rework -either through a VGA BIOS and/or a more robust power delivery (more than 4+1 in any case). Since it's already well proven that anything above 1.1v for a GTX580/570 becomes a case of diminishing returns at best (negative performance at worst), it would seem prudent to keep voltage to 1.05-1.1v -which from what I've seen would allow for the stock GTX590 clocks to be gained (GPU). vRAM I think is still going to hampered by the single-phase memeory voltage regulation and the complexity of 2 x 384-bit busses.

Either way, both the GTX590 and HD6990 are dinosaurs watching the meteor get closer if previous driver support for dual and quad GPU's is any indication.
 
If this series of burned up cards is due to the voltage limiter being software based, It probably wont matter if the partners beef up the VRM's. It may not matter anyway as Nvidia made a lot of noise about this card not being a paper launch and having launch day availability. A cursory look around you see for example that Newwegg's listings for the 590 were accompanied with "out of stock" almost as soon as they went up. Over at Ebuyer.com , they have a total of 4 in stock, and a total of 10...for pre-order. They apparently are not expecting to get very many in the future either. Secondly, with a single GF 110 having a TDP of 244w, the binning process had to be extremely selective and constraining, whereas AMD has 8 different flavors and incarnations of the HD 6990 listed. I think this is one that Nvidia wants to go away. I also predict that Nvidias response to this is going to be more about the virtues of the GTX 580 than the GTX 590. A rather big to-do over two cards (590/6990) that don't make sense in any category.
I will be curious about two things-
1) How the 6990 holds up when the owners remove the yellow sticker
and..
2) if a GTX 590 manufactured 90 days from now performs at the same level as the review samples.
 
I think the U.S. market is entirely an Asus and EVGA domain.
EVGA have their cards going in and out of stock all the time. I'm a member of the EVGA forums (as well as XS and oc.uk) a fair few people already have their orders enroute. I think Tiger Direct was the first seller of the cards and the main provider for those who already have theirs. Down in NZ and Aus I think only the Asus card is available -but anyone looking to buy one is going to get torn a new one ($NZ1799 = $US1350 !!!) The European markets are probably somewhat better seved thanks to PoV/TGT resurgence.

BTW: The Asus card has a review (verified owner) for the GTX590 a day after launch...so that makes it somewhat quicker to market than the HD6990 I believe- if we're spitting hairs:p

AMD probably need to keep the 6990 in stock so long to keep the GTX580 from being the "worlds fastest graphics blah blah" by default. So as long as the 6990 remains in stock, then so should the 590. Hopefully if gets replaced by a dual GTX560 with 900+ clocks and a more friendly price point soon.
 
I think the U.S. market is entirely an Asus and EVGA domain.
EVGA have their cards going in and out of stock all the time. I'm a member of the EVGA forums (as well as XS and oc.uk) and a fair few people already have their orders enroute. I think Tiger Direct was the first seller of the cards and the main provider for those who already have theirs. Down in NZ and Aus I think only the Asus card is available -but anyone looking to buy one is going to get torn a new one ($NZ1799 = $US1350 !!!) The European markets are probably somewhat better seved thanks to PoV/TGT resurgence.

BTW: The Asus card has a review (verified owner) for the GTX590 a day after launch...so that makes it somewhat quicker to market than the HD6990 I believe- if we're spitting hairs:p

I get it's the opening reference volley, I just think the partners will get...oh about a dozen apiece to work with. or....see #2 in previous post. awfully cynical I know, but this "dual GPU' war has been rather silly.

Now here is a card that makes sense:rolleyes:
http://www.fudzilla.com/graphics/item/22093-evga-shows-its-geforce-gtx-460-2win
 
I'd expect that HD6990 owners shouldn' be overly affected by the BIOS switch. By all accounts fan noise precludes getting to carried away any big overclocking adventures. Powertune should keep things in check for people OC'ing on the 1.12v setting.

AIB non-reference models usually use a higher grade of componentry, so I doubt there will be too much commonality with what's available today. No doubt Kyle/Brent wil be doing a Asus DirectCu II GTX590/HD6990 article when both come to fruition. But since it will be in essence two cards sporting full-fat GTX580SLI/6970CFX those benches should be a forgone conclusion with the exception of quad GPU scaling and driver support.

Something tells me that w1zzard was aiming at being top-dog reviewer for the day. Not being satisfied with 815MHz (34% oc) leads me to believe he had predecided that the 0.963v guideline was going to be adhered to by most (if not all) reviewers and he wanted to make a big splash. The fact that Anand managed 750 (23% oc) and PureOC/Hilbert managed 775 (std GTX580 clock) with minimal voltage increases would tend to support the assumption. He also seems to be copping quite a bit of flak in his and other oc'ing forums for his methodology- rather unsurprisingly, especially after some statements attibuted to him and other reviewers stated that the cards blew on POST/OS startup (i.e. 2D clocks). If the card blew at, or under the 0.963v (or even 1-1.05v) then I think there is real cause for nvidia to recall the whole batch. I haven't heard anything definitive from SweClockers, but t-Break, Tweaktown, TPU and Lab501 all seem to have had pyrotechnics after pushing the voltage after Afterburner allowed the card to be overvolted to 1.21v
 
I suppose they were thinking it has voltage limit, and the vast majority of cards will survive a over-overclock, so why not!? and went ahead and applied the Minnesota or equator OC to it.
By all accounts fan noise precludes getting to carried away any big overclocking adventures.

I'm not so sure, I did a machine for a guy who "had to have" a GTX 480. He likes to oc the hell out of it and it sounds like an F-14. His answer.....headphones!:haha:
 
Back