Capacitor issues are causing RTX 3080/3090 crashes

I hear that due to driver delays, hardware that was already in production couldn’t be properly tested and went out the door nonetheless to meet the launch deadline. Rush, rush, rush...
 
I'm really confused by all this. Here, nVidia has the awesome RTX 3080 (and it really is awesome) but it's very obvious that they rushed the launch so much that problems were inevitable. Why didn't they just take another month to make sure that they had all of their ducks in a row (and stock on the shelves)? It's not like people would have died if the launch was delayed by a month. They also could've avoided the issues with stock and bots along with customers not having crashy cards.

It's not just nVidia that does this either and it's not just the tech industry. All industries pull this stupid crap and it makes me wonder when they'll finally learn to get the damn product right before asking people to pay for it. It really shows how little regard that companies have for the customers that keep them solvent.
Because in business delaying isnt good business. While not all companies take that approach, a lot do. Is it worth the gamble, to some yes. To nvidia, probably cause of all the hype and favorable reviews they got. Not to mention the $millions$ they are going to get. This issue, wont be stopping the hype train. Especially with EVGA coming out and saying what they said. In the end, everything will be fine apparently. Let this be a lesson to people buying new tech, let someone else do it first.
 
I'm sorry Nvidia is a big boy - this is unacceptable - Surely it has a team just dedicated to supplying clean , accurate power etc .

They license this out - I'm sure their must be a set of stds to adhere to .

I mean read some PSU reviews - a few test them to extremes - surely these big companies would test the hell out of these things.

If you were a project engineer doing a 2 billion dollar construction in Dubai - would you trust the suppliers of steel, cement data ? - you would a least get a number of samples analysed
 
There is a new driver that claims stability fixes today.

The reddit thread on this topic includes contributions from professionals in this space, and I tend to buy their input that the capacitors are part of the overall board design and it is not a one size fits all approach. Your need the right capacitors for your board's overall design, not necessarily one magic configuration that would be right for any board.

As to the bottom line here, that this "launch" was rushed beyond what should be acceptable, absolutely. There was not enough time for validation, drivers, or production and I fail to see what benefit anyone is getting from this circus, other than the sneaker scalpers.
 
There is a new driver that claims stability fixes today.

The reddit thread on this topic includes contributions from professionals in this space, and I tend to buy their input that the capacitors are part of the overall board design and it is not a one size fits all approach. Your need the right capacitors for your board's overall design, not necessarily one magic configuration that would be right for any board.

As to the bottom line here, that this "launch" was rushed beyond what should be acceptable, absolutely. There was not enough time for validation, drivers, or production and I fail to see what benefit anyone is getting from this circus, other than the sneaker scalpers.

Early reports are that the new driver improve things. It really has me wondering what the issue is then. If it's really is a cap issue then the only way to "fix" it driver side would be to reduce max clocks.
 
A fix is coming. it will be a download of 1GB new drivers which underclock your card for you transparently. done
 
I watched Jayz video on it and yeah, he is somehow trying to frame it as not Nvidia's fault. It is 100% Nvidia's fault, they set the spec that AIBs follow. There are people with the FE having the issue too.

He also pointed out that everyone likes to pile on when a card has issues, including people that don't even have the product. That made me chuckle a bit. It's true and yet he didn't seem to care when the same happened to Navi. Jayz is as much as an Nvidia homer as he's ever been.
Jay has become such a pathetic caricature of himself that I can't even bear to watch him any more. In that video to which you refer, he has a gigantic bookshelf that's chock-full of nVidia card boxes. It looks like an unbroken green line.

A few years back, he made the single most pathetic tech video that I'd ever seen. It was about recommending a video card for a first-time gamer. That fanboy fool recommended the GTX 1080 Ti (back when it was current and $700USD) for the FIRST TIME GAMER?! I know damn well that it was because the actual best card at the time for the first time gamer was the RX 580 and his masters at nVidia wouldn't allow him to recommend a Radeon.

He has this BS line that's he's not a fan of nVidia, he's a fan of EVGA. Right, so what's the difference? He doesn't even know what he's doing half the time and it's obvious that he won't do anything to jeopardise all the free stuff that Intel and nVidia give him. I even saw him say in a more recent video "Doesn't anyone use Intel any more?".

He's a useless tool.
 
Jay has become such a pathetic caricature of himself that I can't even bear to watch him any more. In that video to which you refer, he has a gigantic bookshelf that's chock-full of nVidia card boxes. It looks like an unbroken green line.

A few years back, he made the single most pathetic tech video that I'd ever seen. It was about recommending a video card for a first-time gamer. That fanboy fool recommended the GTX 1080 Ti (back when it was current and $700USD) for the FIRST TIME GAMER?! I know damn well that it was because the actual best card at the time for the first time gamer was the RX 580 and his masters at nVidia wouldn't allow him to recommend a Radeon.

He has this BS line that's he's not a fan of nVidia, he's a fan of EVGA. Right, so what's the difference? He doesn't even know what he's doing half the time and it's obvious that he won't do anything to jeopardise all the free stuff that Intel and nVidia give him. I even saw him say in a more recent video "Doesn't anyone use Intel any more?".

He's a useless tool.

That more or less lines up with what I'd expect of him. I stopped watching him years back over his reviews being more fanboying than actual review.
 
This is not a new problem. There was a time when a very wide range and brand of motherboards were failing ]because of really cheap quality Chinese capacitors. Again what part of motherboard really operates at high voltage? Tantalum capacitors cost at least 10 times more then ceramic. So anything to cut cost without real thorough testing in actual operating environment is nothing but disaster.
 
This is not a new problem. There was a time when a very wide range and brand of motherboards were failing ]because of really cheap quality Chinese capacitors. Again what part of motherboard really operates at high voltage? Tantalum capacitors cost at least 10 times more then ceramic. So anything to cut cost without real thorough testing in actual operating environment is nothing but disaster.
I couldn't agree more. The really stupid thing is, after this happened with motherboards, you'd think that they'd know that this wouldn't be a good idea for high-end 300W video cards. I was really shocked to see that Zotac was using all cheap capacitors (which is actually BELOW the nVidia reference spec) because they're the same company as Sapphire and Sapphire would NEVER do that to a Radeon card. After all, Sapphire is to Radeon what EVGA is to GeForce and I found it puzzling that PC Partner is so committed to quality when it comes to Sapphire Radeon but not so much when it comes to Zotac GeForce.
 
Last edited:
I couldn't agree more. The really stupid thing is, after this happened with motherboards, you'd think that they'd know that this wouldn't be a good idea for high-end 300W video cards.
My, there's a lot of back-seat EE's here. Let's put a few things into perspective, shall we?

a. A majority of the cards seem just fine.
b. It's possible, even probable that most or all the affected cards can be corrected by a simple driver update. For those that aren't-- you'll get a new card from the manufacturer. Yawn.
c. There isn't a EE on the planet that doesn't want to overengineer his product to well above required specifications. If a manufacturer does so, though, the products cost far more, and 999 times out of 1000, you the end user get absolutely no benefit from that extra cost.

This is the 1 time out of 1000 where that's not true. Don't condemn the entire system because of what truly is a rather small hiccup.
 
I couldn't agree more. The really stupid thing is, after this happened with motherboards, you'd think that they'd know that this wouldn't be a good idea for high-end 300W video cards. I was really shocked to see that Zotac was using all cheap capacitors (which is actually BELOW the nVidia reference spec) because they're the same company as Sapphire and Sapphire would NEVER do that to a Radeon card. After all, Sapphire is to Radeon what EVGA is to GeForce and I found it puzzling that PC Partner is so committed to quality when it comes to Sapphire Radeon but not so much when it comes to Zotac GeForce.
For all you know they may be getting special deals from AMD since they also buy AM4 motherboard chisets. Tell me honestly which company does not have something cooking under the table? I remember buying a Zotac Mini-Itx motherboard with dual core Atom and nVidia ION graphic chipset instead of the Intel one. Immediately Intel redesigned the chipset to move graphics to the CPU and jacked up the price.
 
If AMD does the same thing people will be talking so much **** about them, but NVIDIA always seems to get a pass from the fanboys. And this is worse than any driver issue! Period! LOL
 
For all you know they may be getting special deals from AMD since they also buy AM4 motherboard chisets. Tell me honestly which company does not have something cooking under the table? I remember buying a Zotac Mini-Itx motherboard with dual core Atom and nVidia ION graphic chipset instead of the Intel one. Immediately Intel redesigned the chipset to move graphics to the CPU and jacked up the price.
That's also true. PC Partner created the Sapphire brand five years before they created Zotac. Five years isn't a long time to a human but to the tech industry, it might as well be a century.

As for the lengths that Intel will go to in their ongoing efforts to swindle every last penny out of you, nothing surprises me anymore.
 
Last edited:
If AMD does the same thing people will be talking so much **** about them, but NVIDIA always seems to get a pass from the fanboys. And this is worse than any driver issue! Period! LOL
I blame the addon card manufacturers too.
Many years ago I found an ATI graphic card with built in TV tunerin a UK magazine. It was offered at twice the price of the same model and part number being sold in the USA. I figured it too would be installed with universal tuner just like ALL the TVs sold in India. So I got it from the US. Lo and behold it would not tune to and display a single TV channel in India. It turned out that the tuner chip used was strictly an NTSC version while one sold in the UK was a EU standard Pal-Secam version. So much for their honesty. Why have same part and ordering number is the compatibility is miles apart ???
 
I blame the addon card manufacturers too.
Many years ago I found an ATI graphic card with built in TV tunerin a UK magazine. It was offered at twice the price of the same model and part number being sold in the USA. I figured it too would be installed with universal tuner just like ALL the TVs sold in India. So I got it from the US. Lo and behold it would not tune to and display a single TV channel in India. It turned out that the tuner chip used was strictly an NTSC version while one sold in the UK was a EU standard Pal-Secam version. So much for their honesty. Why have same part and ordering number is the compatibility is miles apart ???
I'm sorry but I'm going to have to say that was your fault. It is (and was) common knowledge that North America uses NTSC and Europe uses PAL. You made an assumption without making any attempt to find out. I would never have made that mistake and if I didn't understand why it was so much cheaper in the USA, I would have made inquiries. Clearly, they weren't expecting someone in India to try to do what you did.

The ordering and part numbers are unique to each region which means that an ATi All-In-Wonder in North America will be an NTSC model while the same ATi All-In-Wonder in the UK will be the PAL model. It's not because it's just cheaper in the USA because ATi is Canadian. It's probably because the UK has VAT, there's some exchange involved and there was probably less demand for it in PAL regions than in NTSC regions. As a result, more NTSC models were made which drove the price down further.

There's a difference between being frugal and cheaping yourself to death. You didn't do your due diligence in making sure that it would work and you got burned. Calling ATi dishonest is ridiculous. You made a false assumption, the fault is yours. It's not that complicated.
 
My argument is that when the two were distinctly different hey should have assigned different part / ordering numbers. I sent it back to my daughter and she used it on her PC.

UK prices are overall substantially higher than those ruling in the US. Will give you an example. I have an acquaintance who makes 3 - 4 rips to UK & US on work. On every trip he buys the latest in Canon camera system. He uses it for 3-4 months in India. Then on his next trip he sells it on eBay in London as used and recoups his entire original investment. He is happy and the buyer is happy. From there he goes to US, again on work and picks up the latest models. But for the current Covid19 travel restrictions he has been following this regime for more than 6 years that I know of and never been out of pocket in his upgrades.
 
This is an older story at this point, but after the GPU shortages, I was just recently (in January 2023) able to pick up a Zotac RTX 3090 Trinity OC used on Facebook Marketplace. In Borderlands 3, the entire system crashes with a green screen, indicating an issue with the GPU (Googling suggests many AMD users can confirm). I thought I had been ripped off on a faulty card, as I was running it at the factory clocks and it wasn't running hot. I happened to stumble across this news story, and it seems to explain everything. Through experimentation with a Reddit thread, I set the power limit to 90% in Afterburner while keeping the clocks at default, and sure enough that seems to consistently keep the card stable. I suspect under-volting it slightly is just enough for the capacitors to be able to keep the voltages safely stable, and I've not seen any reduction in performance with my usage.

Just wanted to share my story for the next guy to Google the problem.
 
Back