Nvidia RTX 5090 graphics card power cable melts at both ends, bulge spotted at PSU side

Maybe video cards shouldn't be 600 w power monsters maybe scale it back a little bit
Then everyone will cry and whine about not getting any uplift between generations. Actually when you figure frames per watt the 4090 and 5090 pretty much rule in efficiency. Sure a RX 9070 can match them but they do it by lowering the clocks 400+ MHz more than they need to leaving a lot of performance on the table. You could do the same thing with a 4090 or 5090, beat a 9070 in efficiency and still smoke every other GPU on the market. It would also lower the watts used by 75 to 100 and anyone that has one can do that themselves in Afterburner but they won't because they want every last ounce of power and that comes at a cost of high wattages
 
12VHPWR is a toy connector. Form (cable management) over function (safety). Just look at the tiny thing. This should never run 600w without an active amp balancing mechanism or without any kind of failsafe.

Also, the thin wires as well as the dense area around the connectors does not help. This is not a PSU or cable problem. It's Nvidia. The GPU has no power management and can't tell if a pin is well over it's amp limit, it will continue to ask for full power from the PSU side and draw power through the cables and connnectors on the GPU and PSU side. And Nvidia ignored wire safety margins and cheaped out on shunt resistors (beginning with the 4090).

All six 12V wires of this standard come together in one single shunt resistor on the 5090, as Buildzoid showed. This was problematic with the 4090's 450w and is a stupid engineering decision at 600w. The 5090 can't adjust power and won't stop running if things go north and they sometimes do. Your nose or your smoke detector will be the first line of defense if there is an imbalance in amps on the 12V connector pins.

The kind of mating force, the tiny AWG16 toy wires and the overall design of the connector is not meant for this kind of power draw. It's not the fault of the cables or PSU makers. Also, 12v-2x6 or original cables are not a fix for the underlying issues. This is a complete design fail from Nvidia. The design operates at the absolute limit with the 5090. Things can work or they won't but without any readout you can't tell if something is wrong (unless you are using the 5090 Astral with the pin readout - ASUS felt there was a problem, too).

This is unaceptable for 3000$ cards.

But Nvidia couldn't care less. Their revenue comes 90% from data center products and enterprise services. They won't recall or redesign this connector for their gaming products. Or even comment on or at least deny the melting incidents of the 5090.

If you want to read more about this issue and have lot's of time on your hand, I recommend this thread from A to Z (41 pages) to perceive the level of bullshit of the 12VHPWR standard:
https://www.techpowerup.com/forums/...again-melting-12v-high-pwr-connectors.332311/

BTW I am not a hater. I own a 4090. I am just observing the different issues with the 5090 (not only the occasional melting) and I am baffled how Nvidia lost it's way. It took them only 2-3 years to completely forget their root business (gaming) and shift all focus to data center (because of AI).
 
Last edited:
It's nothing do to with the gpu's power draw. It's the pathetically designed cable connector that is out of spec and has a woeful margin of safety of ~10%. AMD and Intel rejected this crap design for good reason. 8-pin molex has a 90% safety margin. 3 8-pin molex can provide 600W alone with still 40%. safety margin and then add 75W from the PCI-E slot, so you can provide 675W reliably.

Why not develop a 12-pin molex type connector with say 250W rating and 80-90% margin of safety?
 
Then everyone will cry and whine about not getting any uplift between generations. Actually when you figure frames per watt the 4090 and 5090 pretty much rule in efficiency. Sure a RX 9070 can match them but they do it by lowering the clocks 400+ MHz more than they need to leaving a lot of performance on the table. You could do the same thing with a 4090 or 5090, beat a 9070 in efficiency and still smoke every other GPU on the market. It would also lower the watts used by 75 to 100 and anyone that has one can do that themselves in Afterburner but they won't because they want every last ounce of power and that comes at a cost of high wattages


The 5xxx series lack of performance upgrades are less about power but more about Nvidia not changing the architecture meaning no IPC uplifts. They brute forced the 5090 with more cuda cores and higher power draw. AMD 9070XT on same process node as Nividia delivered almost 7900XTX raster and better RTing with 33% less cores. Showing RDNA4 provided large IPC uplifts for similar power to previous gen.

Nvidia clearly did not give a toss about desktop, doing as little as possible this gen and then doubling down on their contempt with poor drivers, even worse stock levels and insane pricing. You could not make up this level of incompetence.
 
Then everyone will cry and whine about not getting any uplift between generations. Actually when you figure frames per watt the 4090 and 5090 pretty much rule in efficiency. Sure a RX 9070 can match them but they do it by lowering the clocks 400+ MHz more than they need to leaving a lot of performance on the table. You could do the same thing with a 4090 or 5090, beat a 9070 in efficiency and still smoke every other GPU on the market. It would also lower the watts used by 75 to 100 and anyone that has one can do that themselves in Afterburner but they won't because they want every last ounce of power and that comes at a cost of high wattages
That's fine but I refuse to accept this level of power demand, A few years ago we made fun of video cards for having this kind of power demand, does anybody remember how much fun we made of the r9 290X? I simply refuse to accept a market we're $3,000 video card that runs at 600 Watts is considered an acceptable product on the market because it's not I will never be accept to that. For the cost of that video card I can buy a used car and not a crappy one either at least where I'm at in Texas I can buy one with working AC that will reliably get me to and from work for three grand , it's also not going to be a fire hazard.

Now when he's kind of cards catch on fire and somebody's computer I also don't feel any sympathy for them because a fool and their money is easily parted. You chose to be a fool when you spent this kind of money on a video card and you get what you deserve.
 
Pretty soon they'll need ceramic connectors and sockets with high temperature electrical wires used in stoves and ovens. No more plastic and pvc insulation.
 
Nvidia should just make you pay a subscription to use their card (after buying it at $4000 of course) and if you don't it self-immolates and burns you and your house down.
 
Maybe video cards shouldn't be 600 w power monsters maybe scale it back a little bit
Come on, it's a 5090, it deserves all the power you can provide and more, it deserves all the money you can afford and more, it deserves all the conditions it requires to run unconditionally! Giving it anything less than what it deserves is like trying to put 87 gas in a Rolls-Royce....
 
Come on, it's a 5090, it deserves all the power you can provide and more, it deserves all the money you can afford and more, it deserves all the conditions it requires to run unconditionally! Giving it anything less than what it deserves is like trying to put 87 gas in a Rolls-Royce....
Yeah I'm not the kind of guy that's going to buy Rolls-Royce either. I'm the kind of guy that would rather be rich and drive an old beat up pickup truck and live in a double wide so nobody knows I've got the money.
 
5090 is a card that 0.001 ppl would buy, so it's not a big loss in the grand scheme of things BUT

possibly burning or even getting on fire because, F the rules of common sense (who designs a pcb to suck 600W + ) nobody should be held responsible.

Nvidia is becoming more and more CRAPvidia.
 
Jayz2cents ran tests of voltage on a series of cables, on a recent YouTube video. He discovered that in cases where the pins were slightly pushed in, the the voltage was low on those cables. He had a special microscope to magnify and identify problems. There were some cables that were consistently in spec. Many others were not. An interesting watch.

I can understand the appeal of this channel, but he's really lacking in a lot of his knowledge and (by default as an influencer) integrity.

If youve watched his channel, you know he's got a relationship with Nvidia. Nvidia has been trying to drive the cable narrative, but there are other, more knowledgeable folks out there (like Roman) who've done better investigation and found the issue lies with Nvidias decision to remove load regulation after the 30 series. I myself had 3 4090s (both with official adapters and atx 3.1 cables with melting issues).

So I'd encourage anyone reading this to check out other content on the matter. Cheers!
 
Back