Nvidia investigating cases of melting RTX 4090 power cables, RDNA 3 won't use 12VHPWR

Daniel Sims

Posts: 1,373   +43
Staff
Why it matters: Nvidia's new flagship RTX 4090 is a beast in multiple aspects. On top of its unprecedented gaming performance, consumers have expressed worry over its power requirements. Those fears may now bear out weeks after warnings about the GPU's power connectors.

Nvidia said it's investigating at least two cases of burning and melting power adaptors for its new flagship RTX 4090 graphics card. The requirement of a new power connector standard to use with the GPU has caused some controversy.

Two Reddit users shared photos of burnt and melted RTX 4090 power connectors on the end of the cable and on the card itself. Nvidia contacted one of the owners in hopes of diagnosing the problem. It's unclear if the incidents are related to issues the Peripheral Component Interconnect Special Interest Group (PCI-SIG) highlighted in September.

The core of the controversy is the new power connector standard the RTX 4000 series graphics cards use. The GPUs employ the 16-pin 12VHPWR cable standard for ATX 3.0 power supplies. Those looking to upgrade without replacing their ATX 2.0 PSUs can use adapters bundled with the cards connecting three 8-pins or four 8-pins to one 12VHPWR cable.

Last month, PCI-SIG warned that certain situations could put the adaptors in danger of over-current and over-power. For example, pushing the PSU to its limit in a hot case could burn the adaptor connectors, but Nvidia claims it fixed the issue before the official launch. Cable vendor CableMod warns that bending the cables less than 35mm can threaten the connections since the 12VHPWR connectors are smaller than previous generations.

It's unclear if either of these concerns is behind the Reddit users' incidents. In any case, it's probably safest to use RTX 4000 GPUs with ATX 3.0 PSUs if you can afford the upgrade.

In response to the burnt connector photos, AMD confirmed its upcoming RTX 4000 competitor — the RDNA 3 GPUs — wouldn't use 12VHPWR. Many users will likely celebrate the new cards retaining full compatibility with their existing PSUs.

Worries over 12VHPWR came in addition to early concerns that the RTX 4090 might draw 800W. Nvidia confirmed that it only needed 450W, recommending an 850W PSU for systems using the card. However, some AIB partners suggest PSUs over 1,000W out of caution for users attempting to pair the GPUs with cheap PSUs.

Rumors indicate Nvidia intended to launch an even more powerful RTX 4000 card --the Lovelace Titan --but canceled it because it required too much energy. Reports say that it melted PSUs and tripped breakers in internal tests. However, the monster card could return with 27Gbps or GDDR7 memory.

Permalink to story.

 
Nvidia 4090 wanted the crown regardless the power and the price, now they got the burning crown for sure.
They need to lower the power, the price and add a fire extinguisher in every 4090 box. :)
 
Last edited:
I won't buy any card that pulls more than 300W when gaming, or that costs more than $700. And it will be Team Red when I do upgrade. But if I had been considering a 4090 this would stop me cold. From the facts so far, it's a serious problem, involves major finger-pointing, and hard to be sure of a solid fix. I wouldn't want to buy into that situation, even potentially.
 
I gotta say. I recently bought a 4090 and this whole fiascos making me regret my decision...
You bought a $1500 card without research or knowledge about the card. Info has been around for over a month, maybe much longer on rumor side. Not the smartest thing on your part. Don't blame a company for your mistake. If your card has no issues, just be cautious is all or use a atx 3.0 psu which has been recommendation from the beginning.
 
Nvidia's new flagship RTX 4090 is a beast in multiple aspects. On top of its unprecedented gaming performance, consumers have expressed worry over its power requirements.

Never fear, Techspot is here to protect poor and defenseless Nvidia!

Man, why are media outlets pushing Nvidia cr@p so hard?

That sentence will apply to EVERY SINGLE HALO PRODUCT!

Yet, here we are…


Ok, rant over. I am not an electrical engineer, but I had concerns for that connector from the get go.

Convenient? Yes, but cant shake the feeling that its simply too much power going by very few and thin cables…
 
However, some AIB partners suggest PSUs over 1,000W out of caution for users attempting to pair the GPUs with cheap PSUs.
JFC this is insane. You don't pair a power-hungry card like the 4090 with a "cheap" PSU, period. Giving a cheap PSU the ability to draw more power just means it's going to explode and start an electrical fire twice as fast.

If I didn't know any better Nvidia was intentionally setting the board up so that this generation of GPUs is a safety and environmental disaster, so that when regulatory retribution comes around they're the ones in the prime position to do the heel-face turn and become the responsible white knight, while cracking down harder on their partners.
 
This might be a combination of cable+connector specs that are cutting it too close (really looks rather small for the power it transports), cheap materials and manufacturing of said cable+connectors and the cards being so large that you cannot install them without bending connectors and cables.

If this were AMD, you can be sure that the message we‘d be hearing would be ‚every card is affected‘, ‚this has been an issue for years‘… yet somehow physically dying nVidia GPU (‚Bumpgate‘ that killed 7000+8000 series cards, 2080Ti dying, 3080/3090 cards being bricked by games…) is quickly forgotten and painted as a rare non-issue that only affects very few.
 
Jay also covered this recently:

I'd have to agree that this connector is poorly designed in terms of mechanical durability, which is troubling considering this connector is meant for high-wattage.
 
I feel yesterday's article on rising power requirements should have mentioned this. It's a perfect example of why the trend in high-end CPU/GPU power consumption is a bad thing, and it's only going to be become more of a problem.
 
You bought a $1500 card without research or knowledge about the card. Info has been around for over a month, maybe much longer on rumor side. Not the smartest thing on your part. Don't blame a company for your mistake. If your card has no issues, just be cautious is all or use a atx 3.0 psu which has been recommendation from the beginning.

Without research or knowledge. Of course. let me bring up my excel sheet right now.
 
Good thing it didn't need a large part of that 800 watts.
Probably a bit overblown by the users though.
The 800W thing could most likely be true, but it wasn't the RTX 4090 that was drawing this much. It might have been be an early engineering sample of the 4090Ti/Titan clocked as high as they could. (the 4090 can draw 600W in some situations when pushed to 3GHz)
 
Bad connector. Too small of pins, cheaping out trying to keep costs down. there's a reason the 8 pins are so big when they officially only put out 150 watt.

Three 8 pins could have easily fed the 400, they were fine for the hungrier 295x2.
 
With a 7900X and a 4090 total system power draw can definitely hit 800W.

But apparently, with a bit of tweaking you can get away with running that same system with a 550W power supply if you're willing to give up about 10% avg FPS:

 
With a 7900X and a 4090 total system power draw can definitely hit 800W.

But apparently, with a bit of tweaking you can get away with running that same system with a 550W power supply if you're willing to give up about 10% avg FPS:

That 550W power supply is too close to the limit. It's just for demonstrative purposes (and they clearly had issues with it). 650W is what should be used after undevolting and power limiting the CPU and GPU.
 
Back