Paying $1600 only to worry you're bending your GPU power cable wrong to turn it into a fire hazard...
amps is Watts divided by Volts which is 50 amps or Volts X Amps equal wattsI'm sure it was a typo, but @ 600 watts, the amperage is 5.
amps is Watts divided by Volts which is 50 amps or Volts X Amps equal watts
At 120v 600 watts is 5 amps
as graphics cards use 12v it's 50 amps which needs a thick pipe to supply.
"Purchase shaming?" That's a thing now?You bought a $1500 card without research or knowledge about the card. Info has been around for over a month, maybe much longer on rumor side. Not the smartest thing on your part. Don't blame a company for your mistake. If your card has no issues, just be cautious is all or use a atx 3.0 psu which has been recommendation from the beginning.
some of the cards can reach 600 watts with over clockingWhat's a little puzzling here is that I don't recall (but more than happy to be shown otherwise!) that the 3090 Ti, which also used the 12VHPWR connector (via a three 8 pin socket adapter), showed no signs of burning out connectors. That's a 450W card too.
Ehh. I've seen higher voltage & current over smaller cables.Never fear, Techspot is here to protect poor and defenseless Nvidia!
Man, why are media outlets pushing Nvidia cr@p so hard?
That sentence will apply to EVERY SINGLE HALO PRODUCT!
Yet, here we are…
Ok, rant over. I am not an electrical engineer, but I had concerns for that connector from the get go.
Convenient? Yes, but cant shake the feeling that its simply too much power going by very few and thin cables…
You bought a $1500 card without research or knowledge about the card. Info has been around for over a month, maybe much longer on rumor side. Not the smartest thing on your part. Don't blame a company for your mistake. If your card has no issues, just be cautious is all or use a atx 3.0 psu which has been recommendation from the beginning.
That's not the point here, as the two initial reports of the 12VHPWR connector melting (here and here) concern cards with TDPs in the same ballpark as the 3090 Ti (I.e. 450W or higher). There wasn't, as far as I'm aware, the same kinds of issues with that card's power connector.some of the cards can reach 600 watts with over clocking
It’s not proprietary to Nvidia, though. They may well have pushed for its inclusion with PCI-SIG, but it’s not exclusive to them.It's just another proprietary thing from nVidia to make the noobs think that it's special
And once again my lack of technological depth creeps into my life.All graphics cards draw current at 12V, be it through the PCIe slot or the power connectors. So 600W total is indeed 50A of current (600/12 = 50).
You're correct in looking at wall voltage and currents - for total system load and heat going into your room.And once again my lack of technological depth creeps into my life.
I always thought stated CPU\GPU wattage was at the wall.
When you (anyone) has the time, I'm curious how the industry actually measures this.
Let's say I turn on a desktop and load it down, CPU and GPU, with all the subsystems associated with it. Checking the draw at the wall reveals the system is drawing 600 watts total. That is 5 amps from the wall. (2.5 amps from 240 volt path).
Amperage = Watts\Volts
They use systems that can record the current draw the various supply pins/cables. Sites such as TechPowerUp use these for their power charts in GPU reviews.When you (anyone) has the time, I'm curious how the industry actually measures this.
Well, it may as well be because Radeons are just using the normal connectors. I don't really understand the logic behind it because normal PCI-Express supplementary power connectors work just fine so this is just a gimmick to me.It’s not proprietary to Nvidia, though. They may well have pushed for its inclusion with PCI-SIG, but it’s not exclusive to them.
I agree with you there..... but nevertheless, here we are.The PCI-SIG specification states a peak sustained draw of 55A, so it absolutely shouldn’t be melting with 450W (38A) cards, irrespective of the manufacturer.
Having a single connector that allows up to 600W, plus offers additional sensing and signalling pins, in a format that’s way more compact than three 8 pin PCIe connectors is definitely of interest to the GPU industry. It’s cheaper, for one.I don't really understand the logic behind it because normal PCI-Express supplementary power connectors work just fine so this is just a gimmick to me.
Having a single connector that allows up to 600W, plus offers additional sensing and signalling pins, in a format that’s way more compact than three 8 pin PCIe connectors is definitely of interest to the GPU industry. It’s cheaper, for one.![]()
some of the cards can reach 600 watts with over clocking
Correct. I’d only mentioned three because that’s how many Nvidia used, via the 12VHPWR adapter, for the 3090 Ti (and probably for the 4090 too, but I’ve not checked).It would be four of the 8 pin PCIE power connectors for the 600 watt power limit cards. That's why they switched to the newer connector.
It's kinda hard to overblow something like this. That's like saying that the Gigabyte PSUs that poofed were overblown by Steve Burke. If the connector melted, it melted. It might not be a fire hazard but it's still pretty damn bad, especially considering that it cost US$1,600 for this melting privilege!Good thing it didn't need a large part of that 800 watts.
Probably a bit overblown by the users though.
What is it about GeForce cards that start with the number 4 after the letter prefixes?Nvidia 4090 wanted the crown regardless the power and the price, now they got the burning crown for sure.
They need to lower the power and the price or add a fire extinguisher in every 4090 box.![]()
Can't say that I blame you. To pay US$1,600 for the privilege of melting a piece of your card would be a tough pill to swallow for me too.I won't buy any card that pulls more than 300W when gaming, or that costs more than $700. And it will be Team Red when I do upgrade. But if I had been considering a 4090 this would stop me cold. From the facts so far, it's a serious problem, involves major finger-pointing, and hard to be sure of a solid fix. I wouldn't want to buy into that situation, even potentially.
You bought a halo card from nVidia mere weeks before the launch of the new Radeon cards? Methinks that you're going to regret a lot more on November 3. A $1,600 impluse purchase is just insane unless you're rich enough that you don't need a job.I gotta say. I recently bought a 4090 and this whole fiascos making me regret my decision...
What he means is that there were whispers all over the internet about this problem occurring with one of the prototypes. I'm guessing that you're new to this and everyone's entitled to royally screw up the first time. What's important now is that you don't repeat this mistake and above all else, don't listen to what other people say about things. Watch reviews on YouTube on channels like Hardware Unboxed, Gamers Nexus and Paul's Hardware.Without research or knowledge. Of course. let me bring up my excel sheet right now.
I agree. I don't know what nVidia was smoking when they came up with this idea. I'd say I want some but I don't ever want to become that stupid!Bad connector. Too small of pins, cheaping out trying to keep costs down. there's a reason the 8 pins are so big when they officially only put out 150 watt.
Three 8 pins could have easily fed the 400, they were fine for the hungrier 295x2.
Just imagine what an i9-13900K would do!With a 7900X and a 4090 total system power draw can definitely hit 800W.
No, it's properly referred to as "Frankly Speaking" or "A Reality Check"."Purchase shaming?" That's a thing now?
Not exactly what one would expect from one of the most expensive video cards ever released, eh?I'll bet that nVidia and/or AIBs cheaped out on the connector of the included adapter cable, and either the pins are too loose, don't align properly, requires too much insertion force, and/or the overall connection (clip and plastic housing, combined) doesn't have retention force. Its also possible that PSU manufacturers cheaped out, too, on this new connector because of a "no one will use this" attitude from management.
It's just mind-boggling the things that weren't considered imperative back then, eh?Way back on Socket A days one friend of mine decided to change the thermal paste. When reinserting the cooler in those damn clamps one didn't lock all the way up.
He started the PC and had to go out, someone on front door. Guess what in 5 minutes his room was on fire. And after all investigation police and firefighters concluded the PC was the cause.
That was back in the days AMD cpu's didnt have the thermal shutdown like now.
Gamers Nexus has some fancy equipment for just that purpose but I have no idea how it works. All I know is that it cost them thousands.And once again my lack of technological depth creeps into my life.
I always thought stated CPU\GPU wattage was at the wall.
When you (anyone) has the time, I'm curious how the industry actually measures this.
Let's say I turn on a desktop and load it down, CPU and GPU, with all the subsystems associated with it. Checking the draw at the wall reveals the system is drawing 600 watts total. That is 5 amps from the wall. (2.5 amps from 240 volt path).
Amperage = Watts\Volts
They should have just used the four PCI-Express connectors. That would've been A LOT simpler (and evidently, more reliable too).It would be four of the 8 pin PCIE power connectors for the 600 watt power limit cards. That's why they switched to the newer connector.
Wow you are a genius using rape to compare product knowledge.You definitely sound like one of those amazing people who love to blame the victim or a person who got raped because of their clothes!!