I posted this already, here it is again:
The best cables I've seen are 16 AWG. A simple calculation for a solid 16 AWG wire shows that if fed 12V at 8A at the PSU end, it will deliver 11.9 V at the other end if it is 3 ft long (standard). However those wires are not one single core solid wire but multicore so skin effects will drop the voltage further. This will make the card ask for slightly more current to compensate, that's the law and that is your design flaw at least partially.
I'm currently holding one of those 12VHPWR cables proudly displaying a 600W label on it and it is made out of 16 AWG wires which are inadequate for this sort of current draw, their safety factor is simply too low. To be more precise, if they would be a few inches longer we would have more than 1% voltage drop. Sure, such a wire will take 9A draw individually but not without heating up, more so due to skin effects.
And then you look at the actual connector's sizes, the actual bit which has to transmit those 8 amps, it is simply too small, it barely has enough contact area for that current capacity if everything is ideal and the contact is almost perfect. Reality is different, contacts may not be perfectly clean or have a tiny bit of oxidation on them which is not uncommon, so they will heat up when the current draw increases, causing a bit more oxidation and so on, until the temperature caused by the increased resistivity of the contact melts the connector.
As an Automation Senior Engineer I'm quite familiar with power requirements and how power distribution works. When connectors melt could be due to short term peak current draw but most often it is because the RMS current draw is beyond what the connector's duty cycle can take. Therefore the only logical conclusion one can take is the wires/ connectors are undersized. The hard evidence points that way.