Article is a summary of der8auer's video. Measurements show 22A/260 watts – nearly half of the card's power draw – going through a single wire heating it up to 150 °C in an open-air test bench.
I don't understand how the ATX standard hasn't adopted higher voltage rails for GPUs. Higher volts mean fewer amps which means less heat. 44A through an 8 pin connector is going to make a lot of heat. 11A at 48v would be much better.
There wasn't even anything wrong with the minifit jr 8 pin pcie connectors, the official spec of 150W massively derated. If Nvidia really wanted to cap the number of plugs and not change any specs then the 8 pin EPS connectors are rated at 288W.
If they did that then they likely would end up with melted 8 pin connectors as that derating is a protection against one pin being overused due to small expected differences in resistance on each pin
Using custom cables even in that scenario would still lead to cables melting.
It’s from increasing the cable length and therefore the required power draw to draw the required amount to power the device. It already happens with 8pin systems as well, this isn’t unique to the delivery system either.
Using too many extension cords has the exact same affect… it’s like blaming the heater because the cords melted. But no, let’s not blame physics, let’s blame the GPU(heater) -.-
It's mostly just one loud person who very apparently has not actually interacted with the case presented here. Either through ignorance or to purposely deflect the blame from Nvidia with the evidence weighing against them.
I thought blocking .ml would end the troll-madness, but - apparently I didn't. Well, at least the rest of the comments here show me that I'm not crazy or smth :3
Two. Two guys. One of which had cables melting away at casual gaming, the other of which had 150°C reached (and it would go higher over more time) in an open environment.
And even if it's only those two, that's two too many.
Wasn’t the one using a custom cable? Most of these are user error.
Edit, Jesus guys, this is basic electricity principles…
Adding length to electrical cables increases the power draw, adding connections increases resistance, which increases power draw.
So yeah, let’s add cables and connections that add draw and be shocked when it fucks up. Dude who made the video is just an idiot that doesn’t understand the base issue. It’s not the cards or adapter, it’s the fricken cable even though the guy doesn’t understand how voltage works apparently….
Why? Products fail all the time. Consumers who spend entire paychecks and stand in lines for hours to buy the newest Product Don't have the right to complain that the product hasn't been vetted yet. Spend your money on something important, not some trendy new gadget for your gaming PC. You can wait one year on the product will be tested and vetted, or it will have been removed from shelves because it was faulty to begin with, and the price will have come down. Your assertion that "two is too many" suggests that these consumers should be protected from their own poor spending decisions and is wrong to do so.
The cards aren’t affected* it’s the custom cables people are using.
I wonder why he didn’t do a temp test without the extensions and just Nvidias adapter? Those parts aren’t hot and haven’t caused any issues from any reports so far. It’s ALWAYS had a custom extra cable. Increasing the length of cables increases power draw, it’s the basics of electricity. So yeah… increaseing cable length, and adding extra connections will make it waste more power and draw more….. it’s not surprising it’s only happening with systems with the extra cable length and connections that cause loss.
Also, the headline is misleading, he’s blaming the cards, when the only piece that’s ever melted is custom non approved parts.