Maximum Wattage for 6pin and 8pin PCIe Cables
Everywhere I look online it says 6pin PCIe can carry only 75W and the 8pin PCIe is maxed out at 150W. Why do you claim the PCIe cables you carry can handle a higher wattage?
We have been contacted, politely and not-so-politely, about this ever since we started selling crypto mining equipment in 2015. It's long overdue for us to put the matter of PCIe maximum wattage to rest.
Wattage Limitation as Publicly Rated
PCI-SIG, the original proprietary creator of the 6 and 8 pin PCIe connector design, charges an arm and a leg to view their trademarked schematics. As few people can afford to be privy to that information, businesses go by the official public rating: 75W for 6pin and 150W for 8pin. PCI-SIG does this for legal and liability reasons, not because it's a tested hard limit; they state that using more than 75/150W, respectively, is at your own risk. That said, how many companies have invested the time and money to verify the stated limitations aren't actually massively underquoted? I assume very few found it a worthwhile endeavor.
As the GPU manufacturers might have noticed, it makes very little electrical sense that simply adding 2 ground wires to turn a 6pin into an 8pin would somehow double the maximum wattage the entire cable can carry. Ground wires don't carry electricity! Subsequently, we know 75W/150W is not a hard limitation based on that fact alone. Below are some GPUs that don't follow PCI-SIG's wattage guidelines as further examples:
- the RX 480 draws 180W from a single 6pin
- the AMD Radeon 8GB version of the RX 480 draws 150W from a single 6pin
- the R9 390X draws 290W from a 6pin and an 8pin
- the 3070 draws 220W from a single 8pin
Further, when stating the 75/150W limit, PCI-SIG makes zero mention of the quality of the build materials, copper versus aluminum, solid copper versus stranded copper, or the American Wire Gauge (AWG) size. Those are all crucial pieces of information when trying to calculate the maximum safe wattage. They leave out too many important variables in their official rating for it to be a hard and fast rule.
Wattage Limitation in Practice
Our high-quality stranded-copper 16AWG PCIe cables (both the 6pin and 6+2pin) are tested and rated for up to 350W, while the thinner 18AWG splitters are tested and rated for up to 250W. (I personally prefer to only use the splitters with risers.)
- The amp rating for short-run 16AWG wire is 13amps. With three pairs of wires in our cables, the math calculation becomes: (Amp_Rating x 12volt) x 3 = Maximum watt per PCIe cable.
- The math for the 18AWG splitters is: (Amp_Rating x 12volt) x 3 = Maximum watt per PCIe cable.
To be safe, we recommend using only 80% of the above results, so further multiply by 0.80.
Mathematically, our cables should be able to handle higher wattage. However, through testing and real-life usage, we know that 350W and 250W are the respective upper limits for safety and long-term use of our cables. The PCIe ports on our breakout boards can handle those loads. Related side note: the ports on our boards will have cosmetic discoloration when they continuously push over 300W long-term.
When I reviewed the American Wire Gauge (AWG) system's charts from multiple sources, including Wikipedia, about the maximum amperage copper cables can carry, I noticed that the 18AWG cables are capped at 16A! There is conflicting data about whether the 16AWG cables are capped at 22A or 18A. This means that there could theoretically be 16AWG PCIe cables of the highest caliber build materials with the ability to carry up to 792W under an ideal usage environment. That's a far cry from PCI-SIG's stated 75/150W limit!
Final Words on PCIe Maximum Wattage Limits
The ultimate test for PCIe cables is to use them. If the cable becomes overloaded beyond what it can handle, it will get hot! If the cable is simply warm, not hot, then it is fully capable of supporting the current passing through it.
At the end of the day, we leave it up to you to decide if you think a PCIe cable is too warm to use. We're not here to talk you into using components that you feel uncomfortable with; we're here to answer questions so you make a well-informed decision about what is best for your needs. So, please, don't ever hesitate to reach out to us.