PCIe Gen5 12VHPWR power connection to have 150W, 300W, 450W, and 600W outputs

Published by

Click here to post a comment for PCIe Gen5 12VHPWR power connection to have 150W, 300W, 450W, and 600W outputs on our message forum
data/avatar/default/avatar09.webp
Is the industry just following USB's example of making this stupid as hell? One connector but many possible outputs is a terrible idea. It's one connector and it should have to do everything by design. Make it 600w minimum standard and be done with it. Stop duping people with fake/deceptive specs. I can see it already - "PCIE gen 5 compliant!" - with 150w connectors/cables and the poor guy/girl with a 600w gpu.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
450 and 600w cables just to power up GPUs seems to excessive to me... Companies should be prioritizing energy efficiency, not the other way around.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
kcajjones:

Is the industry just following USB's example of making this stupid as hell? One connector but many possible outputs is a terrible idea. It's one connector and it should have to do everything by design. Make it 600w minimum standard and be done with it. Stop duping people with fake/deceptive specs. I can see it already - "PCIE gen 5 compliant!" - with 150w connectors/cables and the poor guy/girl with a 600w gpu.
Agreed. While the wattage label on the connector helps, not everyone is going to know the wattage of a GPU. With the old connectors, you didn't have to know; it either fit or it didn't. This will also either drive up the cost of cheap PSUs, or, they could be a potential hazard. The average consumer is an idiot. "hUrR dUrR it fits and the computer powers on so surely it works!" except when they try to put the GPU under any load. I expect a lot of products being unnecessarily returned because of this. Why couldn't this just be a 350W connector? Nothing more, nothing less. The PSU must deliver that wattage or else it shouldn't be included. At 350W, it doesn't have any overlap with the old connectors, so each one is still distinctly different. 350W is pretty much the upper limit of what can be dissipated in a dual-slot HSF card. If you need any more than 350W or a GPU, it's a stupid product, so just use a 2nd connector.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Agreed. While the wattage label on the connector helps, not everyone is going to know the wattage of a GPU. With the old connectors, you didn't have to know; it either fit or it didn't. This will also either drive up the cost of cheap PSUs, or, they could be a potential hazard. The average consumer is an idiot. "hUrR dUrR it fits and the computer powers on so surely it works!" except when they try to put the GPU under any load. I expect a lot of products being unnecessarily returned because of this. Why couldn't this just be a 350W connector? Nothing more, nothing less. The PSU must deliver that wattage or else it shouldn't be included. At 350W, it doesn't have any overlap with the old connectors, so each one is still distinctly different. 350W is pretty much the upper limit of what can be dissipated in a dual-slot HSF card. If you need any more than 350W or a GPU, it's a stupid product, so just use a 2nd connector.
Single cable is a better solution than some 320W GPU that comes with 3 power connectors, are you gonna use 3 separate PCIe power cable for this (no piggyback)?
small_red-devil-6900-xt-power.jpg
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
schmidtbag:

Agreed. While the wattage label on the connector helps, not everyone is going to know the wattage of a GPU. With the old connectors, you didn't have to know; it either fit or it didn't. This will also either drive up the cost of cheap PSUs, or, they could be a potential hazard. The average consumer is an idiot. "hUrR dUrR it fits and the computer powers on so surely it works!" except when they try to put the GPU under any load. I expect a lot of products being unnecessarily returned because of this. Why couldn't this just be a 350W connector? Nothing more, nothing less. The PSU must deliver that wattage or else it shouldn't be included. At 350W, it doesn't have any overlap with the old connectors, so each one is still distinctly different. 350W is pretty much the upper limit of what can be dissipated in a dual-slot HSF card. If you need any more than 350W or a GPU, it's a stupid product, so just use a 2nd connector.
The most stupid thing is to make a GPU that need 600W when the whole world work hard to reduce the W used everywhere (exept crypto farmer)... Oups... my bad... those GPU might be made for them, at least for those that left: if you want a sport car, a luxury sofa or even a well placed house (everything that is bling bling) right now they sell everything again...
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

Agreed. While the wattage label on the connector helps, not everyone is going to know the wattage of a GPU. With the old connectors, you didn't have to know; it either fit or it didn't. This will also either drive up the cost of cheap PSUs, or, they could be a potential hazard. The average consumer is an idiot. "hUrR dUrR it fits and the computer powers on so surely it works!" except when they try to put the GPU under any load. I expect a lot of products being unnecessarily returned because of this. Why couldn't this just be a 350W connector? Nothing more, nothing less. The PSU must deliver that wattage or else it shouldn't be included. At 350W, it doesn't have any overlap with the old connectors, so each one is still distinctly different. 350W is pretty much the upper limit of what can be dissipated in a dual-slot HSF card. If you need any more than 350W or a GPU, it's a stupid product, so just use a 2nd connector.
That might be true, but the problem is that industries are trying to take advantage of their ignorance instead of helping them by making things easier. A normal person shouldn´t be forced to spend hours in forums and tech sites in order to make a simple and educate guess about what to buy... And the same way some guys are ignorant about tech related stuff, i´m also ignorant about other stuff, and i also don´t like to be confused about products that should all be the same but instead have some small differences that in the end make all the difference!...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Single cable is a better solution than some 320W GPU that comes with 3 power connectors, are you gonna use 3 separate PCIe power cable for this (no piggyback)?
Did you not read what I said? I said this connector should be strictly 350W, because it would solve issues precisely like the example you provided. I'm fine with the connector existing. What I don't like is how it's going to 450W+ and supports such a wide range of wattages.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Did you not read what I said? I said this connector should be strictly 350W, because it would solve issues precisely like the example you provided. I'm fine with the connector existing. What I don't like is how it's going to 450W+ and supports such a wide range of wattages.
Why would you want 2 cables when 1 cable with thicker wiring can do the job? The current PCIe spec say each 8pin PCIe connector is rated for 150W, but in fact high quality PCIe cable can handle >300W, so piggyback is perfectly safe (using 2x8pin connectors from the same cable). So yeah, the new PCIe spec just make it clearer for specifying the max wattage per cable. GPU manufacturer can just put an advisory "make sure you have 450W or above power cable for this GPU" instead of people having to use 2 or 3 power cables unnecessarily (sure it's safer but clunky looking set up).
data/avatar/default/avatar24.webp
kcajjones:

Is the industry just following USB's example of making this stupid as hell? One connector but many possible outputs is a terrible idea. It's one connector and it should have to do everything by design. Make it 600w minimum standard and be done with it. Stop duping people with fake/deceptive specs. I can see it already - "PCIE gen 5 compliant!" - with 150w connectors/cables and the poor guy/girl with a 600w gpu.
Assuming you still want the same wattages, you would be stuck with 4 different connectors though, and your PSU either has to have a whole load of extra cables, or adapters - and adapters are generally not ideal as any connection point is a point of failure, and also of loss. And just enforcing support for full 600W obviously doesn't work, or you are automatically killing all PSUs with lower total power. Even if you enforce only 350W as suggested above, it still means you need another low-power connector, as 350W is still too much for eg. a 500W PSU, which is plenty for a small system with like a 150W GPU in it.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Why would you want 2 cables when 1 cable with thicker wiring can do the job? The current PCIe spec say each 8pin PCIe power is rated for 150W, but in fact high quality PCIe cable can handle >300W, so piggyback is perfectly safe (using 2x8pin connectors from the same cable).
Again: are you not reading what I said? I'm saying it should be a single 350W connector.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Again: are you not reading what I said? I'm saying it should be a single 350W connector.
Your particular usage scenario might not fit others, there are plenty of AIB 6900XT and 3090 out there already use >420W at stock settings
https://forums.guru3d.com/data/avatars/m/288/288652.jpg
Wait this connector is actually interesting because lower tier gpus that are modded will not run into a power bottleneck because they only have one 6 pin and can suck up to 600 watts with just one connector. You just need to ground the 2 sense pins. Quite exciting.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Your particular usage scenario might not fit others, there are plenty of AIB 6900XT and 3090 out there already use >420W at stock settings
I'm well aware. I think such situations are stupid and should be avoided. If you need more than 350W, then you should use additional power connectors. If a single connector for 450W+ systems is really that desirable, then make a new connector. Chip manufacturers should be motivated to attain performance levels within a certain power envelope. If they just say "screw it" or move the goal posts, that doesn't incite innovation. It promotes laziness of the engineers and it's a copout to whichever company claims to have the fastest GPU. A GPU being the fastest isn't impressive when it consumes more power than every other electronic device in the room combined.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

I'm well aware. I think such situations are stupid and should be avoided. If you need more than 350W, then you should use additional power connectors. If a single connector for 450W+ systems is really that desirable, then make a new connector. Chip manufacturers should be motivated to attain performance levels within a certain power envelope. If they just say "screw it" or move the goal posts, that doesn't incite innovation. It promotes laziness of the engineers and it's a copout to whichever company claims to have the fastest GPU. A GPU being the fastest isn't impressive when it consumes more power than every other electronic device in the room combined.
Well if the 600W GPU has the same efficiency (or FPS/watt) as the 300W one, why should you care? I can't handle >400W GPU either because I live in tropical climate but who am I to force others not to use >400W GPU when they live in Alaska for example 😀? Hell I have seen lots of "enthusiast" SLI/Xfire 4 GPUs that use 1000W in total in the past.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Krizby:

Well if the 600W GPU has the same efficiency (or FPS/watt) as the 300W one, why should you care? I can't handle >400W GPU either because I live in tropical climate but who am I to force others not to use >400W GPU when they live in Alaska for example 😀? Hell I have seen lots of "enthusiast" SLI/Xfire 4 GPUs that use 1000W in total in the past.
I personally hate it but to each is there own. Also have done the SLI/Xfire and living in Texas during the summer it would suck. I needed an air conditioner added to my room to make it bearable. I'm not looking to do that again so plan to keep the GPU's under 250watts going forward.
https://forums.guru3d.com/data/avatars/m/178/178348.jpg
A fine solution to a problem we didn't really have.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Stairmand:

A fine solution to a problem we didn't really have.
except the problem has existed for years, you've just been ignorant of it.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Well if the 600W GPU has the same efficiency (or FPS/watt) as the 300W one, why should you care? I can't handle >400W GPU either because I live in tropical climate but who am I to force others not to use >400W GPU when they live in Alaska for example 😀? Hell I have seen lots of "enthusiast" SLI/Xfire 4 GPUs that use 1000W in total in the past.
It isn't the same efficiency. Processor efficiency is not linear with clock speeds. They tend to really lose efficiency once you push them to a certain point; they also lose efficiency when you underclock them to a certain point. Architectures have different "sweet spots" for efficiency. That's why Intel struggles to compete with ARM's performance-per-watt and vise versa. They both have the same exact problem, except x86 scales down poorly and ARM scales up poorly. If you want a 400W+ GPU, fine, but the underlying point is standards should not be bent to satisfy either a niche need or to be an excuse for lazy product releases. It's not that big of a deal to have a 2nd power connector. It is a big deal when companies make excuses to ignore power consumption. The quad SLI/Xfire setups would require many power connectors too, because they're all separate GPUs. Clearly, enthusiasts didn't have much of a problem with that back then.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

It isn't the same efficiency. Processor efficiency is not linear with clock speeds. They tend to really lose efficiency once you push them to a certain point; they also lose efficiency when you underclock them to a certain point. Architectures have different "sweet spots" for efficiency. That's why Intel struggles to compete with ARM's performance-per-watt and vise versa. They both have the same exact problem, except x86 scales down poorly and ARM scales up poorly. If you want a 400W+ GPU, fine, but the underlying point is standards should not be bent to satisfy either a niche need or to be an excuse for lazy product releases. It's not that big of a deal to have a 2nd power connector. It is a big deal when companies make excuses to ignore power consumption. The quad SLI/Xfire setups would require many power connectors too, because they're all separate GPUs. Clearly, enthusiasts didn't have much of a problem with that back then.
Nah, as long as it's monolithic design, bigger GPU will have almost the same efficiency as smaller chip from the same uarch Like here
energy-efficiency.png
Except for Ampere (well due to how inefficient GDDR6X is), biggest chip like the TU102 and Navi21 actually have superior efficiency compare to smaller chip. Also GPU power consumption is not the whole PC, which is the only thing that matter. Let say a PC with 3080 that use 500W total is 30% faster than one with 3070 that use 400W, can you really say that the PC with 3080 is inefficient? If you worry about power consumption all that much, maybe consider switching to Steam Deck?
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
This is getting silly now!