AMD announces driver that reduces PCIe power usage RX 480

Published by

Click here to post a comment for AMD announces driver that reduces PCIe power usage RX 480 on our message forum
https://forums.guru3d.com/data/avatars/m/183/183421.jpg
but they tout the card as having an 150W TDP shouldn't perform within that envelope and not exceed it unless absolutely necessary otherwise they should have given it a 170W TDP
data/avatar/default/avatar10.webp
but they tout the card as having an 150W TDP shouldn't perform within that envelope and not exceed it unless absolutely necessary otherwise they should have given it a 170W TDP
When you have a dud and you need to present it as new cure for cancer, you start hyping unrealistic features. Sometime it works, sometime it backfires. I would very much like to see the "reduced performance" numbers. Hope Hilbert will update Review (at least by a 1-2 additional tests) once this new driver emerges.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
Cheaper than a recall.
data/avatar/default/avatar09.webp
People cant reproduce what was happening, running 4k on this card seems kind of silly because it cant run 4k anyway. How does the performance increase with less power if its not throttling i wonder. I read the performance per watt increased 20% wile maintaining the same performance. I would like to see how close it is now to nvidia 16 fin fet for efficiency now its just closed the gap.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
but they tout the card as having an 150W TDP shouldn't perform within that envelope and not exceed it unless absolutely necessary otherwise they should have given it a 170W TDP
Since when has any company ever given realistic TDP estimates? Ever seen Intel's estimates? Sometimes they're almost uselessly vague.
data/avatar/default/avatar02.webp
Since when has any company ever given realistic TDP estimates? Ever seen Intel's estimates? Sometimes they're almost uselessly vague.
TDP is not even a measurement of total power used anyway. It is "Thermal Design Power", or rather, how much power is dissipated as heat. https://en.wikipedia.org/wiki/Thermal_design_power So those complaining that it uses more than 150w and referencing a stated TDP don't know what they are talking about anyway.
https://forums.guru3d.com/data/avatars/m/181/181448.jpg
TDP is not even a measurement of total power used anyway. It is "Thermal Design Power", or rather, how much power is dissipated as heat. https://en.wikipedia.org/wiki/Thermal_design_power So those complaining that it uses more than 150w and referencing a stated TDP don't know what they are talking about anyway.
Thank you, I've gotten to the point that when I see people talking about TDP as power draw I don't even correct them because I spend all day doing it and it wouldn't help. And to another posters comment, they said the new driver is supposed to add 3% performance boost, so it would probably negate any performance loss from lowering the power, if you choose to do that (you can keep the power and add the 3%).
https://forums.guru3d.com/data/avatars/m/262/262197.jpg
Thank you, I've gotten to the point that when I see people talking about TDP as power draw I don't even correct them because I spend all day doing it and it wouldn't help. And to another posters comment, they said the new driver is supposed to add 3% performance boost, so it would probably negate any performance loss from lowering the power, if you choose to do that (you can keep the power and add the 3%).
Sometimes I wonder if it is indeed difficult for some other people to come to the conclusions you outlined. Or if they are somehow, out of 'base motives' driven to play dumb.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
From what i hear they undervolt great AMD always gives more voltage than whats needed a little too much really. Like my R9 290 i could reduce that by quite a bit and still overclock. So under volt the GPU reduce power consumption, which reduces temps and technically boost should reach higher speeds. Still have to see it in action tho cause i doubt they can reduce the voltage by that much but who knows.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Thank you, I've gotten to the point that when I see people talking about TDP as power draw I don't even correct them because I spend all day doing it and it wouldn't help.
Am I wrong to assume that when techpowerup measured 165w average draw from combined PCI-E/6Pin that they were talking about how much the card was drawing? They instead talked about how much heat was being dissipated? ... Like yeah, we get it, TDP is thermal design power -- it's been mentioned before on this forum plenty of times. Regardless to how people are using the term, the RX480 still typically draws 165w in gaming and the GTX 1070 typically draws 145w and yet it's 40% faster. And it wouldn't really be that big of an issue, except that AMD is playing weird marketing games with the TDP numbers, differentiating TDP and GPU core power + the whole 2.8x perf/w claim. The Fury Nano typically draws 185w and is faster than the RX480 so how the hell they got 2.8x perf/w is beyond me, probably some specific tessellation based benchmark. I mean Polaris was marketed for the past 6 months as some break through in power for AMD. It's pretty ironic that the chip ships with a power problem, especially when they had hardware as early as March or whenever they ran that Hitman demo on it. And it's not like this is one sided. The same questions about the TDP perf/w numbers came up with both the release of Maxwell and Pascal. The only difference is Nvidia actually puts the stupid metrics they are measuring with on the side of their graphs. With Pascal it was like HGEMM/w or some number no gamer cares about. Unfortunately most of the tech press and forum users just see a graph and run to the forums to post their nonsense, good or bad.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Guilty as charged your honor. We have to watch these big corporations like a hawk or they'd slip in a lot more deception if they thought they could get away with it. Luckily for us people are alert and watching them and will not allow this behavior to escape unpunished. It boggles my mind how incompetent these companies are sometimes. Just a tiny mistake from AMD could lead to loss in sales and more hatred towards them. It's almost like self harm because of possible small mistakes made in power compliance. Surely the cost of an 8pin over the 6pin is minimal but AMD was never going to put an 8pin on there no matter what because that would be being truthful about the real TDP and not the fake TDP of 150w.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Guilty as charged your honor. We have to watch these big corporations like a hawk or they'd slip in a lot more deception if they thought they could get away with it. Luckily for us people are alert and watching them and will not allow this behavior to escape unpunished. It boggles my mind how incompetent these companies are sometimes. Just a tiny mistake from AMD could lead to loss in sales and more hatred towards them. It's almost like self harm because of possible small mistakes made in power compliance. Surely the cost of an 8pin over the 6pin is minimal but AMD was never going to put an 8pin on there no matter what because that would be being truthful about the real TDP and not the fake TDP of 150w.
Or maybe the auto-calibration of the Vcore in some cards was a bit off, and everyone made a mess off of it.
data/avatar/default/avatar27.webp
TDP is not even a measurement of total power used anyway. It is "Thermal Design Power", or rather, how much power is dissipated as heat.
I'll answer that. 100% of total electrical power consumed by GPU or CPU is emitted as heat. In terms of energy - all energy consumed by PC is dissipated as heat. Your PC does not store energy does it? Energy in = Energy out Oh but what about the useful work you say. Every arithmetical operation is nothing but a manipulation of states and electrical currents, and is therefore subject to the same ohmic heating. Every electron does no other work than moving under the effect of EM force in EM potential. In other words your PC is no less efficient as a space heater than a proper space heater. Here is a crude but effective experiment demonstrating this: https://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Space-Heater-Efficiency-511/ https://abload.de/img/pic_dispzquxd.jpg
Thank you, I've gotten to the point that when I see people talking about TDP as power draw I don't even correct them because I spend all day doing it and it wouldn't help.
Same. But the exact opposite I still try to correct those who don't see that the two are basically the same 🤓
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
Same. But the exact opposite I still try to correct those who don't see that the two are basically the same 🤓
Your argument is entirely correct, except your missing one small point. Thermal Design Power is not a measure of the energy that will consumed. It is the maximum heat output that the cards cooling is expected to handle. But since you're not going to have Furmark running in the background 24/7 it is only a good reference point. It is very much a case of "Your mileage may vary".
https://forums.guru3d.com/data/avatars/m/118/118821.jpg
And it wouldn't really be that big of an issue, except that AMD is playing weird marketing games with the TDP numbers
thats the part that actually makes sense about this whole overblown debacle - they advertised it as a 150w card because of the 75w PCIE slot + 75w 6pin. if they acknowledged the slightly higher 160-165w draw, they would in turn be obligated to make 8pin connectors on the reference design. (of course we know that a 6pin gpu connection can handle a lot more than 75w, but thats another thing.) considering they already pushed so hard to make sure this wasnt a paper launch by sending out some 4gb models with 8gb on board but disabled in BIOS...they obviously couldnt afford to revise the reference design this late in the game.
At over 80degC the power phases are very hot. The power fets junction temp will be approx. 10..15degC higher.
the VRMs are rated to at least 100 degrees. i think its ~110 max iirc. also, redistributing the 1:1 ratio slightly so that three of six mosfets are working a bit harder isnt going to increase their temp by even 10...you can quote me on that...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
but they tout the card as having an 150W TDP shouldn't perform within that envelope and not exceed it unless absolutely necessary otherwise they should have given it a 170W TDP
Can you tell us what TDP stands for? And then think about Card having blower which eats up to 20W and converts that energy into motion and not into heat. Edit: I see, people here figured it out already. Sometimes I am proud of... Anyway. RX-480 has power delivery close to Fury X level. I am sure having bit higher load on 3 sets of mosfets is not problem. Especially since I have already seen card OCed to "max" with +50% power limit and modded vBIOS eating 260W.
data/avatar/default/avatar05.webp
I'll answer that. 100% of total electrical power consumed by GPU or CPU is emitted as heat. In terms of energy - all energy consumed by PC is dissipated as heat. Your PC does not store energy does it? Energy in = Energy out Oh but what about the useful work you say. Every arithmetical operation is nothing but a manipulation of states and electrical currents, and is therefore subject to the same ohmic heating. Every electron does no other work than moving under the effect of EM force in EM potential. In other words your PC is no less efficient as a space heater than a proper space heater. Here is a crude but effective experiment demonstrating this: https://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Space-Heater-Efficiency-511/ https://abload.de/img/pic_dispzquxd.jpg Same. But the exact opposite I still try to correct those who don't see that the two are basically the same 🤓
Not really, TDP is used for determining cooling design. intel lists TDP per CPU family and the actual power draw can vary by over 50% within the same TDP category. I notice your chart doesn't show TDP for the PC or the wattage of the heater...so pretty much a pointless comparison. Who doesn't know that electronics generate heat?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
And if I go with that overclock.net MSI ab tweak is just a matter of power reconfig. - use more from 6pin instead of pcie slot. Which can also easily be done in driver level. No loss in perf. anyway, imo all this was just some stunt to blacklist 480x sales.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
And if I go with that overclock.net MSI ab tweak is just a matter of power reconfig. - use more from 6pin instead of pcie slot. Which can also easily be done in driver level. No loss in perf. anyway, imo all this was just some stunt to blacklist 480x sales.
Not really. AMD had to push the clock too high and apparently didn't do proper testing after that. Some samples have rather high leakage which makes the matters worse. While minor issue things can go bad if all stars align properly. They wouldn't fix things if they didn't need fixing. Also they will introduce mode that keeps the power usage below 150W and I assume it will enforce 75W for both connectors. That mode will lower the performance a bit though. It's a good thing that AMD acted and made the fix to make sure things will work well. They handled this pretty well.