AMD: graphics cards with consumption above 700W TDP before 2025

Published by

Click here to post a comment for AMD: graphics cards with consumption above 700W TDP before 2025 on our message forum
https://forums.guru3d.com/data/avatars/m/253/253743.jpg
Haha very funny, he talks about Nvidia needing 1.2V for high clocks and AMD not needing that. But why then is my 6900XT running at 1.2V out of the box!?o_O
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
lowrider_05:

Haha very funny, he talks about Nvidia needing 1.2V for high clocks and AMD not needing that. But why then is my 6900XT running at 1.2V out of the box!?o_O
Id have to check my 6900xt but im pretty sure i hit 2.6ghz on a fair bit less than 1.2v
data/avatar/default/avatar37.webp
It dosent matter if it's nvidia, AMD or Intel gpu. 700 watt is apselutely a no go for me. My rtx 3080 that can go up 400 watt is all ready bad enough and I have undervoltage my card. Deffently worth it. Saves between 50 and 120 watt consumption and gives me more performance pr. Watt. But a bit lower performance at max. But that again depends on again how high gpu clock you want anf how low voltage you go to. Not only for the electric bill, but also the pollution. We all ready se now dramatically changes around the world on that part. So 700 watt is apselutely a no go. Also it will be a space heater like no cards before it. Guess that's would be good in cold environments, but hot climates yikes. I really don't hope his forecast is going to be true.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
The summertime heat generation alone will keep me from ever considering this new era of simply not giving a shit about efficiency on high end cards. On that front, how are they planning to cool these power hungry beasts? AIO kits on all of them?
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
0blivious:

The summertime heat generation alone will keep me from ever considering this new era of simply not giving a crap about efficiency on high end cards. On that front, how are they planning to cool these power hungry beasts? AIO kits on all of them?
My NV 7800gtx in 2005 was my first WCed card, and ive WCed every one since. Modern cards are still OCable but they are also very UVable, I cant remember the figures off the top of my head but my 6900xt at stock voltage and clocks Vs when OCed and UVed, significant difference performance and temps. It will hit 36c this week, i can dial back my clocks a bit, or turn on the aircon, either/or.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
Sure, enthusiasts will water cool them, but not everyone has the skill/inclination to do that. I'm just curious what the out of the box solution will be from the OEMs.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
0blivious:

Sure, enthusiasts will water cool them, but not everyone has the skill/inclination to do that. I'm just curious what the out of the box solution will be from the OEMs.
I wouldn't be surprised if cards with AIO become way more common, ive no idea but i wonder in what percentage of sales does the buyer find temps the biggest restriction in performance. Central Europe is warm for what 2 months of the year at most, most places that are commonly warm have aircon, would be interestuing to know.
https://forums.guru3d.com/data/avatars/m/217/217682.jpg
So they tell us to buy a PS5 Pro or a new XSX instead..
data/avatar/default/avatar19.webp
isn't the current that has a quadratic effect on consumption?
data/avatar/default/avatar01.webp
is not about how i cool it. The issue is that if they sell all the card they want to sell and if everyone is gaming 8 hours a day, we are consuming a shit ton of energy for gaming. I mean someone of you in the US may hate to hear this, but this stuff should be regulated when is outside meaningful fields ( research, indulstrial applications and so on ).
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
asturur:

isn't the current that has a quadratic effect on consumption?
Yes it is. Here is the full quote from Sam Naffziger
We’ve driven the frequency up, and that is something unique to AMD. Our GPU frequencies are 2.5 GHz plus now, which is hitting levels not before achieved. It’s not that the process technology is that much faster, but we’ve systematically gone through the design, re-architected the critical paths at a low level, the things that get in the way of high frequency, and done that in a power-efficient way. Frequency tends to have a reputation of resulting in high power. But in reality, if it’s done right, and we just re-architect the paths to reduce the levels of logic required, without adding a bunch of huge gates and extra pipe stages and such, we can get the work done faster. If you know what drives power consumption in silicon processors, it’s voltage. That’s a quadratic effect on power. To hit 2.5 GHz, Nvidia could do that, and in fact they do it with overclocked parts, but that drives the voltage up to very high levels, 1.2 or 1.3 volts. That’s a squared impact on power. Whereas we achieve those high frequencies at modest voltages and do so much more efficiently. We analyze our design pre-silicon, as we’re in the process of developing it, to assess that efficiency. We absolutely analyzed heavily the Nvidia designs and what they were doing, and of course targeted doing much better. Sam Naffziger
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
AIB cards usually have greater power consumption, because they want higher clocks.
data/avatar/default/avatar22.webp
Well, I preffer to stay in sub 300W parts myself.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I guess the customer still has a choice... not buy new cards if that TDP is an issue. At least with Nvidia that's the basic choice you have as it seems.
data/avatar/default/avatar28.webp
Why people from AMD keep mentioning 25x20?! I have no idea if they are trying to tell us if we are dumb or them...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
It's nice to see the lack of people here saying "who cares? Electricity is cheap" and "so what if it uses a lot of power if it has better performance-per-watt?". Seems like they're finally understanding the underlying problems. There is hope yet that AMD and Nvidia will take the hint that this is an unreasonable and petty way to achieve top performance. The prices and power draw of GPUs are rapidly killing interest in PC gaming. Miners and scalpers contribute a lot toward this but we can't blame them entirely.
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
Be nice if we had more moderate clocks and power consumptions. Time was the highest-end GPU out there (5970) had a power draw of 300 W, WITH TWO GPUS ON THE CARD!!!
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
lowrider_05:

Haha very funny, he talks about Nvidia needing 1.2V for high clocks and AMD not needing that. But why then is my 6900XT running at 1.2V out of the box!?o_O
The voltage needed, and the voltage applied by AIBs aren't always the same thing....
schmidtbag:

It's nice to see the lack of people here saying "who cares? Electricity is cheap" and "so what if it uses a lot of power if it has better performance-per-watt?". Seems like they're finally understanding the underlying problems. There is hope yet that AMD and Nvidia will take the hint that this is an unreasonable and petty way to achieve top performance. The prices and power draw of GPUs are rapidly killing interest in PC gaming. Miners and scalpers contribute a lot toward this but we can't blame them entirely.
In general, I'm not usually too concerned with power consumption given the market segment I usually reside in. But, the power consumption of high-end cards starts to concern me when it means the power consumption of mid-range and budget cards is going to go up as well. If a high-end card is going to consume 700watts, what's the chance of a mid-range card being below 300watts? I don't want to move down to entry-level cards to stay below 300watts....