AMD: graphics cards with consumption above 700W TDP before 2025
Click here to post a comment for AMD: graphics cards with consumption above 700W TDP before 2025 on our message forum
lowrider_05
Haha very funny, he talks about Nvidia needing 1.2V for high clocks and AMD not needing that. But why then is my 6900XT running at 1.2V out of the box!?o_O
pegasus1
skylineGTR34
It dosent matter if it's nvidia, AMD or Intel gpu. 700 watt is apselutely a no go for me. My rtx 3080 that can go up 400 watt is all ready bad enough and I have undervoltage my card. Deffently worth it. Saves between 50 and 120 watt consumption and gives me more performance pr. Watt. But a bit lower performance at max. But that again depends on again how high gpu clock you want anf how low voltage you go to.
Not only for the electric bill, but also the pollution. We all ready se now dramatically changes around the world on that part. So 700 watt is apselutely a no go. Also it will be a space heater like no cards before it. Guess that's would be good in cold environments, but hot climates yikes.
I really don't hope his forecast is going to be true.
0blivious
The summertime heat generation alone will keep me from ever considering this new era of simply not giving a shit about efficiency on high end cards. On that front, how are they planning to cool these power hungry beasts? AIO kits on all of them?
pegasus1
0blivious
Sure, enthusiasts will water cool them, but not everyone has the skill/inclination to do that. I'm just curious what the out of the box solution will be from the OEMs.
pegasus1
Memorian
So they tell us to buy a PS5 Pro or a new XSX instead..
asturur
isn't the current that has a quadratic effect on consumption?
asturur
is not about how i cool it.
The issue is that if they sell all the card they want to sell and if everyone is gaming 8 hours a day, we are consuming a shit ton of energy for gaming.
I mean someone of you in the US may hate to hear this, but this stuff should be regulated when is outside meaningful fields ( research, indulstrial applications and so on ).
Horus-Anhur
cucaulay malkin
are they really stupid ?
https://tpucdn.com/review/amd-radeon-rx-6900-xt/images/clock-vs-voltage.png
1.175v for 2500mhz
pegasus1
Horus-Anhur
AIB cards usually have greater power consumption, because they want higher clocks.
Catspaw
Well, I preffer to stay in sub 300W parts myself.
fantaskarsef
I guess the customer still has a choice... not buy new cards if that TDP is an issue. At least with Nvidia that's the basic choice you have as it seems.
Crazy Serb
Why people from AMD keep mentioning 25x20?! I have no idea if they are trying to tell us if we are dumb or them...
schmidtbag
It's nice to see the lack of people here saying "who cares? Electricity is cheap" and "so what if it uses a lot of power if it has better performance-per-watt?". Seems like they're finally understanding the underlying problems.
There is hope yet that AMD and Nvidia will take the hint that this is an unreasonable and petty way to achieve top performance. The prices and power draw of GPUs are rapidly killing interest in PC gaming. Miners and scalpers contribute a lot toward this but we can't blame them entirely.
BLEH!
Be nice if we had more moderate clocks and power consumptions. Time was the highest-end GPU out there (5970) had a power draw of 300 W, WITH TWO GPUS ON THE CARD!!!
sykozis