AMD: graphics cards with consumption above 700W TDP before 2025

Published by

Click here to post a comment for AMD: graphics cards with consumption above 700W TDP before 2025 on our message forum
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
I won't go over 200W TDP for a GPU, ever. Summer here is hot and electricity is expensive. Make it efficient of gtfo, It just doesn't make sense. PS: My RX580 is undervolted to 1.0v and clocked at 1250Mhz. One because of temp/consumption and second because these have a tendency to blow up and she's artifacting already at stock. Otherwhise, works mostly fine with much lower heat and power consumption.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
What surprises me in general: isn't a smaller node supposed to save power? Yes I'm wondering about Nvidia's jump first to Samsung 8nm and now TSMC 5nm
https://forums.guru3d.com/data/avatars/m/262/262085.jpg
700 watts? wow I have to keep an eye on my consumption now due to soaring electric prices and have cpu and gpu both down to their minimum just enough to not hurt games stock 1.068v and 1950mhz uses 245w but my clocks are set to 1875 @ .836mv and 113-150w power usage cpu locked to 4.00ghz with 1.125v or for light gaming 3.6ghz with 1v (3700X) all savings help in these times 700w is ridiculous
data/avatar/default/avatar24.webp
I can see this for mining - all they care about is hash rate/watt. If a single card has a great hash rate/watt but uses 700W that's great as it's more compact then needing several lower power cards to do the same thing. For gamers who aren't making money off their cards it's a lot less compelling. Do you want a 350W card that runs the latest game at very high game settings, or double that to a 700W card for ultra settings which other than bragging rights basically looks identical.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
sykozis:

In general, I'm not usually too concerned with power consumption given the market segment I usually reside in. But, the power consumption of high-end cards starts to concern me when it means the power consumption of mid-range and budget cards is going to go up as well. If a high-end card is going to consume 700watts, what's the chance of a mid-range card being below 300watts? I don't want to move down to entry-level cards to stay below 300watts....
I agree, though so long as the architecture's performance-per-watt is improving and prices per performance tier don't keep going up, I don't necessarily think that's a problem, yet. So for example, let's say the RX 9900 XT is 700W and can play the latest AAA titles at max detail in 8K and 60FPS. A 9600 could maybe play the same games in 4K at 300W, and a 9300 can play in 1080p at 150W. I'd be fine with that, assuming the prices of those performance tiers don't go up so much (as I've griped about before, 1080p GPUs have been sold for about $300 for the past 8 years). Honestly, the wide selection of GPUs we get per-generation is pretty stupid anyway, considering the very subtle differences. Would you like a GPU that can reliably play in 1440p, or another one that can also play in 1440p at an extra 10FPS? So, it'd be nice if each performance tier actually offered a compelling difference. Anyway to your point, if the goalpost of what the peak power draw should be keeps moving, soon enough we might find a simple office GPU to drive a few monitors will draw more than 75W, which would be absurd. It doesn't help that a lot of gaming laptops are giant bricks consuming more power than a console. So, your concern is valid.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
fantaskarsef:

What surprises me in general: isn't a smaller node supposed to save power? Yes I'm wondering about Nvidia's jump first to Samsung 8nm and now TSMC 5nm
they are saving power, and they're even doing further work to maximize chip density https://developer.nvidia.com/blog/designing-arithmetic-circuits-with-deep-reinforcement-learning/ it's just that their approach has changed. now it's one generation per node, to the limit of what a given node can take. next gen will probably be made on 3nm/2nm
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
No chance in hell I'll even go close to 500w. I'd be OK with EU stepping in to stop this "development" too.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
From now on I will be sticking with a sub 300W GPU.
lowrider_05:

Haha very funny, he talks about Nvidia needing 1.2V for high clocks and AMD not needing that. But why then is my 6900XT running at 1.2V out of the box!?o_O
You have a AMD RX 6900 XTXH LC...... this is why it uses 1.2v out of the box. It is a water cooled GPU with everything pushed to its limits. My 6800XT uses 1.15V because I have manually overclocked it. At stock settings I can undervolt to .920V and still hit 2350MHz sustained. For 2500-2600MHz sustained I need 1.15V and this is on a 7nm node so on a newer more efficient 5nm node I can easily see them hitting 3GHz+ with under 1.2V
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Silva:

I won't go over 200W TDP for a GPU, ever. Summer here is hot and electricity is expensive. Make it efficient of gtfo, It just doesn't make sense. PS: My RX580 is undervolted to 1.0v and clocked at 1250Mhz. One because of temp/consumption and second because these have a tendency to blow up and she's artifacting already at stock. Otherwhise, works mostly fine with much lower heat and power consumption.
That, and i will also stick to 60 FPS gaming, also helps with power draw, heat and noise.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
The BHP of performance cars keeps going up yes, as does performance. You cant expect to go up against your competitors with less power and lower performance with each generation of marque car. So many of you guys look at this only through the eyes of the end user, rather than of a business. I wonder how many people, and when i say that i mean the vast majority who just look at Reddit benchmarks, how many of those buy just on FPS of a card, the vast majority im sure, and they wont care about power draw if they see one manufacturers card can run some game at 87.3pfs against another who can 'only' get 86fps. Bigger numbers count when it comes to sales of anything, NV or AMD dont care about the views of anybody on here, they care about the bulk of the consumers who just see that big numbers = better everything. Being kinder to the environment does increase share prices or the biannual dividend, whether or not you like it, its the way it is.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Man i feel like i'm back in the old days were i used to be in game arcades spending all my money on games. If i game for 3 hours a day then i need to put an extra £30 a month into electric. When i don't game for a while i really see a difference in costs. 6 - 12 months ago i would hardly see any difference but that's not the case now. So it's like a pound a day extra which doesn't sound like much but remember that's just 2-3 hours of gaming.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Power use is already silly, this projection is ridiculous. They are restricting their own markets, much more effort needs to be put into keeping power use down. Theres no way in hell I'm getting a 700W card for many reasons. My current 450W card gets used at 350W (or less) and I consider that excessive especially in the Summer, such that it doesnt get used until the room cools down at night. Things are moving back assward.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Kiriakos-GR:

RTX 3000 came out and advertised as 3D Developer workstation, only the youth think of it as a Gamers card.
Please can you stop posting rubbish.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Kiriakos-GR:

RTX 3000 came out and advertised as 3D Developer workstation, only the youth think of it as a Gamers card.
That is bullshit. If you see the RTX 3000 presentation with Jensen Huang, you see him talking a lot about game performance. The cards that are really for workstations are things like the RTX A6000.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
This is basically an admission that silicone is done.
data/avatar/default/avatar06.webp
You dont have to run stock. simple enough to lower power target and run the cards at a practical power draw.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
If true that's very bad. High end gpus at their stock value are not efficient at all, mainly RTX 3080 and 3090, 320W is already a F**** oven.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Martin5000:

You dont have to run stock. simple enough to lower power target and run the cards at a practical power draw.
Sure but it kinda sucks to pay the high premium that these will most likely be only to downgrade them for the sake of a reasonable/practical wattage. At 700W stock, configuring it to a reasonable wattage would defeat the purpose of getting it, where you'd be better off just getting a lower-tier product.