AMD: graphics cards with consumption above 700W TDP before 2025
Click here to post a comment for AMD: graphics cards with consumption above 700W TDP before 2025 on our message forum
Silva
I won't go over 200W TDP for a GPU, ever.
Summer here is hot and electricity is expensive.
Make it efficient of gtfo, It just doesn't make sense.
PS: My RX580 is undervolted to 1.0v and clocked at 1250Mhz. One because of temp/consumption and second because these have a tendency to blow up and she's artifacting already at stock. Otherwhise, works mostly fine with much lower heat and power consumption.
fantaskarsef
What surprises me in general: isn't a smaller node supposed to save power? Yes I'm wondering about Nvidia's jump first to Samsung 8nm and now TSMC 5nm
Passus
700 watts? wow I have to keep an eye on my consumption now due to soaring electric prices and have cpu and gpu both down to their minimum just enough to not hurt games
stock 1.068v and 1950mhz uses 245w but my clocks are set to 1875 @ .836mv and 113-150w power usage
cpu locked to 4.00ghz with 1.125v or for light gaming 3.6ghz with 1v (3700X)
all savings help in these times
700w is ridiculous
Dribble
I can see this for mining - all they care about is hash rate/watt. If a single card has a great hash rate/watt but uses 700W that's great as it's more compact then needing several lower power cards to do the same thing.
For gamers who aren't making money off their cards it's a lot less compelling. Do you want a 350W card that runs the latest game at very high game settings, or double that to a 700W card for ultra settings which other than bragging rights basically looks identical.
schmidtbag
cucaulay malkin
https://developer.nvidia.com/blog/designing-arithmetic-circuits-with-deep-reinforcement-learning/
it's just that their approach has changed. now it's one generation per node, to the limit of what a given node can take.
next gen will probably be made on 3nm/2nm
they are saving power, and they're even doing further work to maximize chip density
Kaleid
No chance in hell I'll even go close to 500w.
I'd be OK with EU stepping in to stop this "development" too.
CPC_RedDawn
From now on I will be sticking with a sub 300W GPU.
You have a AMD RX 6900 XTXH LC...... this is why it uses 1.2v out of the box. It is a water cooled GPU with everything pushed to its limits.
My 6800XT uses 1.15V because I have manually overclocked it. At stock settings I can undervolt to .920V and still hit 2350MHz sustained. For 2500-2600MHz sustained I need 1.15V and this is on a 7nm node so on a newer more efficient 5nm node I can easily see them hitting 3GHz+ with under 1.2V
TheDeeGee
pegasus1
The BHP of performance cars keeps going up yes, as does performance. You cant expect to go up against your competitors with less power and lower performance with each generation of marque car.
So many of you guys look at this only through the eyes of the end user, rather than of a business.
I wonder how many people, and when i say that i mean the vast majority who just look at Reddit benchmarks, how many of those buy just on FPS of a card, the vast majority im sure, and they wont care about power draw if they see one manufacturers card can run some game at 87.3pfs against another who can 'only' get 86fps.
Bigger numbers count when it comes to sales of anything, NV or AMD dont care about the views of anybody on here, they care about the bulk of the consumers who just see that big numbers = better everything.
Being kinder to the environment does increase share prices or the biannual dividend, whether or not you like it, its the way it is.
Reddoguk
Man i feel like i'm back in the old days were i used to be in game arcades spending all my money on games.
If i game for 3 hours a day then i need to put an extra £30 a month into electric. When i don't game for a while i really see a difference in costs.
6 - 12 months ago i would hardly see any difference but that's not the case now. So it's like a pound a day extra which doesn't sound like much but remember that's just 2-3 hours of gaming.
Mufflore
Power use is already silly, this projection is ridiculous.
They are restricting their own markets, much more effort needs to be put into keeping power use down.
Theres no way in hell I'm getting a 700W card for many reasons.
My current 450W card gets used at 350W (or less) and I consider that excessive especially in the Summer, such that it doesnt get used until the room cools down at night.
Things are moving back assward.
Mufflore
Horus-Anhur
PrMinisterGR
This is basically an admission that silicone is done.
Horus-Anhur
https://kmdecorativesurfaces.com/wp-content/uploads/2019/09/Clear-Silicone-Box-of-12.jpg [/SPOILER]
Sorry, I couldn't help it. 😀
[SPOILER]Martin5000
You dont have to run stock.
simple enough to lower power target and run the cards at a practical power draw.
cucaulay malkin
https://www.upwork.com/hire/english-to-greek-translators/
man you need an English to Greek interpreter to watch those presentations.
here's a short list
moab600
If true that's very bad.
High end gpus at their stock value are not efficient at all, mainly RTX 3080 and 3090, 320W is already a F**** oven.
schmidtbag