Twice as powerful but double energy usage - Nvidia GeForce RTX 40 graphics cards
Click here to post a comment for Twice as powerful but double energy usage - Nvidia GeForce RTX 40 graphics cards on our message forum
NiceHair
They're nuts
Endymion
I don't want to switch my 850 Titanium PSU.
But with those trends - I might soon.
That's sucks.
Huggi
I mean, there must have been a reason that the new PCIe 12-pin power connector is rated for 600W...
schmidtbag
Reminiscent of the GTX 400 series, oddly enough.
Though in this case I might be a little less annoyed, if it really is MCM, and, if that means lowering the price.
Fediuld
WTH. That means no process efficiency, just adding more power like Intel does.
Which means bigger monolithic GPU which costs more, requires more expensive PCB and ofc higher prices!
kakiharaFRS
so my 1200w psu already tripped over the 3090, potentially damaging it (it was setup in multi-rail OCP and apparently RTX3080 3090 don't like that at all)
and now they think we can run 700-800w gpus ? because that's what double is..not even 600w
great I can blow up my computer again this time for 2000fps in forza horizon 6 menus 🙄
Nekrosleezer
Thats not the way to go...
I dont want to turn on an "electric kettle" everytime i game...
insp1re2600
LEEc337
With power efficiency out the window does this mean miners will leave it alone?
Kaarme
Yeah, that's definitely not the proper way to go. I hope lower tiers are more power efficient. But then again, if the graphics card prices don't drop to sensible levels, I'm never going to buy a new one and will eventually stop gaming altogether. So, this might or might not matter to me.
Lloyd_Braun
Interesting, what day should I get in line to buy one?
reix2x
thanks to Siemens i have my power supply ready for the new Nvidia cards:
https://assets.new.siemens.com/siemens/assets/api/uuid:2d9f60fe-fa5a-4045-9f13-f4010755bc39/width:1024/im2017110057em_300dpi.jpg
icedman
I'll have to finally retire my 850w psu I think I've had it for 11ish years now
Alessio1989
https://c.tenor.com/oxt9EhZVzdEAAAAC/epic-facepalm.gif
cucaulay malkin
no thanks
I run 3060ti at 200-205w peak, 1995 core (UV) 16bgps vram and I already accepted that next gen will require a large power consumption increase if I'm gonna make a meaningful jump,but I'll be aiming for a 300-320w card that I could bring down to 275-280w max.
I refuse to be part of this 400-500w craziness.I had a 980Ti oc'd to 1500mhz,I didn't care about measuring power draw back then,but it must've been close to 280w if stock was 240-250.Hell,I just checked for some footage and oc'd in witcher 3 980ti pulls 300w,no wonder I couldn't bear the noise.Lol,I didn't even realize I had a 300w card 6 years ago already.
ironically,this gen's 3060ti is my first card with a single 8-pin since I had a 7870 with 2x6-pin
i'm waiting for pcie 5.0 compatible ones,I kinda like the idea of running a 300w card off a single connector even though I dislike 300w cards in general.
Solfaur
This is literally crazy if true, I'm already sweating even in autumn because of the heat output from my 3080Ti (and that's "only" like 350-370W). Adding say a new Alder Lake i9 and a RTX 4000 would mean absolute insane amount of heat and power consumption. I'm really surprised that in an age where people are more and more aware of energy waste, that this is the way forward...
Denial
Stairmand
skija
Double performance , double power consumption , DOUBLE PRICE
alanm
If its double the power draw, double the performance, then its same efficiency as Ampere. Pretty sure RDNA 3 will have similar power draw if they want to keep up.