GeForce RTX 4090 would consume up to 750W, it would arrive in September

Published by

Click here to post a comment for GeForce RTX 4090 would consume up to 750W, it would arrive in September on our message forum
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
no way in hell will i ever buy a card at that tdp, even though i have solar power. this is going to be a halo product and will immediately have snob appeal. i don't even think e-sports sponsors (except maybe ...Nvidia?) would go for buying one of these as that would otherwise be several gpu's even at inflated prices.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
okay i'm still getting 300W max they're still making them on tsmc5,which means there are going to be some really power efficient skus in the upper mid-range.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
750w gpu for streaming kids games on youtube 12 hours a day, worth it.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Ampere on steroids. Its the only way can push more performance out of it. AMD will beat the crap out of them with more performant and efficient rdna3.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

Ampere on steroids. Its the only way can push more performance out of it. AMD will beat the crap out of them with more performant and efficient rdna3.
♪ A Tale As Old As Time ♪
data/avatar/default/avatar30.webp
Is this rate I fear even something like RTX 4060 will be at least a 350w card.. And I thought 550w for 4090 was bad..
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Oh dear , this is getting way out of hand 😏750w~800w tdp nop , no sir , not for me.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Lol unless Nvidia ship these GPUs with active phase change cooler, there is no other conventional way to cool a ~700mm2 silicon that use 500W by itself (3090 silicon use ~220W), these leaks are just full of BS. That or LoveLace use MCM, which is not that exciting, I will just stick to single chip design that use <350W
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
4090 users will not be cold this next winter.
https://forums.guru3d.com/data/avatars/m/274/274587.jpg
LOLWhat? Well I'm gonna have to look into purchasing one... hopefully by then I'll be finished dropping this 109,000 horsepower Wärtsilä RT-flex96C engine in my Honda Civic.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
ps5 it is for the forsee able future, now way in hell would buy 250 tdp/watt gpu let one that is 750 tdp/watt
https://forums.guru3d.com/data/avatars/m/280/280620.jpg
I still interested in the MSRP of the price and wish it will be sold for 10k USD or Euro ++ and see how the steamer can do it for streaming.
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
For a MIDRANGE intel/nvidia PC you'll need a 1250 W PSU...
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
So glad my 3060 pulls in less than 200, and, most of the time less than 150
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
How on earth will this be able to dissipate heat effectively? Thermal density will be too high, except maybe if this is a chiplet design that offers a greater surface.
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
I guess my next system will be a legit 1000W+ setup. Might consider a solar setup just for my home office, even tho my electric rates are locked in.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
In the past everybody mostly said in forums that the power consumption of your PC even under load was negligible when looking at your power bill, but is that still true? What these high end CPUs and GPUs pull nowadays just seems ridiculous looking back in time. If I use my PC for 6-10 hrs a day (I use my PC for work and gaming so I'm on the computer a lot most days) I have to wonder what the monthly cost difference ends up being. Not an area I really know anything about, but if they keep pushing power consumption up and up and up at some point I expect the difference is going to be meaningful for heavy users, no?
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
Loobyluggs:

So glad my 3060 pulls in less than 200, and, most of the time less than 150
It's funny looking back how disappointed I was in the 2080 Super I ended up picking up since it seemed like other than the new RT features and DLSS (neither of which were any good at launch) the card was basically not much better than the 1080 Ti. After what happened with the GPU market I realize in hindsight I upgraded at a very fortunate time. DLSS in particular has become pretty nuts and a few games even have smashing RT implementations like Metro Exodus Remaster / Cyberpunk. If I could go back I think I'd opt for the 2070 S + 3700X instead of the 2080 S + 3900X though. Going back over the performance reviews, the 2070 S and 2080 S shared the same die (2070 S just downclocked/with more cores disabled for yield iirc) and the 3700X performs in games essentially the same as the 3900X so I wonder if I wasted more money than I needed to looking back.
https://forums.guru3d.com/data/avatars/m/271/271585.jpg
Just because a card could theoretically pull X amount via available power connectors doesn't translate to it will pull the absolute max possible.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Mpampis:

How on earth will this be able to dissipate heat effectively? Thermal density will be too high, except maybe if this is a chiplet design that offers a greater surface.
all you have to do is buy and point directly with in about 1 meter distance or less in your case a 18k btu air-condition unit and you will be fine :P