GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool

Published by

Click here to post a comment for GeForce RTX 4090 running at 3.0 GHz and 616 Watt running GPU stress tool on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

I love this idea/concept. For me, Intel AMD and Nvidia should sit together and agree on maximum thermals values that could be used by their CPUs and GPUs. And if anyone breached that values, that would have to pay a penalty to the others, if they managed to say within the agreed values. I know this is almost impossible to happen, but it would be really good.
@Ven0m I thought it was the Japanese government that was cracking down on them? That's why kei vehicles are so weird, because the restrictions were put on by the government and manufacturers tried to make the most of those restrictions. The Subaru Sambar is possibly the best example of this. In any case, the biggest difference here is that those car manufacturers made pretty much everything in their cars. AMD and Nvidia pretty much just make the chips themselves but none of the rest of the PC. They have no incentive to agree upon such things. The market determines the demand and being a duopoloy, one of them just has to be only slightly better than the other (Nvidia has a better overall platform, AMD has better prices) and if you don't like it then that sucks for you. PSU and motherboard manufacturers aren't going to care because it just helps drive sales of more valuable components. OEMs are probably thrilled about it because by cheaping out on cooling, the parts will surely prematurely die, which means people will be forced to buy upgrades/replacements sooner. So, I think a large government will have to step in, and basically tax the AIB manufacturer for making components that exceed a certain performance-per-watt. Since the AIB partners have razor thin margins, any additional costs would incentivize them to either optimize or otherwise lower the performance of the GPU. Since the chip manufacturer doesn't want to see their stats lower, they are then incentivized to make chips more efficient.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
gQx:

More power But sir... I said. More POWER. Did you see what you did sir f**k this s**t I'm out
Anatoli Diatlov: More power Random Comrade: But sir... Anatoli Diatlov: I said. More POWER.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
@ 600W it's not just a GPU stress tool. It's also testing PSU, case, case fans, CPU cooler etc etc
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
southamptonfc:

@ 600W it's not just a GPU stress tool. It's also testing PSU, case, case fans, CPU cooler etc etc
It also tests your circuit breaker (you have to account for the whole PC, the input wattage, and other devices connected to the same breaker), your air conditioning performance, and your wife's patience with electric bills and fan noise.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Well, if you look at all of the incandescent light bulbs you replaced in your home with LEDs, a 600 Watt GPU is palatable.
data/avatar/default/avatar05.webp
That power draw tests your house electric instalation. Some older houses (25/30y) have thinner wiring on some electric plugs, and equipments consuming high amounts of energy/current can burn that wiring over the years. My father being an electrician did wiring replacement in many houses because of that.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
umeng2002:

Well, if you look at all of the incandescent light bulbs you replaced in your home with LEDs, a 600 Watt GPU is palatable.
600W of lightbulbs spread throughout a house that might not be all on simultaneously isn't a big deal. Having 600W of incandescent bulbs all plugged into the same single outlet is. Remember too; that's 600W for the GPU alone, not the whole system. Nobody is going to pair a GPU like this with a 12400 or 5600G, so we're talking more like like 800W for the whole PC in a realistic workload. Add another ~40W for AC to DC conversion losses. The average incandescent bulb is 60W, so we're talking 14 light bulbs being on simultaneously in the same room.
data/avatar/default/avatar09.webp
schmidtbag:

600W of lightbulbs spread throughout a house that might not be all on simultaneously isn't a big deal. Having 600W of incandescent bulbs all plugged into the same single outlet is. Remember too; that's 600W for the GPU alone, not the whole system. Nobody is going to pair a GPU like this with a 12400 or 5600G, so we're talking more like like 800W for the whole PC in a realistic workload. Add another ~40W for AC to DC conversion losses. The average incandescent bulb is 60W, so we're talking 14 light bulbs being on simultaneously in the same room.
Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
https://forums.guru3d.com/data/avatars/m/294/294076.jpg
If I remember correctly, the maximum for the new 16-Pins power connector is 660W. So, that's pretty close.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Nopa:

If I remember correctly, the maximum for the new 16-Pins power connector is 660W. So, that's pretty close.
It can take more if they keep splashing some liquid nitrogen on it!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Glottiz:

Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
We're talking about an overclocked 4090 here, not a presumably stock 3080. So yes, it is a realistic workload. GPUs are commonly bottlenecks in games, so games would be pushing this to 600W, plus or minus a dozen. Note that if the CPU were under full load along with the GPU then the wattage would probably get closer to 900, which is why I was saying that realistically, it would be lower.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
guess its time to build that custom loop with a car radiator that i've always thought about.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
I think it's about time discussions were had on the design of modern computers and just scrapping what exists now and getting new technology - because, this is an appalling situation. Buying one of these? I think I would rather watch a 3 hour documentary about how Gorgonzola Larson does yoga in preparation of pretending to be an actress in Captain Marvel.
data/avatar/default/avatar24.webp
Notice that new, hot running hardware is released after the summer, when temps in the home are dropping. People need the extra heat anyhow so don’t care as much.
data/avatar/default/avatar37.webp
schmidtbag:

We're talking about an overclocked 4090 here, not a presumably stock 3080. So yes, it is a realistic workload. GPUs are commonly bottlenecks in games, so games would be pushing this to 600W, plus or minus a dozen. Note that if the CPU were under full load along with the GPU then the wattage would probably get closer to 900, which is why I was saying that realistically, it would be lower.
Are you being deliberately obtuse or just enjoy spreading hysteria? In any case, you'll be proven wrong when reviews come out in a few days and 4090 won't be running at 600W in games.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Glottiz:

Let me tell you this is not realistic workload. My entire 12700K PC with 3080, 4K TV, peripherals, router peaks out at 550W in gaming (but usually averages out at 350-450W depending on a game). I know this because I have enterprise grade UPS that always shows power usage. So if I added 4090 to my setup, power usage would remain relatively similar, even lower depending on a game, because it's a much more efficient card. Only if I stressed 4090 with uncapped framerate I guesstimate my total setup power usage would peak to about 650 watts for short bursts. This 616 watt thing in the article is for GPU stress tool, not real world gaming scenario.
The "3GHz" in the title already gave the context, welcome to the thread 😉
data/avatar/default/avatar37.webp
Glottiz:

Yup, and everyone memeing about power usage fail to realize that 4090 will actually be much more power efficient. Let's say you play a game at 4K with 60fps cap. On a 3090 power usage is about 400W, but on a 4090 playing that same game at 4K 60 and same settings will only use ~200W.
Lol, no one with a functioning brain will buy a 4090 to run the games at the same settings and fps as you did with your old gpu... that's like the most retarded notion ever.
data/avatar/default/avatar04.webp
Dragam1337:

Lol, no one with a functioning brain will buy a 4090 to run the games at the same settings and fps as you did with your old gpu... that's like the most retarded notion ever.
what you gonna remaster game yourself and invent higher settings or buy 8K TV ? Not all games stress GPU at 100% all the time. If I play a game that already runs at max settings on my display at least the benefit from 4090 would be smaller power bill.