During the stress tests we measured power consumption based on the power draw from the entire PC.
The methodology is simple: We have a device constantly monitoring the power draw from the PC. After we have run all our tests and benchmarks, we look at the recorded maximum peak; and that's the bulls-eye you need to observe as the power peak is extremely important. Bear in mind that you are not looking at the power consumption of the graphics card, but the consumption of the entire PC.
Our test system is a power hungry Core i7 965 / X58 based and overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results).
I'd say on average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc. Keep that in mind. Our normal system power consumption is a little higher than your average system. But with multi-GPU setups ... well that fact fades away real fast. Check it out.
With 2-way SLI we noticed our power consumption for the cards peaked to roughly 719 Watts, that's quite something really, above two 480 actually which is strange to see with a card that supposedly has a lower TDP. Yet we also really need to mention that one of the boards used (engineering sample) had an older BIOS and that the power consumption on that board might be a tad higher as a result of it.
But let's take the worst case scenario just to play it safe:
Measured power consumption
System in IDLE = 237W
System Wattage with GPUs in FULL Stress = 719W
Difference (GPU load) = 482W
Add average IDLE wattage ~ 20W x2
Subjective obtained GPU power consumption = ~ 522 Watts
Mind you that the system wattage is measured from the wall socket and is for the entire PC. Below, a chart of measured Wattages per card. Overall this is much higher than reference, this is due to an increased GPU voltage to allow easy overclocking and the standard higher clock frequencies.
Power Consumption Cost Analysis
Based on the Wattage we can now check how much a card like today will cost you per year and per month. We charge 0,23 EUR cents (or dollars) per KWh, which is the standard here.
TDP in KWh
2 hrs day
4 hrs day
Graphics card measured TDP
Cost 5 days per week / 4 hrs day
Cost per Month
Cost per Year 5 days week / 4 hrs day
We estimate and calculate here based on four hours GPU intensive gaming per day / 5 days a week with this card.
It's the high-end game though, you can expect lots of power being consumed, that never has been any different. In fact I remember two GTX 285 cards setup in 3-way SLI consume the very same, yet you have way more performance with the GTX 580s alright.
Gigabyte GeForce GTX Titan Black WindForce review We review the Gigabyte GeForce GTX Titan Black WindForce GHz edition. You take the reference product, arm it with a custom WindForce cooler and you receive a 6GB Titan Black that has been factory over...
Nvidia GeForce GTX Titan Black review A while ago Nvidia launched the GeForce GTX Titan Black which we review. We never tested it as it was supposed to be a professional series and targeted card. Nvidia's Board partners however are slowl...
Nvidia GeForce GTX Titan-Z review Review of the GeForce GTX Titan-Z. The card is much talked about as Nvidia introduced the product at prices that are insane, and then they refused to send out samples towards the media. To th...