Liquid Cooling and Overclocking the GTX 580 with Danger Den

Graphics cards 1049 Page 8 of 12 Published by

teaser

Power Consumption

Power Consumption

In our recent series GTX 480 and 580 reviews we noted that power consumption went up quite a bit once the reference ventilator started to spin harder (higher RPM). As such we were curious as to the fact whether or not we'd see a decrease in power consumption with that cooler removed.

So let's have a look at how much power draw we measure with this graphics card installed.

The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load.

Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.

After long deliberation we decided to move away from FurMark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.

Our test system is based on a power hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc.

We'll be calculating the GPU power consumption here, not the total PC power consumption.

Measured power consumption cards at default clocks

  1. System in IDLE = 199W
  2. System Wattage with GPU in FULL Stress = 395W
  3. Difference (GPU load) = 196 W
  4. Add average IDLE wattage ~ 25W
  5. Subjective obtained GPU power consumption = ~ 221 Watts

Measured power consumption card overclocked

  1. System in IDLE = 199W
  2. System Wattage with GPU in FULL Stress = 458W
  3. Difference (GPU load) = 259 W
  4. Add average IDLE wattage ~ 25W
  5. Subjective obtained GPU power consumption = ~ 284 Watts

Bear in mind that the system Wattage is measured from the wall socket and is for the entire PC. Below, a chart of measured Wattages per card. Overall this is much higher than reference, this is due to an increased GPU voltage to allow easy overclocking and the standard higher clock frequencies.

Power Consumption Cost Analysis

Based on the Wattage we can now check how much a card like today will cost you per year and per month. We use a charge of 0,23 EUR cent (or dollar) per KWh, which is the (high) standard here.

Power consumption default TDP in KWh KWh price 2 hrs day 4 hrs day
Graphics card measured TDP 0,221 0,23 0,10 0,20
         
Cost 5 days per week / 4 hrs day 1,02      
Cost per Month 4,41      
Cost per Year 5 days week / 4 hrs day 52,86      
Cost per Year 5 days week / 4 hrs day $ 69,78      

 

Power consumption OVERCLOCKED TDP in KWh KWh price 2 hrs day 4 hrs day
Graphics card measured TDP 0,284 0,23 0,13 0,26
         
Cost 5 days per week / 4 hrs day 1,31      
Cost per Month 5,66      
Cost per Year 5 days week / 4 hrs day 67,93      
Cost per Year 5 days week / 4 hrs day $ 89,67      

We estimate and calculate here based on four hours of GPU intensive gaming per day / 5 days a week with this card.

Share this content
Twitter Facebook Reddit WhatsApp Email Print