Let's have a look at how much power draw we measure with this graphics card installed.
The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load.
Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.
After long deliberation we decided to move away from Furmark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/ATI/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.
Our test system is based on a power hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc.
We'll be calculating the GPU power consumption here, not the total PC power consumption.
Measured power consumption
System in IDLE = 201W
System Wattage with GPU in FULL Stress = 449W
Difference (GPU load) = 248W
Add average IDLE wattage ~ 50W
Subjective obtained GPU power consumption = ~ 298 Watts
Mind you that the system wattage is measured from the wall socket and is for the entire PC. Below, a chart of measured Wattages per card.
With 2-way SLI we noticed our power consumption for the cards peaked to roughly 298 Watts, that's JUST the two cards, not the entire PC.
TDP in KWh
2 hrs day
4 hrs day
Graphics card measured TDP
Cost 5 days per week / 4 hrs day
Cost per Month
Cost per Year 5 days week / 4 hrs day /
Cost per Year 5 days week / 4 hrs day/ $
We estimate and calculate here based on four hours of GPU intensive gaming per day / 5 days a week with this card.
Above, a chart of relative power consumption. Due to the fact that we scrapped all FurMark results we are re-measuring all temp/dba/power tests with the new stress software. As such the results are limited to a handful of cards right now.
Again the Wattage displayed are the cards with the GPU(s) stressed 100%, and the CPU(s) left in near idle.
Here is Guru3D's power supply recommendation:
GeForce GTX 560 Ti
On your average system the card requires you to have a 500 Watt power supply unit.
GeForce GTX 560 Ti in 2-way SLI
A second card requires you to add another ~175 Watts. You need a 700+ Watt power supply unit if you use it in a high-end system (800+ to a KiloWatt is recommended if you plan on any overclocking).
For each other card (3-way SLI) that you add, just add another 200 Watts and 20A on the 12V rails as a safety margin. What would happen if your PSU can't cope with the load? Here are some pointers:
bad 3D performance
spontaneous reset or imminent shutdown of the PC
freezing during gameplay
PSU overload can cause it to break down
There are many good PSUs out there, please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there.
KFA2 GeForce GTX 980 Ti HOF Review In this review we benchmark the new KFA2 / GALAX GeForce GTX 980 Ti HOF, a product that impresses by design as it offers great game rendering performance at quiet noise levels. Hey with its white desi...