Installation of any of the NVIDIA GeForce graphics cards is really easy. Once the card is seated into the PC make sure you hook up the monitor and of course any external power connectors like 6 and/or 8-pin PEG power connectors. Preferably get yourself a power supply that has these PCIe PEG connector native (converting them from a Molex Peripheral connector, anno 2010, we feel is a no-go).
Once done we boot into Windows, install the latest drivers and after a reboot all should be working. No further configuration is required or needed unless you like to tweak settings, for which you can open the driver's control panel.
Let's have a look at how much power draw we measure with this graphics card installed.
The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load.
Note: As of lately, there has been a lot of discussion using FurMark as stress test to measure power load. Furmark is so malicious on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.
After long deliberation we decided to move away from FurMark and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing what application that is as we do not want AMD/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.
Our test system is based on a power hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights etc.
We'll be calculating the GPU power consumption here, not the total PC power consumption.
Measured power consumption one card
System in IDLE = 213W
System Wattage with GPU in FULL Stress = 503W
Difference (GPU load) = 290W
Add average IDLE wattage ~ 50W
Subjective obtained GPU power consumption = ~ 340W
Bear in mind that the system Wattage is measured from the wall socket and is for the entire PC. Below, a chart of measured Wattages per card. Overall this is much higher than reference, this is due to an increased GPU voltage to allow easy overclocking and the standard higher clock frequencies.
Power Consumption Cost Analysis
Based on the Wattage we can now check how much a card like today will cost you per year and per month. We use a charge of 0,23 EUR cent (or dollar) per KWh, which is the (high) standard here.
TDP in KWh
2 hrs day
4 hrs day
Graphics card measured TDP
Cost 5 days per week / 4 hrs day
Cost per Month
Cost per Year 5 days week / 4 hrs day /
Cost per Year 5 days week / 4 hrs day / $
We estimate and calculate here based on four hours of GPU intensive gaming per day / 5 days a week with this card.
Recommended Power Supply
Here is Guru3D's power supply recommendation on the GeForce 500 series:
GeForce GTX 590 - On your average system the card requires you to have a 700 Watt power supply unit.
GeForce GTX 590 SLI - On your average system the card requires you to have a 1000+ Watt power supply unit.
If you are going to overclock CPU or GPU, then we do recommend that you purchase something with some more stamina.
There are many good PSUs out there, please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What would happen if your PSU can't cope with the load? Here are a few possible issues:
bad 3D performance
spontaneous reset or imminent shutdown of the PC
freezing during gameplay
PSU overload can cause it to break down
Let's move to the next page where we'll look into GPU heat levels and noise levels coming from this graphics card.
Gigabyte GeForce GTX Titan Black WindForce review We review the Gigabyte GeForce GTX Titan Black WindForce GHz edition. You take the reference product, arm it with a custom WindForce cooler and you receive a 6GB Titan Black that has been factory over...
Nvidia GeForce GTX Titan Black review A while ago Nvidia launched the GeForce GTX Titan Black which we review. We never tested it as it was supposed to be a professional series and targeted card. Nvidia's Board partners however are slowl...
Nvidia GeForce GTX Titan-Z review Review of the GeForce GTX Titan-Z. The card is much talked about as Nvidia introduced the product at prices that are insane, and then they refused to send out samples towards the media. To th...