Moving towards a 28nm based chip meant you can do more with less as the product becomes smaller, the GK110 GPU comes with roughly 7.1 billion transistors embedded into it. As such we really expected the board power draw / TDP (maximum board power) to sit at a high number, however at a very respectable 250 Watts, you just hardly can complain. The GeForce GTX Titan comes with two power connectors to get enough current and a little spare for overclocking. This boils down as: 1 x 8-pin PEG and 1 x 6-pin PEG = 225W + PCIe slot = 75W is 300W available (in theory). We'll measure all that later on in the article, but directly related to the power design is the following chapter on NVIDIA Boost technology that has advanced to revision 2.0.
Ever since the launch of Kepler GPUs the mainstream and high-end SKUs feature a Dynamic Clock Adjustment technology and we can explain it easily without any complexity. Typically when your graphics card idles the cards clock frequency will go down... yes? Well, obviously Kepler architecture cards will do this as well, yet it can now work vice versa. If in a game the GPU has room left for some more, it will increase the clock frequency a little and add some extra performance. You could say that the graphics card is maximizing it's available power threshold and target.
This all is managed by a dedicated hardware circuitry that monitors various aspects of the graphics card, from the GPU's power consumption and temperature to the actual GPU and memory utilization. All that information is processed by the GPU Boost software algorithm to determine what changes, if any, should be made to the GPU and memory clock speeds and voltages.
GPU Boost 2.0
So what's new? New starting at GeForce GTX Titan is a temperature target, basically Titan monitors a temperature (that you can define) and will try and meet that target. The nominal baseline temperature is 80 degrees Celsius. That is the balance in-between an acceptable temperature versus low noise levels. If you configure the temperature target at 90 degrees and the power target has room left then Titan will increase the GPU Voltage a little bit. It'll then clock faster on the Turbo frequency until it reaches the temperature and power targets.
Overclocking on that end will work the same as GPU boost will continue to work while overclocking, it stays restricted within the TDP bracket. We'll show you that in out overclocking chapter. The overclock tools like MSI Afterburner and EVGA precision will be updated during GeForce GTX Titan launch week, allowing you to tweak and overclock based on power, voltage and heat targets.
We’ve had Titan already clocked at over 1.1 GHz in our own testing. Can you imagine what severe coolness this means with liquid cooling?
With the soon to be released MSI AfterBurner 3.0.0 you will be able to control many new settings like an updated power limiter, temperature limiter and priority features. In the monitor on the right side you can observe that the GPU is kept at 80 Degrees C at all times. The card will lower voltage, clocks or whatever is needed to match the temperature target.
All these settings are configurable, so you may set a temperature target of 90 Degrees C as well, which will get you more performance at the cost of slightly more noise. We'll discuss all this in our overclock article though.
Unlocked GPU core Voltage
GeForce GTX Titan is designed to be overclocked. Nvidia has received a lot of heat when they started limiting voltages. Obviously they have done so in order to prevent high RMA rates. For GeForce GTX Titan this changes. At default your card will be locked at a maximum Core voltage of 1.162 mV.
Now read this very carefully, the board partners like MSI, EVGA and others get to decide whether or not you may unlock Voltage control. Inside the Nvidia driver, you can opt to unlock Voltage by agreeing towards an EULA. That EULA will try to make you understand that applying higher voltages will decrease the lifespan of the product. So if the GeForce GTX Titan has been build for a theoretical 5 years productivity at 1.162 mV then tweaking Voltage towards 1.250 mV could (in theory) half that lifespan. Now here's the good news, unlocking the Voltage will not result in loosing your warranty, let me be very clear about that. However, if you have a 2 year warranty and after 3 years the card dies as a result of voltage tweaking and thus the reduced lifespan... that would be the consequence and at your risk.
Some board partners thus might leave out the option and disable the Voltage unlock option completely (albeit we doubt it). Still, if you plan to Voltage Tweak with your Titan, then be sure to check out the particular SKU and it's ability to make that happen. From what we have seen most boards will get voltage control, yet very limited towards 1.20 Volts.
Now here's where a couple of your guys will say "huh?". Pretty much all monitors have a maximum of 60 Hz and thus can only show 60 frames per second. Did you know that a lot of monitors actually can take higher refresh rates like 70, 80 maybe even 100 Hz?
The higher your monitor refresh rate the better your gaming experience as you can output more rendering awesomeness to your screen. And yes, it does matter. Typically what many of you guys did was to hack the EDID information of your monitor and that way try to "overclock" your monitor. EDID (Extended Display Identification Data) is a data-set that the monitor send towards the display adapter, the graphics cards reads out that information and then returns the video signal back to the compatible screen resolution, refresh rate and color depth.
Nvidia will now allow you to overclock the display screen by making custom profiles, so you can check if the monitor can take like 70 or 80 Hz (or whatever). If you have found a compatible refresh rate you can apply it and keep using it. So you break away from the 60 Hz limitation.
A personal note, I have no idea what effect this will have of the lifespan of your monitor. As with any kind of tweak or overclock, use it with caution and only if you deem it absolutely necessary weighing the risks versus the benefits.
GeForce GTX 1070 2-way SLI review We review two MSI GeForce GTX 1070 Gaming X editions graphics cards in a 2-way Multi-GPU setup. We'll obviously focus at Ultra HD performance as well as a micro stuttering analysis with the help of F...
ASUS ROG Strix GeForce GTX 1080 review ASUS unleashes their first GTX 1080 ROG card, the STRIX edition has been set free to run in the wild. It is armed with an all custom design including the STRIX cooler and a very healthy factory tweak....
Gigabyte GeForce GTX 1080 G1 GAMING review Gigabyte released their GeForce GTX 1080 G1 GAMING edition graphics card. This bad boy is what many of you have been waiting for, all custom, all tweaked and cooled much better opposed to the founder...