Moving towards a 28nm based chip meant you can do more with less as the product becomes smaller, the GK110 GPU comes with roughly 7.1 billion transistors embedded into it. As such, we really expected the board power draw / TDP (maximum board power) to sit at a high number, however at a very respectable 250 Watts you can hardly complain. The GeForce GTX Titan comes with two power connectors to get enough current and a little spare for overclocking. This boils down as: 1x 8-pin PEG and 1 x 6-pin PEG = 225W + PCIe slot = 75W is 300W available (in theory). We'll measure all that later on in the article, but directly related to the power design is the following chapter on NVIDIA Boost technology that has advanced to revision 2.0.
Ever since the launch of Kepler GPUs the mainstream and high-end SKUs feature a Dynamic Clock Adjustment technology and we can explain it easily without any complexity. Typically when your graphics card idles the card's clock frequency will go down... yes? Well, obviously Kepler architecture cards will do this as well, yet it can now work vice versa. If in a game the GPU has room left for some more, it will increase the clock frequency a little and add some extra performance. You could say that the graphics card is maximizing its available power threshold and target.
This is all managed by dedicated hardware circuitry that monitors various aspects of the graphics card. from the GPU's power consumption and temperature to the actual GPU and memory utilization. All that information is processed by the GPU Boost software algorithm to determine what changes, if any, should be made to the GPU and memory clock speeds and voltages.
GPU Boost 2.0
So what's new? New starting with GeForce GTX Titan is a temperature target. Basically Titan monitors a temperature (that you can define) and will try to meet that target. The nominal baseline temperature is 80 degrees Celsius. That is the balance in-between an acceptable temperature versus low noise levels. If you configure the temperature target at 90 degrees and the power target has room left then Titan will increase the GPU Voltage a little bit. It'll then clock faster on the Turbo frequency until it reaches the temperature and power targets.
Overclocking on that end will work the same as GPU boost will continue to work while overclocking, it stays restricted within the TDP bracket. We'll show you that in our overclocking chapter. The overclock tools like MSI Afterburner and EVGA Precision will be updated during GeForce GTX Titan launch week, allowing you to tweak and overclock based on power, Voltage and heat targets.
We’ve had Titan already clocked at over 1.1GHz in our own testing. Can you imagine what severe coolness this means with liquid cooling?
Unlocked GPU core Voltage
GeForce GTX Titan is designed to be overclocked. NVIDIA received a lot of heat when they started limiting Voltages. Obviously they have done so in order to prevent high RMA rates. For GeForce GTX Titan this changes. At default your card will be locked at a maximum Core voltage of 1.162 mV.
Now read this very carefully, the board partners like MSI, EVGA and others get to decide whether or not you may unlock Voltage control. Inside the NVIDIA driver you can opt to unlock Voltage by agreeing to an EULA. That EULA will try to make you understand that applying higher voltages will decrease the lifespan of the product. So if the GeForce GTX Titan has been built for a theoretical 5 years productivity at 1.162 mV then tweaking the Voltage towards 1.250 mV could (in theory) halve that lifespan. Now here's the good news; unlocking the Voltage will not result in losing your warranty, let me be very clear about that. However, if you have a 2 year warranty and after 3 years the card dies as a result of voltage tweaking and thus the reduced lifespan... that would be the consequence and at your risk.
Some board partners thus might leave out the option and disable the Voltage unlock option completely (albeit we doubt it). Still, if you plan to Voltage tweak with your Titan, then be sure to check out the particular SKU and its ability to make that happen.
Now here's where a couple of you guys will say 'huh?'. Pretty much all monitors have a maximum of 60 hz and thus can only show 60 frames per second. Did you know that a lot of monitors actually can take higher refresh rates like 70, 80 maybe even 100 Hz?
The higher your monitor refresh rate the better your gaming experience as you can output more rendering awesomeness to your screen. And yes, it does matter. Typically what many of you guys did was to hack the EDID information of your monitor and that way try to 'overclock' your monitor. EDID (Extended Display Identification Data) is a data-set that the monitor sends to the display adapter, the graphics cards reads out that information and then returns the video signal back to the compatible screen resolution, refresh rate and color depth.
NVIDIA will now allow you to overclock the display screen by making custom profiles, so you can check if the monitor can take like 70 or 80 hz (or whatever). If you have found a compatible refresh rate you can apply it and keep using it. So you break away from the 60Hz limitation.
A personal note, I have no idea what effect this will have of the lifespan of your monitor. As with any kind of tweak or overclock, use it with caution and only if you deem it absolutely necessary weighing the risks versus the benefits.
MSI GeForce GTX 1070 Quick Silver 8G OC review We review the MSI GeForce GTX 1070 Quick Silver 8G OC armed with 8GB GDDR5 graphics memory. This hi-hooo Quick Silver edition is the latest SKU from MSI, comes with extra LED functionality and the ve...
MSI GeForce GTX 1050 & 1050 Ti Gaming X Review In this article we'll review the MSI GeForce GTX 1050 and 1050 Ti Gaming X, two graphics cards aimed at the budget minded consumer these cards are very affordable at a 109 and 139 dollar (US) respec...
ZOTAC GeForce GTX 1080 ArcticStorm Review All hail liquid cooling ! Wann see a GeForce GTX 1080 review based on an all custom and proper liquid cooled setup ? Well, meet the ZOTAC GeForce GTX 1080 ArcticStorm Edition. We'll check out the 8 G...