As most of you know, with most videocards you can apply a simple series of tricks to boost the overall performance a little. You can do this at two levels, namely tweaking by enabling registry or BIOS hacks, or very simply to tamper with Image Quality. And then there is overclocking, which will give you the best possible results by far.
What do we need? One of the best tools for overclocking NVIDIA and ATI videocards is our own Rivatuner that you can download here. If you own an ATI or NVIDIA graphics card then the manufacturer actually has some very nice built in options for you that can be found in the display driver properties. Based on Rivatuner you can alternatively use MSI AfterBurner which will work with 90% of the graphics cards out there. We can really recommend it, download here.
Where should we go? Overclocking: By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard, but it really can be done in less than a few minutes. I always tend to recommend to novice users and beginners, to not increase the frequency any higher than 5% on the core and memory clock. Example: If your card runs at 600 MHz (which is pretty common these days) then I suggest that you don't increase the frequency any higher than 30 to 50 MHz.
More advanced users push the frequency often way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should back down 10-15 MHz and leave it at that. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. Carefully find that limit and then back down at least 20 MHz from the moment you notice an artifact. Look carefully and observe well. I really wouldn't know why you need to overclock today's tested card anyway, but we'll still show it.
All in all... do it at your own risk.
Core Clock: 675MHz
Core Clock: 780MHz
Core Clock: 952MHz
Shader Clock: 1350Hz
Shader Clock: 1904Hz
Memory Clock: 3600MHz
Memory Clock: 4400 MHz
Now we left the fan RPM control at default in all circumstances. We reached a very decent overclock guaranteeing better results. We tweaked GPU voltage a little with AfterBurner. Obviously if you like to give it a go, go grab AfterBurner (download here), the VRM control is fully supported and the HAWK card will allow PLL and MEM voltage tweaks as well. We increased the GPU voltage by 100mV (really don't overdo it), and maxed out memory and AUX voltages.
With the overclock our temperature now rises to roughly 77 degrees C under load. DBa levels rose to 44 DBa under full stress and power consumption went up from 342 Watt towards 435 Watt. When we add full maximum GPU/AUX/MEM voltage we noticed the power draw rising to 503 Watt (for the entire system, crazy really) But let's re-check some numbers shall we?
3DMark Vantage - setup in Performance mode
Another massive gain in performance alright. Unfortunately we did not manage to pull off a 1 GHz overclock, that would have been something. And please do have a peek at reference performance.
SOC model with COD: Modern Warfare 2, maxed out image quality settings as before with 4xAA 16xAF
MSI GeForce GTX 1060 GAMING X Review In this article we'll review the MSI GeForce GTX 1060 GAMING X, aimed at the mainstream segment with a 279 USD price this card is 20 bucks cheaper compared to the founders edition, yet comes factory ...
MSI GeForce GTX 1080 GAMING Z 8G review We move from the X to the Z as we test and review the all custom, cooled and tweaked higher GAMING Z edition GeForce GTX 1080 from MSI. Let's check out the new 8 GB beast with the all new TwinFrozr
MSI GeForce GTX 1080 SEA HAWK X review Let's fire up some GeForce GTX 1080 testing with hybrid cooling!, yes join us as we test the MSI GeForce GTX 1080 SEA HAWK X . The gear that everybody is waiting to see are the board partner cards, ...
MSI GeForce GTX 1070 Gaming X review We review the MSI GeForce GTX 1070 Gaming X armed with 8GB GDDR5 graphics memory. Now we all like the reference founders edition cards, but be honest with me .. everybody really waiting to see the b...