As most of you know, with most videocards you can apply a simple series of tricks to boost the overall performance a little. You can do this at two levels, namely tweaking by enabling registry or BIOS hacks, or very simply to tamper with Image Quality. And then there is overclocking, which will give you the best possible results by far.
What do we need? One of the best tools for overclocking NVIDIA and ATI videocards is our own AfterBurner which will work with 90% of the graphics cards out there. We can really recommend it, download here.
Where should we go? Overclocking: By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard, but it really can be done in less than a few minutes. I always tend to recommend to novice users and beginners, to not increase the frequency any higher than 5% on the core and memory clock. Example: If your card runs at 600 MHz (which is pretty common these days) then I suggest that you don't increase the frequency any higher than 30 to 50 MHz.
More advanced users often push the frequency way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should back down 10-15 MHz and leave it at that. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. Carefully find that limit and then back down at least 20 MHz from the moment you notice an artifact. Look carefully and observe well. I really wouldn't know why you need to overclock today's tested card anyway, but we'll still show it.
All in all... do it at your own risk.
Core Clock: 822MHz
Core Clock: 1000MHz
Core Clock: 1044MHz
Shader Clock: 1644MHz
Shader Clock: 2088MHz
Memory Clock: 4008MHz
Memory Clock: 4950MHz
Now we left fan control at default, thus self regulating and during the overclock it did not at all become noisy. Our stable end result was not a huge overclock as the product is already so far factory clocked, yet still a good 1044 MHz on the core and 4950MHz on the memory. Temp went up merely a few degrees C. We did not apply any voltage tweak or anything.
Here's what that does towards overall game performance.
Above Call of Duty: Modern Warfare 2, maxed out image quality settings as before with 4xAA 16xAF
Above Battlefield Bad Company 2, maxed out image quality settings as before with 8xAA 16xAF
Gigabyte GeForce GTX 1060 G1 GAMING Review In this article we'll review the G1 GAMING GeForce GTX 1060 from Gigabyte GeForce GTX 1060. This products sits in the mainstream performance bracket, yet oozes class and cooling. with a 279 USD pr...
Gigabyte GeForce GTX 1070 G1 GAMING review We review the Gigabyte GeForce GTX 1070 G1 GAMING. It's factory customized and comes all tweaked and cooled so much better opposed to the founders edition. And it looks fantastic as well. Join me in...
Gigabyte GeForce GTX 1080 G1 GAMING review Gigabyte released their GeForce GTX 1080 G1 GAMING edition graphics card. This bad boy is what many of you have been waiting for, all custom, all tweaked and cooled much better opposed to the founder...
Gigabyte X170 Extreme ECC and Intel Xeon E3-1230 v5 We review the Gigabyte X170 Extreme ECC motherboard, an Xeon compatible Intel chipset based product that is loaded with kit, ECC memory support (if you use a Xeon) and features. Though the chipset and...