As most of you know, with most videocards you can apply a simple series of tricks to boost the overall performance a little. You can do this at two levels, namely tweaking by enabling registry or BIOS hacks, or very simply to tamper with Image Quality. And then there is overclocking, which will give you the best possible results by far.
What do we need? One of the best tools for overclocking NVIDIA and ATI videocards is our own Rivatuner that you can download here. If you own an ATI or NVIDIA graphics card then the manufacturer actually has some very nice built in options for you that can be found in the display driver properties. Based on Rivatuner you can alternatively use MSI AfterBurner which will work with 90% of the graphics cards out there. We can really recommend it, download here.
Where should we go? Overclocking: By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard, but it really can be done in less than a few minutes. I always tend to recommend to novice users and beginners, to not increase the frequency any higher than 5% on the core and memory clock. Example: If your card runs at 600 MHz (which is pretty common these days) then I suggest that you don't increase the frequency any higher than 30 to 50 MHz.
More advanced users push the frequency often way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should back down 10-15 MHz and leave it at that. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. Carefully find that limit and then back down at least 20 MHz from the moment you notice an artifact. Look carefully and observe well. I really wouldn't know why you need to overclock today's tested card anyway, but we'll still show it.
All in all... do it at your own risk.
Overclocked (with GPU V)
Core Clock: 675 MHz
Core Clock: 815 MHz
Core Clock: 949 MHz
Shader Clock: 1350 MHz
Shader Clock:1630 MHz
Shader Clock: 1876 MHz
Memory Clock: 3600 MHz
Memory Clock:4000 MHz
Memory Clock: 4300 MHz
Now we left the fan RPM control at default in all circumstances. We reached a very nice overclock guaranteeing better results.
Without voltage tweaking your limit will roughly be 825~850 MHz on the core (1650~1700 on the shader processors). Memory can be clocked at 4300 MHz effective. Anything higher would results in an automated clockdown of the GPU/Memory.
However when you tweak GPU voltage a little with an application like AfterBurner (download here) and set it at 1.088V then you can take it up a notch more, our stable end result was 949 MHz on the core and 4300 MHz on the memory.
GPU overclock wise we were really pushing it though, for a long term overclock I'd like to recommend stepping down to 900 MHz.
Here's what that does to your overall performance.
3DMark Vantage - setup in Performance mode
An impressive gain in performance alright. Please do have a peek at reference GTX 460 1024MB performance as well. That's just a pretty big difference right there.
COD: Modern Warfare 2, maxed out image quality settings as before with 4xAA 16xAF
Battlefield BC2: maxed out image quality settings as before with 8xAA 16xAF
Gigabyte GeForce GTX 980 Ti G1 Gaming SOC Review In this review we take the Gigabyte GeForce GTX 980 Ti G1 Gaming (SOC edition) for a test-drive, the product is superb, awesome cooling, it's silent, it's factory overclocked and combined with the ...
Gigabyte GeForce GTX 970 OC Mini-ITX review We test the 17cm long Gigabyte GeForce GTX 970 OC Mini-ITX graphics card. The product does not vary much from any other 970 other than it's size. housed in a compact design this card might just be w...
Gigabyte GeForce GTX 960 G1 Gaming 4GB review In this review we check out the 4GB version of the Gigabyte G1 Gaming GeForce GTX 960. The GTX 960 is the mainstream product that we figured has too little memory, will this 4GB version resolve our co...
Gigabyte G1.Sniper B6 review Let's review the budget enthusiast board G1.Sniper B6 from Gigabyte, it is based on the lower cost B85 chipset from Intel. B85 based motherboards typically end up in business desktops and normally ar...