Galaxy Geforce 8800 GT HDMI w/ Xtreme Tuner -
Page 6 - Overclocking & Test bed
Overclocking & Tweaking
As most of you with most videocards know, you can apply a simple series of tricks to boost the overall performance a little. You can do this at two levels, namely tweaking by enabling registry or BIOS hacks, or very simple, tamper with Image Quality. And then there is overclocking, which will give you the best possible results by far.
What do we need?
One of the best tool for overclocking NVIDIA and ATI videocards is our own Rivatuner that you can download here. If you own an ATI or NVIDIA graphics card then the manufacturer actually has very nice built in options for you that can be found in the display driver properties.
Where should we go ?
Overclocking: By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard, but it really can be done in less than a few minutes. I always tend to recommend to novice users and beginners not to increase the frequency any higher then 5% of the core and memory clock. Example: If your card runs at 500 MHz (which is pretty common these days) then I suggest you don't increase the frequency any higher than 25 to 50 MHz.
More advanced users push the frequency often way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should back down 10-15 MHz and leave it at that. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. Carefully find that limit and then back down at least 20 MHz from the moment you notice an artifact. Look carefully and observe well. I really wouldn't know why you need to overclock today tested cards anyway, but we'll still show it ;)
All in all... do it at your own risk.
- Any generic 8800 GT at standard is at 600 / 1500 / 1800 clocks (core / shaders / memory).
- This standard OC for this card is at 600 / 1512/ 2000 (core / shaders / memory).
- We overclocked it towards 700 / 1750/ 2400 (core / shaders / memory).
That is such an excessive overclock. Memory could be pushed heaps further.
As you can see, the result is a notably faster performing card. The game you are looking at is Call of Duty 4.
Image Quality setting:
- 4x Anti Aliasing
- 16x anisotropic filtering
- All settings maxed out
Hardware and Software Used
Now we begin the benchmark portion of this article, but first let me show you our test system plus the software we used.
nVIDIA nForce 680i SLI (eVGA)
Core 2 Duo X6800 Extreme (Conroe)
Various GeForce Series 8 cards
2048 MB (2x1024MB) DDR2 CAS4 @ 1142 MHz Dominator Corsair
Power Supply Unit
Enermax Galaxy 1000 Watt
Dell 3007WFP - up-to 2560x1600
OS related Software
DirectX 9/10 End User Runtime
NVIDIA ForceWare 169.09
NVIDIA nForce 590/680iplatform driver 9.53
Software benchmark suite
Call of Duty 4
World in Conflict
Ghost Recon: Advanced Warrior 2
War Front: Turning Point
A word about "FPS"
What are we looking for in gaming performance wise? First off, obviously Guru3D tends to think that all games should be played at the best image quality (IQ) possible. There's a dilemma though, IQ often interferes with the performance of a graphics card. We measure this in FPS, the number of frames a graphics card can render per second, the higher it is the more fluently your game will display itself.
A game's frames per second (FPS) is a measured average of a series of tests. That test often is a time demo, a recorded part of the game which is a 1:1 representation of the actual game and its gameplay experience. After forcing the same image quality settings; this timedemo is then used for all graphics cards so that the actual measuring is as objective as can be.
|Frames per second||Gameplay|
|<30 FPS||very limited gameplay|
|30-40 FPS||average yet very playable|
|40-60 FPS||good gameplay|
|>60 FPS||best possible gameplay|
- So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
- With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts. Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering quality versus resolution, hey you want both of them to be as high as possible.
- When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely smoothly at every point in the game, turn on every possible in-game IQ setting.
- Over 100 FPS? You have either a MONSTER of graphics card or a very old game.
In this review we'll have a peek at the warmongers from KFA2 (Galaxy), they unleash this cute little beastly looking GTX 550 Ti LTD OC White edition graphics card. And to make it even more special, they slapped all components on a sexy white PCB again. Armed with that atypical looking cooler you'll learn that this product makes no compromises, you will not hear it, it will not run hot and it even comes factory clocked at a full GHz, quite amazing as GPUs seem to slowly pass that weird 1 GHz threshold.
Galaxy GeForce 9800 GT 1024MB review | test
We test a product from the guys at Galaxy, and that means customization and extra features, all for a fair price. Custom PCB, custom cooling, custom bracket, HDMI output, black DVI+backplate heck it even packs 1024MB of memory to play around with for roughly the same money as a 512MB model. All fairly impressive really.
Galaxy GeForce 9800 GTX+ 512MB review
Galaxy GeForce 9800GTX PLUS test - Galaxy recently released this product, custom PCB, custom cooling, custom bracket, HDMI output, black DVI and backplate. I sometimes wonder why a small company consistently can push out striking products like that and the bigger AIBs mainly focus on the reference design.
Galaxy Geforce 8800 GT HDMI w/ Xtreme Tuner
Our initial review of this product was taken offline as we received an early version of the product. This early version had a "beta" cooler on it that made a truck-load of noise. Galaxy claimed to have a new cooler ready and asked if if we could revise the review based on the new and final cooler. And therefore we have updated this initial article to revision 2; based on new facts with the final cooling solution implemented. And it sure is a lot better.