Page 6 - Overclocking & ForceWare
Media PlaybackAs always, we have to mention the much forgotten item in reviews nowadays: multimedia/movie playback. In an old review, I believe it was during the GeForce 3 era, I stated that NVIDIA should take a good look at ATI when talking about dual monitor setups and movie playback through the S-Video connector. Well, NVIDIA did just that; more then ever since they released the GeForce 4 family. Seriously, movie playback and video output has improved tremendously over previous generations. I can start stories about what and why this has changed, the new driver "tweaks," bob and weave interlacing, etc... But basically it's all way too much blah blah.
Images taken with GeForce FX from the movie: Red Dragon - JPEG compression makes it a bit fuzzy.
Basically, all you need to know is that playback is now truly at a level that can compare equally with NVIDIA's arch rival, ATI's Radeon series. This is the stuff where the 5900's real value is.
Performance & Overclocking
Before we dive into an extensive series of tests, we need to discuss overclocking. With most videocards, we can do some easy tricks to boost the overall performance a little. It's called overclocking the videocard. By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard but it really can be done in less then a few minutes. I always tend to recommend to novice users and beginners not to increase the frequency any higher then 5-10% of the core and memory clock. Example: If your card would run at 300 MHz then I suggest you don't increase the frequency any higher than 330 MHz.
More advanced users push the frequency often way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should go down 10 MHz and leave it at that. The core can be somewhat different. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. I recommend that you back down at least 15 MHz from the moment you notice an artifact. Look carefully and observe well.
All in all... do it at your own risk. Overclocking your card too far or constantly to its limit might damage your card and it's usually not covered by your warranty.
You will benefit from overclocking the most with a product that is limited or you may called it "tuned down." We know that this graphics core is often limited by tact frequency or bandwidth limitation, therefore by increasing the memory and core frequency we should be able to witness some higher performance results. A simple trick to get some more bang for your buck.
Overclocking a card like the GeForce FX series is a tricky task as NVIDIA has a built-in SAFE feature. If the core reaches a certain temperature threshold it will automatically lower the clock speed to prevent the GPU from overheating. It's a nice feature, yet it will make overclocking a tad harder to do as it's hard to figure out where the maximum is.
The PixelView GeForce FX 5900 XT Golden Limited from Prolink with 256-bit 128 MB DDR memory runs at default 390 MHz for the core and 700 MHz (2x350) for its memory. This particular 5900 was a magnificent overclocker reaching 485 MHz for the core frequency and check this out... 950 MHz for its memory. Now that's going to make a noticeable difference in our tests performance wise.
BTW for good overclocking software grab our RivaTuner 2.0 (tweak utility).
Benchmark Software Suite:
* For four of these games we are making use of a custom time demo. Neither NVIDIA nor ATI knows what time demo we are using. These are non-public tests which where recorded for us only. We are not going to make them public either as they are and will remain internal material. Therefore the chipset manufacturer will not have the chance of optimizing it that specific benchmark time-demo. We will do our very best now and in the future to keep a close eye on optimizations and cheats, we need to be able to show you objective results. However in the end this should be a responsibility for the chipset designer, if that entity fails to do so, then it'll lose consumer's trust and will dig it's own grave.