It's been a while since we've handled a an extensive overview of the performance of ForceWare drivers and I had some time to kill last weekend and thus decided to do a little ForceWare performance comparison...
ForceWare drivers are reference based graphics card drivers that NVIDIA distributes to the public or to graphics card manufacturers. In general all NVIDIA graphics cards are supported by these drivers, most of them however should be considered beta. Every driver you will see today can be found in our download section. So if you'd like to give a certain driver a go then just grab it, mind you that some, if not most, drivers are beta meaning some systems or games can have issues with them. All the drivers in our test today at least passed the benchmarks, otherwise they would not have been included. What also needs to be mentioned is this test was specifically aimed at performance. No image quality tests will be found here as that would take at least a week to include.
In a recent poll we noticed that almost all of you run their games with optimizations enabled, and why shouldn't you? In real-time game play you will definitely not see the differences with the naked eye. Therefore we opted to test the drivers with the optimizations enabled. We will test each driver 2x meaning one with IQ settings disabled thus 0xAA and 0xAF (no Antialiasing and Anisotropic filtering enabled) and then a run with 4xAA and 8xAF to determine if you can see performance differences among the drivers.
I'd like to note here that drivers of course not are developed solely for performance. NVIDIA has several driver groups, example for their Quadro series, Consumer series, but also different developments lines that focus in group A for the newest upcoming product while group B for example is developing SLI where group C is working on game optimizations and group D on bug-fixes. In the past you have been able to see drivers released with a huge difference in build number, these are hypothetical reasons. Down the line the code comes together targeted for a specific audience or reason. We focus on only one small part of that today, performance.
The software's used in today's article will be Doom3, Far Cry, Halo: Combat Evolved, Splinter Cell, Unreal Tournament 2004 and one synthetic benchmark 3DMark05 (Business Edition). As you can see this is a selection of today's popular games and most of these newer games force the graphics card to its maximum.
All software is tested with the highest possible in-game quality setting at:
800x600 @ 32-Bit (0xAA - 0xAF)
800x600 @ 32-Bit (4xAA - 8xAF)
1024x768 @ 32-Bit (0xAA - 0xAF)
1024x768 @ 32-Bit (4xAA - 8xAF)
1280x1024 @ 32-Bit (0xAA - 0xAF)
1280x1024 @ 32-Bit (4xAA - 8xAF)
1600x1200 @ 32-Bit (0xAA - 0xAF)
1600x1200 @ 32-Bit (4xAA - 8xAF)
To be able to do such a test you will need a current, but most of all, up-to-date system. Therefore the test system used is the new PCI-Express Intel 915G based Albatron Mars PX915G Pro with 512 MB DDR1 memory running at 400 MHz, Intel Pentium4 560 (3.6 GHz Prescott). The graphics card used is NVIDIA's GeForce 6800 GT reference model.
The numbers (FPS = Frames Per Second)
Now what you need to observe is simple, the numbers versus the screen resolution. The higher the better.
The numbers represent what we call FPS, this means Frames per second. A game's Frames per second is a measured average of a series of test. That test often is a timedemo, a recorded part of the game which is a 1:1 representation of the actual game(play). After forcing the same image quality settings this timedemo is then used for all graphics cards so that the actual measuring is as objective as can be for all graphics cards. in todays article a GeForce 6800 GT PCI-Express.
If a card reaches >30 FPS then the card is barely able to play the game. With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain, intensive on the graphics card, parts.
When a graphics card is doing 60 FPS at average or higher then you can rest assured that the game will likely play extremely smooth at every point in the game.
You are always aiming for the highest possible FPS versus the highest resolution versus the highest image quality.
Frames per second
very limited gameplay
average yet playable
best possible gameplay
We have chosen to use such a high-end system as otherwise the results would be influenced by bottlenecks. For example, if we used a GeForce FX 5500 the results would be rendered absolutely useless as that graphics card is too slow for current games, making the graphics card a bottleneck to be able to benchmark objectively. The same goes for the CPU, if we'd used too slow a processor you'd see a flat limitation on performance as the graphics card would like to go faster, yet the CPU can't process game data fast enough.
So using a high-end system you are just making sure that all variables a fine-tuned to each other.
To maintain consistency the BIOS settings were set at default and apart from the normal driver settings vsync was disabled. After each test-run with a driver, I re-ghosted the system to get a clean operating system.
In this little test it all boils down to Frames per second...
The graphics card used for this test - GeForce 6800 GT - PCI-Express
ForceWare 71.20 Performance Comparison in this article we compare the newer ForceWare 71.20 driver towards older builds with the help of software like Doom3, 3DMark05, Half-life 2, Splinter Cell, Halo and many more.
ForceWare series 6 Driver Performance Comparison It's been a while since we've handled a an extensive overview of the performance of ForceWare drivers and I had some time to kill last weekend and thus decided to a little ForceWare performance comparison..