TweetBundled Items In the box, we notice a reasonable bundle in terms of software. Let's have a look:
PowerDVD 5, Thief and Joint Operations are all full version software titles that you get for free with this graphics card. Next to that we find a Driver CD.
Next to a manual we can find a S-Video to Composite output cable and last but not least a DVI to CRT/VGA adapter, so that you are able to hook up 2 CRT VGA monitors to your card. The package lacked a Molex power supply cable, which I think really should be included.
So you have a pretty nice bundle to start with right there.
The Installation It's really not hard to install a graphics card yourself nowadays. Especially with brands like ATI and NVIDIA who use unified driver sets. If you have a really new product then make sure you have the latest drivers on your HD. First uninstall your current graphics card's drivers carefully, this is very important, especially if the older graphics card was from a different chipset manufacturer. Now power down the PC and pull out the power cable. Insert the graphics card in the slot, secure it with a screw, connect the monitor and boot up windows, run the driver installation, then restart and you are set to go. That's all. Also important, make sure you have the latest version of DirectX (9c) installed.
Power Supply - Gossip Control One of the things I notice the most about the 6800 Series in our forums are the discussions of its power consumption. Initially NVIDIA recommended testing the 6800 with a 480 Watt power supply. It really does not need that much folks! In fact any decent 350 Watt PSU will be sufficient. The 6800 Ultra actually only uses 15 Watts more than the 5950 Ultra. It's more but nothing that scary.
We used a standard 350 Watt PSU together with a Pentium 4 2.8 GHz, 1 GB test rig along with a DVD-Drive. The GeForce 6800 GT never crashed once, neither did the Ultra for that matter...
UltraShadow II Last year we got acquainted with NVIDIA's UltraShadow technology. Shadows, nothing that new you might say. Yet precise and accurate shadows are so important to a 3D scene. They contribute to the 3D scene's environment, it's a step closer to the real thing, I guess... except for performance and that's where UltraShadow II comes in.
UltraShadow II allows the game programmer to define a bounded portion of the scene (often called depth bounds), which limits the calculation of lighting source effects to objects within that specified area. By limiting calculations to the area most affected by a light source, the overall shadow generation process is highly accelerated. Stenciled shadow volumes do not require texturing or color so UltraShadow II hardware can double the rendering horsepower to generate stenciled shadow volumes at up to twice the standard pixel-processing rate. Did you know that allows the upcoming and highly shadow intensive Doom III to run 3 to 4 times faster in regards to shadow processing?! That's not made up stuff guys.
NV4x Video Architecture Next to being a gaming card, starting with the GeForce Series 6 some high-quality video options become available also. First off, Series 6 is a fully programmable video processor; you can aqctually use it in paint programs or software like Adobe After Effects. And hey, this has the first ever on-chip GPU video encoder.
Among the features are motion adaptive interlacing, integrated TV-Encoder, complete HDTV support (720p. 1080i, 480p, CGMS). We so need HDTV support here in Europe. Not only that but the NV40 now also can decode and encode MPEG 1/2 and 4 and yes also perform real-time DivX decode/encode!
Shaders Model 3.0 If you program or play computer games or even recently attempted to purchase a video card, then you will have no doubt heard the terms "Vertex Shader" and "Pixel Shader".
The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means better performance. DirectX 9 is now updated and we are going to see more support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost. Since I've started to ramble on about Shader technology I just realized that some of you do not even have a clue what I'm talking about. Sorry, that happens when you get a bit excited. Let's do a quick shader course.
What do we need to render a three dimensional object as 2D on your monitor? We start off by building some sort of structure that has a surface, that surface is built from triangles. Why triangles? They are quick to calculate. How's each triangle being processed? Each triangle has to be transformed according to its relative position and orientation to the viewer. Each of the three vertices that the triangle is made up of is transformed to its proper view space position. The next step is to light the triangle by taking the transformed vertices and applying a lighting calculation for every light defined in the scene. And lastly the triangle needs to be projected to the screen in order to rasterize it. During rasterization the triangle will be shaded and textured.
Graphic processors like the GeForce series are able to perform a certain amount of these tasks. The first generation was able to draw shaded and textured triangles in hardware. The CPU still had the burden to feed the graphics processor with transformed and lit vertices, triangle gradients for shading and texturing, etc. Integrating the triangle setup into the chip logic was the next step and finally even transformation and lighting (TnL) was possible in hardware, reducing the CPU load considerably (GeForce 256). The big disadvantage was that a game programmer had no direct (i.e. program driven) control over transformation, lighting and pixel rendering because all the calculation models were fixed on the chip.
And now we finally get to the stage where we can explain Shaders. Vertex and Pixel shaders allow developers to code customized transformation and lighting calculations as well as pixel coloring functionality. Each shader is basically nothing more than a relatively small program executed on the graphics processor to control either vertex or pixel processing.
Medal of Honor Pacific Assault demo - With Shader Model 3 enabled you can boost its performance a bit.
Shader Model 3.0 titles that we can expect soon or are here already: Lord of the Rings; Battle for Middle-Earth, Stalker, Vampire; Bloodlines, Splinter Cell X, Driver 3, Grafan, Painkiller, Far Cry and more...
Gigabyte X170 Extreme ECC and Intel Xeon E3-1230 v5 We review the Gigabyte X170 Extreme ECC motherboard, an Xeon compatible Intel chipset based product that is loaded with kit, ECC memory support (if you use a Xeon) and features. Though the chipset and...
Gigabyte GeForce GTX 950 Xtreme Gaming review We review the Gigabyte GeForce GTX 950 Xtreme Gaming OC edition. The GTX 950 is an entry-level to mainstream graphics card in the Maxwell range of GPUs from Nvidia that sits pretty nicely in the 1080...
Gigabyte Z170X Gaming G1 review We review the Gigabyte Z170X Gaming G1, an Intel Z170 based product that is loaded with kti and features. The motherboard has a new lets call it F1 design and even is quad-SLI/Crossfire capable. Combi...