As you can see from the table above, NVIDIA has high spirits of the Series 6 graphics processors, they announced Series 6 in April 2004 and look what still is stacking up the retail and OEM market. Quite startling. As you have been able to notice from my endless ramblings of clock speeds, this GeForce 6800 XT cards come with a 425 MHz clock and a (2x)500 MHz memory rate, 256-bit (GDDR3, 256 MB) with a suggested price tag of 179 USD.
The GeForce 6600 (GT) product line has eight pixel pipelines and thee Vertex processors, this 6800 has one Vertex processor more (four), we confirmed that to be true and working 100% with Rivatuner. Rivatuner also reported the core_id back as a NV41 revision A2 chip that's being used.
This is a Dual BIOS version of the 6800 XT and it is available in the PCI-Express version. Notable is the fact that it has two Dual DVI outputs and it is a Zalman cooled graphics card. Cool and Quiet is the name of their game, and it definitely seems to be so on the 6800 XT. We'll get into that in a short bit.
Dual BIOS. It is designed specifically for overclockers and people that are really really afraid ... Let's say you want to overclock or fool around with the BIOS to get some pretty interesting overclocking results eh? You can do that now risk free. If you fail miserably with your overclock, you simply flip the switch and the card is hardwired to the second backup BIOS. Once you boot up you can flash BIOS A to B again and start over and over until you are satisfied. A pretty interesting idea that is being used on several mainboards for quite some time now.
That jumper selects BIOS A or B.
The gDDR3 memory is covered by ramsinks. To your lower right you can see a small speaker. When the card overheats or the fan stops rotating it'll beep.
Shaders and HDR
What is a shader ?
What do we need to render a three dimensional object; 2D on your monitor? We start off by building some sort of structure that has a surface, that surface is being built from triangles and why triangles? They are quick to calculate. How's each triangle being processed? Each triangle has to be transformed according to its relative position and orientation to the viewer. Each of the three vertices the triangle is made up of is transformed to its proper view space position. The next step is to light the triangle by taking the transformed vertices and applying a lighting calculation for every light defined in the scene. At last the triangle needs to be projected to the screen in order to rasterize it. During rasterization the triangle will be shaded and textured.
Graphic processors like the GeForce series are able to perform a certain amount of these tasks. The first generation was able to draw shaded and textured triangles in hardware. The CPU still had the burden to feed the graphics processor with transformed and lit vertices, triangle gradients for shading and texturing, etc. Integrating the triangle setup into the chip logic was the next step and finally even transformation and lighting (TnL) was possible in hardware, reducing the CPU load considerably (GeForce 256). The big disadvantage was that a game programmer had no direct (i.e. program driven) control over transformation, lighting and pixel rendering because all the calculation models were fixed on the chip. And now we finally get to the stage where we can explain Shaders. Vertex and Pixel shaders allow developers to code customized transformation and lighting calculations as well as pixel coloring functionality. Each shader is basically nothing more than a relatively small program executed on the graphics processor to control either vertex or pixel processing
As always, after the technical specs we're at the point where we quickly discuss Shader Model 3 and another hot feature of the GeForce Series 6 & 7 products.
Talking about Shader Model 3
If you program or play computer games or even recently attempted to purchase a video card, then you will have no doubt heard the terms "Vertex Shader" and "Pixel Shader". The step from 2.0 towards 3.0 was a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which can bring more performance to that gaming experience. DirectX 9 was recently updated and we are going to see more and more support for 3.0 Shaders.
Is SM 3.0 technology a huge visual advantage over 2.0? Nope, not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation's development. What you need to remember about Shaders 3.0 it that it can and will be used only in several critical places where it can give a performance boost and graphics cards are all about performance my friends. Both ATI and NVIDIA now offer shader model 3 support in the new products. GeForce series 6 and upwards and for ATi their X1000 series and upwards.
Talking about HDR
Another big trendy implementation that will bring games closer to a movie like quality experience is HDR.
Both ATI and NVIDIA have been focusing extremely hard on HDR. They put a lot of money into their technology to support HDR in the best possible way, and they should as it just is a fantastic effect that brings so much more to the your gameplay experience. HDR is something you all know from games like Far Cry. It's extremely bright lighting that brings a really cool cinematic effect to gaming. This effect is becoming extraordinarily popular.
Valve recently released a new HL2 level in the form of Half Life 2: Lost Coast. Go download it as it'll show and amaze you what HDR can do. The difference is obvious. HDR means High Dynamic Range. HDR facilitates the use of color values way beyond the normal range of the color palette in an effort to produce a more extreme form of lighting rendering. Typically this trick is used to contrast really dark scenery. Extreme sunlight, over-saturation or over exposure is a good example of what exactly is possible. The most simple way to describe it would be controlling the amount of light used present in a certain position in a 3D scene.
Half Life 2 - Lost Coast level. If you bought the game it's available for free via Steam.
HDR is already present in Far Cry, Splinter Cell: Chaos Theory, Serious Sam 2 and in Half Life 2: Lost Coast. It will be available in Unreal 3 and likely a large number of other games, not to mention 3DMark06. Let the screenshots do the talking and click away.
GeForce GTX 970 SLI review We review Nvidias money maker, the GeForce GTX 970 but this time in a 2-way SLI. As such we'll be going from fast to faaaaast. In this review we'll run the standard benchmarks, but we will also have...
Palit GeForce GTX 970 Jetstream review Palit is back in the house as we review their GeForce GTX 970 Jetstream edition. The product comes factory overclocked with a boost clock of 1304 MHz. The product has a custom and very small PCB, it'...
ASUS GeForce GTX 970 Strix review We review the new ASUS GeForce GTX 970 Strix edition. A product that has been overhauled in terms of PCB and cooling. ASUS also clocks the product a little faster then reference. The new DirectCU II b...
Nvidia GeForce GTX 970 and 980 reference review A while ago Nvidia launched the GeForce GTX Titan Black which we review. We never tested it as it was supposed to be a professional series and targeted card. Nvidia's Board partners however are slowl...