GeForce 6800 GT preview

Graphics cards 1049 Page 3 of 15 Published by

teaser

PSU, Shaders and Technology

Power Supply Recommended GossipOne of the things I notice the most about the 6800 Series in our forums are the discussions of its power consumption. Initially NVIDIA recommended testing the 6800 with a 480 Watt Power supply. It really does not need that much folks ! In fact any decent 350 Watt PSU will be sufficient. The 6800 Ultra actually only used up 15 Watts more than the 5950 Ultra. It's more but nothing that scary.

We used a standard 350 Watt PSU together with a Pentium 4 2.8 GHz, 1 GB test rig along with a DVD-Drive. The GeForce 6800 GT never crashed once, neither did the Ultra for that matter...

UltraShadow II
Last year we got acquainted with NVIDIA's UltraShadow technology. Shadows, nothing that new you might say. Yet precise and accurate shadows are so important to a 3D scene. They contribute to the 3D scene's environment, it's a step closer to the real thing, I guess... except for performance and that's where UltraShadow II comes in.

UltraShadow II allows the game programmer to define a bounded portion of the scene (often called depth bounds), which limits the calculation of lighting source effects to objects within that specified area. By limiting calculations to the area most affected by a light source, the overall shadow generation process is highly accelerated. Stenciled shadow volumes do not require texturing or color so UltraShadow II hardware can double the rendering horsepower to generate stenciled shadow volumes at up to twice the standard pixel-processing rate. Did you know that allows the upcoming and highly shadow intensive Doom III to run 3 to 4 times faster in regards to shadow processing?! That's not made up stuff guys.

NV4x Video Architecture
Next to being a gaming card, starting with the GeForce Series 6 some high-quality video options become available also. First off, Series 6 is a fully programmable video processor; you can aqctually use it in paint programs or software like Adobe After Effects. And hey, this has the first ever on-chip GPU video encoder.

Among the features are motion adaptive interlacing, integrated TV-Encoder, complete HDTV support (720p. 1080i, 480p, CGMS). We so need HDTV support here in Europe. Not only that but the NV40 now also can Decode and... Encode MPEG 1/2 and 4, and yes also real-time DiVX Decode/Encode!

Shaders Model 3.0
If you program or play computer games or even recently attempted to purchase a video card, then you will have no doubt heard the terms "Vertex Shader" and "Pixel Shader".

Funny story, I've been following some threads in our forums lately to see the ATI versus NVIDIA discussions regarding 3.0 Shader technology. It's now known that the x800 series from ATI (which we recently tested) does not support DirectX 9.0c Shader Model 3.0. The discussion we see from a lot of ATI fans is are like this "Hey, who needs Pixel and Vertex Shader 3.0?; there is not one game that supports it and it that won't happen this year also.''

That's so wrong people. The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more candy for the eyes. When DirectX 9 is updated we are going to see a lot of support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Personally I question that fact. Any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost or significantly boosts image quality.

ATi makes you believe that Shader Model 3.0 is not important right now. True... it's not that important right now. But throw this thesis at it. Product A is extremely fast with SM2 and has awesome SM3 support. Product B has only SM2 support yet is only slightly faster. Looking at the future, we will see more and more games using SM3 even if it is purely for better performance, then what product has the advantage...  Product A of course. In this case the 6800 outclasses the Radeon x800 from ATI.

Since I've started to ramble on about Shader technology I just realized that some of you do not even have a clue what I'm talking about. Sorry, that happens when you get a bit excited. Let's do a quick shader course.

What do we need to render a three dimensional object as 2D on your monitor? We start off by building some sort of structure that has a surface, that surface is built from triangles. Why triangles? They are quick to calculate. How's each triangle being processed? Each triangle has to be transformed according to its relative position and orientation to the viewer. Each of the three vertices that the triangle is made up of is transformed to its proper view space position. The next step is to light the triangle by taking the transformed vertices and applying a lighting calculation for every light defined in the scene. And lastly the triangle needs to be projected to the screen in order to rasterize it. During rasterization the triangle will be shaded and textured.

And now we finally get to the stage where we can explain Shaders. Vertex and Pixel shaders allow developers to code customized transformation and lighting calculations as well as pixel coloring functionality. Each shader is basically nothing more than a relatively small program executed on the graphics processor to control either vertex or pixel processing.

Copyright Guru3D.com
Far Cry

Shader Model 3.0 titles that we can expect soon: Lord of the rings; Battle for Middle-Earth, Stalker, Vampire; Bloodlines, Splinter Cell X, Driver 3, Grafan, Painkiller, Far Cry and more...

Copyright Guru3D.com
Lord of the Rings - The Battle for Middle Earth - Shaders version 3.0 all the way.

Share this content
Twitter Facebook Reddit WhatsApp Email Print