GeForce 8800 GTS & GTX review

Graphics cards 1049 Page 5 of 27 Published by

teaser

Page 5 - Internal Architecture

So I think what you need to understand is that DirectX 10 doesn't commence a collosal fundamental change in new capabilities; yet it brings expanded and new features into DirectX that will enable game developers to optimize games more thoroughly and thus deliver incrementally better visuals and better frame rates, which obviously is great. How fast will it be adopted well, Microsoft is highlighting the DX10 API as God's gift to the gaming universe yet what they forget to mention is that all developers who support DX10 will have to continue supporting DirectX9 as well and thus maintain two versions of the rendering code in their engine as DXD10 is only available on Windows Vista and not XP, which is such a drama.

You heard the rumors and it's false, DirectX 9.0L will NOT make Windows XP DX10 compatible as it is the other way around, if you have DirectX 9 hardware, you will be using DirectX 9.0L as your API in Windows Vista. With that statement you also need to realize that a DX10 card like the G80 is fully DX9 compatible!

However, you can understand that from a game developer point of view it brings a considerable amount of additional workload and cost to PC game development until Vista finally becomes mainstream.

Regardless of the immense marketing hype, DirectX 10 just is not extraordinarily different from DirectX 9, you'll mainly see good performance benefits on DirectX 10 rather than vastly prominent visual differences with obviously a good number of exceptions here and there; but DX is evolving into something better and faster.

GeFore 8800 GTX & GTS review - Copyright 2006 Guru3D.comStretchy skin, geometry shaders as you alter surface vertices on the fly. Poor Froggy.

With the introduction of Unified Shader technology the industry will also make you believe that GPU's no longer have a pixel pipeline. That's true but not entirely, we obviously are still dealing with a pixel pipeline yet the dynamics simply have changed.

Stating that this product has 24 pixel pipelines does not apply anymore and that by itself will force a shift on how we need to look at new GPU microarchitecture. So I'm afraid that from now on, we can't say ooh this product has 24 pixel pipelines. The new method of making you guys understand what we are talking about and relate that to performance will simply be the cumulative number of shader processors.

Just remember this: we have moved from a fixed pipeline based architecture to a processor based architecture.

With that in mind that "number" of processors will be our new and more easy to understand and comprehend method of relating how fast a product "can" be. I know this is shady to explain.

Prepare for the impact now, the GeForce 8800 GTS has 96 shader processors / stream processors and the GeForce 8800 GTX has 128 of these unified processor units. Think for a moment about the GeForce 7900 GTX and relate that to its 8 vertex and 24 pixel processors. See the parallel already?

The internal GPU clocks have changed quite a bit also. A year or two ago our own Alexey (Rivatuner programmer) made a discovery. NVIDIA's architecture all of the sudden was showing registers for multiple clocks coming from the graphics processor. So at that time it became clear that, for example, a GeForce 7900 GTX has three (and likely more) different internal clocks. This is something you need to get used to as the G80 series has many clocked domains within the graphics processor and everything seems to be asynchronous, which is quite interesting as everything in the history of graphics cards has been developed in a very parallel manner.

So as contradicting as it may sound the GeForce 8800 GTX has a "generic" 575 MHz core clock, yet its memory is running at 900 MHz (x2) and get this, the Shader processors are clocked at 1350 MHz. And I'm pretty confident that we can find a few other clocks in there as well. Memory is a tad weird also. The GTX for example has no less than 768 MB of memory and it's 384-bits wide, now this is where things can get a little tricky to understand but there is no 384 Bit wide memory with the GTS at 320-bits; it's a bit of a trick that I'll explain to you in our photo-shoot.

GeFore 8800 GTX & GTS review - Copyright 2006 Guru3D.com
The G80 at work in the new nForce 680i SLI platform

Share this content
Twitter Facebook Reddit WhatsApp Email Print