XFX GeForce 6600 256 MB gDDR2

Graphics cards 1049 Page 2 of 13 Published by

teaser

Page 2

Technical joyfulness

Quite honestly if you have played around with a GeForce 7800 GTX or Radeon X1800 XT for a while you feel a bit spoiled when you go back to a card from this segment. Make no mistake though, cards like these offer you way more bang for your bucks than such a high-end card. The card tested today as you have been able to observe will become available in a 256 MB configuration. Let's have a look at NVIDIA's GeForce Series 6 product line with the most important retail cards and yes even in this list a couple are missing.

NVIDIA GeForce 6800 Product Lineup Specifications

Product Name

# pixel processors

# vertex processors

Bus width

Memory Type/Amount

GPU Speed

RAM Speed

GeForce 6800 Ultra

16

6

256-bit

GDDR3/256MB

400MHz

1100MHz

GeForce 6800 GT

16

6

256-bit

GDDR3/256MB

350MHz

1000MHz

GeForce 6800

12

5

256-bit

GDDR/128MB

325MHz

700MHz

GeForce 6800 LE

8

4

256-bit

GDDR/128MB

320MHz

700MHz

GeForce 6600 GT 8 3 128-bit GDDR3/128/256MB 500MHz 1000MHz
GeForce 6600 8 3 128-bit GDDR2/128/256MB 400 MHz 400 (800)
GeForce 6600 8 3 64/128-bit GDDR/128MB 300MHz 275(550)
GeForce 6200
4 3 64/128-bit GDDR/128MB/256MB 300MHz 275(550)

A product for any budget! As you can see from the table above, NVIDIA can offer you any product at any price with the Series 6 graphics processor. They announced Series 6 in April 2004 and look it is still here and offering good performance. This 6600 still has eight pixel pipelines and three vertex processors that are confirmed to be working 100% with Rivatuner. The 6600 series can write only four color pixels per clock and has a fragment crossbar. The NV43 does appear to have eight pixel shader/texture units, so its not an "8 x 1" design or a "4 x 1" design. It's more of a hybrid and works quite well. Next to that we notice four ROPS. ROP is short for Raster Operation and is a portion of a pipeline. It's responsible for AA, blending and Z-Buffer compression. Simply stated an ROP is basically the output engine of a pixel shader pipeline.

What you should not forget is that when you buy an NVIDIA series 6 product it means full support of HDR, PureVideo (optional as you still have to cough up 20 bucks to get it supported), and Shader Model 3.0 alive, active and working on that GPU. 

As our test will show you today, even the newer titles can be played with this card around 1024x768 (Quake 4, Far Cry, Half-Life), and that's where the word value kicks in again. Reasonable performance versus a low price. These cards should be hitting the market very shortly as I have a full retail XFX version sitting right here in the test rig. When we take a look at the card we can clearly see it is reference based with a nice blue colored PCB and the fan being the only differences. What I really like is that even for this somewhat cheaper product a better cooling solution was used in comparison to the reference cooling - a nice and shiny heatsink with an active fan that's not at all noisy yet seems to be very effective.

Let's fire up Rivatuner and see what the BIOS registers fire back to us:

$ffffffffff Display adapter information
$ffffffffff ---------------------------------------------------
$0000000000 Description : NVIDIA GeForce 6600
$0000000001 Vendor ID : 10de (NVIDIA)
$0000000002 Device ID : 0141
$0000000003 Location : bus 5, device 0, function 0
$0000000004 Bus type : PCIE
$000000000f PCIE link width : 16x supported, 16x selected
$ffffffffff ---------------------------------------------------
$ffffffffff NVIDIA specific display adapter information
$ffffffffff ---------------------------------------------------
$0100000000 Graphics core : NV43 revision A4 (8x1,3vp)
$0100000001 Hardwired ID : 0141 (ROM strapped to 0141)
$0100000002 Memory bus : 128-bit
$0100000003 Memory type : DDR (RAM configuration 02)
$0100000004 Memory amount : 262144KB
$0100000005 Core clock : 441.000MHz
$0100000006 Memory clock : 420.750MHz (841.500MHz effective)

As you can see the core is clocked at 440 MHz. That's real close to the 6600 GT. I think this product will overclock beautifully. ;)

Shaders Model 3.0If you program or play computer games or even recently attempted to purchase a video card, then you will have no doubt heard the terms "Vertex Shader" and "Pixel Shader".

The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more performance. DirectX 9 is now updated and we are going to see more support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it can give a performance boost.

mohaa2-2.jpg
Medal of Honor Pacific Assault demo - With Shader Model 3 enabled you can boost performance a bit.

Shader Model 3.0 titles: Lord of the Rings; Battle for Middle-Earth, Stalker, Vampire; Bloodlines, Splinter Cell 3, Driver 3, Grafan, Painkiller, Far Cry and these days many more ...

Powerrrr!!
As usual we try to get you an overview on power consumption. We know this test is rather subjective as we measure between the power outlet and the PSU but none the less we think it's a nice way of trying to show what you can expect.

The total power draw (peak) from the power supply unit was not very high at all. On an Athlon 64 4000+ based system with two HD's, a DVD-Rom, and 1024 MB memory, in-game the PC pulled a maximum of roughly 228 Watt. However, you always need some reserves so a 350 Watt PSU is sufficient. NVIDIA actually recommends a 300 Watt Power Supply for the 6600 series, and basically everyone has that in their system these days.

Bundled Items
In the box we see that XFX in no way goes el cheapo on you.  It comes with the standard stuff like a driver CD, but it's also bundled with 3D-Edit (editing software that makes use of the GPU) and a full version of the game Far Cry. Next to the manual we find an SVideo output cable, DVI to D-Sub  and a set of manuals. For a product priced at this level this is really good, and in particular the inclusion of a full copy of Far Cry brings a smile to my face.

Copyright 2005 - Guru3D.com

The Installation
It's really not hard to install a graphics card yourself nowadays. Especially with brands like ATI and NVIDIA, who use unified driver sets. If you have a really new product then make sure you have the latest drivers on your HD. First uninstall your current graphics card's drivers carefully. This is very important especially if the older graphics card was from a different chipset manufacturer. Now power down the PC and pull out the power cable. Insert the graphics card in the slot, secure it with a screw, connect the monitor, boot up windows, run the driver installation, then restart and you are set to go. That's all. Also important is to make sure you have the latest version of DirectX (9.0c) installed. If you experience compatibility issues please make sure you have installed the latest version of your mainboard drivers, have a look in our extensive download section I'd say.

Latest software Downloads

Share this content
Twitter Facebook Reddit WhatsApp Email Print