Quite honestly if you have played around with a GeForce 7800 GTX or Radeon X1800 XT for a while you feel a bit spoiled when you go back to a card from this segment. Make no mistake though, cards like these offer you way more bang for your bucks than such a high-end card. The card tested today as you have been able to observe will become available in a 256 MB configuration. Let's have a look at NVIDIA's GeForce Series 6 product line with the most important retail cards and yes even in this list a couple are missing.
NVIDIA GeForce 6800 Product Lineup Specifications
# pixel processors
# vertex processors
GeForce 6800 Ultra
GeForce 6800 GT
GeForce 6800 LE
GeForce 6600 GT
A product for any budget! As you can see from the table above, NVIDIA can offer you any product at any price with the Series 6 graphics processor. They announced Series 6 in April 2004 and look it is still here and offering good performance. This 6600 still has eight pixel pipelines and three vertex processors that are confirmed to be working 100% with Rivatuner. The 6600 series can write only four color pixels per clock and has a fragment crossbar. The NV43 does appear to have eight pixel shader/texture units, so its not an "8 x 1" design or a "4 x 1" design. It's more of a hybrid and works quite well. Next to that we notice four ROPS. ROP is short for Raster Operation and is a portion of a pipeline. It's responsible for AA, blending and Z-Buffer compression. Simply stated an ROP is basically the output engine of a pixel shader pipeline.
What you should not forget is that when you buy an NVIDIA series 6 product it means full support of HDR, PureVideo (optional as you still have to cough up 20 bucks to get it supported), and Shader Model 3.0 alive, active and working on that GPU.
As our test will show you today, even the newer titles can be played with this card around 1024x768 (Quake 4, Far Cry, Half-Life), and that's where the word value kicks in again. Reasonable performance versus a low price. These cards should be hitting the market very shortly as I have a full retail XFX version sitting right here in the test rig. When we take a look at the card we can clearly see it is reference based with a nice blue colored PCB and the fan being the only differences. What I really like is that even for this somewhat cheaper product a better cooling solution was used in comparison to the reference cooling - a nice and shiny heatsink with an active fan that's not at all noisy yet seems to be very effective.
Let's fire up Rivatuner and see what the BIOS registers fire back to us:
$ffffffffff Display adapter information $ffffffffff --------------------------------------------------- $0000000000 Description : NVIDIA GeForce 6600 $0000000001 Vendor ID : 10de (NVIDIA) $0000000002 Device ID : 0141 $0000000003 Location : bus 5, device 0, function 0 $0000000004 Bus type : PCIE $000000000f PCIE link width : 16x supported, 16x selected $ffffffffff --------------------------------------------------- $ffffffffff NVIDIA specific display adapter information $ffffffffff --------------------------------------------------- $0100000000 Graphics core : NV43 revision A4 (8x1,3vp) $0100000001 Hardwired ID : 0141 (ROM strapped to 0141) $0100000002 Memory bus : 128-bit $0100000003 Memory type : DDR (RAM configuration 02) $0100000004 Memory amount : 262144KB $0100000005 Core clock : 441.000MHz $0100000006 Memory clock : 420.750MHz (841.500MHz effective)
As you can see the core is clocked at 440 MHz. That's real close to the 6600 GT. I think this product will overclock beautifully. ;)
Shaders Model 3.0 If you program or play computer games or even recently attempted to purchase a video card, then you will have no doubt heard the terms "Vertex Shader" and "Pixel Shader".
The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more performance. DirectX 9 is now updated and we are going to see more support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Not even the slightest bit. Yet any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it can give a performance boost.
Medal of Honor Pacific Assault demo - With Shader Model 3 enabled you can boost performance a bit.
Shader Model 3.0 titles: Lord of the Rings; Battle for Middle-Earth, Stalker, Vampire; Bloodlines, Splinter Cell 3, Driver 3, Grafan, Painkiller, Far Cry and these days many more ...
Powerrrr!! As usual we try to get you an overview on power consumption. We know this test is rather subjective as we measure between the power outlet and the PSU but none the less we think it's a nice way of trying to show what you can expect.
The total power draw (peak) from the power supply unit was not very high at all. On an Athlon 64 4000+ based system with two HD's, a DVD-Rom, and 1024 MB memory, in-game the PC pulled a maximum of roughly 228 Watt. However, you always need some reserves so a 350 Watt PSU is sufficient.
NVIDIA actually recommends a 300 Watt Power Supply for the 6600 series, and basically everyone has that in their system these days.
Bundled Items In the box we see that XFX in no way goes el cheapo on you. It comes with the standard stuff like a driver CD, but it's also bundled with 3D-Edit (editing software that makes use of the GPU) and a full version of the game Far Cry. Next to the manual we find an SVideo output cable, DVI to D-Sub and a set of manuals. For a product priced at this level this is really good, and in particular the inclusion of a full copy of Far Cry brings a smile to my face.
The Installation It's really not hard to install a graphics card yourself nowadays. Especially with brands like ATI and NVIDIA, who use unified driver sets. If you have a really new product then make sure you have the latest drivers on your HD. First uninstall your current graphics card's drivers carefully. This is very important especially if the older graphics card was from a different chipset manufacturer. Now power down the PC and pull out the power cable. Insert the graphics card in the slot, secure it with a screw, connect the monitor, boot up windows, run the driver installation, then restart and you are set to go. That's all. Also important is to make sure you have the latest version of DirectX (9.0c) installed. If you experience compatibility issues please make sure you have installed the latest version of your mainboard drivers, have a look in our extensive download section I'd say.
XFX GeForce 9800 GTX Black Edition review A review on the XFX Black edition GeForce 9800 GTX, after reading some of you guys will probably will run to the store .. as the XFX XXX and Black editions obviously have fallen in price after AMDs Radeon 4850 launch.
XFX GeForce 9800 GX2 Black edition review King of pre-overclocked products has to be XFX, their speed-bin of releasing OCed products is exceedingly fast. I do not think that there is an XFX product out there that is not available as a triple X edition or other series name slapped on top of it. Today will be no different, we received XFXes top of the shelf product, the GeForce 9800 GX2, the cool thing is that this is the new mystified named 'Black edition' series of cards, meaning it comes with a nice pre-overclock and an lovely game bundle.
XFX GeForce 9600 GT XXX 512MB review A review on that XXX rated version of the GeForce 9600 GT. Hey, only XFX can implement such a name for you. XFX have a "way" of communication and getting attention. They quite honestly deliver really impressive products. Today's tested product is no different.
XFX GeForce 8800 GTS 320 MB Fatal1ty review The Fata1ty Geforce 8800 GTS 320MB graphics card comes with it's GPU core pre-overclocked at 650 MHz and it's 320 MB memory clocked at (2x) 999 MHz. If that isn't rather radical, the Shader domain inside that graphics processor is clocked at a lovely 1620 MHz. Now if that does mean jack to you, these are the reference specs a 320 model normally has: 500 MHz core, 2x 800 MHz memory and a shader domain clock of 1200 MHz. Can you already sniff the performance increase ? That's like bacon on your eggs man.