BFG GeForce 9800 GX2 1024MB review

Graphics cards 1048 Page 3 of 18 Published by

teaser

3 - Reference & BFG 9800 GX2

The reference 9800 GX2

Since we covered the basics of the G92 processor we can now talk a bit about the actual GX2 itself. it's the year 2008, you guys are familiar with abbreviations like SLI and Crossfire. In a very short laymen explanation, this is what NVIDIA is applying at the 9800 GX2, it's a very fast and relatively cheap to develop method to produce a fast performing product.

There are however some deficits, but we'll discuss them later on in this chapter. See basically what NVIDIA did is to take two 8800 GTS 512MB graphics cards, merged them together and made sure it's working as a "single" functional graphics card. Why do I say that so specifically ? Because with SLI you are forced to use a nForce mainboard series where with a GX2 you can use the product in for example an Intel mainboard just as well.

So when you open of the casing of that huge looking graphics card, you'd notice two PCBs, two GPUs, and two sets of 512MB memory. yes, everything is doubled up.

BFG GeForce 9800 GX2 review
Two PCBs, two set's of memory, one PCB flipped around so one cooler is used. Engineering wise, this is art.

Let's get a misconception out of the way first though. At the CeBIT pretty much any board partner out there was showing off the Geforce 9800 GX2. And ill behaving as they did, almost all of them where showing the memory specification to be 512-bit memory. This is not true, the memory connected to the G92 is running 100% over the 256-bit wide controller. But since everything is doubled up (two set's of memory and thus two controllers) some board-partners would "love" to make you believe it's 512-bit. [Ed - Marketing again then huh]

Now there's no shame in 256-bit people if anything the Radeon cards with 512-bit memory controller and memory proofed that 512-bit is not quite yet the answer. It's a lot more wires in that PCB, very complex and it's very hard to see a real gain in performance. Alright, so remember .. 256-bit.

Two GPU's create a lot of heat and consume a lot of power. This might be the reason that the core frequency of the 9800 GX2 GPUs are clocked slower than the G92 on the 8800 GTS 512MB. It beats me, but the core is running at 600 MHz, the Shaders processors at 1500 MHz and memory at a beefy 2000 MHz. These specs are lower than a 8800 GTS 512MB. Also I know for a fact that the first week or two, NVIDIA's board partners are not allowed to sell pre-overclocked versions of the 9800 GX2. Very interesting fact.

Let's compare:

table_tl.gif blank.gif table_tr.gif
blank.gif
  GeForce
8800 Ultra
GeForce
8800 GTX
8800 GTS 512MB Old GeForce 8800 GTS GeForce 9800 GX2
Stream (Shader) Processors 128 128 128 96 128x2
Core Clock (MHz) 612 575 650 500 600
Shader Clock (MHz) 1512 1350 1625 1200 1512
Memory Clock (MHz) x2 1080 900 800 800 1000
Memory amount 768 MB 768 MB 512 MB 320/640 MB 512 MBx2
Memory Interface 384-bit 384-bit 256-bit 320-bit 256-bit
HDCP Yes Yes Yes Yes Yes
Two Dual link DVI Yes Yes Yes Yes Yes
blank.gif
table_bl.gif blank.gif table_br.gif

BFG GeForce 9800 GX2

Next to the reference model we'll be testing today we also have the pleasure to play around with a rather similar looking card from the fine folks at BFG. The BFG card comes nicely bundles and with a stack-load of warranty. Heck, I even heard that sometime soon they will introduce a step-up program which probably will make the folks a eVGA go "booh". But hey, we as consumers love that competition. Board-partners; you guys can drive each other insane, we as a consumer will sit back, relax, grab a can of soda and watch how that will develop :)

On a serious note, obviously, if that does happen than thats great stuff for us consumers. The 9800 GX2 they sent out for us to review is not an OC model. NVIDIA made a policy that at launch each partner should be equal, fair play. And they may not introduce any overclocked model ... for two weeks. Um yeah, I don't get that either. Suffice to say that the BFG GeForce 9800 GX2 therefore is 100% identical to the reference specification. There is not a thing different, or a single rendered frame faster. Though the BFG GeForce 9800 GX2 does have a nice sticker slapped on each side of the card :)

BFG as stated will intro this product "standard", it does however come with the luxuries that they always offer, inside the USA a life-time warranty, which is just a bitching nice feature. Outside the US you are limited to 10 years warranty; which is still bloody fantastic. Mind you that if you purchase a BFG product then please within 30 days register yourself at the BFG website in order to activate that warranty. This is a new policy recently introduced.

BFG GeForce 9800 GX2 review

Bundle wise BFG is sticking to the old routine, though two new additions have made it to the package. A SPDIF connector. See the GX2 comes with a HDMI connector, HDMI can transport sound. So you simply bypass the SPDIF connector on your soundcard or mainboard, then lead it to the GX2 (I'll show you this in the photo session) and you have the ability to output lossless sound over HDMI.

Now then, since the paragraph above already had the word HDMI in it, BFG figured .. let's throw in a HDMI cable as well. Fuckin' A BFG ! And it's even a good quality one as well. Two very big thumbs up here. Other then that, the usual driver CDs, manual, quick reference guide and one DVI->VGA converter. Last but not least you'll find a 6-pin to 8-pin power converter as the GX2 requires a 8-pin connector feed.

We stumbled into a rather nasty problem (that all GX2's have by the way). We'll show that to you in our chapter called " Power Connector issues". If you are in the market for this card, then for your own sake, please read that chapter.

So there you go, these are the insane outs you need to know about the technology. Fact is that given the right circumstances (resolution, image quality, overall PC performance) this GX2 will beat one GeForce 8800 Ultra by a significant margin. And that makes things really interesting. What's also interesting is that the 9800 GX2 is SLI compatible. That's right, it is possible to combine two GX2's together and set them up in SLI. That way you will have four GPUs rendering your games.

BFG GeForce 9800 GX2 review

That by itself is a discussion alone, as with the GeForce 7950 GX2 quad-GPU rendering flopped miserably due to two important things, the lack of driver support from NVIDIA's side and Windows XP (had a back buffer issue allowing only two GPUs). Time has changed and with the recent "shoved down your throat" release of Windows Vista this issue however is solved for the bigger part, and for the first time in the history of 3D Rendering, quad-GPU rendering might take off. I've seen the preliminary work on the drivers, actually 3-way SLI in the same development path as the new Quad SLI drivers.

This by the way is also good news for Quad SLI GeForce 7950 GX2 owners, after two years in combo with Vista you can "finally" benefit from the drivers as well, and yes .. a little sarcasm in this line was intended that way. NVIDIA must stay on track with quad-sli support from now on with it's drivers, otherwise they'll fail wretchedly the next time they release a Quad-GPU compatible product.

Since we touched the topic drivers, with a dual-GPU based product there's always the risk of incompatibility with new games. NVIDIA's drivers (SLI wise) are based on driver profiles. Meaning for new games you'll have to download & install a new NVIDIA driver. NVIDIA's driver support is still mediocre when it comes to release schedules, they are not updating drivers fast enough. And for GX2 that might become an issue. See when a game is not recognized by the driver, it'll jump into single-GPU rendering, that's half the performance. Meaning you literally would be rendering the game at half the capacity of the card. With a bit of luck you can enable an SLI profile for the game by adding it manually, yet it remains a little tricky.

Anyway, back to the review itself ! We'll slowly start up the actual test, first we'll look at power consumption, heat levels and noise. Then we'll show you a thing in our PlayGuru nekked photo-shoot after which we'll obviously fire off the hottest games of 2008 at the card to see if it can actually deliver what it's supposed to do.

Next page please.

Share this content
Twitter Facebook Reddit WhatsApp Email Print