We have to admit, the GeForce GTX 280 is simply put - a Greek power house. Spaaaartaaans! Year after year it astonishes me how many transistors chipset manufacturers can place on a chipset die ... apparently NVIDIA can do no less than 1400 million. The result is a huge chip with huge performance. Obviously AMD will say, het we can achieve the same performance by placing two series 4000 and make it one card. True .. but so can NVIDIA, it's a little brute but a GeForce GX2 280 could be made as well, who knows.
Strangely enough there is another product out there that can actually compete with the GeForce GTX 280 rather well. It's NVIDIA's own GeForce 9800 GX2. Yes you hear it right. Now don't go thinking, hey my GX2 will be faster in most circumstances. No ... it's really way more complicated than that. See the GX2 has 2x512MB cloned memory to function in SLI mode. While that works just fine, the GTX 280 has 1024MB dedicated memory. So in memory bound situations like applying a lot of extremely high-res textures or high AA levels in high resolutions, the GTX 280 will perform just much better based on the 512-bit memory bus and framebuffer size, especially for future games which will have much larger requirements.
When we reverse things and look at the 9800 GX2, the GX2 however has 2x 128= 256 shader cores opposed to the 240 on the GTX 280 ... that gives it a bit more bite and the GX2 shader domain is clocked a tad faster. So as this article has shown, the products both win and lose a little from each other. But what about overclocking then ?, you ask. Reference samples we just do not overclock as there's no point. Be aware though that this week we'll post a review on a retail sample, and that card we did overclock. And yes, you will gain a little extra performance from that as well.
Now then, if you'd ask me; the final word as far as I'm concerned is that the GTX 280 is the way to go here. Yes it'll be more (too) expensive but with the GX2 you also have the problems that come with SLI, a lot of heat (2 GPUs) and a fair amount of power consumption. When games start using that excessive amount of frame buffer and complex shaders, the GTX 280 would seriously kick in. I mean just look at the FEAR results which show this perfectly. But granted ... the existence of the GeForce 9800 GX2 makes this conclusion a little more knotty.
Other than, let me just say, if the 9800 GX2 did not exist, this product would by far be impressive (which it really is). But the 9800 GX2 is in the market, and though I expect it to go EOL real soon, it is challenging the GeForce GTX 280 quite well.
Alright, back to the actual product. Quick tip: we typically test in an open environment, meaning no airflow. Please remember that this is a card that will get warm, get a properly ventilated PC chassis, definitely a requirement. The release of the GeForce GTX 200 series is all about overall performance and power consumption, and I know that a lot of you were hoping to see DirectX 10.1 support. It was not implemented, and that's a bit of a loss as far as I'm concerned.
Pricing then; we just have to take into account that the GeForce GTX 280 will be introduced at launch with a price of no less than an astounding 649 USD, that's roughly 500-549 EUR here in Europe. And that's a stack load of money for sure which obviously I don't like as much as you do, either. The GeForce GTX 260 (which we hope to review soon) will be launched at a far more interesting 399,- USD retail price.
Anyway, with a fresh from the shelves high-end product like this I however always need to filter out the price, so let's do that for a minute. See, if you have a stack load of cash to spend then undoubtedly this is by far the best and fastest singe GPU based graphics card that money can buy. In that aspect the GTX 280 is screamingly fast, achieves super high framerates in combo with the best image quality settings, it incorporates Purevideo high-definition decoding and then of course the new features we can find under CUDA. And though CUDA applications still are work in progress, I bet that soon we'll see heaps of software applications that is going to make use of that GPU -- a truly nice development.
With this release we also see the introduction of GeForce PhysX, and all I can say ... it's about time folks. Though I'd like to see it become a standard in DirectX itself, Physics integration through CUDA in gaming is adding a new dimension into the gaming experience. It's feels, looks and is to be experienced as a more dynamic experience. Hard to explain, but once this is fully integrated, please have a go with it, as I'm sure you'll absolutely love it.
Rounding it up, the GeForce GTX 280 is a frickin beast that cranks up the performance ladder another notch or two. And though I can loathe the price as much as I want to, this is the high-end game ... If you want the best and got cash to spend, look no further. You found it man.
Thanks goes out to NVIDIA for their continued support on these reference reviews, we really appreciate that.
Mighty impressive product. Pure unadulterated brute gaming fun.
MSI GeForce GTX 980 Ti Lightning Review Thunderclouds hover above the Guru3D test-lab as the MSI GeForce GTX 980 Ti Lightning edition will now get a review. Yes we test and benchmark one of the most anticipated GeForce GTX 980 Ti cards of ...
ASUS GeForce GTX 980 Ti Poseidon Review We review and benchmark the coolest of them all, the ASUS GeForce GTX 980 Ti Poseidon Platinum ROG edition graphics card. This GeForce GTX 980 Ti based product comes factory overclocked and sports hyb...
MSI GeForce GTX 950 Gaming + 2-way SLI review We review the MSI GeForce GTX 950 Gaming (in SLI as well), this entry-level to mainstream graphics card is armed with a GM206 Maxwell generation graphics processor from Nvidia. The product performs qu...
ASUS GeForce GTX 950 STRIX review We review the ASUS GeForce GTX 950 STRIX, tagged as STRIX-GTX950-DC2OC-2GD5-GAMING. The GTX 950 is an entry-level to mainstream graphics card in the Maxwell range of GPUs from Nvidia that sits prett...