Albatron GeForce 8800 GTS 512MB

Graphics cards 1048 Page 2 of 13 Published by

teaser

2 - G92 & Purevideo HD

 

Alright, let's get cracking. To understand the GeForce 8800 GTS 512MB we must first go back to the actual GPU this product has embedded. Therefore first a quick chat about the graphics processors developed under the name "G92". If you think you recognize that codename, you are definitely onto something. The name G92, is the GPU used in the product you better know as GeForce 8800 GT, specifically the 512MB models. That product alone comes pretty close, performance wise, to the GeForce 8800 GTX already, so imagine what a GTS can do in terms of performance.
The G92 silicon is based on NVIDIA's newer 0.65nm silicon. Is this a respin chip? Rhetorical questions answered "Yes and no. See while this GPU architecture is 95% the same as the good old G80 (GTS/GTX/Ultra) core there are some very distinct differences

The first being that fabrication process, they made the chip "smaller". It was moved from 90nm towards 65nm, meaning a smaller die-size, likely resulting in lower core voltages, better energy efficiency and perhaps better clock speeds. The second positive here is that the product has an increased amount of shader processors over the previous GTS 320/640MB series. Let me explain: a 8800 GTX has 128 of these Shader processors embedded onto the graphics core, the old model 8800 GTS (320/640MB) has 96 shader processors, the new 8800 GT has 112 activated Shader processors, yet importantly the G92 is based on the 8800 GTS 512MB which means that both GPUs on the 9800 GX2 have 128 shader processors.

Interesting, because if you take a peek at the clock speeds you'd might even think that one G92 GPU could beat a GTX. Well shader processors are not the only part of the equation, because at the end of the pipeline there this thing called ROPs, and that's where the old GTS (20 ROPs), and this G92 (16 ROPs) become a little castrated compared to the GTX (24 ROPs). There's a new optimization of ROPs' compression algorithm being applied on the GT though, which makes it a bit more efficient.

Next to that, the G92 is utilizing more memory differently, GTX for example is addressing the memory bus wider at 384-bit, opposed to the G92 with 256-bit, that's less memory bandwidth to play around with. The performance differential between the two processors is small though, as our benchmarks have shown already.

A logical question on your end would be, why is the 9800 GX2 a series 9 product, and the 8800 GTS 512MB based on the same G92 graphics core, a Series 8 product?

So in summary, the new features of G92 opposed to the G80, smaller 65nm fabrication process, silicon optimizations (compression algorithms), usage of 256-bit memory and compared to the older GTS models, a high shader processor count at 128 sub-cores.

Also new in the G92 is an improved video decoding engine. Let's have a little chat about that.

New PureVideo Enhancements

PureVideo HD is a video engine built into the GPU of your graphics card (dedicated core logic). It allows for dedicated GPU-based video processing to accelerate, decode and enhance image quality of low- and high definition video in the following formats: H.264, VC-1, WMV/WMV-HD, and MPEG-2 (HD). Speaking more generic; your graphics card can be used to decode SD/HD materials in two categories:

HD Acceleration
The more your graphics card can decode the better, as it'll lower the overall used CPU cycles of your PC. We'll measure with the two most popular codecs used on both Blu-Ray and HD-DVD movies. VC1 is without a doubt the most used format, and secondly, the hefty, but oh so sweet H.264 format. We'll fire off a couple of movies and allow the graphics cards to decode the content; meanwhile like a vicious minx we'll be monitoring and recording the CPU load of the test PC.

HD Quality
Not only can the graphics card help offload processing from the the CPU, it can also improve (enhance) image quality; as it should. So besides checking out performance of AMD's Avivo HD and NVIDIA's PureVideo HD video engines, we want to see how they effect the image quality, e.g. post-process and enhance the image quality of the movie.

Basically, in the entire GeForce Series 8 and obviously the new Series 9 range we see a 10-Bit display processing pipeline and also new post-processing options like:

  • VC-1 & H.264 HD Spatial-Temporal De-Interlacing
  • VC-1 & H.264 HD Inverse Telecine
  • HD Noise Reduction
  • HD Edge Enhancement
  • HD Dynamic Contrast Enhancement
  • HD Dynamic Color Enhancement

Today's newly added features in bold will be available for all GeForce series 8 & 9 products. With the GeForce 9600 GT comes the exact same VP2 decoding engine as found on the 8800 GT. You'll have your low-CPU post-processed, decoding 1080P image quality options with HD-DVD and Blu-Ray just like the 8800 GT with the new Video Processor 2 engine built in. I quickly verified; the HQV-HD score is at it's maximum 100 points and thus working like a charm.

GeForce 9600 GT - Guru3D.comNow read the topic carefully as with this new release we mainly talk about enhancements. In the upcoming drivers you'll notice the addition of two new features: Dynamic Contrast and Color enhancement.

It does pretty much what the name says;  dynamic contrast enhancement technology will improve the contrast ratios in videos in real-time on the fly. It's a bit of a trivial thing to do, as there are certain situations where you do not want your contrast increased. Think for example a scary thriller, dark environment ... and all of the sudden your trees light up. So with that in mind; the implementation has been done very delicately. It does work pretty well, but personally I'd rather tweak the contrast ratio myself and leave it at that.

To the right you can see a screenshot where the new options are located, and yes sorry for the Dutch language.

** Idea for NVIDIA's driver team, selectable languages in the ForceWare drivers.

The second feature is Dynamic Color Enhancement. It's pretty much a color tone enhancement feature and will slightly enforce a color correction where it's needed. We'll show you that in a bit as I quite like this feature; it makes certain aspects of a movie a little more vivid.

Also a small new addition for Vista Aero enthusiasts, previously when you play back a movie while utilizing graphics processor with software like PowerDVD, and thus had to shut down Vista Aero and revert back to the basic Vista theme; this has now been solved and windows transparency, thumbnail previews etc are all working as it's intended to do.

Also new is a feature called Dual-Stream decode. Pretty much it boils down to the fact that you can display two video streams simultaneously. Pretty handy if you watch a Blu-Ray movie with a small directors commentary window on the lower part of your screen.

55-4with-edge-and-noise-enabled.jpg

Let's split the frames in two and compare with all interesting tweaks enabled. Two older features, edge enhancement and noise reduction obviously are also at your disposal. To the left the baseline (first) image, to the right the final result. Once we enable these as well and combine them with the Dynamic contrast enhancement and color enhancement option we see a distinct difference in image quality. Thanks to edge enhancement the frame is more sharp.

Share this content
Twitter Facebook Reddit WhatsApp Email Print