The one thing I again want to touch, as I respect this move from NVIDIA, is Image quality. This is a quickie copy/paste from our original GeForce 8800 article last year as, well nothing changed in this segment.
One of the things you'll notice in the new Series 8 products is that number if pre-existing features have become much better and I'm not only talking about the overall performance improvements and new DX10 features. Nope, NVIDIA also had a good look at Image Quality. Image quality is significantly improved on GeForce 8800 GPUs over the prior generation with what NVIDIA seems to call the Lumenex engine.
You will now have the option of 16x full screen multisampled antialiasing quality at near 4x multisampled antialiasing performance using a single GPU with the help of a new AA mode called Coverage Sampled Antialiasing. We'll get into this later though with pretty much this is a math based approach as the new CS mode computes and stores boolean coverage at 16 subsamples and yes this is the point where we lost you right? We'll drop it.
So what you need to remember is that CSAA enhances application antialiasing modes with higher quality antialiasing. The new modes are called 8x, 8xQ, 16x, and 16xQ. The 8xQ and 16xQ modes provide first class antialiasing quality TBH.
If you pick up a GeForce 8800 GTS/GTX/Ultra then please remember this; Each new AA mode can be enabled from the NVIDIA driver control panel and requires the use to select an option called Enhance the Application Setting. Users must first turn on ANY antialiasing level within the games control panel for the new AA modes to work, since they need the game to properly allocate and enable anti-aliased rendering surfaces.
If a game does not natively support antialiasing, a user can select an NVIDIA driver control panel option called Override Any Applications Setting, which allows any control panel AA settings to be used with the game. Also you need to know that in a number of cases (such as the edge of stencil shadow volumes), the new antialiasing modes can not be enabled, those portions of the scene will fall back to 4x multisampled mode. So there definitely is a bit of a tradeoff going on as it is a "sometimes it works but sometimes it doesn't" kind of feature.
So I agree, a very confusing method. I simply would like to select in the driver which AA mode I prefer, something like "Force CSAA when applicable", yes something for NVIDIA to focus on.
But 16x quality at almost 4x performance, really good edges, really good performance, that obviously is always lovely.
One of the most heated issues over the previous generation products opposed to the competition was the fact that the NVIDIA graphics cards could not render AA+HDR at the same time. Well that was not entirely true through as it was possible with the help of shaders as exactly four games have demonstrated. But it was a far from efficient method, a very far cry (Ed: please no more puns!) you might say.
So what if I would were to say that now not only you can push 16xAA with a single G80 graphics card, but also do full 128-bit FP (Floating point) HDR! To give you a clue the previous architecture could not do HDR + AA but it could do technically 64-bit HDR (just like the Radeons). So NVIDIA got a good wakeup call and noticed that a lot of people were buying ATI cards just so they could do HDR & AA the way it was intended. Now the G80 will do the same but it's even better. Look at 128-bit wide HDR as a palette of brightness/color range that is just amazing. Obviously we'll see this in games as soon as they will adopt it, and believe me they will. 128-bit precision (32-bit floating point values per component), permitting almost real-life lighting and shadows. Dark objects can appear extremely dark, and bright objects can be exhaustingly bright, with visible details present at both extremes, in addition to rendering completely smooth gradients in between.
As stated; HDR lighting effects can be used together with multisampled antialiasing now on GeForce 8 Series GPUs and the addition of angle-independent anisotropic filtering. The antialiasing can be used in conjunction with both FP16 (64-bit color) and FP32 (128-bit color) render targets.
Improved texture quality
it's just something we must mention. We all have been complaining about shimmering effects and lesser filtering quality than the Radeon products, it's a thing of the past. NVIDIA listened and added raw horsepower for texture filtering making it really darn good. Well .. we can actually test that !
Allow me to show you. See, I have this little tool called D3D AF Tester which helps me determine how image quality is in terms of Anisotropic filtering. So basically we knew that ATI always has been better at IQ compared to NVIDIA.
GeForce 7900 GTX 16xAF (HQ)
Radeon X1900 XTX 16xHQ AF
GeForce 8800 16xAF Default
Now have a look at the images above and let it sink in. It goes too far to explain what you are looking at; but the more perfect a round colored circle in the middle is the better image quality will be. A perfect round circle is perfect IQ.
Impressive to say at the least. The the AF patterns are just massively better compared to previous generation hardware. Look at that, that is default IQ; that's just really good...
Nvidia GeForce GTX 750 and 750 Ti review In todays article we review the new GeForce GTX 750 and 750 Ti from Nvidia. These cards are affordable - low power - decent performance graphics cards that will allow you to game even at 1080P. these...
NVIDIA G-Sync Explored and Explained NVIDIA recently announced GSync, a technology that is named to be a game changer, yes G-Sync eliminated toe problems that come with VSYNC on and off. meaning no more Sync stuttering and or Screen te...
NVIDIA G-Sync explained On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really bi...
NVIDIA GF100 (Fermi) Technology preview Last week we arrived at Sin City not only to cover CES but there was something else going on as well. In Las Vegas, NVIDIA had organized a briefing for a select group of the press. From Europe perhaps ten to fifteen people where invited for this somewhat privileged preview -- the topic, a technical overview of project Fermi. Fermi is of course the family name of the latest generation of GPUs from NVIDIA. The first chipset deriving from Fermi will be called the GF100 GPU which will likely be used on what we think will be called products like GeForce 360 and GeForce 380. Join us in a nice technology preview.