GeForce FX 5800 Ultra review

Graphics cards 1048 Page 2 of 20 Published by

teaser

The Technology #1

This new GPU is and will be profiled as Cinematic GPU as it is capable of bringing cinematic visual effects on your PC with the combination of some brutal power and an excellent featureset. The CineFX GPU is of course a capable of utilizing DirectX9 Pixel Shaders 2.0+, Vertex Shader 2.0+ and OpenGL.

Such a GPU of course deserves good memory bandwidth. That rumor was true .. the GeForce FX GPU will make use of the very excellent DDR2 local memory which is capable of a 500 MHz thus effective 1 GHz frequency !! And although nVIDIA did not announce the core clock frequency yet you can expect it to be in the 400-500 MHz ! In theory you can do 20.000-22.000+ 3D Marks with a graphics card like that ..

We can always get a vague idea where a card will be performance wise by looking at it's memory bandwidth. Based on the DDR2 frequency we can make note of the fact that  [(2x128bit) x (1000MHz:2) : 8bit] the graphics card has roughly 16000 MB/sec memory bandwidth. It's not all about memory bandwidth of course.

Untitled-1.jpgRealtime Cinematic Effects

Developers on GeForce FX - John Carmack (Id Software)
NVIDIA Is the first of the consumer graphics chip companies to firmly understand what is going to be happening with the convergence of consumer realtime and professional offline rendering. The architectural decision in the NV30 to allow full floating point precision all the way to the framebuffer and texture fetch, instead of just in internal paths, is a good example of far sighted planning. It has been obvious to me for some time how things are going to come together, but Nvidia has made moves on both the technical and company strategic fronts that are going to accelerate the timetable over my original estimations.

My current work on Doom is designed around what was made possible on the original GeForce, and reaches an optimal implementation on the NV30. My next generation of work will be designed around what is made possible on the NV30.

The GeForce FX has Raw Computing Power

First let's take a look how and where the GeForce FX will get it's speed from. You can understand that a videocard with a strong and powerful GPU like the GeForceFX is you'll of course be able to play any today's and tomorrows games at frightening fast speeds and high resolutions. To achieve all that raw speed there are some key factors in the graphics core that allow that process:

  • 8-pixels per clock cycle. The high graphics bandwidth facilitates the application of complex textures, lighting, and other effects to an entire scene, without limiting cinematic realism to a portion of the screen or to the main characters. Games and other desktop applications take on a film-like appearance, in real time. Geforce FX has 4 Pipelines and 2 Texture Memory Units that can results with 8 textures per clock but only in multitexturing. Many games including Unreal Tournament 2003 use both single and multitexturing. In any cases where you don't need multitexturing and can make your scene with single texturing you want to do that as every bit of bandwidth you can save really counts.

  • NVIDIA Intellisample technology. The NVIDIA GeForce FX GPU is the first GPU to include both Z and color compression, automatically providing a boost to antialiasing. These advances in compression and antialiasing techniques ensure realistic color and smooth edges at all resolutions, without any loss in performance. Users will see the most fluid frame rates possible for a truly realistic experience.

  • DDR2 - Innovative interfaces to the latest 1GHz DDR2 memories. The next generation NVIDIA memory interface technology maximizes the bandwidth that can be achieved using the latest high-speed memory chips.

  • AGP 8X bus implementation. The newest specification of the AGP bus doubles the theoretical bandwidth between the graphics engine and the rest of the system, accelerating transfers to main memory and minimizing the overhead associated with storing and retrieving textures using main memory.

  • 0.13-micron semiconductor fabrication, this allows very high clockspeeds and a low heat factor. A higher clock frequency of course means higher framerate and performance. I think we can expect a core clock somewhere in 400-500 MHz !

Let's put all that data in a table and compare it with the previous generation, thus GeForce4:

Untitled-2.jpg

When we step away from the RAW power that the GeForce FX will bring we'll move into the concept quality.

  • 128-bit studio precision pixel processing throughout the entire graphics pipeline for uncompromised precision levels without visual artifacts.

  • Incorporate a seamless design to take advantage of high-level shading languages, including Cg. These high-level languages make it easier for artists and developers to create stunning content and exploit the potential of the GeForce FX GPU capabilities.

This commitment to engineering excellence has made the NVIDIA GeForce FX GPU the best development platform and the best playback platform for next generation cinematic-quality games. When a programmer uses specialized graphics programs called shaders the NVIDIA CineFX engine allows artists and designers to easily convert their artistic visions into digital content. By simplifying the creation of shaders and building in support for shader execution at the hardware level, the NVIDIA GeForce FX GPUs inspire a new generation of special effects programming. The NVIDIA GPUs also offer programmers the power of the Cg high-level graphics programming language, and the choice of programming environments. So basically the programmer of a game is given a specialized set of tools to create whatever he or she wants. The new GPU offer the most complete hardware implementation for both OpenGL and Microsoft DirectX APIs.

Untitled-3.jpg
GeForce FX enables cinematic-quality real-time rendering

All of the programming advancements are complemented with significantly increased levels of precision. The new graphics pipeline of the CineFX engine has the built-in capacity to deliver true 128-bit color, or 32-bit components for red, green, blue, and alpha values. With 128-bit color, literally millions of choices exist for each color component, compared to only a couple of hundred levels with 32-bit color.

Share this content
Twitter Facebook Reddit WhatsApp Email Print