Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
MS Flight Simulator (2020): the 2021 PC graphics performance benchmark review
Radeon Series RX 6700 XT preview & analysis
Corsair MM700 & Corsair Katar Pro XT Review
Guru3D Rig of the Month - February 2021
ASUS GeForce RTX 3060 STRIX Gaming OC review
EVGA GeForce RTX 3060 XC Gaming review
MSI GeForce RTX 3060 Gaming X TRIO review
PALIT GeForce RTX 3060 DUAL OC review
ZOTAC GeForce RTX 3060 AMP WHITE review
Fractal Design Meshify 2 Compact chassis review

New Downloads
GeForce 461.81 hotfix driver download
ClockTuner for Ryzen (CTR) v2.0 RC4 Download
SiSoft Sandra 20/21 download v31.12
Intel HD graphics Driver Download Version: DCH 27.20.100.9316
AIDA64 Download Version 6.32.5644 beta
FurMark Download v1.25
MSI Afterburner 4.6.3 Final Stable Download
Display Driver Uninstaller Download version 18.0.3.7
Guru3D RTSS Rivatuner Statistics Server Download 7.3.0 Final
Media Player Classic - Home Cinema v1.9.10 Download


New Forum Topics
NVIDIA GeForce RTX 3080 Ti to get limited for Cryptocurrency Mining Performance Also ClockTuner 2.0 for Ryzen (CTR) Guide and download GeForce 461.72 WHQL drivers: download & discussion 461.72 + Hotfix 461.81 - Clean Version VBIOS modded to 1000W for GeForce RTX 3090 Hall Of Fame (HOF) Edition did not yield much AMD announces Radeon RX 6700 XT 12GB at 479 USD, launches on March 18th AMD confirms that Resident Evil Village will have Ray Tracing support on PC MS Flight Simulator (2020): the 2021 PC graphics performance benchmark review Agesa 1.2.0.1 experience! GeForce Hotfix Driver Version 461.81




Guru3D.com » Review » GeForce GTX 280 review » Page 2

GeForce GTX 280 review - 2 - A Crash Course on Shaders

by Hilbert Hagedoorn on: 06/15/2008 01:00 PM [ ] 0 comment(s)

Tweet


The quick 101 - Shaders explained

Dude .. what is a shader?
To understand what is going on inside that graphics card of yours, please allow me to explain what is actually happening inside that graphics processor and explain shaders in very easy to understand terminology (I hope). That and how it relates to rendering all that gaming goodness on your screen (the short version).

What do we need to render a three dimensional object; 2D on your monitor? We start off by building some sort of structure that has a surface, and that surface is being built from triangles. Triangles are great as they are really quick and easy to compute. Now we need to processes each triangle. Each triangle has to be transformed according to its relative position and orientation to the viewer.

The next step is to light the triangle by taking the transformed vertices and applying a lighting calculation for every light defined in the scene. At last the triangle needs to be projected to the screen in order to rasterize it. During rasterization the triangle will be shaded and textured.

Graphic processors like the GeForce series are able to perform a large sum of these tasks. Actually the first generation (say ten years ago) was able to draw shaded and textured triangles in hardware, which was a revolution.

The CPU still had the burden of feeding the graphics processor with transformed and lit vertices, triangle gradients for shading and texturing, etc. Integrating the triangle setup into the chip logic was the next step and finally even transformation and lighting (TnL) was possible in hardware, reducing the CPU load considerably (surely everyone remembers the GeForce 256 right ?).

The big disadvantage at that time was that a game programmer had no direct (i.e. program driven) control over transformation, lighting and pixel rendering because all the calculation models were fixed on the chip. This is the point in time where shader design surfaced.

We now finally get to the stage where we can explain Shaders.

In the year 2000 DirectX 8 was released, Vertex and Pixel shaders arrived and allowed software and game developers to program tailored transformation and lighting calculations as well as pixel coloring functionality which gave a new graphics dimension towards the gaming experience, and things started to look much more realistic.

Each shader is basically nothing more than a relatively small program (programming code) executed on the graphics processor to control either vertex, pixel or geometry processing. So a shader unit in fact small floating point processor inside your GPU.

When we advance to the year 2002, we see the release of DirectX 9. DX9 had the advantage for the use of way longer shader programs than before, with pixel and vertex shader version 2.0.

In the past graphics processors have had dedicated units for diverse types of operations in the rendering pipeline, such as vertex processing and pixel shading.

Last year with the introduction of DirectX 10 it was time to move away from the pretty inefficient fixed pipeline and create a new unified architecture.

So each time we mention a shader processor, this is one of the many shader processors inside your GPU. Once I mention a shader ... that's the program executed on the Shader engine (the accumulated shader processor domain).

NVIDIA likes to call these stream processors. Same idea, slightly different context. GPUs are stream processors – processors that can operate in parallel, aka many independent vertices and fragments at once. A stream is simply a set of records that require similar computation.

Okay, we now have several types of shaders, Pixel, Vertex, Geometry and with the coming of DX11 likely a compute shader, which is a pretty cool one. As you can run physics acceleration without needing to go through a graphics API. Fun thing, this is already possible through through the CUDA engine, but we'll talk about that later on.

There. I do hope you now understand the concept of shaders and shader processors. Let's talk a little about the GeForce series 200, shall we?

GeForce GTX 280 - GeForce GTX 200 Series




20 pages 1 2 3 4 next »



Related Articles
ASUS GeForce RTX 3060 STRIX Gaming OC review
We move to ASUS, which outs their ROG GeForce RTX 3060 STRIX Gaming OC, with 12GB, 3584 shading processors activated and a boost clock of 1882 MHz the card has been tweaked extensively straight out of...

EVGA GeForce RTX 3060 XC Gaming review
All your base belong to EVGA, join us as we review their compact GeForce RTX 3060 XC Gaming, also with 12GB, 3584 shading processors activated but with a proper factory boost clock of 1882 MHz, a grap...

MSI GeForce RTX 3060 Gaming X TRIO review
NVIDIA has released another RTX 3000 series SKU onto the market, in this review, we check out the MSI GeForce RTX 3060 Gaming X TRIO, and yeah that's a non-Ti model. This Ampere GPU-based graphics ...

PALIT GeForce RTX 3060 DUAL OC review
We travel to the premises of PALIT, which released their DUAL OC GeForce RTX 3060, also with 12GB, 3584 shading processors activated and with a default boost clock of 1837 MHz the OC edition graphics ...

© 2021