Quick test: Wolfenstein II: The New Colossus Adaptive Shading Benchmarks
A newly released patch for Wolfenstein II: The New Colossus has been released and interestingly enough, it adds support for NVIDIA Turing's Adaptive Shading. The claims have been enticing, higher performance should be reached with this new technology.
With adaptive shading, an algorithm looks at the previous frame and then will determine which parts of the current frame can contain less detail without it having noticeable image quality loss (basically where you look with your eyes is shaded the best, the rest a little less). In areas where you hardly look with your eyes shading does not take place per pixel, but per 4 pixels.
-- Patch info -- A new patch has been deployed to implement NVIDIA Adaptive Shading, improving the performance of Wolfenstein II: The New Colossus. We’ve been working with NVIDIA to make sure the game runs great on NVIDIA RTX hardware. Wolfenstein II: The New Colossus debuts the first implementation of NAS.
Additional patch notes:
- Added support for NVIDIA Adaptive Shading on NVIDIA RTX series GPUs. (Improves frame rate by dynamically adjusting the shading resolution in different areas of the screen, without affecting fidelity).
- Ensured that, on multiple GPU systems, the discrete GPU is preferred over an integrated GPU.
- Players can now choose to ignore/suppress warnings when the selected video settings exceed the amount of dedicated VRAM available on the GPU
- Fixes for skinning issues on GTX 970
The patch was released yesterday and I decided to see how much of a difference it really makes. In the test results below you can see the performance differences. The test has been conducted on our eight-core Haswell-E platform at 4200 MHz on all cores and a GeForce RTX 2080. Adaptive shading can be selected in the settings, at several stages. Off, quality, balanced and performance. You can also set a custom profile setting. Since full HD is CPU limited on the GeForce RTX 2080 we skipped that resolution, below 2560x1440 (WQHD) and 3840x2160 (Ultra HD) average framerates measured on a 30 seconds recorded run per preset and resolution.
We can confirm we see very little image quality degradation if you can notice it at all. We rendered at the Mein Leben! preset, customized with adaptive shading. From off to adaptive performance mode you are looking at roughly a 5% performance differential on a game that already is offering breathtaking framerates on any Turing based graphics card.
And per request, the same result set at a normalized scale:
Quick test: DeepCool MF120 Frameless RGB Fan w/ WIFI controller - 02/23/2018 04:19 PM
In this quick test, we briefly peek at the new MF120 Frameless WIFI RGB series fans from DeepCool, the fans are not just unique looking with an aluminum base finish, the RGB LED strip can be controlle...
Quick test: Corsair PBT Double-Shot Keycaps - 01/26/2018 11:11 AM
We check out some plastics today, as we review the new PBT Double-Shot Keycaps from Corsair, compatible with their K95, K70, K65, K66, K63 and STRAFE keyboard models. The new keycaps are extremely dur...
Quick test: Futuremark 3DMark v2.3.3663 Vulkan API Overhead Benchmarks - 03/23/2017 04:39 PM
Futuremark just released 3DMark v2.3.3663 with support for Vulkan in their API overhead test, we ran some quick tests to see what is happeing....
Quick Test: 3DMark TimeSpy DirectX 12 Benchmarks - 07/15/2016 11:29 AM
For those that missed the news. it seems that Time Spy, the new DirectX 12 benchmark test from Futuremark is available as DLC right now on Steam, coming soon to all Windows editions of 3DMark. With i...
Quick test: DirectX 12 API Overhead Benchmark results - 03/26/2015 08:09 PM
As you guys know, DirectX 12 is going to greatly free up processor utilization, and thus your games can make more draw-calls with your processor. More efficient usage of CPUs with multiple cores is tr...
Junior Member
Posts: 15
Joined: 2018-04-27

Senior Member
Posts: 15258
Joined: 2008-08-28
Bender that sh1t is funny but you should stop it.

Senior Member
Posts: 115
Joined: 2014-08-19
We might even need a semi NSFW picture, with RTX on with a brazzers logo, you know cause people took it ten folded in the as*.... always when you buy a Nvidia product.
Senior Member
Posts: 16009
Joined: 2004-08-18
Seems a bit pointless to me this as the performance increase is so minimal anyway that you will almost certainly not notice any difference while playing. The RTX 2080 (and presumably RTX 2080 Ti) already run this game at ridiculously high framerates that a few more FPS at 1440p and 4K is really not that worthwhile.
This technology might be more useful on lower end Turing cards such as an RTX 2050/2060 which I assume will be out at some point.
Not really been impressed with the RTX range so far. Sure, the performance increase is nice in the RTX 2080 Ti vs. my own GTX 1080 Ti but that is par for the course with any new range of GPUs and that card isn't really offering a huge leap over what we've had before, just the usual 25-40% increase on average vs. the previous generation's equivalent GPU. The RTX features so far have been very underwhelming IMO and definitely not worth the price hike in my view.
Like any new technology though, the first generation is always the proof of concept that doesn't quite run as expected but I'm sure that future hardware will see significant improvements and also lower prices. I am more than happy to wait as it's not like there is any game out there that demands anything more powerful than my current GTX 1080 Ti and those RTX enhancements are so sparse that there is absolutely no reason to upgrade.
Junior Member
Posts: 15
Joined: 2018-04-27