NVIDIA Brings DX-R To Pascal GPUs with new driver (+some benchmarks)
It has been a topic of much and many debates, the RTX features from NVIDIA. Is hybrid ray tracing with the extra money, and why hasn’t the sole technology behind RTX, the DXR API been brought towards other cards. Well, today that changes with a driver release from NVIDIA.
You want to run Battlefield 5 or Metro Exodus with ray tracing enabled? You need a GeForce RTX card. The release of today’s driver opens up DXR support towards NVIDIA Pascal GPUs for the GeForce 1000 and 1660 (6GB and higher) series.
Above the GTX 1080 - the perf hit is very extensive.
There's a bit of a conundrum though, DXR technically only requires a DX12- or Vulkan-compatible graphics card with appropriate drivers, but the calculations for ray tracing will fall back to the GPU's compute capabilities and will invoke a big massive performance hit. The new driver released today makes it possible to use real-time raytracing in games via DXR as part of DirectX 12 without exclusive RT cores and thus utilizing the traditional compute cores. However, with DXR effects enabled, performance will be significantly lower than Turing's counterparts with specialized RT cores.
The GTX 1660 Ti actually holds ground pretty well when you compare it with that GTX 1080
NVIDIA has been working on DXR via compute shaders running over its CUDA (shader) cores. For recent GeForce GTX 1660 series cards adopters that means there will be a performance benefit as Turing includes separate INT32 cores which are not merged with FP32 cores as Pascal has), that means that the 1660 cards will run significantly better than the Series 1000 (Pascal). It still will take a massive perf hit as it obviously is still lacking RTX cores.
The dark-colored charts below derive from NVIDIA themselves. The benchmark results as presented by NVIDIA are fairly spot on to what we are seeing.
DLSS remains exclusively for Turing RTX
From now on Raytracing will no longer be available exclusively on GeForce RTX at Nvidia. But with DLSS (Deep Learning Super Sampling) it is different. The alternative AI-based antialiasing will continue to be provided only with the dedicated tensor cores of the RTX family, Nvidia said in a Q & A.
So in short, people can enable DirectX Raytracing (DXR) on GeForce GTX 1060 6GB and higher graphics cards via a Game Ready Driver update, expected in April. DLSS, no bueno.
Below you can see a number of slides with perf numbers that NVIDIA made, we have had no early driver to test this. In the course of today, we’ll run some numbers internally on our side and will add these towards this article. So, to be updated. The bad news is that Raytracing performance will suck on non RTX supported cards, the good news is that you can now visually check whether or not you like RTX Raytracing in your games, and can decide whether or not this is something for your upgrade path. For some good news, NVIDIA will be making three demos available today as well, the Star Wars reflections demo, the Justice tech demo and the Atomic Heart Tech demo which should be a really nice demo to play around with and immerse yourself in a raytraced environment.
Meanwhile, you can download and try the new driver here.
Nvidia BB8 autonomous car drives 80km through Silicon Valley - 10/11/2018 07:54 AM
If you live in California, see something green with lots of sensors and overtake it, chances are its an NVIDIA test platform for autonomous driving. The car in question is called .. BB8 yeah, geeks ...
Destiny 2 PC launch trailer - hardware requirements and NVIDIA Bundle - 10/18/2017 08:34 AM
Some news on Destiny 2 for PC then, first up Nvidia, they are announcing a GeForce GTX Destiny 2 Bundle. When gamers purchase a select GeForce GTX 1080 Ti or GeForce GTX 1080-based Graphics Card, sys...
NVIDIA Bundles Middle Earth: Shadow of War with GTX 1080 and GTX 1080 Ti - 09/26/2017 11:03 PM
NVIDIA announced the ‘Forge Your Army’ Bundle. For a limited time, gamers can get Middle-earth: Shadow of War for free when they buy a GeForce GTX 1080 Ti or GeForce GTX 1080-based...
Nvidia BattleBox To be Offered With Both Intel and Ryzen Processors - 05/27/2017 07:35 AM
In case you are not familiar with the concept, the Nvidia GTX Battle Box is a prebuilt system for both the high-end as well as the mid-range PC users. You can choose out of two different versions, th...
Nvidia Board Partners Can Reconfigure 1060 6GB And 1080 With Faster Memory - 03/01/2017 02:32 PM
At the Nvidia Geforce GTX 1080 Ti event yesterday a number of things have been discussed. Among them it seems that Nvidia has started allowing its AIB partners to reconfigure the GeForce GTX 1060 (6GB...
Senior Member
Posts: 13234
Joined: 2004-05-16
Yes but I think it's worth it.
Digital artists are essentially rubbing up against the wall of what's physically possible using traditional rasterization techniques. They have all these hacks, gimmicks, methods that are strewn together all to create the illusion of what light should be doing. All those things are getting increasingly more complex, increasingly harder to integrate and result of them, the "image quality", is essentially plateauing. RT kind of supersedes all that - right now it's complicated because we're still just on the cusp of performance with it.. we still need a few tricks to reduce impact, that's where machine learning comes in (use less ray casts and denoise the scene) and other optimization methods (decrease quality with distance, simplify the BVH representation, single bounce, etc) - but eventually RT will just be the default method and as the performance increases one by one those optimizations won't be necessary. Quality will go up way further than raster could ever bring us and complexity and integration "cost" will go down because the algorithms are simply mirroring what light would do in real life. Grab some good materials shove them in a scene and boom thing looks like real life.. you're not spending 3 months writing some hacked up GI/Diffuse shader or playing with lightmaps all day because your outside directional light is leaking through your geometry in a bunch of places.
Plus there will be advancements in hardware, raycast collision algorithms, AI denosiing that will make the impact less noticeable. Eventually you won't even get raster based games anymore so you won't even know what the performance impact is regardless.
Yeah it sucks now because DXR/Raytracing is somewhat limited to Nvidia only and the cost of those cards is restrictive but 5-6 years from now it's just going to be like AO or tessellation or any other similar technique that was once novel but became an expected feature.
Senior Member
Posts: 12508
Joined: 2010-05-22
No problem on my end, we're all just communicating just voicing our own ideas and opinions, does there need to be a problem? I'm confused

Speaking of which has anyone got any numbers for SLI and DX-R? Does it scale?
You need DX11 for SLI which doesnt support DXR.
DX12 only supports mGPU which needs direct support in the game, there is no method to enable it via the video driver.
Senior Member
Posts: 1463
Joined: 2011-02-17
I might have missed it if it has been mentioned already, but Shadow of the Tomb Raider supports DX12 multi GPU, so that's one game Pascal Sli users could try with DXR enabled. There's even a free demo with the first mission of the game.
Senior Member
Posts: 14606
Joined: 2008-08-28
Well in the shop where I work there is a very rare rtx2080 purchase, 2080ti basically none.. maybe 2-3 in 3months.
Mostly 1660, 2060 and few 2070 here and there..
Thats not so strange. 2070 looks very appealing. Its the only card i would consider from nvidia. It has enough horsepower and enough vram.
I think some of the cheapo models can be found below 500Eur. Hows the prices in your shop?
Senior Member
Posts: 17119
Joined: 2012-05-18
Well in the shop where I work there is a very rare rtx2080 purchase, 2080ti basically none.. maybe 2-3 in 3months.
Mostly 1660, 2060 and few 2070 here and there..