Raytracing without RTX: Nvidia Pascal receives DXR support via driver
Click here to post a comment for Raytracing without RTX: Nvidia Pascal receives DXR support via driver on our message forum
Kaarme
XenthorX
Interesting move, wondering why some suggest it might be a move to counter the crytek presentation, not like this kind of things are decided over night.
Fact is, they realized some ray tracing features run on GTX 10XX serie. Got to keep in mind you can run the raytraced pass on a lower resolution, therefore lots of GPU can use raytraycing, but only the RTX cards wll be able to perform at the higher resolution.
The crytek presentation is inspiring as it shows that without dedicated hardware you can already get incredible results, just imagin how far you can go with those RTX cards then.
RooiKreef
It amazes me to see how Nvidia will fcuk the consumers over time after time like this. I bet you if we never saw the Crytech demo where AMD could do DXR with current hardware and no special cores then Nvidia would have never made a driver update like this. Oh well it’s a good thing so now everybody know that you don’t need special hardware to execute DXR on a DX12 compatible GPU. I never understood the reasoning behind special cores that can do only RT and AI when you could have just build a bigge GPU with more total core count and do everything on it.
HWgeek
https://www.nvidia.com/en-us/geforce/news/geforce-gtx-ray-tracing-coming-soon/
https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-gtx-dxr/geforce-rtx-gtx-dxr-shadow-of-the-tomb-raider-performance-850px.png
Looking in the Test images- can it be that the Performance uplift of turing RTX 2080 (W/O RT cores) over Pascal GTX 1080Ti comes from FP16 2:1? (20,137 GFLOPS FP16 on RTX 2080 vs 11,340 GFLOPS FP32 on 1080Ti)
Also we already know that RT need very fast memory with low latency- so lets wait to see how this limited RT affect will perform on Pascal vs Vega 56/64 and Radeon VII with HBM and FP16 2:1.
For me- Just give me reflection in games like Rainbow Six Siege to see Caviera sneaking on me and I am Happy 🙂.
Glidefan
Moderator
Yes! GPU baking 😀:D
HardwareCaps
So basically, Turing main unique feature "ray tracing", is going to be supported by previous generation as well?
Why would anyone get an RTX card now? it literally makes no sense.
does that mean that the 1660/ 1660Ti will also support DXR? haha so awkward
nevcairiel
nevcairiel
Richard Nutman
To me it seems the reason for this decision is just to get more game developers to add DXR Raytracing into games. Even if it's just for a few small effects so it results in acceptable frame rates on 10 and 16 series.
Then the RTX cards will naturally see significant performance increases in these games compared to AMD and NV's previous cards.
HardwareCaps
HardwareCaps
Also since developers know that Pascal is way way more popular than Turing, they can simply use DXR very very lightly for minor things, focusing on the main consumer base(also AMD will probably support DXR without dedicated acceleration hardware)
and might offer better visuals for RTX users
but that completely removes the main feature, the selling point of Turing, which is Ray Tracing.
nicugoalkeper
To Me this is not a hilarious move, but a normal one, Nvidia knew about this but they were hoping that this will happen later on so they can squiz some money out of costumers. Maybe this is one of the facts why they also launch the 16xx series.
All graphic cards that fully suport DirectX 12, suport Microsoft's DXR-API (the API responsible for real-time ray tracing) , we only need to know the impact to performance, but ther rest is there.
And Nvidia needs to bring support for this tech just because this will be supported by every game developer. Also the new consoles will suport this.
The 2xxx series bring some nice thing to the tabel but until we see a real comparation we can't say if this is not ok or that it has limitations, and so on.
Now the real race for DXR Real-Time Ray Tracing starts.
HardwareCaps
They did it because they had to get RTX rolling and Turing failed to do so. they are forcing ray tracing as hard as they can.
will fail.
kings
alanm
HardwareCaps
moo100times
@DrKeo - Ok that is what Nvidia is saying, which it is turning out to be different to what is possible and what might be.
If they are saying the same workload can be done using INT32 part of gpu chip, then what are RT cores offering over this, and what extra hurdles do developers need to overcome in order to make this work without tanking gaming performance? If these are features native to DX12 and general implementation, why would you spend the extra effort on developing for these cores generally, but even more so if you are planning to port games to consoles which are AMD based hardware?
Similar to what HardwareCaps says, I think this completely undercuts the whole RTX line. Not only is raytracing possible another way, but also can be implemented in a more developer friendly way, and across more hardware. In many ways, Nvidia might have done better to push the RTX narrative longer and locked out other changes a bit like Physx before giving up the ghost, rather than 6 months after the release of an expensive new line? If I bought a 1000 USD graphics card for this and then saw this, I would be pretty annoyed at this announcement.
You talk about raytracing being better through having dedicated hardware implementation for it, but RT core use doesn't seem to be compatible yet with a proper, decent FPS gaming experience, which means that this is kind of a moot point as they are not of real benefit yet to the end user yet. If this DX12 feature can be applied to these cores and bring about a massive increase in performance, then fair play to Nvidia, they will save themselves, but they have not demonstrated this yet.
Also Nvidia did not offer a comparison for the 1660Ti to their RTX cards, so no way to support or refute this. Comparing it to a 1080Ti does not really support the claims as it has little to do with what was announced for their new budget cards that supposedly lacked this feature natively, and yet can be implemented with a driver update.
The paranoid side of me wonders if there will some careful balancing of raytracing performance on these cards to not outperform their other gpus, but if AMD cards end up performing well in this, there will be some serious confusion as to what cards offer real bang for bucks.
metagamer
Kaarme
Glidefan
Moderator