NVIDIA Announces RTX technology a Raytracing API
Click here to post a comment for NVIDIA Announces RTX technology a Raytracing API on our message forum
Noisiv
https://www.geforce.com/whats-new/articles/nvidia-rtx-real-time-game-ray-tracing
Note: a ray-tracing denoiser module is coming to the GameWorks SDK, which will enable developers to remove film grain-like noise without any additional time-sapping development work
It's noise. Information has to be reconstructed.
And it can be reconstructed as cheap or as expensive as your denoising algorithm is cheap or expensive. It's the old (aliasing) problem. There is nothing trivial about it
pharma
On Wednesday UnReal will be demoing this technique at GTC 2018, and other development groups but not sure of their schedule.
https://www.unrealengine.com/en-US/blog/unreal-engine-4-supports-microsoft-s-directx-raytracing-and-nvidia-rtx
-Tj-
RzrTrek
Nvidia is always at the forefront with technology.
Fox2232
@-Tj- : It would be doable if they used per frame data transfers. Let's say you have raytracing desity 1 ray per pixel as that's what your compute HW manages to do in 16ms (and you want 60fps). And then you bake those data. Worse looking than regular rendering, right.
But 16ms later, you calculate another ray per pixel read averaged value from baked data around that point and bake it again to that altered point of view objects. Looks bit better now.
In second you are already in something what resembles 60 rays per pixel. That works very well as long as data you generate in real time for scenery are not dropped.
Only issue come from flashing effects as they would have prolonged effect (while quickly diminishing).
Maybe even weighted value of calculated result. Like last data shown are 50% of current frame only. then data from 2 frames back are 25% only. 3 frames back 12,5%. And so on. So quickly killing old data.
I am sure there are many mathematical models allowing preservation of data from previous iterations for use in next frame for improved IQ and reduced HW requirements.
Noisiv
-Tj-
Im speaking specifically about Unreal Engine here, I quoted that as well..
all those games you mentioned are close to this, but still not it. Not by a mile, if Cyberpunk2077 will really look like in that trailer, then hat's off.
I personally would rather use those tensor cores for more direct compute by e.g. smoke, cloth, hair, trees/branches, something to make the world more richer, not 2d texture/sprites everywhere, kind of like old Tombraider sm2.0 vs nextgen sm3.0.. and this is what's keeping it back from cgi movies vs actual games.. Some newer games have more cloth and real hair, etc., imo just enough to keep our mouths shut lol
Edit: do you think new TR will look like this?
[youtube=YXS-BGW-uhU]
CGI quality gfx.. Well I don't think it will. lol I've been waiting for such gfx, hmm... ever since I saw that nvidia Nalu back in 2005?
Mundosold
The innovative part is the AI denoiser running on Tensor cores, and if automatically handled by Gameworks, it makes raytracing viable and accessible without the impossible performance barrier that you have today. Raytracing is the path to photorealism.
sammarbella
tsunami231
-Tj-
Well they explicitly said once that they are aiming for that kind of gfx or nothing. Guess they're also waiting for better gpus, I doubt even full volta can run that beyond 1080p..
Imo witcher would have looked similar to first leaked videos, but they had nv pressure them with that hair crap.. and then we got a rushed & crippled gfx version with "fancy" hair..
Yes, but why use just for raytracing?
why not use it for other stuff, that's relevant now or will be in next 1-2years.. Raytracing yes is nice, but then they used it to mimic better shadows and AO and reflections, anything more and it would be overkill..
And this is my point, I would rather use it for richer game worlds, not some after touches.
Noisiv
We'll see 1 or 2 Ray-tracing games this year, and then pretty much nothing for at least 2-3 years.
This, I believe, is a reasonably optimistic outlook.
Robbo9999
One of the most interesting parts of this article for me is the implication that the next generation of Geforce cards will have tensor cores. Previous rumours suggested that the next generation of gaming cards would not be Volta, but would instead be called Turing or Ampere and would be a tensor-less architecture. The implications of this article is that the next generation of Geforce gaming cards are gonna be more Volta-like than had been rumoured.
fantaskarsef
Well... can't say much positive things except that maybe they finally got the idea of doing something for dx12 or it won't ever lift off.
ruthan
So simply just better cutscenes for now and not realtime raytracing on consumers cards yet?
Prince Valiant
KissSh0t
Shots fired xD
[youtube=C9eMciZGN4I]
JonasBeckman
https://www.resetera.com/threads/metro-exodus-to-utilise-nvidia-rtx-ray-tracing.31012/
Was wondering which title would be first with using the effect, might be Metro: Exodus as it looks like they're bringing the tech into the game.
Spets
UE4 using Nvidia RTX
[youtube=J3ue35ago3Y]
Spets
[youtube=tjf-1BxpR9c]