NVIDIA Announces RTX technology a Raytracing API

Published by

Click here to post a comment for NVIDIA Announces RTX technology a Raytracing API on our message forum
data/avatar/default/avatar23.webp
Fox2232:

This grainy look can be cheaply fixed. Once this low sample density raytracing completes its cycles, all that has to be written to mapped textures (information is there anyway) and then flatly rendered with high quality AF.
It's noise. Information has to be reconstructed. And it can be reconstructed as cheap or as expensive as your denoising algorithm is cheap or expensive. It's the old (aliasing) problem. There is nothing trivial about it https://www.geforce.com/whats-new/articles/nvidia-rtx-real-time-game-ray-tracing Note: a ray-tracing denoiser module is coming to the GameWorks SDK, which will enable developers to remove film grain-like noise without any additional time-sapping development work
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
pharma:

On Wednesday UnReal will be demoing this technique at GTC 2018, and other development groups but not sure of their schedule. https://www.unrealengine.com/en-US/blog/unreal-engine-4-supports-microsoft-s-directx-raytracing-and-nvidia-rtx
Funny they always show stuff, but in actual games nowhere near that fidelity. Again, just like that Smaritan techdemo, or Elemental - we got a crippled PS4 variant for PC in the end.. [youtube=wdwHrCT5jr0] fast forward from year 2011 to 2018 and nada, maaaybe tiny bit of it here and there..
data/avatar/default/avatar37.webp
Nvidia is always at the forefront with technology.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
@-Tj- : It would be doable if they used per frame data transfers. Let's say you have raytracing desity 1 ray per pixel as that's what your compute HW manages to do in 16ms (and you want 60fps). And then you bake those data. Worse looking than regular rendering, right. But 16ms later, you calculate another ray per pixel read averaged value from baked data around that point and bake it again to that altered point of view objects. Looks bit better now. In second you are already in something what resembles 60 rays per pixel. That works very well as long as data you generate in real time for scenery are not dropped. Only issue come from flashing effects as they would have prolonged effect (while quickly diminishing). Maybe even weighted value of calculated result. Like last data shown are 50% of current frame only. then data from 2 frames back are 25% only. 3 frames back 12,5%. And so on. So quickly killing old data. I am sure there are many mathematical models allowing preservation of data from previous iterations for use in next frame for improved IQ and reduced HW requirements.
data/avatar/default/avatar30.webp
-Tj-:

Again, just like that Smaritan techdemo, or Elemental - we got a crippled PS4 variant for PC in the end.. fast forward from year 2011 to 2018 and nada, maaaybe tiny bit of it here and there..
In 2011 Samaritan demo barely ran on the best hardware of the day. In SLI (!) Nowadays my old 290 runs games with similar or better gfx (for example Hellblade: Senua's Sacrifice) In 2011 Crysis 2 was the king of PC graphics. Witcher 2 blew our mind, BF3 was the pinnacle of what can be done in MP game, and everyone stared at screenshots of modded Skyrim. If you don't see progress from 2011, you should REALLY go to an eye doctor
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Im speaking specifically about Unreal Engine here, I quoted that as well.. all those games you mentioned are close to this, but still not it. Not by a mile, if Cyberpunk2077 will really look like in that trailer, then hat's off. I personally would rather use those tensor cores for more direct compute by e.g. smoke, cloth, hair, trees/branches, something to make the world more richer, not 2d texture/sprites everywhere, kind of like old Tombraider sm2.0 vs nextgen sm3.0.. and this is what's keeping it back from cgi movies vs actual games.. Some newer games have more cloth and real hair, etc., imo just enough to keep our mouths shut lol Edit: do you think new TR will look like this? [youtube=YXS-BGW-uhU] CGI quality gfx.. Well I don't think it will. lol I've been waiting for such gfx, hmm... ever since I saw that nvidia Nalu back in 2005?
data/avatar/default/avatar18.webp
The innovative part is the AI denoiser running on Tensor cores, and if automatically handled by Gameworks, it makes raytracing viable and accessible without the impossible performance barrier that you have today. Raytracing is the path to photorealism.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
-Tj-:

if Cyberpunk2077 will really look like in that trailer, then hat's off.
IF Cyberpunk 2077 is released ONLY for the next console gen (Xbox TWO/PS5) maybe it could look similar. If Cyberpunk 2077 is a trans generational game there is 0% possibility to get that look. Since TW3 CD Projekt RED set the limit by the consoles hardware capacity and PC version needs to be "fair" with them (AKA downgrade on purpose...). 😡
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
-Tj-:

if Cyberpunk2077 will really look like in that trailer, then hat's off.
I would not count on that. That was nothing more then CG Movie nothing more
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Well they explicitly said once that they are aiming for that kind of gfx or nothing. Guess they're also waiting for better gpus, I doubt even full volta can run that beyond 1080p.. Imo witcher would have looked similar to first leaked videos, but they had nv pressure them with that hair crap.. and then we got a rushed & crippled gfx version with "fancy" hair..
Mundosold:

The innovative part is the AI denoiser running on Tensor cores, and if automatically handled by Gameworks, it makes raytracing viable and accessible without the impossible performance barrier that you have today. Raytracing is the path to photorealism.
Yes, but why use just for raytracing? why not use it for other stuff, that's relevant now or will be in next 1-2years.. Raytracing yes is nice, but then they used it to mimic better shadows and AO and reflections, anything more and it would be overkill.. And this is my point, I would rather use it for richer game worlds, not some after touches.
data/avatar/default/avatar31.webp
We'll see 1 or 2 Ray-tracing games this year, and then pretty much nothing for at least 2-3 years. This, I believe, is a reasonably optimistic outlook.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
One of the most interesting parts of this article for me is the implication that the next generation of Geforce cards will have tensor cores. Previous rumours suggested that the next generation of gaming cards would not be Volta, but would instead be called Turing or Ampere and would be a tensor-less architecture. The implications of this article is that the next generation of Geforce gaming cards are gonna be more Volta-like than had been rumoured.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Well... can't say much positive things except that maybe they finally got the idea of doing something for dx12 or it won't ever lift off.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
So simply just better cutscenes for now and not realtime raytracing on consumers cards yet?
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
-Tj-:

Well they explicitly said once that they are aiming for that kind of gfx or nothing. Guess they're also waiting for better gpus, I doubt even full volta can run that beyond 1080p.. Imo witcher would have looked similar to first leaked videos, but they had nv pressure them with that hair crap.. and then we got a rushed & crippled gfx version with "fancy" hair.. Yes, but why use just for raytracing? why not use it for other stuff, that's relevant now or will be in next 1-2years.. Raytracing yes is nice, but then they used it to mimic better shadows and AO and reflections, anything more and it would be overkill.. And this is my point, I would rather use it for richer game worlds, not some after touches.
Most developers don't want to do squat these days if there isn't a button that does it for them. I get that games are more complicated but there's not much zeal for graphical or audio bar setting these days :\.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Shots fired xD [youtube=C9eMciZGN4I]
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
UE4 using Nvidia RTX [youtube=J3ue35ago3Y]
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
[youtube=tjf-1BxpR9c]