NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR

Published by

Click here to post a comment for NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
CPC_RedDawn:

Watch Dogs was a mess, AK had many other issues too and gameworks just added to it. Witcher 3 hairworks was terrible unless you had a high end GPU they should of used tressfx instead, Metro had physX and you took about 20% less performance, same goes for Metro Exodus for 40% more performance disable all Nvidia features for barely any difference in IQ, or you could enable RT for a 50-60% drop in performance. Crysis 2 had insane amount of tessellation left in which Nvidia knew about and it killed performance on AMD cards. Thank god its finally open source, nice to see open source always wins in the end.
TressFX has never been used on more than one asset in the game and at the time doing one character took tons of development time, it also had no fur support - it couldn't have been used in Witcher 3. To this day Hairworks instances better than TressFX - which is why basically no games use it and the ones that do limit it to very few characters. RT is in plenty of games with good results and it keeps improving - I don't think you can compare RT to other Gameworks features. What do you mean "nvidia knew about" ? The underwater tessellation was debunked (gets culled). The rest is explained here:
At the time Crysis 2 came out, Crytek didn't yet improve their parallax occlusion method technique. Parallax occlusion mapping, or POM, is a shader trick used to make a surface look like it has depth to it. An example of it being used in the Crysis 2 CE3 build: http://www.simonfuchs.net/folio/gfx/tutorials/02_pom/13_pom_animation.gifhttp://docs.cryengine.com/display/SDKDOC2/Silhouette+POM After all this time, they finally improved their POM to influence the silhouette. It's still an insanely expensive effect to use, but it has 2 major advantages. 1: It does run much, much faster than tessellation on very noisy surfaces. 2: Gamers have yet to figure out what a shader actually is. You can get people forcing on wireframe mode and poking around in there, but the average gamer still has zero concept of what a shader effect is or what it does. So, if performance goes down, they'll just write it off as something else influencing the performance. Including this link for the sake of completeness: https://lh3.googleusercontent.com/ABCa7x6lZTEJLdFh-dFqOfrffLn6azgd57ekud_FyliE0ZknHHvuIBwgOjJKOYiyR05B8YaLA_Lo95KEg7N80BAKM-Ky7y6TyB4DB9QLQ0-srsFRFkbhXFD70YsdlUJ3PhEOT3o This is Epic's POM integration into the UE4. Of course, like any sensible developer, they warn against overusing it since it is a very expensive shader effect. Image source: https://forums.unrealengine.com/showthread.php?72647-Engine-Features-Preview-6-11-2015
Has nothing to do with Nvidia. There has been some Nvidia Gameworks features that ruin performance with questionable benefits but again it's easily explained by the departure in architectures or Nvidia just trying things (like the voxel based tracing in The Division). People have to realize that these things aren't developed in a vacuum though. Throughout the years i've seen people say "nvidia intentionally made this to sabotage AMD's performance" but in the case of hairworks it was in development for 8 years. https://developer.download.nvidia.com/presentations/2008/SIGGRAPH/RealTimeHairRendering_SponsoredSession2.pdf Here is a 2008 Siggraph presentation that is the basis for Hairworks. Nvidia felt GPU architectures were going to go more towards geometry but they didn't - AMD pushed compute hard, won all the console contracts, pushed compute into consoles and all the tech Nvidia was developing towards their architectures became somewhat obsoleted. In the meantime AMD's tessellation performance was lackluster and they've since made lots of attempts to improve it.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

TressFX has never been used on more than one asset in the game and at the time doing one character took tons of development time, it also had no fur support - it couldn't have been used in Witcher 3. To this day Hairworks instances better than TressFX - which is why basically no games use it and the ones that do limit it to very few characters. RT is in plenty of games with good results and it keeps improving - I don't think you can compare RT to other Gameworks features. What do you mean "nvidia knew about" ? The underwater tessellation was debunked (gets culled). The rest is explained here: Has nothing to do with Nvidia. There has been some Nvidia Gameworks features that ruin performance with questionable benefits but again it's easily explained by the departure in architectures or Nvidia just trying things (like the voxel based tracing in The Division). People have to realize that these things aren't developed in a vacuum though. Throughout the years i've seen people say "nvidia intentionally made this to sabotage AMD's performance" but in the case of hairworks it was in development for 8 years. https://developer.download.nvidia.com/presentations/2008/SIGGRAPH/RealTimeHairRendering_SponsoredSession2.pdf Here is a 2008 Siggraph presentation that is the basis for Hairworks. Nvidia felt GPU architectures were going to go more towards geometry but they didn't - AMD pushed compute hard, won all the console contracts, pushed compute into consoles and all the tech Nvidia was developing towards their architectures became somewhat obsoleted. In the meantime AMD's tessellation performance was lackluster and they've since made lots of attempts to improve it.
Excuse me, most of those links didn't worked for me. I don't get your point. First, you accused AMD to not support an Indie Game that was already sponsored by Nvidia, because the developers couldn't get their own software working in Radeon hardware. And it was AMDs fault because they hadn't money for that proyect, and we know it could be true. Then, you excused Nvidia for their many GameWorks broken games, that already worked nicely (30 pfs 🙄 ) in x86 consoles. The only thing those ports had in common was Nvidia sponsorship, because the rest of console ports (most of the games) worked much better. Arkham Knight was taken off the market and repaired for almost a year... I can't believe the only problem here, was that Nvidia planned to use more geometry, because they never had problems to offer an strong computing performance, and their sponsored games only offered overkill geometry in hair, fur and godrays, because the fog, smoke and fire were so uncommon that I don't remember seen them since Arkham Asylum unnecessary stacked volumetric fog... it looked so unoptimized, that only the strongest PhysX card could run it... it worked really well most of the time, but in some places it was so overpiled that crippled the performance, changing more than 100 fps to almost 20. Everybody knows that Nvidia can be the strongest graphic player in the market without dirty tricks and I expect that any contender does it neither. I hope I didn't sounded harsh, it isn't my mood 😛
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Batman AA didn't work on AMD intentionally by the studio because AMD didn't supply any devrel to verify the AA actually worked on AMD parts.
data/avatar/default/avatar39.webp
People will argue over anything.Look at it this way. Thank's Nvidia for releasing source code/software/G-sync on Freesync monitors/G-Sync VRR over HDMI on Nvidia GPU's.I am all for Nvidia doing anything to help make PC gaming better in anyway at all. Kumbaya MOFU's