NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR

Published by

Click here to post a comment for NVIDIA Releases SDK allowing Global Illumination through Ray-Tracing for any GPU supporting DXR on our message forum
data/avatar/default/avatar07.webp
I call it preemptive strike. I wonder if developers using the Series X will be caught on that. Very tempting to use an already developed code... don't bite that bait.
data/avatar/default/avatar19.webp
This is GameWorks all over again.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Good news indeed.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
David3k:

This is GameWorks all over again.
no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I see this as a positive move.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
This is basically Nvidia seeing AMD is supporting raytracing on the big two next gen consoles.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
I hope it will really be optimized, but not the same old history, forcing every consumer to buy an expesive new card, no matter the brand, to been allowed to see lightning effects or shadows...
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
I always thought that standard open libraries are the best way to build visuals in a wide range of games, instead the usual marketing campaings, locking common graphic effects in some sponsored games (like aliasing or physics), making even their strongest last gen cards look like toys and leaving a pile of downgrades and bugs behind... But if GTX 10 series can deliver enough performance and visuals too, maybe everyone can enjoy ray traced effects in new games, at least at 1080p. Its quite clear now, that David Wang from AMD already knew what will happen and they where waiting for next gen consoles to offer DXR on every new card. Let's see what comes next... finger crossed for physics and AI on every CPU and ray traced effects on every GPU 😀
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Luc:

I hope it will really be optimized, but not the same old history, forcing every consumer to buy an expesive new card, no matter the brand, to been allowed to see lightning effects or shadows... I always thought that standard open libraries are the best way to build visuals in a wide range of games, instead the usual marketing campaings, locking common graphic effects in some sponsored games (like aliasing or physics), making even their strongest last gen cards look like toys and leaving a pile of downgrades and bugs behind... But if GTX 10 series can deliver enough performance and visuals too, maybe everyone can enjoy ray traced effects in new games, at least at 1080p. Its quite clear now, that David Wang from AMD already knew what will happen and they where waiting for next gen consoles to offer DXR on every new card. Let's see what comes next... finger crossed for physics and AI on every CPU and ray traced effects on every GPU 😀
Yes please!.!.! I've got CPU cores for days baby!!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
nVidia's HW is not up to the task as well. And CPUs are not up to the task too for effects run via CPU. There is choice to be made by developer: "Either mediocre visual implementation of bad performance." It never reaches close to tech demos nVidia shows on video only... And releases 4 years later when it works on high end HW.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
Lots Nvidia users hated gameworks too. I did and I,was running 2x GTX 970s at the time. My 290x crossfire ran gimpworks just fine. It's the fact nvidia forced developers to lock AMD out of the game source code and engine if the games use gimpworks. AMD could not even have a driver ready for the game till after launch.
data/avatar/default/avatar23.webp
Do not confuse GameWorks with closed binaries with GameWorks completely open. Most of GameWorks now are completely open.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Agonist:

It's the fact nvidia forced developers to lock AMD out of the game source code and engine if the games use gimpworks. AMD could not even have a driver ready for the game till after launch.
Source it. I've never seen such a thing. For a while Nvidia didn't provide source for Gameworks libraries but they do now - I've never seen them lock AMD out of the games source or engine, ever. In fact the two times AMD did mention GameWorks, it didn't even make sense. With Project Cars, AMD responded to a poster that claimed GPU PhysX was destroying AMD's performance (the game didn't even have GPU PhysX) the developer responded back and said AMD had access to the game for months with no communication what-so-ever to the developer:
With the complaints flowing in thick and fast, Project Cars developer Slightly Mad Studios joined the fray and proceeded to place the blame for the game's issues squarely on AMD. "We’ve provided AMD with 20 keys for game testing as they work on the driver side," said Slighty Mad Studios' Ian Bell. "But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips. We’re reaching out to AMD with all of our efforts. We’ve provided them 20 keys as I say. They were invited to work with us for years, looking through company mails the last I can see [AMD] talked to us was October of last year. Categorically, Nvidia have not paid us a penny. They have though been very forthcoming with support and co-marketing work at their instigation."
With Witcher 3 Richard Huddy said:
"We've been working with CD Projeckt Red from the beginning," said Huddy. "We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."
But there is videos of the game with Hairworks a year before it shipped out. The whole thing was demoed at GDC over a year before it launched. They knew it had Hairworks, yet they never requested for builds with hairworks (presumably or they did and just did nothing) and they knew their geometry performance was trash in their architecture. So I guess they decided to just handwave the whole thing away with "ehh we need sourcecode!" And now all the source is available for all GameWorks libraries and yet no magical drivers came out and fixed the performance or issues AMD had.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

Source it. I've never seen such a thing. For a while Nvidia didn't provide source for Gameworks libraries but they do now - I've never seen them lock AMD out of the games source or engine, ever. In fact the two times AMD did mention GameWorks, it didn't even make sense. With Project Cars, AMD responded to a poster that claimed GPU PhysX was destroying AMD's performance (the game didn't even have GPU PhysX) the developer responded back and said AMD had access to the game for months with no communication what-so-ever to the developer: With Witcher 3 Richard Huddy said: But there is videos of the game with Hairworks a year before it shipped out. The whole thing was demoed at GDC over a year before it launched. They knew it had Hairworks, yet they never requested for builds with hairworks (presumably or they did and just did nothing) and they knew their geometry performance was trash in their architecture. So I guess they decided to just handwave the whole thing away with "ehh we need sourcecode!" And now all the source is available for all GameWorks libraries and yet no magical drivers came out and fixed the performance or issues AMD had.
The evidence: https://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs And every old PhysX game I played didn't run well enought on their own cards, for quite common physics that even my crappy GT 710 DDR3 can push nowadays... At Nvidia they say that don't pay as an sponsor, that they only give away graphic cards and workforce (a lot of money) to implement their "Gameworks", but we must always remember what happened with Watchdogs and Batman Arkham Knight ports to PC... I want to think that things were a little different with CDProyect Red, because they are a reliable studio and they didn`t allowed too much crap in their game, but again 64x tessellation on hair... looks like a desperate move to cripple everything in a nosense. Every GPU maker look for an advantage over rivals: AMD does it through consoles (in the past TressFX, Mantle, etc.), triying to improve performance on their hardware but giving this progress to everyone, while Nvidia always found a way to make every card on the market feels poor except their new gen flagship, while pushing prices higher than ever for a consumer part... I know what to expect, but I hope for an open implementation in new games.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Luc:

The evidence: https://www.techpowerup.com/104868/batman-arkham-asylum-enables-aa-only-on-nvidia-hardware-on-pcs And every old PhysX game I played didn't run well enought on their own cards, for quite common physics that even my crappy GT 710 DDR3 can push nowadays... At Nvidia they say that don't pay as an sponsor, that they only give away graphic cards and workforce (a lot of money) to implement their "Gameworks", but we must always remember what happened with Watchdogs and Batman Arkham Knight ports to PC... I want to think that things were a little different with CDProyect Red, because they are a reliable studio and they didn`t allowed too much crap in their game, but again 64x tessellation on hair... looks like a desperate move to cripple everything in a nosense. I know what to expect, but I hope for an open implementation in new games.
How is this a source? Where in this article does it say that the developer locked AMD out of the engine and/or source code for the game but gave Nvidia access? Furthermore:
"Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA. If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QAed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work."
From PCPer:
The developer relations team at NVIDIA is significantly larger, has a significantly larger budget and in general works with more developers than AMD's. In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined.
Throughout this whole period of time - from when AMD purchased ATi, to honestly pretty recently - AMD was notorious for not supporting game developers. S2 games for example made the biggest stink about it with Savage 2. Where they claimed they actually had Nvidia help them get the game working on AMD hardware because AMD didn't have the resources to help indie game developers. Also Richard Huddy just straight up lies left and right anyway. Remember when he said 4GB of HBM = 12GB of GDDR5? Fury X owners remember.
Every GPU maker look for an advantage over rivals: AMD does it through consoles (in the past TressFX, Mantle, etc.), triying to improve performance on their hardware but giving this progress to everyone, while Nvidia always found a way to make every card on the market feels poor except their new gen flagship, while pushing prices higher than ever for a consumer part...
TressFX had the nearly the same problem that AMD claimed with HairWorks: https://www.pcgamer.com/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/ Notice how unlike AMD, they just said they were working on drivers and with developer to fix the issue instead of pointing fingers. Oh and they did it without the source code because TressFX didn't release source until months later. Don't get me wrong - I think AMD is a completely different company under Lisa Su.. most of the people at that time are long gone. But a lot of these things people bring up are from a time when AMD/Nvidia's architectures were moving in radically different directions and both companies were trying to develop features that were optimized for their own products. I think it's asinine to keep bringing them up, especially when most of the issues with Gameworks (for example it being closed source) is no longer an issue.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
sadly Richard Huddy is still present and lying.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Denial:

How is this a source? Where in this article does it say that the developer locked AMD out of the engine and/or source code for the game but gave Nvidia access?
From the source: "AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom." I can confirm you that the developer changed their game core and deleted/modified their whitelist to allow AMD to use AA, after months of criticism from all the gaming comunity. At the end, they did it between poor excuses, in an attempt to clean their public image, and it looks like they accomplished it.
Denial:

Throughout this whole period of time - from when AMD purchased ATi, to honestly pretty recently - AMD was notorious for not supporting game developers. S2 games for example made the biggest stink about it with Savage 2. Where they claimed they actually had Nvidia help them get the game working on AMD hardware because AMD didn't have the resources to help indie game developers.
I didn't knew about that game nor if it was popular or niche in its time, so I cannot discuss about it, nor if AMD had money or it was their fault at all, but i'd like to trust you.
Denial:

Also Richard Huddy just straight up lies left and right anyway. Remember when he said 4GB of HBM = 12GB of GDDR5? Fury X owners remember.
Yeah, big fail by everyone there, they lied trying to hide the problem and the hype got things worst.
Denial:

TressFX had the nearly the same problem that AMD claimed with HairWorks: https://www.pcgamer.com/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/ Notice how unlike AMD, they just said they were working on drivers and with developer to fix the issue instead of pointing fingers. Oh and they did it without the source code because TressFX didn't release source until months later.
Yes, AMD did the same they blame. At Nvidia they did it well, first they improved the stability, then they upgraded the performance with the source code. TressFX seemed to be rushed, I remember that even Radeon drivers weren't performing well when the game was released. But there is a big difference in time, talking about releasing the source code after launching the game (dirty) isn't the same as releasing PhysX code openly after 10 years being closed, because I don't know how many time GameWorks was closed. They must avoid those strategys, that's why this announcement is so encouraging, but we know how those old dogs used to be...
Denial:

Don't get me wrong - I think AMD is a completely different company under Lisa Su.. most of the people at that time are long gone. But a lot of these things people bring up are from a time when AMD/Nvidia's architectures were moving in radically different directions and both companies were trying to develop features that were optimized for their own products. I think it's asinine to keep bringing them up, especially when most of the issues with Gameworks (for example it being closed source) is no longer an issue.
I hope you are right, because we don't know how AMD RDNA2 will play with RTX on, and there exists the possibility that will only work on next-gen console ports. Lisa leadership is great, but she already thrown Vega VII to gamers when David Wang said that chip isn't for gaming. Maybe Intel can balance the situation between them, it could be a funny history. PS: sorry for the long post and my repetitive and/or bad english 😳
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Astyanax:

no it isn't. and the only hate for gameworks was because AMD had hardware that wasn't up to the task, RDNA is.
Neither did Nvidia. Gameworks broke nearly every game it was implemented in even on Nvidia cards. Watch Dogs, broken Witcher 3, broken Arkham Knight, broken Metro Series, broken Crysis 2, broken These are just a few I can remember but nearly every time they added in these features it destroyed the frame rate on AMD but also Nvidia cards too. This is why its common knowledge that if you want a quick fix for getting better performance you ALWAYS disable these stupid features first and forget they even exist. Open standards are always better, this closed off garden Nvidia keeps locking themselves in does nothing but hurt the PC gaming industry. The sooner this stuff goes away the better for us all.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
^^Don't forget Mafia 2. I'd say that was probably worst implementation of physics. Witcher 3 and Arkham Knight weren't that bad at release imo.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
CPC_RedDawn:

Neither did Nvidia. Gameworks broke nearly every game it was implemented in even on Nvidia cards. Watch Dogs, broken Witcher 3, broken Arkham Knight, broken Metro Series, broken Crysis 2, broken These are just a few I can remember but nearly every time they added in these features it destroyed the frame rate on AMD but also Nvidia cards too. This is why its common knowledge that if you want a quick fix for getting better performance you ALWAYS disable these stupid features first and forget they even exist. Open standards are always better, this closed off garden Nvidia keeps locking themselves in does nothing but hurt the PC gaming industry. The sooner this stuff goes away the better for us all.
I played more than half these games fine at launch (I never played WD or AK). No idea what you're talking about with the others - Crysis 2 I played on AMD card, the game was bad but it played fine. Also all of Nvidia's stuff is open now, including the item mentioned in the OP.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Denial:

I played more than half these games fine at launch (I never played WD or AK). No idea what you're talking about with the others - Crysis 2 I played on AMD card, the game was bad but it played fine. Also all of Nvidia's stuff is open now, including the item mentioned in the OP.
Watch Dogs was a mess, AK had many other issues too and gameworks just added to it. Witcher 3 hairworks was terrible unless you had a high end GPU they should of used tressfx instead, Metro had physX and you took about 20% less performance, same goes for Metro Exodus for 40% more performance disable all Nvidia features for barely any difference in IQ, or you could enable RT for a 50-60% drop in performance. Crysis 2 had insane amount of tessellation left in which Nvidia knew about and it killed performance on AMD cards. Thank god its finally open source, nice to see open source always wins in the end.