A light has been casted - AMD files patent combining hardware and software for raytracing

Published by

Click here to post a comment for A light has been casted - AMD files patent combining hardware and software for raytracing on our message forum
data/avatar/default/avatar15.webp
Meanwhile the preview of the Windows SDK shows updated DirectX Raytracing APIs...
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
This looks really cool if i've understood it correctly. You have fixed function hardware for just part of the full ray trace, the single node. You can have as many of these as you want, and they will each do one full node in HW. But, the shaders will control the full ray trace, using the fixed function hardware if and when needed, and bypassing it when not. So technically you keep both fed optimally, and there should be less idle transistors. one per CU maybe ? Aren't the CU's smaller on Navi, compared to Vega/VII ?
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
funny with rt,x nv says look what we can do big shows and stuff amd yeah we got it hardware and software with half a paragraph no show now or anything maybe they have to keep quite because the unreleased consoles
data/avatar/default/avatar20.webp
that is a paragraph from the patent file. You do not show anything there, you register the method you use to do it, in case someone wants to copy it. It does not have to be a detailed description, as long as you have a working prototype that matches the description.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Would be funny if AMD masters ray tracing and Nvidia is left behind and cant do anything about it because the best method is patented.
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Maybe its similar to that reshade mod that enables RT in games that don't have it?
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
Whoops Nvidia?... will be interesting to see how this performs for sure.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
I really like the idea that CPU will also participate in the calculation (hopefully not resulting in micro-stutter).
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
I knew there was no way Nvidia keep could this proprietary for long. They were crazy for thinking it would only be possible with special hardware dedication.(aka RT cores) http://i67.tinypic.com/29urfkg.jpg
https://forums.guru3d.com/data/avatars/m/265/265068.jpg
jbscotchman:

I knew there was no way Nvidia keep could this proprietary for long. They were crazy for thinking it would only be possible with special hardware dedication.(aka RT cores) http://i67.tinypic.com/29urfkg.jpg
It never was a proprietary Nvidia technology; nor did they think it could only be done with special hardware. Software ray tracing has been in use already for decades. Nvidia just did the R&D and made their own hardware version of cores that help do it faster than software can. Just look at the GTX 10-series cards. No hardware RT cores or anything and, yet, it can still do ray tracing in software... albeit VERY SLOWLY. Ray tracing up until RT cores usually took giant server farms to do it in movies.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
jbscotchman:

I knew there was no way Nvidia keep could this proprietary for long. They were crazy for thinking it would only be possible with special hardware dedication.(aka RT cores) http://i67.tinypic.com/29urfkg.jpg
"Proprietary" you what? i really doubt half you people know what BVH traversal actually is, BVH is still most optimally done on a fixed function unit (which the patent indicates is required but not necessarily used) This is just AMD catching up to the RTX capabilities.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Just to have it stated here, anything Nvidia has right now is based on DXR. DirectX Ray tracing. It's a standard and an API that AMD could adapt. That said, they chose not to. AMD is not running this over DXR API, are they? So this time I have to say, way to go for fracturing the market AMD. This time it's not Nvidia's fault. And they're fracturing it between consoles and PC now, creating a messy situation for devs where they now have to put in work to adapt it for consoles and for PC again as well. Could have been otherwise, but AMD chose not to. Also: "AMD has announced in a published patent how its proprietary software and hardware solutions are supposed to make raytracing better." This is NOT based on the API both could support and it's proprietary to AMD hardware. So where's the uproar now? Where's the protest about another piece of proprietary tech? Nowhere to be heard? Thanks for double standards on this one 😀 Honestly, I've yet to see what this even does. But I have a sour aftertaste since they did what people always complained about with Nvidia, and we didn't like it, so I can't like it this time around either. The only positive of this is that it could be supported without dx12 / DXR api, but I've yet to see it happen. As of now we have 2 RT systems, Nvidia (PC) and AMD (consoles). And this only is bad for the market. edit: This "rant" is meaningless as (see below) they are using DXR as a back end as well. I'm just leaving it here to show I was wrong.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Evildead666:

This looks really cool if i've understood it correctly. You have fixed function hardware for just part of the full ray trace, the single node. You can have as many of these as you want, and they will each do one full node in HW. But, the shaders will control the full ray trace, using the fixed function hardware if and when needed, and bypassing it when not. So technically you keep both fed optimally, and there should be less idle transistors. one per CU maybe ? Aren't the CU's smaller on Navi, compared to Vega/VII ?
They share some things, that's why I call them Dual-CUs. It is not really Raytracing Fixed Function HW. It is slight addition to TMU which enables to do BVH. @Astyanax : What you missed is that while nVidia beefed their SMs a lot to do RT, AMD needs just small increase of transistor count per CU if they use same arrangement as in RX 5700 XT. But in reality, AMD can use different CU arrangements which will make CU smaller at cost of shading, but not TMU (Raytracing) power... Getting more CUs at same transistor count => higher raytracing performance. (Current GCN compatible arrangement is worst case scenario for RDNA and new features but understandable since RX 5700 does not look like having capability described in given patent.) I do expect certain trade-offs, but AMD's GPUs had too much compute for actual gaming performance till RDNA anyway. And in 5 days it will be seen that shader count is not as important as is ability to use them properly. = = = = And as for the patent. It describes "software" solutions as introduction to problematic and that they are ineffective and generally bad even if done on "WH-level" via shaders alone. This is not CPU related. That "software" is control method done inside CU. (And means that there is logic in contrast to fully "Fixed Function-HW".)
data/avatar/default/avatar21.webp
fantaskarsef:

Just to have it stated here, anything Nvidia has right now is based on DXR. DirectX Ray tracing. It's a standard and an API that AMD could adapt. That said, they chose not to. AMD is not running this over DXR API, are they?...
Of course they are - the PC's use the DXR API and the next Xbox will use DXR API. This is just a back end that implements it just like RTX. As to how well it works - well we'll have to see whenever it comes out.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Dribble:

Of course they are - the PC's use the DXR API and the next Xbox will use DXR API. This is just a back end that implements it just like RTX. As to how well it works - well we'll have to see whenever it comes out.
Thank you for confirming this! That indeed takes away most of my doubt, and I have to admit, being displeased. If they all are using DXR, than at least everybody can use the same things. For any reader of this thread, most of my "rant" above is now meaningless if they put their stuff on top of DXR like RTX. Then at least we all can enjoy ray tracing in the near future!
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Interesting stuff, we will have to wait and see!
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Not to rain on anybody (amd die hard fans) parade but this patent is from 2017 (over a year and a half ago) and amd said in their launch of the Navi-based graphics cards that they didn't include any hardware devoted to ray tracing because they believe it will be a few years before real-time ray tracing catches on anyway. So, if any, they were wrong in this assumption ("a few years") and maybe this announcement is actually a bad thing for amd demonstrating that they are - again - on back foot regarding new technologies in video games. They still need a hardware implementation of some kind and maybe the next gen graphic cards from them will have something like this. In the meantime if many AAA titles will include ray tracing optimized for nvidia rtx the damage is done because somebody at amd choose not to develop this patent on the assumption that "it will be few years before it will catch on"....
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
barbacot:

Not to rain on anybody (amd die hard fans) parade but this patent is from 2017 (over a year and a half ago) and amd said in their launch of the Navi-based graphics cards that they didn't include any hardware devoted to ray tracing because they believe it will be a few years before real-time ray tracing catches on anyway. So, if any, they were wrong in this assumption ("a few years") and maybe this announcement is actually a bad thing for amd demonstrating that they are - again - on back foot regarding new technologies in video games. They still need a hardware implementation of some kind and maybe the next gen graphic cards from them will have something like this. In the meantime if many AAA titles will include ray tracing optimized for nvidia rtx the damage is done because somebody at amd choose not to develop this patent on the assumption that "it will be few years before it will catch on"....
The next gen consoles will have some form of Ray Tracing, Playstation 5 at least. With AMD making all the hardware for the consoles, I don't think it will be a problem developing games with AMD's hardware in mind.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
That was pretty fast getting that patent filled out. What I think is especially interesting is how, as far as I can tell, this should allow you to use a secondary GPU as a discrete ray-tracer. I wouldn't mind using the M.2 slot for some cheap low-end GPU for this. I'm not going to use it on an NVMe drive, not for a while anyway.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
barbacot:

So, if any, they were wrong in this assumption ("a few years") and maybe this announcement is actually a bad thing for amd demonstrating that they are - again - on back foot regarding new technologies in video games.
Not to rain on your one man parade, but by your logic both AMD and Nvidia are always on back foot (who isn't, then? Intel? Bwahahahaha!). Surely you don't yet need reminding about things like async compute, which Nvidia lacked but quickly tried to implement in their next gen when it became something of a thing. Neither AMD nor Nvidia have always been able to predict everything, which is natural. Dedicated RT cores were a big sacrifice from Nvidia, which they could afford because AMD hasn't been able to offer much competition in the GPU world. Then there are also the blur cores, as well. If the competition had been fierce, I doubt Nvidia would have been as willing to risk it, unless they knew for a fact AMD was planning it as well.