AMD will not implement DirectX ray tracing anytime soon

Published by

Click here to post a comment for AMD will not implement DirectX ray tracing anytime soon on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Keeps raytracing out of consoles. Another DOA feature 😀
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Good, focus on raster + performance.
Geryboy:

they don't have the hardware for it anyway, and they won't compete with 1080ti + performance for a while, so yeah...
That you don't know.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
given the lack of actual software and the performance penalty on cards that have native acceleration. Seems like a reasonable thing to say. the way things are going, i'd be surprised if any raytracing supporting titles will have patches for it by christmas
https://forums.guru3d.com/data/avatars/m/163/163984.jpg
IMO Ray Tracing on nVidia cards is only there to justify the presence of all those tensor cores that are only useful in the pro AI and deep learning sector. They didn't want to develop a gaming specific chip without them so created a false dawn of "real time raytracing" using all these otherwise useless tensor cores that are costing you hundreds of euros to put on every chip. Who in their right mind is going to game at 1080p30 on a card costing €1200? And that's if game studios even implement it. Some will with massive backing from team green so they can justify this overly complicated and way overpriced latest generation of cards.
data/avatar/default/avatar36.webp
RonanH:

IMO Ray Tracing on nVidia cards is only there to justify the presence of all those tensor cores that are only useful in the pro AI and deep learning sector. They didn't want to develop a gaming specific chip without them so created a false dawn of "real time raytracing" using all these otherwise useless tensor cores that are costing you hundreds of euros to put on every chip.
Thats quite nonsense. They've stripped off pro functions from gaming GPUs in the past, and considering how many different Dies they actually produce, it would be easy for them to just make one without all those functions. Its not like they are using a single identical Die for all GPUs that would give such an argument any merit whatsoever. Contrary to popular believe, making products super expensive doesn't actually result in making that much more money. Having a product at a more affordable price would easily offset the price difference with increased sales - that is, if your product isn't too expensive to produce. So why would they try to sell a chip thats so expensive to make, for such a high price? Surely its because they were too stupid to make it smaller by cutting out features it doesn't need, you know, like they've done in the past? Oh wait. The only explanation that makes any sense is that they actually want this new technology to succeed. But as with all new technology, the introduction is painful and slow. But someone has to get it going. And believe you me, game developers are excited about Ray Tracing, it just takes time to implement it in their engines, and for hardware to actually become more wide-spread. The big engines are getting onboard with this, which will offer it for a wide range of games automatically in the future. Its a shame most people around here don't remember the last time a major groundbreaking GPU hardware change was introduced. The comments were quite the same. Barely any software used it, but everyone involved with the technology knew it was the future.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
That's really disappointing to hear that from an AMD person. I guess if it does take off, that will put AMD even more on the back foot. Just what they need.
data/avatar/default/avatar35.webp
Hardware RT should have been kept exclusive for the professional market (Quadro, Tesla, etc.) until it matures and refines, not shove it in the consumer SKUs right away. It is too expensive to gain mass traction and justify game developers to allocate time and resources for proper implementation. Instead, hardware vendors must press on adopting better memory tech, since fast random memory access (latency and bandwidth) is the fundamental key for real-time RT, there's just no way around it.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Maddness:

That's really disappointing to hear that from an AMD person. I guess if it does take off, that will put AMD even more on the back foot. Just what they need.
Why focus on irrelevant as of right now technology? There are 0 titles that support it, technology is too fresh and barely works at acceptable framerates. If AMD provides a strong Dx11/Dx12 raster performance and 4k 60fps+ at non-extortion prices, that will be a win.
data/avatar/default/avatar18.webp
It seems to me like the physx at the beginning. Its nice to have, but you don't need it AND even Nvidia can't reliable use it - from the footage we have now. Ray Trace might be the future but the future isn't now. Yet. So AMD have plenty of time.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
You don't have to follow every DX fonction, do you remember DX10 nearly no one followed it completely as it bring nothing at this time, despite that fail we found the same in DX11 and it was ok for everyone. RT is a good idea but it just began on games (of course at work it is used since a long time)... If you are a company that succeed but can still fail (as AMD is still right now) it is prudent to let it come and see...
fellix:

Hardware RT should have been kept exclusive for the professional market (Quadro, Tesla, etc.) until it matures and refines, not shove it in the consumer SKUs right away
Not at all we can do it right now (and NVidia have done it), real time RT is enough mature to get on our screen (i remember waiting dayS with my Amiga A2000 calculating RT... ). Its like when you start a race with unsure weather, some will get rain tires and some will stay in slick tires... none are wrong, it depend on their tactic. NVidia are taking risk on slick and AMD are prudent on rain.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
cryohellinc:

Why focus on irrelevant as of right now technology? There are 0 titles that support it, technology is too fresh and barely works at acceptable framerates. If AMD provides a strong Dx11/Dx12 raster performance and 4k 60fps+ at non-extortion prices, that will be a win.
That's not going to happen until at least 2020. AMD have stated that 2019 Navi is a mid range chip. We won't see 4k 60fps from AMD for quite some time. The news about Ray Tracing makes this even more disappointing imo. There may not be any games with it yet, but make no mistake it is the future. They can ignore this at there own peril.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Nothing wrong with that, 4K is far from mainstream, RTX is even further away. The fact that ZERO games support it, and might add support later(gets delayed all the time), just shows RT is not a priority. Heck, 90% or more won't even use it, since they don't have those 2080 cards, even those with 2080TI will probably use it one time and bye. RT will be possible in the future, but surely it's so far from mainstream.
https://forums.guru3d.com/data/avatars/m/229/229454.jpg
cryohellinc:

Why focus on irrelevant as of right now technology? There are 0 titles that support it, technology is too fresh and barely works at acceptable framerates. If AMD provides a strong Dx11/Dx12 raster performance and 4k 60fps+ at non-extortion prices, that will be a win.
Yep, make a good performing product at a reasonable price (insert RTX = Off / On joke here) and it will sell. I'll gladly switch back to AMD then. Even if I seem to have an RTX card now but rest assured I didn't get it for whatever ray-tracing may or may not be possible in near or distant future.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Maddness:

That's not going to happen until at least 2020.
In accordance with whom?
Maddness:

AMD have stated that 2019 Navi is a mid range chip.
I'm not talking about Navi, instead of whatever they have there in development, let's call it Vega 2.
Maddness:

We won't see 4k 60fps from AMD for quite some time.
An assumption. Vega 64 can reach 45-50 fps range in 4k, so I can safely assume Vega 2 will have 60+fps.
Maddness:

The news about Ray Tracing makes this even more disappointing imo.
Ray Tracing is an early in development overhyped irrelevant technology. There are multiple other methods (such as path tracing for example) of tracing technologies which are more effective. I predict that Nvidia's attempt to yet again make a "unique" Nvidia only solution will result in similar to PhysX situation, where it will be cool when it works, but not widely adopted. In any case, all things considered, this tech for the consumer market is still in its infancy.
Maddness:

There may not be any games with it yet, but make no mistake it is the future. They can ignore this at there own peril.
Only an idiot will ignore that, and they have some bright heads working out there. Nvidia made a first cash-grab attempt with an unfinished/unimplemented technology. Bottom line from me is that AMD should focus on strong raster performance at high res, power efficiency and optimization of 7nm process, meanwhile working at Ray/Path tracing solution,
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
rl66:

You don't have to follow every DX fonction, do you remember DX10 nearly no one followed it completely as it bring nothing at this time, despite that fail we found the same in DX11 and it was ok for everyone. RT is a good idea but it just began on games (of course at work it is used since a long time)... If you are a company that succeed but can still fail (as AMD is still right now) it is prudent to let it come and see... Not at all we can do it right now (and NVidia have done it), real time RT is enough mature to get on our screen (i remember waiting dayS with my Amiga A2000 calculating RT... ). Its like when you start a race with unsure weather, some will get rain tires and some will stay in slick tires... none are wrong, it depend on their tactic. NVidia are taking risk on slick and AMD are prudent on rain.
Technically speaking, it is mature enough to cover tiny part of your 1080p screen. Something like 1/7th of 1080p screen while reaching 60fps. So barely enough for shadows and basic reflections. In same way you could say that 6 years old GCN GPU can do raytracing too, but it would barely cover 1/50th of screen. Next RTX generation may start being interesting. But this one... performance will not be good enough for good enough IQ. Screen space reflections can do almost same with much smaller performance impact. And I think there is already pattern to RTX based effects not being released... only shown. Does someone remember how cool Portal effect was?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
This is AMD Babble for "We don't want to screw up our ray tracing implementation like we did with our Tessellation engine"
data/avatar/default/avatar09.webp
cryohellinc:

Why focus on irrelevant as of right now technology? There are 0 titles that support it, technology is too fresh and barely works at acceptable framerates. If AMD provides a strong Dx11/Dx12 raster performance and 4k 60fps+ at non-extortion prices, that will be a win.
It's not irrelevant. As Hilbert stated "technology needs to move forward." Of course there are no titles that support it. You have to have the hardware before the software can be released. You say it's too fresh and barely works at acceptable framerates. The same thing could have been said back in the mid 90s about 3d acceleration, if companies like 3dfx and NVidia and Matrox would have thought like that, look where we would be today.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
@Margalus Have you even read what I wrote? "Why focus on irrelevant as of right now technology?"
data/avatar/default/avatar36.webp
AMD also mentioned they will not have support for Ray Tracing until they can offer it across their whole gpu line. This will be a while, but at least it's a bit more realistic for them. Might see a high end gpu the end of 2019. They have something in the works.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I see that it makes very much sense in AMD's position, a very special situation, to not jump onto ray tracing right away just because Nvidia does it. If you got something to catch up to, it's not wise to do two jumps in one and try to push DXR onto AMD cards via hardware, as they have plenty of time to reach it to a usable level (which was stated by AMD too) on all cards across resolutions. These days you can argue about the sense in any graphical improvement's implementation (not that they're working on bringing it to us) as long as it's not able to run on consoles... sad but true. Other than that it's merely pushing PC adaption, which again, is of little use in general since it's hardly coming to every engine / game soon. It's in the works, and personally I think it's pretty cool, but it is by now not standard. Just because I can buy an 8K TV doesn't mean that it's the standard. And it doesn't make sense to push more R&D money into a feature that's not standard, if you have to spend that R&D money on something else, like general performance. If AMD can catch up in general performance, they might as well jump onto DXR with any future hardware, easily, just not necessarily now. It's pretty much clear that even with RTX cards, we're far from a state to say ray tracing is here... it's more or less not ready for usage as it's below 60fps on 1080p as of right now. Unless we're at some point reaching 4K 30fps with DXR, it's a tech that won't for a new standard. Yet.