Video: Crytek CryEngine shows raytracing technology demo (runs on Radeon RX Vega 56)

Published by

Click here to post a comment for Video: Crytek CryEngine shows raytracing technology demo (runs on Radeon RX Vega 56) on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Alright! Bring it on, great work! More of this and we will finally have some really innovative times ahead! Can't wait for some benchmark to be run with this way of doing Ray Tracing ๐Ÿ˜€
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
So Unreal and now Cryengine can both support raytracing without the need of a rt cores? Wtf nvidia?
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Undying:

So Unreal and now Cryengine can both support raytracing without the need of a rt cores? Wtf nvidia?
*Nvidia slightly triggered*
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

So Unreal and now Cryengine can both support raytracing without the need of a rt cores? Wtf nvidia?
Why is this surprising? RT cores accelerate BVH traversal. BVH was utilized prior to RT cores. It's not like they brought some new technology to the fold they just accelerate an existing data structure. Also FWIW this implementation uses SVOGI and not DXR at all.
KissSh0t:

*Nvidia slightly triggered*
What's funny is that the technique (SVOGI) was created by Cyril Crassin from Nvidia. It was originally going to be used as UE4's default lighting technique but got scrapped due to the performance requirements.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Undying:

So Unreal and now Cryengine can both support raytracing without the need of a rt cores? Wtf nvidia?
RT cores offload the workload from the GPU, which can still help. Problem is doesn't seem like RT cores can keep up with the main GPU cores...
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Denial:

Why is this surprising? RT cores accelerate BVH traversal. BVH was utilized prior to RT cores. It's not like they brought some new technology to the fold they just accelerate an existing data structure. Also FWIW this implementation uses SVOGI and not DXR at all.
I'm merely surprised by this happening so fast to be honest. But I like it!
https://forums.guru3d.com/data/avatars/m/260/260114.jpg
Crysis 4 :P
https://forums.guru3d.com/data/avatars/m/274/274789.jpg
atleast nvidia did their job by punching game industry in right way
https://forums.guru3d.com/data/avatars/m/67/67544.jpg
They should focus on making better games first. Do they even make games still or is it just licensing out their engine now?
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Amaze:

They should focus on making better games first. Do they even make games still or is it just licensing out their engine now?
They do. Hunt Showdown is very underrated game. You should try it.
https://forums.guru3d.com/data/avatars/m/219/219428.jpg
Strange Times:

atleast nvidia did their job by punching game industry in right way
Punching what? There where people experimenting with raytracing way before it was a thing. I'd rather see proper collision detection, AI and more interaction in games before we get to raytracing.
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
Correct me if I'm wrong, but Real Time Ray Tracing is not something only Tensor, or Turing cores can do. All other cores can do it, but not nearly as effective. So the real question is, if Real Time Ray Tracing was implemented on most games tomorrow, would it actually be worth paying extra for Real Time Ray Tracing specific cores? A comparison between an RTX 2060 and a Vega 56 on the above demo would probably provide some answers.
https://forums.guru3d.com/data/avatars/m/267/267787.jpg
To my knowledge there is no must for RTX cores to do real time ray tracing. The RTX cores only help with it so the GPU have more resources for other graphics related tasks. From what I know any DX12 GPU should be able to execute real time ray tracing. The only problem is performance with a low end GPU.
data/avatar/default/avatar22.webp
No thanks..... probably big performance loss and most of the lighting is still done with normal techniques.... ray tracing requires more compute than what we have today...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Denial:

What's funny is that the technique (SVOGI) was created by Cyril Crassin from Nvidia. It was originally going to be used as UE4's default lighting technique but got scrapped due to the performance requirements.
Performance requirements in what part of GPU? I guess compute/shader, right? Because that's where nVidia relatively sucked till 20x0 came along. (AMD had overblown Shader count in comparison to TMU/ROPs.) One can say that nVidia has brilliant engineers who are controlled by finance/marketing departments. If they come with something good that runs better on non-nVidia HW, it gets canned till nV's HW is better at it.
HardwareCaps:

No thanks..... probably big performance loss and most of the lighting is still done with normal techniques.... ray tracing requires more compute than what we have today...
Still looks much better than what BF5 does. And on GI it is better than Metro.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Undying:

They do. Hunt Showdown is very underrated game. You should try it.
Yeah, and it got a huge update not long ago, looks like its getting to final rls soon, they are promissing xbox1 rls in spring Q1 2019
data/avatar/default/avatar28.webp
lol with what's happening in the news the last 1-2 years it's gotta suck to be 1.) INTEL customer, 2.) NVidia customer, 3.) a Democrat.
https://forums.guru3d.com/data/avatars/m/118/118854.jpg
Very Very impressive Reflections here. On vega 56?? WOW, Prettie darn impressed here, thought it would run ALOT slower. Does anyone knows what internal resolution was used in this demo?? Amazing Reflections, GOOD JOB!!
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
ALRIGHT! My Vega 64's and VEGA II are so ready for this! Bring it on!!!:D