Gears 5 Developer mentions dedicated ray-tracing cores from AMD in next-gen XBox

Published by

Click here to post a comment for Gears 5 Developer mentions dedicated ray-tracing cores from AMD in next-gen XBox on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Keep in mind the RT cores can be used for other things - @Astyanax already mentioned a new AA method that Nvidia is exploring but ray cals can be used for sound, physics collisions, etc. It will be interesting to see what kind of things can be accelerated and/or new ideas devs come up with when playing with the hardware.
data/avatar/default/avatar40.webp
schmidtbag:

Dedicated in the same was as Nvidia's, or dedicated as in "X amount of cores will be reserved for RT when RT is in use"? Because if they're taking the Nvidia route, I'm not so sure I'm a fan of that. But, perhaps there isn't another option.
Better to keep the expectations low and expect the latter IMO. He knew that his target audience was the mainstream and won't go into details like that. The general opinion still seems to be that raytracing in games isn't impressive yet, and will the next-gen consoles' AMD GPUs be able to bring RTX 2070-ish power as well as more capable raytracing performance? People want next-gen graphics as well as raytracing, not current gen with raytracing. Red Dead Redemption 2 has good lighting and that's running on six year old GCN. Most engineers I've seen talking about hardware wishes throughout the years prefer more compute that they can allocate to whatever they want rather than fixed function, unless they know 100% that the fixed function will be useful for all cases. A common example is the Xbox 360's tesselator unit being considered worthless.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The problem is regardless to whether it's RT or raster getting "next gen graphics" has insanely high diminishing returns from both a performance perspective and a development perspective. You can't just pump up polygon counts anymore or add some simple effect like a godray - you need to spend hundreds of hours tweaking shadows/lights/shaders that simulate light through skin, etc, it all needs to be placed perfectly, it needs to be extremely well optimized and the end result is like 5% perceived improvement in graphics over just winging it. It gets more and more difficult the further you go. The idea with RT from a paradigm perspective is that of the actual manual worked is "offloaded" to engineering teams. In theory you don't have to have an artist tweaking hidden/fake lights in a level to get simulated bounces or tweak models/shadows/etc all day so it looks perfect - the underlining tech itself should just handle it and handle it to a degree that's better than an artist can do - since artists often use RT scenes as a reference anyway. Which is another thing people typically don't bring up - RT as a reference. The ability to see the scene in RT and then set up your raster to try to mimic it and iterate quickly should theoretically improve graphics regardless to whether RT is being used in the final product or not.
data/avatar/default/avatar31.webp
Denial:

The problem is regardless to whether it's RT or raster getting "next gen graphics" has insanely high diminishing returns from both a performance perspective and a development perspective. You can't just pump up polygon counts anymore or add some simple effect like a godray - you need to spend hundreds of hours tweaking shadows/lights/shaders that simulate light through skin, etc, it all needs to be placed perfectly, it needs to be extremely well optimized and the end result is like 5% perceived improvement in graphics over just winging it. It gets more and more difficult the further you go. The idea with RT from a paradigm perspective is that of the actual manual worked is "offloaded" to engineering teams. In theory you don't have to have an artist tweaking hidden/fake lights in a level to get simulated bounces or tweak models/shadows/etc all day so it looks perfect - the underlining tech itself should just handle it and handle it to a degree that's better than an artist can do - since artists often use RT scenes as a reference anyway. Which is another thing people typically don't bring up - RT as a reference. The ability to see the scene in RT and then set up your raster to try to mimic it and iterate quickly should theoretically improve graphics regardless to whether RT is being used in the final product or not.
The other side of this is if the next gen consoles ray trace something then they won't have done a rasterized version of that effect. They probably won't spend hundreds of hours to add a really good quality rasterized effect just for PC's that have dated gpu's that can't ray trace. Hence you might find that the quality goes down on the PC port for those with gpu's that can't do RT.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dribble:

The other side of this is if the next gen consoles ray trace something then they won't have done a rasterized version of that effect. They probably won't spend hundreds of hours to add a really good quality rasterized effect just for PC's that have dated gpu's that can't ray trace. Hence you might find that the quality goes down on the PC port for those who's gpu's can't do RT.
Honestly I feel like that kind of already happens - Metro for example, the GI without RT is pretty bad, definitely worse than other games. It's like they spent their time optimizing the RTX GI and just phoned in the non-RTX GI. It's kind of a bummer but I don't know how else you'd transition without spending a ton of money to appease people. I guess the longer it's stretched out the more people that will be on RT capable cards anyway. Perhaps this is why the CEO of Nvidia said it's a mistake to buy a non-RTX GPU if you plan on keeping it for more than 2 years. They know the adoption rate of RTX cards + probably have a better understanding of what's coming down the pipeline with RT from Microsoft/AMD/Intel.
data/avatar/default/avatar19.webp
Hypernaut:

Like motion blur and v-sync everyone will disable it lol. Well anyone who plays at a high level. Which isn't me btw
Anyone with G-Sync/Freesync/Adaptive Sync should be using V-sync for proper implementation.Motion Blur can make people sick ,I am glad I play on a PC to have choices. Also Ray Tracing people can say what they want about how crappy it is or I can not tell the difference or it is too expensive or too shiny or whatever,which are valid points. I can say though someone like myself enjoys that RTX stuff,I have Wolfenstein YoungBlood sitting in my game library until RTX is implemented and will not be played ever unless it has RTX patch is released. Something I will never turnoff in a game is RTX,as Henry from Kingdom Come Deliverance would say RTX In Nomie De Padre, Et Filii, Et Spiritus…Etcetera,Etcetera! Jesus Be Praised RTX mumbles.......like Moms Pie.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Maddness:

If you have seen it in action in games like Metro and Control, it's a hell of a lot more than that.
i have seen in Cyberpunk (not released yet) BF5 and Metro... Look too glossy, it's a big step forward but not so impresive on screen... (Or maybe i am too used to see pro rendering) It will take time to have a real good looking RT like in pre-calculated render, for now it is more a sale argument for the early (gogo?) buyer. The real change will be with more capable GPU... right now, even a RTX 2080 Ti seem too weak.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
rl66:

i have seen in Cyberpunk (not released yet) BF5 and Metro... Look too glossy, it's a big step forward but not so impresive on screen... (Or maybe i am too used to see pro rendering) It will take time to have a real good looking RT like in pre-calculated render, for now it is more a sale argument for the early (gogo?) buyer. The real change will be with more capable GPU... right now, even a RTX 2080 Ti seem too weak.
RTX 2080ti is definitely not powerful enough for what RT is bringing to the table. However, we'll be looking back and it'll be like looking at first-gen 4K monitors and comparing the price+specs of that to what we have now. It's going to take a few years for the tech to be figured out and for implementations to improve, but, I think they're going to get it sorted out a lot quicker than people think.
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
So RT with hdr and 4k in the next xbox Forza 8 or FH5 games.......that is going to be something special. 🙂