NVIDIA’s Morgan McGuire: “First triple-A game to require a ray tracing GPU will be released in 2023”

Published by

Click here to post a comment for NVIDIA’s Morgan McGuire: “First triple-A game to require a ray tracing GPU will be released in 2023” on our message forum
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
Required? If it's required for the story or serves a purpose for story telling, then I understand. If it's for nothing more than reflections, then it's not welcome. Just my opinion By the way, 2034? You mean 2024?
https://forums.guru3d.com/data/avatars/m/224/224720.jpg
"every gaming platform" Yea right... Nintendo will never do ray tracing.
https://forums.guru3d.com/data/avatars/m/224/224720.jpg
Clawedge:

Required? If it's required for the story or serves a purpose for story telling, then I understand. If it's for nothing more than reflections, then it's not welcome. Just my opinion
It would be required because it's much easier for devs to setup the lighting for a game with ray tracing and when hardware is good enough that they can use it across the board, they will stop using older methods for lighting effects all together to make things easier.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
@loser7 OK you have a good point. The current ray trace situation is much like pixel shaded when it came out. The geforcd FX series.
data/avatar/default/avatar12.webp
So just like Halo 2 "required" DX10 + Windows Vista and Quantum Break "required" DX12 + Windows 10. Will be another artificially created problem to push hardware and OS sales.
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
So Dr.Lisa Su got it right? there was no reason to make RTRT GPU in 2018 🙂. Also if AMD will ANN 64CU Navi in coming month that will challenfge RTX 2080TI at ~800$ then AMD will force Nvidia to make Big GTX card again without RTX so they can compete in pricing?
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Watch people Mod it out :P
data/avatar/default/avatar03.webp
I remember when I watched the first RTX demo, all I could think was where are all the fingerprints? I can't wait for 2030 when we finally get more realistically dull worlds like reality.
data/avatar/default/avatar39.webp
Clawedge:

Required? If it's required for the story or serves a purpose for story telling, then I understand. If it's for nothing more than reflections, then it's not welcome.
Yeah. How dare people need to buy new consoles because they aren't required for the story. If you remove backwards compatibility, you reduce a lot of excess work that may not be required by the vast majority of people. Assuming just PC games, how many people are going to be running a 5 year or older GPU to play brand new games?
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
So an NVidia worker said that 20** serie is useless and that the 16** is sufficient 🙂 NVidia shot a bullet in own feet for main consumer market ... Also as 16** serie doesn't do high end GPU (does it mean that they will do 1670 or 1680? lol) so that RX 5*00 is the best choise in power per price, that post is holly bread for AMD.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
rl66:

So an NVidia worker said that 20** serie is useless and that the 16** is sufficient 🙂 NVidia shot a bullet in own feet for main consumer market ... Also as 16** serie doesn't do high end GPU (does it mean that they will do 1670 or 1680? lol) so that RX 5*00 is the best choise in power per price, that post is holly bread for AMD.
thats not what happened, you're viewing the post through your own tainted perception.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Clawedge:

Required? If it's required for the story or serves a purpose for story telling, then I understand. If it's for nothing more than reflections, then it's not welcome.
This is why NVidia (or current games) sort of shot themselves in the foot... Problem is, the hardware is not fast enough to do proper real ray-traced lighting + shadows + reflections + refraction + occlusion across ALL pixels in scene So the game devs simply cut out most of the best / slowest parts of ray-tracing and JUST did the easiest - reflection in small patches of reflective surfaces. Now the uninformed believes that "reflections in puddles" is all ray-tracing adds to games. When the hardware can do lighting + shadows + reflections + refraction + occlusion across ALL pixels at >60FPS then people will understand. The other problem is the current rasterizing rendering solutions are very good at simulating a lot of this (such as pre-baked maps/lighting/shadows) so it can be difficult to tell the faked from the real thing even if you know what you are looking for. And for many people, they can't see the difference in a lot of scenes.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HWgeek:

So Dr.Lisa Su got it right? there was no reason to make RTRT GPU in 2018 🙂. Also if AMD will ANN 64CU Navi in coming month that will challenfge RTX 2080TI at ~800$ then AMD will force Nvidia to make Big GTX card again without RTX so they can compete in pricing?
Depends on the cost of adding RT and how much of an advantage having in your hardware 4 years in advance gives you. They can make a big "GTX" card with RT but drop tensor (DLSS) since DLSS is currently the only thing Tensors are used for.
data/avatar/default/avatar36.webp
Denial:

Depends on the cost of adding RT and how much of an advantage having in your hardware 4 years in advance gives you. They can make a big "GTX" card with RT but drop tensor (DLSS) since DLSS is currently the only thing Tensors are used for.
They really ought to make a big GTX card though - alot more people would be interested in that than an overpriced RTX card.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Dragam1337:

They really ought to make a big GTX card though - alot more people would be interested in that than an overpriced RTX card.
they aren't going to cut parts out of their top end products just for dinosaurs who want to stay on the old stuff. they didn't do it with tesselation, or SM3.0.
data/avatar/default/avatar12.webp
Astyanax:

they aren't going to cut parts out of their top end products just for dinosaurs who want to stay on the old stuff. they didn't do it with tesselation, or SM3.0.
Tesselation didn't reqruire additinal hardware. SM3.0 helped improve performance. DXR reqruires additional hardware, and significantly hampers performance. People want better performance, not worse performance, hence why barely anyone uses DXR, even if they have an RTX gpu, and hence why GTX cards without RT cores etc would sell better.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

Tesselation didn't reqruire additinal hardware.
? It definitely does. I don't know if it's a good example compared to DXR - I think DXR is a more complicated feature to measure the "impact" of but both require additional hardware.
GeForce GTX 400 GPUs are built with up to fifteen tessellation units, each with dedicated hardware for vertex fetch, tessellation, and coordinate transformations. They operate with four parallel raster engines which transform newly tessellated triangles into a fine stream of pixels for shading. The result is a breakthrough in tessellation performance—over 1.6 billion triangles per second in sustained performance. Compared to the fastest competing product, the GeForce GTX 480 is up to 7.8x faster as measured by the independent website Bjorn3D.