NVIDIA’s Morgan McGuire: “First triple-A game to require a ray tracing GPU will be released in 2023”

Published by

Click here to post a comment for NVIDIA’s Morgan McGuire: “First triple-A game to require a ray tracing GPU will be released in 2023” on our message forum
https://forums.guru3d.com/data/avatars/m/191/191875.jpg
Hilbert Hagedoorn:

Ray tracing is a technology that NVIDIA is currently pushing hard, the first triple-A game that will require a Ray Tracing graphics card will not be coming out anytime soon. According to NVIDIA&rs... NVIDIA’s Morgan McGuire: “First triple-A game to require a ray tracing GPU will be released in 2023”
So is anyone else seeing this guy from Nvidia as basically coming forward and saying by the way you don't actually need to buy one of our all new, all singing and new dancing GPUs for at least 3 years?
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
The Laughing Ma:

So is anyone else seeing this guy from Nvidia as basically coming forward and saying by the way you don't actually need to buy one of our all new, all singing and new dancing GPUs for at least 3 years?
But we will, won't we, each one of those years 🙄
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The Laughing Ma:

So is anyone else seeing this guy from Nvidia as basically coming forward and saying by the way you don't actually need to buy one of our all new, all singing and new dancing GPUs for at least 3 years?
If you want more performance you currently don't have a choice so, no?
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
MonstroMart:

Yet another revision of the 1080Ti? o_O
No....Gtx 2080ti for $650/$700 is what I meant. The RTX simply cost too much vs a powerful GTX a model of the same thing "minus-RTX" In other words....a GTX.
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
buhehe:

I find it kinda odd that next gen console would focus on RT rather than high refresh rates
Visuals aka eye candy, sell more than framerate! That's why in every generation we have increased visual quality, but most games run at 30fps. They could perfectly limit the graphics level in favor of framerate, but the impact would be much smaller. If Sony and MS were to give people a choice, "want the next generation with more bling bling and 30fps or stick with today's graphics and play everything at 60fps or more?"... I bet the vast majority will vote for a better graphics boost and continue at 30fps.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
By then i'm sure the prices have dropped by 50%?
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Isn't the "problem" with game development companies? ...where there would really have to be some sort of raytracing gameplay to make this a requirement. Otherwise, developers could just start doing barebones alternatives to raytracing effects like shadows/lighting or ambient occlusion or reflections. Honestly, this is probably going to be an outlier even 10-15 years from now. Why would you *require* RTRT on a game, shrinking your customer base even by a little, unless it is required for game mechanics?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Caesar:

Why would you *require* RTRT on a game, shrinking your customer base even by a little, unless it is required for game mechanics?
Because it's 500x easier and faster to get good results as opposed to rasterized lighting - especially when the entire scene is utilizing it.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Well, RT is and will be cheaper for game-devs to implement, rather than faking it, which is a lot more work. I imagine game development shifting to focussing on RT version first, with pure rasterization version worked on after RT version is in-place.
data/avatar/default/avatar30.webp
kings:

Visuals aka eye candy, sell more than framerate! That's why in every generation we have increased visual quality, but most games run at 30fps. They could perfectly limit the graphics level in favor of framerate, but the impact would be much smaller. If Sony and MS were to give people a choice, "want the next generation with more bling bling and 30fps or stick with today's graphics and play everything at 60fps or more?"... I bet the vast majority will vote for a better graphics boost and continue at 30fps.
thats simply because console that most people plug it to TV is limited refresh-rate, no? thus having high frame-rate(fps) isnt that much benefit, if ur display cant keep up with the refresh-rate, right? cmiiw
data/avatar/default/avatar25.webp
Denial:

Because it's 500x easier and faster to get good results as opposed to rasterized lighting - especially when the entire scene is utilizing it.
so developer pick easy-way and let less people play/buy their game ? in another mean something like probably cheaper developer cost to allow less game-sales? even with good adoption rate in next 4years, except they can make RT work on old-gpu, making a game for RT-capable gpu only will still cut the sales except the game is sponsored/invested by either nvidia/amd, i dont think any developer will pick that way
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
This basically confirms what annoyed me about billing the 20 series as "RTX" instead of just the new GTX (with new ray tracing features). I mean, other than the pricing being absurd which was also annoying. Ray tracing is really cool but we aren't even close to being "there" yet. Perhaps in 4 years, we will be.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
slyphnier:

so developer pick easy-way and let less people play/buy their game ? in another mean something like probably cheaper developer cost to allow less game-sales? even with good adoption rate in next 4years, except they can make RT work on old-gpu, making a game for RT-capable gpu only will still cut the sales except the game is sponsored/invested by either nvidia/amd, i dont think any developer will pick that way
Windows XP dropped under 40% marketshare in 2012. How many DX10/11 only games were out in 2012? 6% of the steam market currently doesn't even have a DX10 capable GPU. You think that's stopping developers?
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
Clawedge:

Required? If it's required for the story or serves a purpose for story telling, then I understand.
Required means all lighting will be done with RT only, while rasterization being deprecated. Considering the fact that by 2021 both AMD and Intel plans hardware ray tracing acceleration, it's highly possible that 2 years after developers will begin forgetting about rasterization.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
coth:

Required means all lighting will be done with RT only, while rasterization being deprecated. Considering the fact that by 2021 both AMD and Intel plans hardware ray tracing acceleration, it's highly possible that 2 years after developers will begin forgetting about rasterization.
As long as performance penalty is minimal, I am game
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
slyphnier:

so developer pick easy-way and let less people play/buy their game ? in another mean something like probably cheaper developer cost to allow less game-sales? even with good adoption rate in next 4years, except they can make RT work on old-gpu, making a game for RT-capable gpu only will still cut the sales except the game is sponsored/invested by either nvidia/amd, i dont think any developer will pick that way
You'd have to assume that most gpu wouldn't have RT capability by that time for it to be negative. However, that's not the case. Both PS5 and X2 already have RT capable hardware, so, that rules out any problems with those 2 consoles. The Switch (or next-gen Switch) likely won't be getting that game, so, is irrelevant. This then only leaves the PC market, where by that time RTX/RT cards would've been around for 4 1/2yrs already. If only 10% of the PC market bought that game it would still be successful.
data/avatar/default/avatar28.webp
Clawedge:

As long as performance penalty is minimal, I am game
It won't be... ever. Raytracing, especially with true raytracing where the entire scene is raytraced, is SIGNIFICANTLY more demanding than rasterazation. Raytracing will perhaps be better looking and easier to implement for the game developper, but it will come at a drastic performance penalty.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
Dragam1337:

It won't be... ever. Raytracing, especially with true raytracing where the entire scene is raytraced, is SIGNIFICANTLY more demanding than rasterazation. Raytracing will perhaps be better looking and easier to implement for the game developper, but it will come at a drastic performance penalty.
That's exactly what I am afraid of
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Game studios might eventually make games that demand an RT capable GPU, but AMD/Nvidia can't make a GPU that's only capable of that as people would still want to play all the games in their library/market, including old ones. Sounds like a permanent, obligatory price jump. Conversely, great many studios/Indie devs would keep sticking to the old ways, knowing they remain viable. Just like we still get new DX11 games (even if they have a nominal DX12/Vulkan mode for marketing or whatever purposes).
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Hence, why I bought a 1660 Ti. I almost bought a 2060 but since RT is such a performance killer it would have just pissed me off knowing that the card was too weak to handle it. Kinda like back in the day when I had a Geforce FX5900. It was supposed to be a DX9 card but when the Half Life 2 performance benchmarks came out it sucked ass!