NVIDIA Announces Support for lots of RTX ON based games at Gamescom

Published by

Click here to post a comment for NVIDIA Announces Support for lots of RTX ON based games at Gamescom on our message forum
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Turing was designed for samsung 10nm, they just couldn't get fab time.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

What's power draw and clock of RTX 2080Ti when its rasterization and other parts are fully used at same time? Is it some very low clock with very high power draw? Or is card on rather good side of power efficiency as given "12nm" is not just marketing term over previously used 16nm? It was ready. AMD would have released around that time too if they did not have two "oups". It's that nVidia opted out of 7nm at beginning.
It's 290w with no RT - turning on RT actually reduces the power consumption. I disagree that it was ready - Nvidia shipped their entire lineup in October 2018.. AMD could barely yield Instinct/Vega 7nm months later (MI60's were back ordered until March).. and those are smaller chips than what you're arguing for and have an advantage of being sold at significantly higher margins.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm having a really hard time understanding why Nvidia keeps showcasing the worst possible examples for RTX. Flat puddles are pretty much the ONLY thing RTX is completely unnecessary for, and it seems to be the only thing games are using. I really want RT to be successful (and that says a lot considering I have no intention on buying an RTX GPU in the foreseeable future) but as long as they keep using flat shiny surfaces, they're basically just shooting themselves in the foot.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Reflections are an easy comparison and screen-space hits limits with what's visible in the camera and then having to rely on cube-mapping. Shadows can be impressive but a comparison is trickier for finer details like transitions, edges, overall distance and accuracy including improving self shadowing and indirect shadowing via ambient occlusion. GI would be the big one but it is likely also much more demanding in turn but we'll see when these next RTX games come out with Control first up and then benchmarks and graphical comparisons on and off and such. šŸ™‚ I like it, not a fan of the divide between AMD and NVIDIA GPU's again but ray-tracing still has a couple of years before it can become a standard, way more before it can start replacing rasterization since both will be needed for now and at least until consoles can do it as a standard in who knows how many generations. EDIT: Well the videos might not show it but any game with SSR just moving the camera around and seeing the reflection change depending on what info is available is telling, cube mapping would be how often it updates, resolution and other info but newer games have gotten better at utilizing it but the same flaws or drawbacks are still there even if it's masked via additional shaders or effects. (Not that different for how to check issues with AO or shadows but it tends to stand out less though distance and angles can be more noticeable.) But it is quite fast and the errors aren't too bad for the performance and how well developers are now using the effect.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

I'm having a really hard time understanding why Nvidia keeps showcasing the worst possible examples for RTX. Flat puddles are pretty much the ONLY thing RTX is completely unnecessary for, and it seems to be the only thing games are using. I really want RT to be successful (and that says a lot considering I have no intention on buying an RTX GPU in the foreseeable future) but as long as they keep using flat shiny surfaces, they're basically just shooting themselves in the foot.
I mean I think that's showcased more because it's the one thing that looks like a visible improvement. Look at the call of duty photos for example, sure the RTX render is more accurate to real life but if you gave me those two pictures and said "which one is more accurate" without the RTX logo - I wouldn't know - they look like the same thing just the shadows are different. Maybe if I sat there and analyzed the image for an hour I could tell but just glancing I have no idea. That being said they showed quite a few games not doing reflections: https://babeltechreviews.com/ray-tracing-news-from-gamescom-2019/ Hilbert just didn't post them all.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

I'm having a really hard time understanding why Nvidia keeps showcasing the worst possible examples for RTX. Flat puddles are pretty much the ONLY thing RTX is completely unnecessary for, and it seems to be the only thing games are using. I really want RT to be successful (and that says a lot considering I have no intention on buying an RTX GPU in the foreseeable future) but as long as they keep using flat shiny surfaces, they're basically just shooting themselves in the foot.
I think thatĀ“s because of the limited performance of current RTX cards available. RT may be the holy grail of real time graphics but NvidiaĀ“s cards are simply too weak to display itĀ“s true potential, so Nvidia has can only show (very) small glimpes of RT true potential and they have to do it in a way that games donĀ“t become interactive slideshows. Solution: letĀ“s show gamers/buyers some super realistic puddles!....
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
JonasBeckman:

Reflections are an easy comparison and screen-space hits limits with what's visible in the camera and then having to rely on cube-mapping. Shadows can be impressive but a comparison is trickier for finer details like transitions, edges, overall distance and accuracy including improving self shadowing and indirect shadowing via ambient occlusion. GI would be the big one but it is likely also much more demanding in turn but we'll see when these next RTX games come out with Control first up and then benchmarks and graphical comparisons on and off and such. šŸ™‚ I like it, not a fan of the divide between AMD and NVIDIA GPU's again but ray-tracing still has a couple of years before it can become a standard, way more before it can start replacing rasterization since both will be needed for now and at least until consoles can do it as a standard in who knows how many generations. EDIT: Well the videos might not show it but any game with SSR just moving the camera around and seeing the reflection change depending on what info is available is telling, cube mapping would be how often it updates, resolution and other info but newer games have gotten better at utilizing it but the same flaws or drawbacks are still there even if it's masked via additional shaders or effects. (Not that different for how to check issues with AO or shadows but it tends to stand out less though distance and angles can be more noticeable.) But it is quite fast.
Planar reflections instead of SSR or hybrid. As for demonstration: Global illumination would be best thing in those games with a lot of shiny/glowing objects. Shadows are part of it, but they are not bad without RT. I must say that when Path of Exile got GI, it was really good IQ improvement. But here comes real RT: -> materials: surface harshness/light diffusion, translucency + transparency (reflections and refractions on both surface and inside of material)
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

I mean I think that's showcased more because it's the one thing that looks like a visible improvement. Look at the call of duty photos for example, sure the RTX render is more accurate to real life but if you gave me those two pictures and said "which one is more accurate" without the RTX logo - I wouldn't know - they look like the same thing just the shadows are different. Maybe if I sat there and analyzed the image for an hour I could tell but just glancing I have no idea.
I understand that but there are plenty of examples that have night and day differences that aren't just a bunch of shiny surfaces. Some of the screenshots in the link you provided have very noticeable differences. You might not be able to tell what the differences are by looking at each screenshot just once for a few seconds, but compare them back and forth and it's very obvious. This post I think is a great demo of RTX: https://forums.guru3d.com/goto/post?id=5701513#post-5701513
H83:

I think thatĀ“s because of the limited performance of current RTX cards available. RT may be the holy grail of real time graphics but NvidiaĀ“s cards are simply too weak to display itĀ“s true potential, so Nvidia has can only show (very) small glimpes of RT true potential and they have to do it in a way that games donĀ“t become interactive slideshows. Solution: letĀ“s show gamers/buyers some super realistic puddles!....
That sort-of makes sense, but, I think it's actually more computationally taxing to render a reflection than a hued glow that reflected off an object. Remember, reflections require the GPU to basically render the same visuals twice in the same scene. Take the Dying Light 2 scene in the link Denial posted. The light is bouncing off the red paint on the generator, and is creating a faint red glow on the background objects. I can't imagine that would be more taxing for the GPU to render than a puddle, but, it's still a very distinct difference, and a better use-case of RTX.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Screenshots like these only push me further away from Nvidia right now. I don't want to subsidize the development of RTX over the next 5 years. If I'm going to put money towards it, I want a meaningful benefit for that money.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
rm082e:

Screenshots like these only push me further away from Nvidia right now. I don't want to subsidize the development of RTX over the next 5 years. If I'm going to put money towards it, I want a meaningful benefit for that money.
you're paying for the extra fps the card brings, every card ever had some introductory tech that didn't run well on that gen and was turned off till later cards could There is absolutely nothing new about what is going on here no matter how much sawdust you have in your ears.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
I'm worried about implementation and the divide more than anything, if AMD and NVIDIA have their own exclusive separate solutions for D3D12 (Even if it goes through DX12 DXR to each respective vendors solution for ray tracing.) and then VLK through extensions support is going to be weird even if AMD decides to support ray-tracing and it becoming something of a standardized feature needs some unified solution though for now these early games gives a way for NVIDIA to showcase their RTX and ray-tracing technology and the areas they have focused on initially and might expand further as stronger hardware becomes available and development to the code itself and I keep forgetting the actual SDK name and suite here but RTX 2.0 for simplicity kinda. Though so far we've not heard what AMD might be doing with ray-tracing for the consumer and gaming segment even if Navi20 might have some form of GPU hardware support for speeding up the effect and it could be more related to productivity or fields such as audio which is also something I believe was mentioned for the upcoming console generation that is probably close to being formally announced now. Will be interesting to see how it goes, these early RTX titles will be NVIDIA only but for the broader whole and overall continuation of ray tracing it seems unlikely devs would implement effects 2x through what might be two very different API's and instruction sets and how the end result ends up so that's going to be a bit weird or whatever to call it and I can't see it overtaking the current way of doing things until there is some better standard solution but we'll see, not like that's a huge concern for these early games showcasing a couple of nice effects but this is still just the start of it and that's important too. EDIT: Meh seems I kinda mentioned that in the previous post already, hi there brain busy with some other important stuff as usual eh? (Yeah something like that.)
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
schmidtbag:

As if I didn't need one more reason to dislike Huang... If he left Nvidia, I'd most likely buy the next GPU he wasn't involved in. I avoid buying Nvidia almost entirely out of principle, because of him. I think they make a better product but I'm not willing to fuel his ego.
if he left nvidia, progress would hit a wall and there'd be no progress in pc graphics.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
As long as all the big games implement it I don't care. 3000 series is going to be optimised for RTX games. I can't wait to see how Nvidia balances performance on their next-gen cards.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Astyanax:

you're paying for the extra fps the card brings, every card ever had some introductory tech that didn't run well on that gen and was turned off till later cards could There is absolutely nothing new about what is going on here no matter how much sawdust you have in your ears.
Comparing the 5700 XT and the 2070 Super, I don't see enough extra fps to justify the added price of the 2070 Super. That was my point. The 5700 XT looks like a better value right now.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
and then you factor in the drivers and what you want to do with your system. Custom AA hacks, Emulation and legacy opengl gaming, nvidia is king. surround gaming, AMD has an edge here. God forbid... SLI, nvidia is the only one showing up to the multigpu party lately (AMD discontinued crossfire for new boards and went dx12 mgpu only) Streaming and encoding, if you ignore the half arsed vendor provided apps, both are pretty on par and i don't believe either support AV1 yet. Pick based on your needs, but effectively the price is VCR vs DVD, old tech vs new tech and AMD can't force nvidia's hand to bring prices down till they have RT either.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Fox2232:

I would not say that problem is in adaptation speed. I would say that problem was from start that HW did not have anywhere near required compute strength to do meaningful RT. As for used term "nGreedia", it is understandable. When nVidia introduced RTX I wrote about this being market shift strategy for next 10 years or more. And people are not doing themselves any service by cheering for RTX here or anywhere else! Why? Simply because GPU power would become non-limiting factor on 4K in case nVidia took Pascal and scaled it to Turing's 18.6B transistors. Yes, 1080 Ti is 11.8B transistor GPU only. In other words, If nVidia decided to sell 18.6B transistor Pascals, people would have no need to upgrade in next ~5 years even on 4K. So they did paradigm shift, reset performance at all resolutions for this "new" thing, and people are going to chase fps again. And will be upgrading in much more regular intervals. Why? Because RT demands can be several magnitudes higher on mere 1080p than it is now on 4K Battlefield. And it will take few clicks to increase demands. RT is bottomless pit that gobbles any kind of performance. Going forward, improvements to GPU performance will be relatively big. But even doubling RT performance per generation at same price points will not result in happy gamers. Quite contrary, new RT settings for game from given year will put last generation cards to their knees as their RT capability is only 1/2. Top cards will be almost useless for RT in time 2 GPU generations are released. People should imagine that nVidia delivers 4x RT performance with next generation. How will those RT games made for new cards run on 1st generation Turing? - - - - And what's other option there to have? nVidia brings smaller performance improvements to RT per generation. Like 50%. Top cards from last generation will survive through next. But time it will take for RT to actually become good IQ improvement will turn into half decade or more. - - - - And either way it is great milking cow. Thing like enough RT does not and will not exist in our realm of discrete GPUs. But Enough rasterization is very real thing, because to suck power out of GPU takes actual effort on developer's side. Like much more complex shader code, additional effects, postprocessing, ... This entire thing is planned obsolescence level 9000 + 1.
I think you hit the nail on the head. not that there wasnt a reset when 4k became thing cause even the high end cards are chasing that still and will be for years, RT is just worse I think it will be 5 years at minimum before RT is really usable. As for AMD and what @Astyanax said about RT AMD obviously has it they just not implementing for reason and I dont expect them to, till after the next ps5/xbox comes out, seeing it being claim they both will have it and both are AMD hardware. which by the time they are released the hit might not be so massive, maybe.