NVIDIA confirmed that five titles will feature DLSS 3.0 within the next week.

Published by

Click here to post a comment for NVIDIA confirmed that five titles will feature DLSS 3.0 within the next week. on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
One thing that's interesting, just on the side of things, is how Nvidia successfully captured what made AMD the pick a few years ago: if you wanted to have something that "gets better in the long run, once drivers mature" and along such lines. These days, it's actually DLSS that "keeps on getting better" with more games and longer time. Or that's what impression their marketing has on me, anyway. Since DLSS3 is available in not a single game I'd like to play or do right now.
data/avatar/default/avatar32.webp
fantaskarsef:

One thing that's interesting, just on the side of things, is how Nvidia successfully captured what made AMD the pick a few years ago: if you wanted to have something that "gets better in the long run, once drivers mature" and along such lines. These days, it's actually DLSS that "keeps on getting better" with more games and longer time. Or that's what impression their marketing has on me, anyway. Since DLSS3 is available in not a single game I'd like to play or do right now.
Do you even play games anymore? That's the question 😛
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
nizzen:

Do you even play games anymore? That's the question 😛
Super People - yet another battle royale shooter Justice ‘Fuyun Court’ - a graphics demo... Loopmancer - a platformer game Flight simulator - what it says, nomen est omen F1 - racing game A Plague Tale: Requiem - story telling game In this particular list of games that have DLSS3, I'm pretty sure I'm only interested, if at all, in A Plague Tale's second game. Need to play 1st, but yeah, I am not the target group for racing and simulation games, platformers, the 16th battle royale game, and Fuyun Court is not even a real game 😀
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Fortunately game studios don't need to hurry with DLSS 3.0. It's not like 4090 would need it with all the horsepower it has got.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Kaarme:

Fortunately game studios don't need to hurry with DLSS 3.0. It's not like 4090 would need it with all the horsepower it has got.
True, but Nvidia's portfolio's bigger than just the 4090, and I'm fairly sure they want to sell some e.g. 4060 cards down the road as well. Entry-level / midrange cards like those do make good use of DLSS, imho
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
They need to fix the stuttering in F1 22 first before adding more stuff.
https://forums.guru3d.com/data/avatars/m/269/269781.jpg
Nvidia, just keep your bullshit, and start making strong cards and drivers like old days.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Off to a bit of a rough start with DLSS 3. In general, it doesn't matter what the potential is for the technology if it requires developers to put in extra work to implement it, especially if it only applies to a single vendor's hardware. I'd say the only truly successful Nvidia-specific technology was CUDA, and that's because they made it sooo much better than the alternative that it became the only obvious choice for most developers. So long as Nvidia locks technologies to their platform, we're never going to see a lot of use-cases for them.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

we're never going to see a lot of use-cases for them.
You won't, but the rest do.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

You won't, but the rest do.
"the rest" meaning who and based on what evidence? Look at all of Nvidia's exclusive technologies and tell me how many of them found wide adoption. Again, other than CUDA, I can't think of a single one that was implemented by the majority of the target audience. GPU-accelerated PhysX, OptiX, G-sync, DLSS, SLI, etc - they all barely made a dent. Obviously it's even worse for AMD, which is probably why they gave up trying to make their own exclusive features since I think TressFX.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

"the rest" meaning who and based on what evidence? Look at all of Nvidia's exclusive technologies and tell me how many of them found wide adoption. Again, other than CUDA, I can't think of a single one that was implemented by the majority of the target audience. GPU-accelerated PhysX, OptiX, G-sync, DLSS, SLI, etc - they all barely made a dent. Obviously it's even worse for AMD, which is probably why they gave up trying to make their own exclusive features since I think TressFX.
LOL, everytime you want to downplay something new, just call it "niche", it gets tiring you know. Well whatever, whatever new tech Nvidia introduce will have more adoption than the competitors, that's all that matters. I couldn't care less what the majority of people in the world own or do anyways.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
loracle:

Nvidia, just keep your bullshit, and start making strong cards and drivers like old days.
What are you talking about, drivers are rock solid. Maybe your OS needs a fresh install.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

LOL, everytime you want to downplay something new, just call it "niche", it gets tiring you know.
Every time anything slightly negative is said about Nvidia, you get butthurt. It gets tiring y'know. These features, including DLSS 3.0, are not niche. To be niche means it appeals to a narrow market. Nvidia's technologies are desirable even by the haters and most of them would improve most games. The problem is how they're implemented, and in turn: that they aren't implemented. Each of the technologies I mentioned yield great results or are gamechangers (literally) but it doesn't matter if only a select few titles implement them. Time is money and game developers would see more profit optimizing the game for more platforms than to implement vendor-exclusive features.
Well whatever, whatever new tech Nvidia introduce will have more adoption than the competitors, that's all that matters. I couldn't care less what the majority of people in the world own or do anyways.
I really can't comprehend your logic there - in what way does it matter that their adoption rate is higher than the competitors if it's still crappy? And you also say that as though the competitors should see less adoption for their technologies, as in, Nvidia monopolizing on desirable features is an ideal world to you. In any case, it does matter because you're paying extra for it and these features distract from further optimizing the drivers. Nvidia could easily knock AMD out of the market if they lowered their prices by spending less time and money on exclusive features, especially ones that can have a more open approach and/or don't require developer/engineer intervention. A bit ironic of course, because in such a world, those features would then be accessible to pretty much everyone.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Every time anything slightly negative is said about Nvidia, you get butthurt. It gets tiring y'know. These features, including DLSS 3.0, are not niche. To be niche means it appeals to a narrow market. Nvidia's technologies are desirable even by the haters and most of them would improve most games. The problem is how they're implemented, and in turn: that they aren't implemented. Each of the technologies I mentioned yield great results or are gamechangers (literally) but it doesn't matter if only a select few titles implement them. Time is money and game developers would see more profit optimizing the game for more platforms than to implement vendor-exclusive features.
Let me ask you then, MS own the majority of OS marketshare, yet only 25% of users go with Windows11. How can Nvidia introduce a new technology and in a short time everyone will adopt it? You are asking the impossible and act disappointed when it will not happen, ever. Heck even DX12 is still a niche API, and it came out 7 years ago, Vulkan is even more niche despite being the "superior" API.
schmidtbag:

In any case, it does matter because you're paying extra for it and these features distract from further optimizing the drivers. Nvidia could easily knock AMD out of the market if they lowered their prices by spending less time and money on exclusive features, especially ones that can have a more open approach and/or don't require developer/engineer intervention. A bit ironic of course, because in such a world, those features would then be accessible to pretty much everyone.
You don't understand what you are talking about, Nvidia doesn't want to kill off RTG because that would put Nvidia under USA antitrust law, instead Nvidia/RTG will maintain a duopoly with healthy profit margins for both. What you are advocating for is monopoly, and monopoly kills off any incentive for technological advancement.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Let me ask you then, MS own the majority of OS marketshare, yet only 25% of users go with Windows11. How can Nvidia introduce a new technology and in a short time everyone will adopt it? You are asking the impossible and act disappointed when it will not happen, ever.
Your logic is getting increasingly broken because neither can be compared... at all. Games will run just fine on either W10 or 11. The vast majority of games will even work on W7, just maybe not to their full potential. Pretty much the only reason why Windows has a large userbase is specifically because of compatibility of software, hardware, and features. MS actually deliberately breaks some of this compatibility by having specific features only available in newer versions of Windows. This is very different from what Nvidia is doing: Unlike MS, Nvidia implements features people actually want. While some of Nvidia's features could be backported (and sometimes are), many of their features can't be because the hardware literally can't do it. I think the GTX 1000 series getting DXR support was Nvidia trying to prove a point about this. New technologies can be rather quickly adopted and Nvidia themselves have succeeded in that. The way to do that is to either: A. Make it so compelling and easy-to-implement that it becomes the obvious or perhaps the only viable option (like with CUDA). B. Integrate it with a widely used platform that other competitors can still use, if they can figure out how (like with DXR). C. Make it either open source or an open standard (like with Freesync*) D. Make it so existing software can use the feature without further modifications or updates (like with FSR). And depending on what the thing is, you need more than one of these. Nvidia tends to make their technologies proprietary, closed-source, or heavily dependent upon their drivers/hardware. Of Nvidia's coolest technologies, AMD tends to undermine them with something that doesn't yield as good of results but appeals to a wider market (whether that be platform-agnostic, royalty-free, or doesn't require developers to do more work). * Adaptive sync monitors are a niche product, but Freesync was quickly adopted since it was compatible among different platforms and was royalty-free. TL;DR: It's not impossible, but Nvidia can't have their cake and eat it too. I get it - they put a lot of R&D into their features and they do seem to legitimately care about improving the user experience, so they deserve credit for that. But all they have to do is just design their architecture to favor whatever the new feature is. That's often how the CPU market works.
Heck even DX12 is still a niche API, and it came out 7 years ago, Vulkan is even more niche despite being the "superior" API.
DX12 isn't niche anymore. It took a while to adopt because a lot of people refused to use W10 and DX12 was deliberately made incompatible with W7. Vulkan is niche because, from what I understand, it's harder to implement (but still easier than OpenGL). Meanwhile, Apple used their own API (Metal), and it took a while for mobile chips to support Vulkan. Since DX12 is easier and applied to a much wider audience, Vulkan just didn't have much of a chance to grow.
You don't understand what you are talking about, Nvidia doesn't want to kill off RTG because that would put Nvidia under USA antitrust law, instead Nvidia/RTG will maintain a duopoly with healthy profit margins for both.
*sigh* I know that... My point is Nvidia could do that.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Nvidia tends to make their technologies proprietary, closed-source, or heavily dependent upon their drivers/hardware. Of Nvidia's coolest technologies, AMD tends to undermine them with something that doesn't yield as good of results but appeals to a wider market (whether that be platform-agnostic, royalty-free, or doesn't require developers to do more work). * Adaptive sync monitors are a niche product, but Freesync was quickly adopted since it was compatible among different platforms and was royalty-free.
Nvidia wants to maintain the premium brand recognition, that means their products are guaranteed to work as intended. G-sync and G-sync compatible monitors are guaranteed to work with Nvidia GPUs, meanwhile having Freesync branding on monitors mean nothing (since AMD doesn't do any testing). For example LG OLED TV have Freesync VRR branding, yet VRR don't work on AMD RX6000 (another one from guru3d), what a joke. Let say DLSS3 can be made to work with current hardwares, but either there is no benefit (no FPS gain) or the results are so bad it would tarnish Nvidia brand, why would Nvidia ever allow it?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Nvidia wants to maintain the premium brand recognition, that means their products are guaranteed to work as intended. G-sync and G-sync compatible monitors are guaranteed to work with Nvidia GPUs, meanwhile having Freesync branding on monitors mean nothing (since AMD doesn't do any testing). For example LG OLED TV have Freesync VRR branding, yet VRR don't work on AMD RX6000, what a joke.
Except Nvidia didn't have to make G-Sync for that scenario to happen, at least it didn't need to have special hardware. AMD does a poor job at guaranteeing results for FreeSync but nothing is preventing them from doing that other than laziness/cheapness/negligence. Nvidia has the ability to use generic VESA AdaptiveSync (with no stupid extra hardware) and still be the more appealing option, by having a scrutinizing certification program so the consumer knows with confidence which displays will work correctly with Nvidia products. It might cost a little extra, but at least it isn't vendor locked. The evidence for this is there are FreeSync displays that do have good results. And that's where Nvidia can have their cake and eat it too - they have the means to push for a technology, make it an open or royalty-free standard, and make it easily adoptable, yet still be the best choice. Generally speaking, you can take away all the extra features and Nvidia still has the better overall platform, at least on release day. For years, they've been the preferred choice, not because of all the seldom-used features, but because they offered a more stable and performant product. There is nothing preventing Nvidia from continuing to do this with other technologies. I think Nvidia approached DXR perfectly: it's a cool technology that they managed to pioneer and they didn't vendor-lock it. They didn't need to because their implementation was so much better that it became the obvious choice. It works better in Nvidia's favor to let a technology be platform-agnostic because it increases the chances of it being adopted. When something becomes widely adopted but one vendor handles it consistently better, that makes the product look more appealing. In other words, which do you think is the better situation to be in: A. Having a list of exclusive but seldom-used features. B. Having a list of widely-used features but one vendor is a lot better at implementing them.
Let say DLSS3 can be made to work with current hardwares, but either there is no benefit (no FPS gain) or the results are so bad it would tarnish Nvidia brand, why would Nvidia ever allow it?
Ironically the reality is the opposite situation, where things like the frame generation feature would make it so outdated products remain appealing a little while longer, thereby holding back new sales. But for the sake of argument, let's go with your example: as I previously mentioned, Nvidia already did this by allowing raytracing support on the GTX 1000 series. It performs so badly that you might as well just turn it off. It didn't tarnish the brand at all*, it got people to realize that the RT cores aren't a gimmick and that there is a reason to upgrade. * People hellbent on hating Nvidia would say it tarnished the brand, but such people also think raytracing is pointless. They're also the same people who exaggerate how much fidelity loss DLSS 2.0 has. They don't count.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Espionage724:

https://www.pcgamer.com/dlss-3-on-older-GPUs/ It sounds like DLSS 3 will work on older GPUs, but without the frame interpolation feature?
DLSS 3 is 3 different technologies. Super Resolution, Frame Generation, and NVIDIA Reflex. Super resolution and Reflex will work on older GPUs.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Except Nvidia didn't have to make G-Sync for that scenario to happen, at least it didn't need to have special hardware. AMD does a poor job at guaranteeing results for FreeSync but nothing is preventing them from doing that other than laziness/cheapness/negligence. Nvidia has the ability to use generic VESA AdaptiveSync (with no stupid extra hardware) and still be the more appealing option, by having a scrutinizing certification program so the consumer knows with confidence which displays will work correctly with Nvidia products. It might cost a little extra, but at least it isn't vendor locked. The evidence for this is there are FreeSync displays that do have good results. And that's where Nvidia can have their cake and eat it too - they have the means to push for a technology, make it an open or royalty-free standard, and make it easily adoptable, yet still be the best choice. Generally speaking, you can take away all the extra features and Nvidia still has the better overall platform, at least on release day. For years, they've been the preferred choice, not because of all the seldom-used features, but because they offered a more stable and performant product. There is nothing preventing Nvidia from continuing to do this with other technologies. I think Nvidia approached DXR perfectly: it's a cool technology that they managed to pioneer and they didn't vendor-lock it. They didn't need to because their implementation was so much better that it became the obvious choice. It works better in Nvidia's favor to let a technology be platform-agnostic because it increases the chances of it being adopted. When something becomes widely adopted but one vendor handles it consistently better, that makes the product look more appealing. In other words, which do you think is the better situation to be in: A. Having a list of exclusive but seldom-used features. B. Having a list of widely-used features but one vendor is a lot better at implementing them. Ironically the reality is the opposite situation, where things like the frame generation feature would make it so outdated products remain appealing a little while longer, thereby holding back new sales. But for the sake of argument, let's go with your example: as I previously mentioned, Nvidia already did this by allowing raytracing support on the GTX 1000 series. It performs so badly that you might as well just turn it off. It didn't tarnish the brand at all*, it got people to realize that the RT cores aren't a gimmick and that there is a reason to upgrade. * People hellbent on hating Nvidia would say it tarnished the brand, but such people also think raytracing is pointless. They're also the same people who exaggerate how much fidelity loss DLSS 2.0 has. They don't count.
There are additional features for having the dedicated Gsync module like Ultra Low Motion Blur mode back when it came out in 2014 that AdaptiveSync can't replicate, now newer Gsync monitors also have the reflex analyzer, these features are only worth it for Esport gamers (Gsync Ultimate Features). If you think AMD is playing nice, how about the Smart Access Memory, did AMD release any source code for their tech? Nope, Nvidia and Intel figured it out on their own that it's Resizeable Bar that had existed in the PCIe spec. AMD only open-source their techs after Nvidia have introduced them first (like FSR). In the end all these decisions are for the good of their own respective companies. Well for a propriety software, DLSS sure turned out popular enough that almost all new AAA games come with DLSS support. Surely devs must realized that the majority of their target audience have RTX GPUs.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
schmidtbag:

"the rest" meaning who and based on what evidence? Look at all of Nvidia's exclusive technologies and tell me how many of them found wide adoption. Again, other than CUDA, I can't think of a single one that was implemented by the majority of the target audience. GPU-accelerated PhysX, OptiX, G-sync, DLSS, SLI, etc - they all barely made a dent. Obviously it's even worse for AMD, which is probably why they gave up trying to make their own exclusive features since I think TressFX.
meanwhile the free for all fsr 2.0 has most of the work is done by modders not developers and needs dlss dll to work in the fist place.