Cyberpunk 2077 NVIDIA DLSS 3 Update Is Out Now

Published by

Click here to post a comment for Cyberpunk 2077 NVIDIA DLSS 3 Update Is Out Now on our message forum
data/avatar/default/avatar32.webp
would be cool to get FSR 2.2 to get less ghosting too...
https://forums.guru3d.com/data/avatars/m/271/271929.jpg
RTX 3000 maybe dead
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
It's frustrating that the units that are enabling this exist in cards since Turing, and I also never understood how this is not possible with a 2 frame delay, for any card out there.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
PrMinisterGR:

It's frustrating that the units that are enabling this exist in cards since Turing, and I also never understood how this is not possible with a 2 frame delay, for any card out there.
I've had raging arguments on reddit telling people this exact same thing. But they won't have it. They just start raging hard and shilling for Nvidia. AMD proved Nvidia fanboys wrong with FSR and more importantly FSR2.x showing that upscaling can be done without bespoke hardware and it can have excellent results (ala FSR2). TV's have been doing black frame insertion for years now on chips inside TV's that are insanely low power. Consoles and game engines have had chequerboard rendering for nearly a decade now too. But still Nvidia fanboys will just defend Nvidia at every chance they get. I hope AMD is working hard on their FSR3 implementation and I hope to god they open it up to more GPU's like they have with FSR1&2. I hope Intel does the same but they are in a mess at the moment from their enaring call.... yikes...
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
CPC_RedDawn:

I hope AMD is working hard on their FSR3 implementation and I hope to god they open it up to more GPU's like they have with FSR1&2.
Yet you buy the most expensive Nvidia card there is. Nuff said.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
aufkrawall2:

Yet you buy the most expensive Nvidia card there is. Nuff said.
My 4090 was one of the "cheapest" models there is, I can spend my money on what ever I want. That doesn't mean I can't say nothing negative about Nvidia, there was no contract inside the box when I got the card. I had a 7900XTX but I suffered from the heatsink issue and had to return it. There was no AIB 7900XTX cards in stock and I wasn't going to buy a 4080 so I just caved and got a 4090. If I had no issues with the 7900XTX I would still own that card as performance was still excellent. My argument would actually give you more features on your 3060 and make my own purchase look stupid, how can you not see that?
https://forums.guru3d.com/data/avatars/m/262/262613.jpg
aufkrawall2:

Yet you buy the most expensive Nvidia card there is. Nuff said.
I have the 4090 too but it doesn't mean i'm rooting for Nvidia, we need a competitive AMD to continue giving us great alternatives and also keep nvidia from creating a monopoly. I hope AMD will embarrass nvidia like they did back when they released freesync which today almost eliminated gsync from relevance.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
PrMinisterGR:

It's frustrating that the units that are enabling this exist in cards since Turing, and I also never understood how this is not possible with a 2 frame delay, for any card out there.
You had me curious so i looked into it and there's indeed support for optical flow back to turing, but under the hood the capabilities have evolved: https://docs.nvidia.com/video-technologies/optical-flow-sdk/nvofa-application-note/ Ampere might be able to get some form of frame generation going, Turing has capabilities limitations, once frame generation for Ada is robust enough they might release a DLSS 2.5 with frame gen for Ampere? Who knows.
of.jpg

of2.jpg
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
CPC_RedDawn:

My argument would actually give you more features on your 3060 and make my own purchase look stupid, how can you not see that?
And what if it was as useless as XeSS on non Intel dGPUs? Since Nvidia's doc pretty much indicates that:
In the NVIDIA Turing and NVIDIA Ampere architecture generation GPUs, most of these algorithms use a compute engine to perform the required tasks. As a result, when the compute engine workload is high, the performance of the NVIDIA Optical Flow Accelerator (NVOFA) could be affected. On NVIDIA Ada-generation GPUs, most of these algorithms are moved to dedicated hardware within the NVOFA, reducing the dependency on the compute engine significantly. In addition, NVIDIA Ada-generation GPUs bring several other optimizations related to reducing the overhead of interaction between driver and hardware. This increases the overall performance and context switches between various hardware engines on the GPU. With these changes, the speed of the NVIDIA Ada Lovelace architecture NVOFA is improved ~2x compared to the NVIDIA Ampere architecture NVOFA.
https://developer.nvidia.com/blog/harnessing-the-nvidia-ada-architecture-for-frame-rate-up-conversion-in-the-nvidia-optical-flow-sdk/ No, I don't want uselessly slow features on my 3060. More DLSS 2 improvements are much more welcome, and I'm still getting these (so far). As I do get Reflex in more titles, so Ampere and Turing are aging much better than RDNA1 & 2.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
aufkrawall2:

And what if it was as useless as XeSS on non Intel dGPUs? Since Nvidia's doc pretty much indicates that: https://developer.nvidia.com/blog/harnessing-the-nvidia-ada-architecture-for-frame-rate-up-conversion-in-the-nvidia-optical-flow-sdk/ No, I don't want uselessly slow features on my 3060. More DLSS 2 improvements are much more welcome, and I'm still getting these (so far). As I do get Reflex in more titles, so Ampere and Turing are aging much better than RDNA1 & 2.
Look it's entirely possible that FSR3 frame gen is terrible on Nvidia or just terrible altogether but the idea of using that link as a reason is kind of moot. I assume AMD won't be using NVOFA in their implementation and due to that will probably have to find ways to solve the compute issue. Kind of reminds me of hairworks.. where Nvidia built a ton of work into a geometry simulation of hair because A. Their hardware was good at geometry and B. It used a bunch of technology they already had.. then AMD comes along and arguably does it better with TressFX which abandoned geometry process entirely in favor of a pure vertex solution. Now I'm also not saying AMD comes out with some non-compute variant of frame generation but it's entirely possible they take a completely different approach and it's successful. And it's not like it costs you anything - they need it for their own architecture, so what's the downside for us?
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
aufkrawall2:

Yet you buy the most expensive Nvidia card there is. Nuff said.
Pointless comment.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
"I have the 4090 too but it doesn't mean i'm rooting for Nvidia" oh yea, that was the best joke of the day here.
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
Denial:

Now I'm also not saying AMD comes out with some non-compute variant of frame generation but it's entirely possible they take a completely different approach and it's successful. And it's not like it costs you anything - they need it for their own architecture, so what's the downside for us?
It is not realistic to expect Nvidia not to push features for which they invested dedicated transistors.
pegasus1:

Pointless comment.
Another hypocrite.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
aufkrawall2:

It is not realistic to expect Nvidia not to push features for which they invested dedicated transistors.
I'm not saying Nvidia shouldn't push it's own technologies, in fact I give credit to Nvidia for doing so - because it forces competition. I'm just saying it's entirely within the realm of possibility that you can develop said features without said hardware. Which has been shown by AMD multiple times. TressFX, Freesync and FSR are all examples of this. And, as Nvidia user, there's no downside that AMD develops those technologies - it's just a bonus for us.
aufkrawall2:

Another hypocrite.
I don't get how any of this makes them hypocrites? You can simultaneously buy things and advocate for others to get said technologies without being hypocritical lol..
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
aufkrawall2:

Another hypocrite.
You're being ridiculous, and accomplishing nothing by saying this stuff. Would you prefer if they just sing the praises of nVidia? He had the cash and wanted to get a video card now rather than waiting an unknown amount of time. That's not unreasonable, I'm still waiting on my 7900XTX because a single AIB model (seemingly) hasn't been shipped to Canada yet, despite ordering over a month ago, not too long after launch.
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
I thought the same, bought AMD, even went Linux for the most part. It doesn't work or it's severely inferior (as will be FSR 3), I don't need to be bothered with this in my leisure time. It's really some nerd issue that points to a lack of more important things to care about.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
Well after waiting for months, I've finally tried it and sad to say I don't like it. DLSS3 works really well in Portal RTX. It turns 25fps into something playable. Because the game doesn't require millisecond reflexes, the game is responsive enough for a decent experience. I had hoped that with Cyberpunk, already running at 85fps, the experience would be enhanced by DLSS3 but for whatever reason, it isn't. It's hard to describe, just feels a bit strange when enabled. I also don't get a feeling of an improved framerate like with Portal RTX.
data/avatar/default/avatar13.webp
I AM 100% sure DLSS 3.0 could work on 3000 series, NGreedia simply not turn it on so people buy overpriced 4000 series
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
half_empty_soul:

I AM 100% sure DLSS 3.0 could work on 3000 series, NGreedia simply not turn it on so people buy overpriced 4000 series
Maybe but you could say that about literally anything... So I think it's better to show evidence that way we avoid a bunch of posts like this for every single piece of technology that comes out with every single card ever.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
CPC_RedDawn:

My 4090 was one of the "cheapest" models there is, I can spend my money on what ever I want. That doesn't mean I can't say nothing negative about Nvidia, there was no contract inside the box when I got the card. I had a 7900XTX but I suffered from the heatsink issue and had to return it. There was no AIB 7900XTX cards in stock and I wasn't going to buy a 4080 so I just caved and got a 4090. If I had no issues with the 7900XTX I would still own that card as performance was still excellent. My argument would actually give you more features on your 3060 and make my own purchase look stupid, how can you not see that?
Very good comment. Just because we buy products from companies doesn`t mean we should agree with everything these same companies do. We can like the products and even the companies and still be critical of certain decisions they make.