NVIDIA Demos Marvel Spider-Man Remastered at 4k / 200 FPS with DLSS 3

Published by

Click here to post a comment for NVIDIA Demos Marvel Spider-Man Remastered at 4k / 200 FPS with DLSS 3 on our message forum
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kanenas:


Opera Snapshot_2022-09-22_093240_uploads.disquscdn.com.png

Opera Snapshot_2022-09-22_093208_uploads.disquscdn.com.png
there is definitelty some fuzz aroud the character in motion,easy to spot even in that video.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
The multiframe generation is AI based, looks like it needs more time in the oven at the moment.
data/avatar/default/avatar22.webp
Denial:

The Digital Foundry video shows significantly more artifacting. Pause at 1:33+ in this video: [youtube=qyGWFI1cuZQ] https://i.imgur.com/35JT7Sq.png - DLSS3 https://i.imgur.com/Ht39bOS.png - NoDLSS https://i.imgur.com/pYFeUmk.png Need to see a real review of the tech from independent sources imo
I remember when people used to care about image quality, even going as far as comparing image quality between driver version (which even became a meme). But now it seems like artifacting and visual degradation introduced by all these upscaling techniques is widely accepted, because everyone's eyes are glued onto FPS counter and not the actual game.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Glottiz:

I remember when people used to care about image quality, even going as far as comparing image quality between driver version (which even became a meme). But now it seems like artifacting and visual degradation introduced by all these upscaling techniques is widely accepted, because everyone's eyes are glued onto FPS counter and not the actual game.
Quite wrong, I compare IQ with Nvidia ICAT all the time to find the best balance between FPS/IQ (mainly for choosing between DLSS mode) for the best gaming experience. If you want to stick to Native, best to stick with 1440p screen, meanwhile 4K DLSS on a 4K screen offer superior IQ and even FPS
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Glottiz:

I remember when people used to care about image quality, even going as far as comparing image quality between driver version (which even became a meme). But now it seems like artifacting and visual degradation introduced by all these upscaling techniques is widely accepted, because everyone's eyes are glued onto FPS counter and not the actual game.
Yeah i kind of agree, but I think it's inevitable. Getting real performance gains from "brute force" with hardware is becoming increasingly more difficult. As we see with this gen the power requirements are starting to massively increase for the same area size in hardware as previous gens. Dennard scaling basically is dead. So what do you do? Do you just continuously increase power requirements? Or do you start researching new ways to get jumps that maybe start off poorly but get better overtime? I prefer the latter. I'd imagine in a few years the Super Resolution part of DLSS (DLSS2) will basically be flawless. It's already massively improved since DLSS 2 launched. And maybe a few years after that the interpolation feature will also be flawless. I just think you need to start somewhere and tweak/improve as you go. My issue is that Nvidia likes to pretend like these technologies are perfect from launch. The official Spiderman trailer barely shows any artifacts, this shows a little more and the 3rd party preview (digital foundry) who most people consider Nvidia bias.. shows even more. Nvidia also is doing that thing again where its selling it's cards on the promise of DLSS 3 being in x number of games. Then you go down the list and half those games don't have release dates. The other half aren't even games. Atomic Heart is the biggest joke.. it was a DLSS 1 launch title but never came out and now it's here again. I would rather them just admit the technology has flaws but it's the only way forward. They are going to work on continually improving it.. bring some kind of way to update older games with newer dlls officially. Blah blah. You know, be honest.. then just be super vague and pretend like it's all perfect.
data/avatar/default/avatar25.webp
Krizby:

meanwhile 4K DLSS on a 4K screen offer superior IQ and even FPS
Sure if you been drinking Nvidia and their Youtube shilling channel DigitalFoundry coolaid. There are many games where DLSS is inferior, introduces unwanted artifacting, oversharpening, haloing, changes how certain effects look like, etc.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Denial:

Yeah i kind of agree, but I think it's inevitable. Getting real performance gains from "brute force" with hardware is becoming increasingly more difficult. As we see with this gen the power requirements are starting to massively increase for the same area size in hardware as previous gens. Dennard scaling basically is dead. So what do you do? Do you just continuously increase power requirements? Or do you start researching new ways to get jumps that maybe start off poorly but get better overtime? I'd imagine in a few years the Super Resolution part of DLSS (DLSS2) will basically be flawless. And maybe a few years later the interpolation feature will also be flawless. I just think you need to start somewhere. My issue is that Nvidia likes to pretend like these technologies are perfect from launch. The official Spiderman trailer barely shows any artifacts, this shows a little more and the 3rd party preview (digital foundry) who most people consider Nvidia bias.. shows even more.
The one on DF is comparing Native (60FPS) vs DLSS Performance + Frame generation, meanwhile Nvidia is showing DLSS Performance vs DLSS Perf + Frame generation, of course you will see fuzziness when using DLSS Perf
Glottiz:

Sure if you been drinking Nvidia and their Youtube shilling channel DigitalFoundry coolaid. There are many games where DLSS is inferior, introduces unwanted artifacting, oversharpening, haloing, changes how certain effects look like, etc.
Lol yeah sure like I can do the comparison myself, and for quite a while too, ICAT just make that a whole lot easier though.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Glottiz:

Sure if you been drinking Nvidia and their Youtube shilling channel DigitalFoundry coolaid. There are many games where DLSS is inferior, introduces unwanted artifacting, oversharpening, haloing, changes how certain effects look like, etc.
The better IQ debate really comes down to how crap the games TAA is, most of the time it's garbage and something like DLAA/DLSS/FSR can do the motion vectors and aliasing better.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Looks a bit bad. DLSS 3.0 seems to be more and more as snake oil, rather than a killer feature.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Horus-Anhur:

Looks a bit bad. DLSS 3.0 seems to be more and more as snake oil, rather than a killer feature.
I think Denial is right, it'll come out flawed like DLSS 1 was but eventually get better.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Spets:

I think Denial is right, it'll come out flawed like DLSS 1 was but eventually get better.
I wonder if we can choose DLSS mode with DLSS3, instead of the default DLSS Perf + Frame Generation. DLSS Quality + Frame generation obviously look better.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Spets:

I think Denial is right, it'll come out flawed like DLSS 1 was but eventually get better.
But DLSS 1 never got better. It just died and was replaced by a different paradigm with DLSS 2
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Krizby:

I wonder if we can choose DLSS mode with DLSS3, instead of the default DLSS Perf + Frame Generation. DLSS Quality + Frame generation obviously look better.
I'd say you would still get to pick the quality modes, it still runs like dlss2 to output one image then goes through the optical multiframe generator for the next image.
Horus-Anhur:

But DLSS 1 never got better. It just died and was replaced by a different paradigm with DLSS 2
The technology evolved to dlss2 overtime, you make it sound like it was completely scrapped 😀
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Spets:

Wonder if they used 2.4.6 😛
I just tested that out with 2.4.6
data/avatar/default/avatar08.webp
Spets:

I'd say you would still get to pick the quality modes, it still runs like dlss2 to output one image then goes through the optical multiframe generator for the next image. The technology evolved to dlss2 overtime, you make it sound like it was completely scrapped 😀
It was scrapped in the sense that game which launched with dlss1 never got anything else, and looks TERRIBAD. Not that dlss2 is amazing - it is always a downgrade to image quality, it's just a question of how much.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Dragam1337:

It was scrapped in the sense that game which launched with dlss1 never got anything else, and looks TERRIBAD. Not that dlss2 is amazing - it is always a downgrade to image quality, it's just a question of how much.
Wasn't that Control which was updated with 2.2.11? I really can't remember the first few games that had it. But yeah all that said, I don't think we'll see anything special out of the optical frame function until it improves.