NVIDIA Demos Marvel Spider-Man Remastered at 4k / 200 FPS with DLSS 3

Published by

Click here to post a comment for NVIDIA Demos Marvel Spider-Man Remastered at 4k / 200 FPS with DLSS 3 on our message forum
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Spets:

The technology evolved to dlss2 overtime, you make it sound like it was completely scrapped 😀
But it was scraped. DLSS1 used spatial reconstruction, with an AI algorithm trained for each game. DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Horus-Anhur:

But it was scraped. DLSS1 used spatial reconstruction, with an AI algorithm trained for each game. DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
I'll see if I can find the article but from what I've read the AI algorithm was improved to the point it can be generalised. Edit - I can't find it with a quick search, so long as the outcome is improved they can scrap/improve the code all they want 🙂
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Horus-Anhur:

But it was scraped. DLSS1 used spatial reconstruction, with an AI algorithm trained for each game. DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
DLSS1 becomes DLDSR 🙂
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Krizby:

DLSS1 becomes DLDSR 🙂
Did it? I never saw anything about that. But could be. But then again, it's not DLSS 2.x
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Horus-Anhur:

Did it? I never saw anything about that. But could be. But then again, it's not DLSS 2.x
Everything has to start at 1, do you think DLSS2 just spawned out of nowhere?
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Krizby:

Everything has to start at 1, do you think DLSS2 just spawned out of nowhere?
That is just the number. Not the tech.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Horus-Anhur:

That is just the number. Not the tech.
sometimes evolution just takes a different route 😀
data/avatar/default/avatar07.webp
I think is a matter of preferences. 4k stable 120fps still requires extremely expensive monitors and gpu. So for me is rushed. DLSS is a way to get there sooner, but i don't feel the need. Those cards allow to play at 1440p 165fps and i think overall is what i prefer. Especially i m in the age i m getting older and from the sofa i can't really say if i m playing 4k or not on a large tv. Biology is saving me from crazy expenses
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
There is something wrong, I think, with paying literally thousands of euro/dollars to build a high-end PC running a two grand 4090... to just play a game like Overwatch... which was designed to run on cheap older consoles costing a few hundred... at 400FPS o_O
data/avatar/default/avatar09.webp
"reduced image quality for more fps" "come and get it now" -Nvidia 2022
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Denial:

The Digital Foundry video shows significantly more artifacting. Pause at 1:33+ in this video: https://i.imgur.com/35JT7Sq.png - DLSS3 https://i.imgur.com/Ht39bOS.png - NoDLSS https://i.imgur.com/pYFeUmk.png Need to see a real review of the tech from independent sources imo
I wonder if they will say anything about this like they did with FSR in God of War
Dragam1337:

It was scrapped in the sense that game which launched with dlss1 never got anything else, and looks TERRIBAD. Not that dlss2 is amazing - it is always a downgrade to image quality, it's just a question of how much.
If i m not wrong i think you can force dlss2 in dlss1 games
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
geogan:

There is something wrong, I think, with paying literally thousands of euro/dollars to build a high-end PC running a two grand 4090... to just play a game like Overwatch... which was designed to run on cheap older consoles costing a few hundred... at 400FPS o_O
If you play overwatch or any other competitive games for thousands of hours, you will want to run them at 400FPS+, higher FPS --> better competitive edge. I'm playing PUBG at 200FPS+, still not enough 🙄
data/avatar/default/avatar21.webp
Picolete:

I wonder if they will say anything about this like they did with FSR in God of War If i m not wrong i think you can force dlss2 in dlss1 games
You can't, sadly.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Krizby:

If you play overwatch or any other competitive games for thousands of hours, you will want to run them at 400FPS+, higher FPS --> better competitive edge. I'm playing PUBG at 200FPS+, still not enough 🙄
Dont you have 120hz screen? Whats the point then.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Undying:

Dont you have 120hz screen? Whats the point then.
Higher FPS --> lower input latency, I tried capping FPS and couldn't hit LOL
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Krizby:

Higher FPS --> lower input latency, I tried capping FPS and couldn't hit LOL
Then get a high refresh rate monitor and get rid of the tv your using.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Undying:

Then get a high refresh rate monitor and get rid of the tv your using.
high refresh doesn't mean jack, OLED still offer the best input latency out there
oled.jpg
No way I'm buying VA shit LOL
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Krizby:

high refresh doesn't mean jack, OLED still offer the best input latency out there
oled.jpg
No way I'm buying VA crap LOL
There are some oled monitors from alienware, asus and gigabyte 34-42" and displayport. I think it should be high refresh rate.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Undying:

There are some oled monitors from alienware, asus and gigabyte 34-42" and displayport. I think it should be high refresh rate.
138hz is the max for 4K OLED right now (Asus PG48UQ), not sure if 18hz higher is worth the 5000usd Alienware 34in QD-OLED is 175hz, but I'm not going back to 34in just to get some slightly higher refresh
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Yeah i kind of agree, but I think it's inevitable. Getting real performance gains from "brute force" with hardware is becoming increasingly more difficult. As we see with this gen the power requirements are starting to massively increase for the same area size in hardware as previous gens. Dennard scaling basically is dead.
Well, that's part of a greater underlying problem: IMO, most (not all) devs are getting lazy, even on consoles. Why optimize when you can just simply buy more/better/faster hardware to compensate? The problem is, that laziness is starting to outpace what's physically possible with today's technology, hence Dennard scaling being dead. Nvidia doesn't want to seem like they're the problem (and honestly, they're not) but there are people who expect them to be the ones to do all the heavy lifting. So you're right, we can't keep relying on brute force - we can only work more intelligently. Since we're too lazy to use our own intelligence, we depend on it artificially.