The technology evolved to dlss2 overtime, you make it sound like it was completely scrapped 😀
But it was scraped.
DLSS1 used spatial reconstruction, with an AI algorithm trained for each game.
DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
But it was scraped.
DLSS1 used spatial reconstruction, with an AI algorithm trained for each game.
DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
I'll see if I can find the article but from what I've read the AI algorithm was improved to the point it can be generalised.
Edit - I can't find it with a quick search, so long as the outcome is improved they can scrap/improve the code all they want 🙂
But it was scraped.
DLSS1 used spatial reconstruction, with an AI algorithm trained for each game.
DLSS2 used temporal reconstruction, with a generalized AI algorithm, trained to be used on all games.
I think is a matter of preferences. 4k stable 120fps still requires extremely expensive monitors and gpu. So for me is rushed. DLSS is a way to get there sooner, but i don't feel the need. Those cards allow to play at 1440p 165fps and i think overall is what i prefer. Especially i m in the age i m getting older and from the sofa i can't really say if i m playing 4k or not on a large tv.
Biology is saving me from crazy expenses
There is something wrong, I think, with paying literally thousands of euro/dollars to build a high-end PC running a two grand 4090... to just play a game like Overwatch... which was designed to run on cheap older consoles costing a few hundred... at 400FPS o_O
I wonder if they will say anything about this like they did with FSR in God of War
Dragam1337:
It was scrapped in the sense that game which launched with dlss1 never got anything else, and looks TERRIBAD.
Not that dlss2 is amazing - it is always a downgrade to image quality, it's just a question of how much.
If i m not wrong i think you can force dlss2 in dlss1 games
There is something wrong, I think, with paying literally thousands of euro/dollars to build a high-end PC running a two grand 4090... to just play a game like Overwatch... which was designed to run on cheap older consoles costing a few hundred... at 400FPS o_O
If you play overwatch or any other competitive games for thousands of hours, you will want to run them at 400FPS+, higher FPS --> better competitive edge.
I'm playing PUBG at 200FPS+, still not enough 🙄
If you play overwatch or any other competitive games for thousands of hours, you will want to run them at 400FPS+, higher FPS --> better competitive edge.
I'm playing PUBG at 200FPS+, still not enough 🙄
There are some oled monitors from alienware, asus and gigabyte 34-42" and displayport. I think it should be high refresh rate.
138hz is the max for 4K OLED right now (Asus PG48UQ), not sure if 18hz higher is worth the 5000usd
Alienware 34in QD-OLED is 175hz, but I'm not going back to 34in just to get some slightly higher refresh
Yeah i kind of agree, but I think it's inevitable. Getting real performance gains from "brute force" with hardware is becoming increasingly more difficult. As we see with this gen the power requirements are starting to massively increase for the same area size in hardware as previous gens. Dennard scaling basically is dead.
Well, that's part of a greater underlying problem: IMO, most (not all) devs are getting lazy, even on consoles. Why optimize when you can just simply buy more/better/faster hardware to compensate? The problem is, that laziness is starting to outpace what's physically possible with today's technology, hence Dennard scaling being dead. Nvidia doesn't want to seem like they're the problem (and honestly, they're not) but there are people who expect them to be the ones to do all the heavy lifting.
So you're right, we can't keep relying on brute force - we can only work more intelligently. Since we're too lazy to use our own intelligence, we depend on it artificially.
No way I'm buying VA shit LOL