NVIDIA adds DLSS in four new games

Published by

Click here to post a comment for NVIDIA adds DLSS in four new games on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Nice, nVidia acknowledges that I am not "everyone". Seems that there are still no true 4K GPUs and game developers are not focusing on 4K. But when fps game has that kind of framerate...
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

Seems that there are still no true 4K GPUs and game developers are not focusing on 4K.
that is the right thing to do IQ is not just cramming more pixels in a sqare inch of your monitor.at some point you gotta think about diminishing returns and focus on more realistic rendering techniques.4K was a thing mainly because of weak antialiasing options,not because image quality is too low at 1440/1800p.
data/avatar/default/avatar26.webp
cucaulay malkin:

that is the right thing to do IQ is not just cramming more pixels in a sqare inch of your monitor.at some point you gotta think about diminishing returns and focus on more realistic rendering techniques.4K was a thing mainly because of weak antialiasing options,not because image quality is too low at 1440/1800p.
Exacly , who cares about 4k , i rather have better overall effects at 1080/1440p that sacrefice huge chunk of performance for better "AA". 1440p is optimal and way too go.
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
"the 3070 is faster than the 2080ti" 'but not when you use our proprietary tech that we use as a selling point for the 3070'
data/avatar/default/avatar38.webp
kapu:

Exacly , who cares about 4k , i rather have better overall effects at 1080/1440p that sacrefice huge chunk of performance for better "AA". 1440p is optimal and way too go.
Speaking of 1440p here in this comment section, that is my primary resolution. Should I keep maxing out AA in options or go for a weaker AA solution or no AA at all? What's everyone's experience/opinion on this?
data/avatar/default/avatar01.webp
mentor07825:

Speaking of 1440p here in this comment section, that is my primary resolution. Should I keep maxing out AA in options or go for a weaker AA solution or no AA at all? What's everyone's experience/opinion on this?
Depend on quality , some AA modes are very bad and produce blurry image , always go for SHARP image ( at least that is my opinion). When quality setting is good always MAX AA if you have performance to spare .
data/avatar/default/avatar38.webp
mentor07825:

Speaking of 1440p here in this comment section, that is my primary resolution. Should I keep maxing out AA in options or go for a weaker AA solution or no AA at all? What's everyone's experience/opinion on this?
Basically, the higher the resolution, the less AA you need. I still think AA is beneficial at 1440p. Just try dropping it down a notch and see if you're happy with the results.
data/avatar/default/avatar25.webp
Richard Nutman:

Basically, the higher the resolution, the less AA you need. I still think AA is beneficial at 1440p. Just try dropping it down a notch and see if you're happy with the results.
kapu:

Depend on quality , some AA modes are very bad and produce blurry image , always go for SHARP image ( at least that is my opinion). When quality setting is good always MAX AA if you have performance to spare .
Thank you both, I appreciate the response!
data/avatar/default/avatar24.webp
mentor07825:

Thank you both, I appreciate the response!
kapu:

Depend on quality , some AA modes are very bad and produce blurry image , always go for SHARP image ( at least that is my opinion). When quality setting is good always MAX AA if you have performance to spare .
I have learned that a sharpening filter on top of a blurry AA like TAA provides some excellent results. The built in the driver sharpening on nvidia works really well for this, I believe it's made with machine learning and all that.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
ACEB:

"the 3070 is faster than the 2080ti" 'but not when you use our proprietary tech that we use as a selling point for the 3070'
I mean, if you want to spend $1000+ for a 2080 ti to have better DLSS performance then a RTX 3070, i say go right ahead, but would very much wonder why you wouldn't try for a 3080 (not talking stock issues here) I'm pretty certain no one with a 2080 ti right now is interested in buying a 3070, so the fact that it works better in DLSS then a 3070 and sometimes gets just overall better performance then a 3070 is pretty much moot point, as people who are looking to buy a 3070, are not ones who would be looking to buy a 2080 ti before, or now.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Yes, I've recently discovered that AMD Contrast Adaptive Sharpening injected with ReShade on top of TAA gives great results. AMD CAS is better than nVidia's driver solution. Having moved to 1440p for the past two years, I can say that I have no interest in jumping to 4K any time soon.
https://forums.guru3d.com/data/avatars/m/283/283772.jpg
umeng2002:

Having moved to 1440p for the past two years, I can say that I have no interest in jumping to 4K any time soon.
Avoid PS5 if interested in consoles then.
https://forums.guru3d.com/data/avatars/m/278/278874.jpg
Fox2232:

Nice, nVidia acknowledges that I am not "everyone". Seems that there are still no true 4K GPUs and game developers are not focusing on 4K. But when fps game has that kind of framerate...
There is, RTX 3000 are 4K GPUs, DLSS is useful mostly on ray tracing games to stay above 60fps. Games without ray tracing doesn't have the need for DLSS, but it can serve as a boost if implemented.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Also I might add that DLSS 2.0 looks significantly better than 1.0 so it's almost certainly worth using if available.
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
Aura89:

I mean, if you want to spend $1000+ for a 2080 ti to have better DLSS performance then a RTX 3070, i say go right ahead, but would very much wonder why you wouldn't try for a 3080 (not talking stock issues here) I'm pretty certain no one with a 2080 ti right now is interested in buying a 3070, so the fact that it works better in DLSS then a 3070 and sometimes gets just overall better performance then a 3070 is pretty much moot point, as people who are looking to buy a 3070, are not ones who would be looking to buy a 2080 ti before, or now.
You missed my point, its what Jenson said and all the promos said, the 3070 is faster than the 2080ti, but they also claim that the technology behind RTX is great and should be used with their amazing DLSS and so when you do what they say it actually makes the 3070 slower than the 2080ti which is of course the opposite of everything they have been saying. It's called calling out the BS
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
RED.Misfit:

There is, RTX 3000 are 4K GPUs, DLSS is useful mostly on ray tracing games to stay above 60fps. Games without ray tracing doesn't have the need for DLSS, but it can serve as a boost if implemented.
Same logic says: RTX 3000 are 8K GPUs. nVidia shown them that way, right? Except that their showcases were rendered at 2560x1440 and DLSSed to 8K. (They used x3x3 upscaling mode. Or if you want: "Every ninth pixel was rendered.")
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

Same logic says: RTX 3000 are 8K GPUs. nVidia shown them that way, right? Except that their showcases were rendered at 2560x1440 and DLSSed to 8K. (They used x3x3 upscaling mode. Or if you want: "Every ninth pixel was rendered.")
I agree about the 8K but not the 4K. 3080 is definitely a 4K GPU. There are shit games that won't completely hit 60 - but in previous gens I could find shit games, completely maxed that wouldn't hit 60 at QHD. No one said a 1080Ti wasn't a "QHD GPU" because it barely do Ghost Recon Wildlands at 60fps.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Denial:

I agree about the 8K but not the 4K. 3080 is definitely a 4K GPU. There are crap games that won't completely hit 60 - but in previous gens I could find crap games, completely maxed that wouldn't hit 60 at QHD. No one said a 1080Ti wasn't a "QHD GPU" because it barely do Ghost Recon Wildlands at 60fps.
Times changed. Those crap games in times of 1080Ti are now DXR games for RTX/RDNA2 cards. And while you could say that those crap games which defeated 1080Ti were bad products. DXR games are by opinion of way too many people future of gaming. And will be pretty common sooner than some of us would like. On othjer hand, I will not push onto you idea that you have to play games on maximum details. That would be stupid as some of those "details" are stupid. Only yourself should decide which details are worth it. But if you need upscaling at details of your choice, then your card is not capable enough for native resolution... In my opinion, 1080p is budget and high fps resolution. 1440p is sweet spot where you can have reasonable frame rate in heavy games and great fps in lighter games. 4K is just waste of money. You can make list of cards nVidia and AMD declared as True 4K. And then show how long they managed to have stable 60fps+ on 4K. Then, you can rename it to: "List of failures and short time successes."
data/avatar/default/avatar28.webp
Denial:

I agree about the 8K but not the 4K. 3080 is definitely a 4K GPU. There are crap games that won't completely hit 60 - but in previous gens I could find crap games, completely maxed that wouldn't hit 60 at QHD.
Case in point: the very War Thunder from OP. I can easily DSR 3860x2160 on my 25x16 monitor for 150+ FPS. Better yet, turn DLSS on top of DSR, for even better IQ and better FPS. Alternatively, Turing/Ampere-less pilots can add HQ TAA which works nicely as well 🙂
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
3060ti does better with DLSS than 3080 at native 4K I always thought 4K was a waste but if the adoption rate for dlss 2.0 is as good as we've seen recently for triple A's then I might replace my secondary 1440p display with 4K once I get my hands on a 3060Ti DLSS-Q looked super nice upscaled from 960p,played control and youngblood.Before the 2.0 patch Control could miss some objects on the screen leaving them in lower res,but after 2.0 patch it was flawless.Youngblood was spot on,every SS I took of native and dlssQ looked better with dlss. I think it's no baloney coming from every major tech site that at 4K dlss looks even better than native when it can upscale from higher res and what a sweet card this 3060ti is https://i.imgur.com/LKQpWm6.jpeg