NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering

Published by

Click here to post a comment for NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering on our message forum
data/avatar/default/avatar08.webp
Of course performance with DLSS is higher because DLSS is just a fancy name for upscaling from 1440p to 4K. Basically this chart is just showing difference in performance between 1440p and 2160p.
https://forums.guru3d.com/data/avatars/m/252/252256.jpg
Wait... they're not even testing the same graphics cards. Why are there no like for like benchmarks. i.e 2080 DLSS Vs 2080 TAA?! Those results are just exaggerating DLSS effect even more, making it look better than it is!
data/avatar/default/avatar27.webp
Glottiz:

Of course performance with DLSS is higher because DLSS is just a fancy name for upscaling from 1440p to 4K. Basically this chart is just showing difference in performance between 1440p and 2160p.
The performance is higher because you practically get anti-aliasing for free, this frees up a lot of GPU performance, especially on higher resolutions. TAA is actually quite expensive.
data/avatar/default/avatar40.webp
nevcairiel:

Except that this is wrong. DLSS is just a practically "free" anti-aliasing method, it does not in fact change the rendering resolution. Where does this misconception even come from? The performance is higher because you practically get anti-aliasing for free, this frees up a lot of GPU performance, especially on higher resolutions.
"Free anti-aliasing method", hah, how naive are you? Nothing is free. I guess Nvidia's marketing campaign really worked on you. It's not free AA and it's not native 4K res. Digital Foundry did in depth DLSS analysis and pixel counted that DLSS is just running games at 1440p internal resolution and upscaling to 4K.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
It's a new upscaling technique, just think about it, this is a new architecture after several years of the best one in the market and we still can't have a GPU capable of pushing 4K 60fps as standard, there's always something to trick you like DLSS, checkerboarding, etc. 4K is simply not worth it. It “think” that DLSS blurs images. It's similar to FXAA but here with less computational cost (Wattage???), less jaggies, more blur. I personally prefer aliasing to blur. And only compatible with a handful of games that haven't been released yet. While 1800p + AA is possible with almost every games currently available and to come. Nvidia bringing complex and overpriced new technology in order to achieve what was already possible for cheaper. Revolutionary indeed!
data/avatar/default/avatar18.webp
Glottiz:

"Free anti-aliasing method", hah, how naive are you? Nothing is free.
It costs die space, and as such money, because it uses the Tensor Cores. But it frees up resources on the CUDA cores. PS: The Digital Foundry article hardly reads very in-depth.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
"nVidia shows" - oh, like they showed everything else about this launch. With absolutely meaningless and misleading numbers and graphs that make things look good because they're in a vacuum.
data/avatar/default/avatar31.webp
nevcairiel:

It costs die space, and as such money, because it uses the Tensor Cores. But it frees up resources on the CUDA cores. PS: The Digital Foundry article hardly reads very in-depth.
This. It is free in terms of raster computing power. Which is why FPS are higher when DLSS is enabled. The real problem with DLSS is not the tech but the implementation. Devs have to send their games to some AI compute centre at NVIDIA et even enable it. Dead-end IMHO. Until NVIDIA can make this an engine-wide implementation by licensing the algorithm (will never happen), the only +value of the 2000 series is the Ray Tracing cores.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
These are not benchmarks, that's a PR slide that looks just like that we saw from the Turing announcement event in August...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
PolishRenegade:

This. It is free in terms of raster computing power. Which is why FPS are higher when DLSS is enabled. The real problem with DLSS is not the tech but the implementation. Devs have to send their games to some AI compute centre at NVIDIA et even enable it. Dead-end IMHO. Until NVIDIA can make this an engine-wide implementation by licensing the algorithm (will never happen), the only +value of the 2000 series is the Ray Tracing cores.
They don't have to send it - Nvidia already stated they can train it themselves, it just costs a ton of compute power. On the flipside most devs already send their game code to Nvidia and they'll do the training for free - so it's a no brainer and the it's the reason why several indie games are pledging support for it. People should also realize that it's an iterative process as all AI applications are. The more training data the more accurate you can make the model the better DLSS gets. I personally don't care for the idea of paying for "future" performance but tech itself as a value-add is pretty nifty and I don't think it's going to be a one and done thing. I think the use of deep learning to accelerate visual applications is only starting.
data/avatar/default/avatar18.webp
nevcairiel:

It costs die space, and as such money, because it uses the Tensor Cores. But it frees up resources on the CUDA cores. PS: The Digital Foundry article hardly reads very in-depth.
Yes resources are shifted more towards Tensor Cores, but this is not the main secret behind DLSS performance. Upscaling from 1440p resolution is how Nvidia magically gets this performance boost. This is why Nvidia only compares DLSS vs TAA. Because if they showed 4K without AA vs DLSS everyone would realize that something doesn't add up. Why would native 4K without AA perform so much worse than DLSS?
data/avatar/default/avatar21.webp
We need the games with RT and dlss, not demos and benchmarks...
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
So DLSS gives you free AA at 4K, then I don't need DLSS as I don't use AA at 4K
data/avatar/default/avatar17.webp
Caesar:

It's a new upscaling technique, just think about it, this is a new architecture after several years of the best one in the market and we still can't have a GPU capable of pushing 4K 60fps as standard, there's always something to trick you like DLSS, checkerboarding, etc. 4K is simply not worth it.
The 2080ti does 4k ultra@60fps with no problems with or without dlss.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Barry J:

So DLSS gives you free AA at 4K, then I don't need DLSS as I don't use AA at 4K
This isn't how it works, no.
data/avatar/default/avatar29.webp
Denial:

They don't have to send it - Nvidia already stated they can train it themselves, it just costs a ton of compute power. On the flipside most devs already send their game code to Nvidia and they'll do the training for free - so it's a no brainer and the it's the reason why several indie games are pledging support for it. People should also realize that it's an iterative process as all AI applications are. The more training data the more accurate you can make the model the better DLSS gets. I personally don't care for the idea of paying for "future" performance but tech itself as a value-add is pretty nifty and I don't think it's going to be a one and done thing. I think the use of deep learning to accelerate visual applications is only starting.
You need to contact NVIDIA or they need to contact you to "train" your game, ends up being a process of approval. And if it's not, I would be very surprised as a dev to "suddenly" see my game supported with a driver update. The worst part is that the SDK for it is STILL not available, a month after launch. The whole thing, from a dev point of view, feels like a walled garden of proprietary tech... circa 2005-ish era of closed gaming technologies and engines. Like having to buy a PhysX card to your machine for X-Y game.
data/avatar/default/avatar35.webp
Not the most useful of things. We need to see values for 4K without any DLSS or TAA, then with TAA and then with DLSS AS WELL AS full res screen shots to compare the quality
data/avatar/default/avatar15.webp
Glottiz:

"Free anti-aliasing method", hah, how naive are you? Nothing is free. I guess Nvidia's marketing campaign really worked on you. It's not free AA and it's not native 4K res. Digital Foundry did in depth DLSS analysis and pixel counted that DLSS is just running games at 1440p internal resolution and upscaling to 4K.
That's was know from day one, DLSS works by reconstructing a 4K image from a lower resolution one using AI eliminating aliasing in the process, DLSS 2x start from a 4K image and use AI to just remove aliasing, That's how NVIDIA describe it
data/avatar/default/avatar14.webp
Caesar:

It's a new upscaling technique, just think about it, this is a new architecture after several years of the best one in the market and we still can't have a GPU capable of pushing 4K 60fps as standard, there's always something to trick you like DLSS, checkerboarding, etc. 4K is simply not worth it. It “think” that DLSS blurs images. It's similar to FXAA but here with less computational cost (Wattage???), less jaggies, more blur. I personally prefer aliasing to blur. And only compatible with a handful of games that haven't been released yet. While 1800p + AA is possible with almost every games currently available and to come. Nvidia bringing complex and overpriced new technology in order to achieve what was already possible for cheaper. Revolutionary indeed!
It's not even similar to FXAA and require much much higher computational resources as it reconstruct the image using AI, the incredible computational density of tensor core allow that to look as "free"
data/avatar/default/avatar03.webp
PolishRenegade:

This. It is free in terms of raster computing power. Which is why FPS are higher when DLSS is enabled. The real problem with DLSS is not the tech but the implementation. Devs have to send their games to some AI compute centre at NVIDIA et even enable it. Dead-end IMHO. Until NVIDIA can make this an engine-wide implementation by licensing the algorithm (will never happen), the only +value of the 2000 series is the Ray Tracing cores.
It's AI, there's nothing to license, the algorithm literally write itself using tons of data, you can do yourself with the same open source tools NVIDIA is using if you can afford to build a supercomputer