NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering

Published by

Click here to post a comment for NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering on our message forum
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Very well done as usual from nvidia, really helpful comparison right there. Great job! Also, now i'm really looking forward to the Infiltrator release....oh wait.
data/avatar/default/avatar26.webp
PolishRenegade:

You need to contact NVIDIA or they need to contact you to "train" your game, ends up being a process of approval. And if it's not, I would be very surprised as a dev to "suddenly" see my game supported with a driver update. The worst part is that the SDK for it is STILL not available, a month after launch. The whole thing, from a dev point of view, feels like a walled garden of proprietary tech... circa 2005-ish era of closed gaming technologies and engines. Like having to buy a PhysX card to your machine for X-Y game.
They don't train your game (that doesn't even make sense). Approval? lol Devs don't have to send the code, no one actually send their source code to any company (despite what AMD said in the past) they just give them a build and NVIDIA will use its supercomputer to train an AI algorithm for that game, developer could do that by itself if they afford to build or rent a supercomputer
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I, as many others, am missing reference point. That reference point is fps with both AA OFF.
data/avatar/default/avatar19.webp
Stefem:

They don't train your game (that doesn't even make sense). Approval? lol Devs don't have to send the code, no one actually send their source code to any company (despite what AMD said in the past) they just give them a build and NVIDIA will use its supercomputer to train an AI algorithm for that game, developer could do that by itself if they afford to build or rent a supercomputer
Never said I need to send the source. 1) Implement their SDK, send the game, get in queue for their supercomputer ($$$ probably - because NVIDIA) and give them approval to support my game ($$$ probably - because NVIDIA) in their drivers. 2?) Implement their SDK, get a supercomputer running NVIDIA Teslas ($$$ because NVIDIA), train their AI on my game and then send them the results to be implemented in their drivers ($$$ probably - because NVIDIA) . I am missing something?
Stefem:

It's AI, there's nothing to license, the algorithm literally write itself using tons of data, you can do yourself with the same open source tools NVIDIA is using if you can afford to build a supercomputer
There is no open source tools for implementing the DLSS AI yourself. Nor have I read anything that points to this being free. Maybe I am wrong, would like a reference to that statement.
https://forums.guru3d.com/data/avatars/m/230/230335.jpg
DLSS, looks like we have a catchy name for fake Anti-Aliasing and useless image upscaling to 4K. Instead of this "DLSS" we need real 4K@60 fps capable GPU's, hopefully when the 7 nm GPU process tech. comes
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
tensai28:

The 2080ti does 4k ultra@60fps with no problems with or without dlss.
It does... LOL...Which game???? Forza? or Shadow of Tomb Raider? It has limits...i say again the RTX is NOT a standard for 4K gaming@60FPS
data/avatar/default/avatar25.webp
Given how Nvidia's previous slides skewed truth, I seriously don't think I'd be publishing any more graphs etc from them...until we can properly validate the data ourselves at the same time. If all they're doing is up-scaling lower res textures then I'd feel a little short changed, especially if the developer included actual 4k textures.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
nevcairiel:

The performance is higher because you practically get anti-aliasing for free, this frees up a lot of GPU performance, especially on higher resolutions. TAA is actually quite expensive.
The thing is, the die space used for the Tensor cores could have been used to pack in more CUDA cores instead (which might have lifted AA performance up to DLSS levels). This is why I don't see DLSS as free anti-aliasing, but a trade - Nvidia has traded greater overall performance (CUDA cores) for a faster anti-aliasing method (Tensor cores). Whether it's a fair trade or not remains to be seen.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Glottiz:

Of course performance with DLSS is higher because DLSS is just a fancy name for upscaling from 1440p to 4K. Basically this chart is just showing difference in performance between 1440p and 2160p.
1440p is faster than 1440p + DLSS. DLSS is not free. DLSS "4K" perf should be somewhere between 1440p and native 4K.
data/avatar/default/avatar06.webp
DLSS is definitely going to be a game changer. I mean you get a super crisp image (and free performance) vs. regular anti-aliasing methods. I can only imagine how much better all of my modern racing games would look like when playing in 1080p while also maintaining 144hz.
data/avatar/default/avatar22.webp
"
RzrTrek:

DLSS is definitely going to be a game changer. I mean you get a super crisp image (and free performance) vs. regular anti-aliasing methods. I can only imagine how much better all of my modern racing games would look like when playing in 1080p while also maintaining 144hz.
" Let's not stoke Nvidia's propaganda machine at the moment, that is, until reputable sites like guru3d get a chance to thoroughly test and benchmark this and all other RTX brand features for themselves. On that note, my own thoughts still remain: that Nvidia have shot themselves in the foot more than once with this series release. They've managed to dissuade me from upgrading this cycle.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
Would love to see someone posting the info where Nv said "at native 4k", as i can remember it was never said like that. They might have compared it in image quality to it. But anything to make them look bad, since they dared to not release a 500$ card that can beat the 1080ti.. And about proprietary: Please show me a business model (that works/worked), where company A invented something, and gave it away for company B to make money with it...
data/avatar/default/avatar40.webp
Mesab67:

Let's not stoke Nvidia's propaganda machine at the moment, that is, until reputable sites like guru3d get a chance to thoroughly test and benchmark this and all other RTX brand features for themselves. On that note, my own thoughts still remain: that Nvidia have shot themselves in the foot more than once with this series release. They've managed to dissuade me from upgrading this cycle.
https://www.guru3d.com/articles_pages/asus_turbo_geforce_rtx_2070_8gb_review,23.html
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
fry178:

Would love to see someone posting the info where Nv said "at native 4k", as i can remember it was never said like that. They might have compared it in image quality to it. But anything to make them look bad, since they dared to not release a 500$ card that can beat the 1080ti.. And about proprietary: Please show me a business model (that works/worked), where company A invented something, and gave it away for company B to make money with it...
Microsoft's Direct3D.
data/avatar/default/avatar38.webp
Glottiz:

"Free anti-aliasing method", hah, how naive are you? Nothing is free. I guess Nvidia's marketing campaign really worked on you. It's not free AA and it's not native 4K res. Digital Foundry did in depth DLSS analysis and pixel counted that DLSS is just running games at 1440p internal resolution and upscaling to 4K.
Keep going, keep going why have you stopped? In same DigitalFoundry video, they found 4K DLSS as better image quality then Native 4K + TAA
data/avatar/default/avatar10.webp
Caesar:

It does... LOL...Which game???? Forza? or Shadow of Tomb Raider? It has limits...i say again the RTX is NOT a standard for 4K gaming@60FPS
I have no trouble keeping 60fps on shadow of the tomb raider at 4k. I haven't tried forza.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Fox2232 nope. besides that it more or less requires windows, its not free (licensed), and they didnt write the code and gave it away (for only) company B to use it. and even if, its ONE company with ONE (software) product, so its an example of the complete opposite, when i look at how many companies exist.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
These benchmark results aren't very impressive.
https://forums.guru3d.com/data/avatars/m/240/240526.jpg
Now if only they bothered to care to make a higher quality version of DLSS that focuses on quality instead of getting the fastest possible performance out of it. Which clearly causes the quality to suffer as it obviously can't handle complex shader/specular aiasing problems. Which is a shame, because overall it seems far preferable to the average TAA which is also similarly overly focused on performance and brings a number of unacceptable compromises to image quality in a large portion of cases. (There are many exceptions) It's hardly blurry, and you need your eyes checked if you think it does. While some surfaces clearly sport a slightly smudgy look akin to a deep learning upsampler. (Because obviously that is part of what is happening here.)The overall image quality is obviously very sharp. If sharpness is all you care about, you have no business talking about Anti-Aliasing honestly.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
fry178:

@Fox2232 nope. besides that it more or less requires windows, its not free (licensed), and they didnt write the code and gave it away (for only) company B to use it. and even if, its ONE company with ONE (software) product, so its an example of the complete opposite, when i look at how many companies exist.
Actually, it is perfect example. You just did not see it way it is. MS makes D3D. That enables huge gaming ecosystem, most of us live for over 20 years in. GPU manufacturers benefit. Because without D3D, they would not be really making as many games. And it worked very well for MS, because people buy their OS to play games on it. MS could have been under Linux in popularity if it was not for Games. It is pretty similar to nVidia's G-Sync, but much more sneaky. If G-Sync was cheaper (sneakier), people would have many more G-Sync screens and nVidia's grasp would be tighter. Instead, AMD promoted Freesync. As it is Adaptive Sync adoption, companies decided to provide this extra value to clients as it is not extra costly. Freesync is now almost common. And that gives AMD back value in terms of mindshare. So, guy buying new GPU for living room 52''+ TV will be like: "Should I buy AMD's GPU since I have Freesync of buy new G-Sync TV? What does G-Sync TV costs? Is it even available? ...Oh Sh*7!?!... OK, buying from AMD." Giving something out often results in adoption of technology which in return creates much larger ecosystem. Allowing original author to benefit in long term as technology does not die.