Nvidia introduces official DLSS plugin for Unreal Engine

Published by

Click here to post a comment for Nvidia introduces official DLSS plugin for Unreal Engine on our message forum
data/avatar/default/avatar26.webp
XenthorX:

There's been dynamic resolution and temporal upsampling for a couple years now (almost mandatory for consoles developement), but DLSS is a welcome addition for sure!
yeah but a) like 3 PC games had dynamic res b) DLSS support is new and DLSS kills any other upscaling c) this is UE4, you cant get more exposure than that still not quite sure how DLSS dynamic resolution works, because DLLS input resolutions are very discrete (what? like 4 different ones?) giving very step-function-like perf. differences
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
dampflokfreund:

You don't see the point? My 2060 laptop struggles with the game at 1080p60, yet with DLSS it can do 1440p60 at max settings easily. And with DLSS performance and volumetrics medium I can even add RT reflections+translucency to that at 1440p60 and get a superior experience compared to the stronger PS5 and Xbox Series X, because it runs at higher settings and higher framerate. Or I can reach 144 FPS for my internal laptop display at 1080p with DLSS without using Raytracing. It's not blurry btw, the image quality of DLSS 2.0 is excellent in most games. If it appears blurry in some games like Cyberpunk its because the devs did not implement it correctly. In Cyberpunk for example the TAA is extremly oversharpened while DLSS has not enabled any sharpening at all which makes it appear blurry in comparison. Again, not the fault of DLSS 2.0, it's the fault of the developer and can be fixed by adding sharpening like alanm did. Then even DLSS Performance at 1440p (which is rendering at just 720p) looks close or even better than native rendering. Check this out: https://imgsli.com/MzM5MjA You can see added detail like the number under the left brake light and additional details on the upper left metal wall.
You're wasting your time on that guy, only AMD can do right in his eyes.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
nVidia actually do things when there's the threat of competition possibly showing up in the not distant future... who would have thought. It would have been great if nVidia didn't spend all the time they were unchallenged with their thumbs up their asses laughing as they charged $1K for cards that choked and died at 1440p.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
itpro:

The medium has better quality in XSX than 3060ti w/ 4k DLSS. DLSS is the opium of PC gamers it seems. XSX is the absolute winner VS mid-low end PC. And AMD is behind this miracle. (Microsoft included) I do not believe anyone would like to pay ~500$ gpu to lose from a mere console.
doesn't this run at 30 fps ?
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
GREGIX:

So it is yet another upscaling, but with fancy name...
Yeah but no.
data/avatar/default/avatar30.webp
seemed to work not long after AMD made their imaginer sharpening open.
data/avatar/default/avatar11.webp
Undying:

Not for the price blurring the image.
Please list one DLSS 2.0 game where DLSS blurs the image. Watch HUB video on it from last year, where they reviewed DLSS 2.0 and basically said that, it is free performance and makes no sense disabling it when available.
data/avatar/default/avatar06.webp
DLSS 2.0 is freaking amazing... Everyone that talk trash about it either cant get a 30xx, cant afford a 30xx or has an AMD card and just feel frustrated because they can use it.... Only if you see it with your own eyes you will understand how close it looks to the actual resolution... in 4k its indistinguishable from the actual resolution.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
ViperAnaf:

DLSS 2.0 is freaking amazing... Everyone that talk trash about it either cant get a 30xx, cant afford a 30xx or has an AMD card and just feel frustrated because they can use it.... Only if you see it with your own eyes you will understand how close it looks to the actual resolution... in 4k its indistinguishable from the actual resolution.
I actually really liked 2.0 at 1440p Quality (960p)
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Neo Cyrus:

nVidia actually do things when there's the threat of competition possibly showing up in the not distant future... who would have thought. It would have been great if nVidia didn't spend all the time they were unchallenged with their thumbs up their asses laughing as they charged $1K for cards that choked and died at 1440p.
Nvidia has been working closely with Epic Games for over 15years and especially since UE4 went license-free. They've had a dedicated Unreal Engine Branch with several of their tech available for global illumination, GPU based destructions, fluid simulation, hair simulation,... it's been going on for years.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
The beauty of in-engine implementation is that it opens up built-in customizations and optimizations. This for me is a major step for adoption. Nvidia have done well and now the value of their hardware has hope of solidifying itself. DLSS is still really new and it needs work. However, eventually there's going to be wide-spread support and problems ironed-out or reduced to insignificance. It's the same old story for any new tech in it's early years. Plenty of naysayers who are from the same branch who were against colour tv, and we laugh about that in hindsight, but, that s""" really happened! Anyway, I wonder if it's possible to boost the performance of other AA techniques using the same cores? In my head I designed the next-gen to have not only more RT and Tensor+a.i. cores, but, also these components of the gpu die now takes up a higher percentage of space. When it feels like its almost free performance it'll become standard and accepted.
data/avatar/default/avatar34.webp
Lol at the fanboys who dont have DLSS STILL maintaining that ignorance. Oh yeah "DLSS and RTX are ONLY to be used together!" wtf? They are totally mutually exclusive features that happen to play very nicely with one another. DLSS image reconstruction looks better than a native resolution render at times.
data/avatar/default/avatar20.webp
DLSS 2.1could be a game changer for VR. That is where I want to see how it performs. Nvidia I believe approved it for use a few months ago in VR yet no games use it yet. Was hoping Medal of Honor would and still might but with Facebook all but abandoning PCVR for mobile Quest as it has taken off they were really only one funding AAA VR games on PC outisde of Half Life Valves one gift ha. Apple is entering VR market soon albeit with a very expensive premium headset so thats good an if Carmack can actually get wireless PCVR added to Quest2 without having to use 3rd party software could still help PC market as Quest2 is now 2nd only to RiftS in Steam market share now beating HTC and Index and thats with only like 10 percent of Quest users I read even figuring out how to connect it to a PC ha. Tech is finally getting here for lifelike VR and userbase is growing fastest ever with Quest2 but still not big enough for AAA devs to roll in and Facebook was doing it for last 4 years at a huge loss to grow the market and they are done. About only good thing Zuckerberg has ever done but of course they disabled ability to block all the facebook spying telemetry in latest firmware update but thats another story...
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
cucaulay malkin:

doesn't this run at 30 fps ?
Yes but it holds stable frames and fluid frametime compared to 3060ti without DLSS.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
itpro:

Yes but it holds stable frames and fluid frametime compared to 3060ti without DLSS.
but doesn't 3060ti run at more than 30 fps ? https://gamegpu.com/action-/-fps-/-tps/the-medium-test-gpu-cpu 33/36 min/avg at native 4k,max pc settings also,30 fps = https://www.reactiongifs.us/wp-content/uploads/2013/07/puking_brian.gif can't play at 30 or 35 fps,even with a controller and g-sync,and believe me,I've seen that on a gtx1070 1440p these days. +45 fps or bust.so as much as I like the hardware in xsx,when you're locking it to 30 I'm gonna strongly opt out. at least dlss is giving you playable fps with minimal,if any,visual loss at 4k dlssQ and it still uses dynamic resolution on console 8:00 1440p-4K in single screen,goes as low as 900p in split screen mode on Series X [youtube=cm4uNK5ydGw] so yeah,3060ti wins every way. pause this at 1:45 to see 4k dlss low (performance) absolutely trounces xsx in quality watch further,rt on xsx is quarter of pc ray tracing resolution,missing opaque reflections,translucency reflections,lower quality shadows,lower quality dof.... [youtube=zfMpYDE-ab4] anyway,watched both videos and the game is a mess,both on pc and consoles.xsx stutters all the time.so does a pc without adaptive half refresh vsync enabled in nvcp.frame drops are all over the place. smoothest way to play is gonna be dlss + nvidia's half refresh v-sync set to 30,which is still terrible by my standards.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
XenthorX:

Nvidia has been working closely with Epic Games for over 15years and especially since UE4 went license-free. They've had a dedicated Unreal Engine Branch with several of their tech available for global illumination, GPU based destructions, fluid simulation, hair simulation,... it's been going on for years.
nVidia has accomplished jackshit in the same galaxy as bringing DLSS to the masses over the last 15 years... which they did now within tiny window from the moment it was given the green light. Don't act like they haven't kicked it into high gear now that they feel like their monopoly is broken as soon as AMD stock exists, along with Intel entering the market. Where's the advancement of PhysX? What happened to the destruction, fluid simulation, and hair simulation? Oh yeah they stagnated into obscurity and when implemented eat your GPU alive because they've hardly been improved in a decade. PhysX gets an update once in a galactic rotation and it's usually a bugfix. I don't remember the last time it was improved or added anything that any dev used on anything even once. nVidia, like any major corporation, only makes note worthy progress when they feel they're forced to/it's optimal for profits. Jensen can't let filthy AMD or Intel get mind share, his leather jacket would weep.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
The fact DLSS is a successful project that meets a demand right now given the push for Ray Tracing, can't overshadow the fact that Nvidia has been sharing code base and advanced tech implementation inside Unreal Engine years, at least openly back to 2015. Saying that it's just a recent trend from Nvidia is just missreading the situation to fit a narrative, a bunch of games have been shipped in UE4 using Nvidia turbulence effect, HBAO/HBAO+, TXAA and nvidia raytracing-optimized unreal engine branches. The nvidia blast destruction tech integrated into UE4 felt like the ground work for Epic Games refactor of destructions in the engine using Chaos, for destruction and more globally physic management..
data/avatar/default/avatar13.webp
I'm using LG 34WK95U (5120 x 2160) , with my old 3090 DLSS "1.0" and "2.0" got blur like hell 🙁 ,nvidia told me dlss not support this monitor, so I kept it in storage and get a new 6900xt 😕 while waiting for them to fix . BTW may be my eye have some problems, I can see 6900xt's color better and something ...clearly than mine 3090 o_O(i turned on 10bpc and full rgb in nvidia panel)
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
XenthorX:

Saying that it's just a recent trend from Nvidia is just missreading the situation to fit a narrative
It's delusional to think "nVidia are pushing way harder because competition now may exist" is a narrative. Yeah they totally are going into overdrive because they think they'll still be competing with no one but themselves. It's far beyond naïve, believe what you want.
data/avatar/default/avatar40.webp
Neo Cyrus:

It's delusional to think "nVidia are pushing way harder because competition now may exist" is a narrative. Yeah they totally are going into overdrive because they think they'll still be competing with no one but themselves. It's far beyond naïve, believe what you want.
In 2018. Microsoft showcased DirectML doing the same thing that DLSS is doing (albeit using Nvidia's work). Why we're still not seeing it in action? Where is AMD/MS/Sony/Intel's response? Could it be that it's HARD? Could it be that you can't turn on a dime and release something as disruptive as DLSS the moment you feel threatened?