Unity Adding NVIDIA DLSS Support to Their Game Engine

Published by

Click here to post a comment for Unity Adding NVIDIA DLSS Support to Their Game Engine on our message forum
data/avatar/default/avatar19.webp
Good , bring more , that will pressure AMD, competition = progress.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Good , bring more , that will pressure AMD, competition = progress.
I think they've been pressured enough to show their own image reconstruction technology when dlss 2.0 came out more than a year ago. adding dlss to all major engines would be ensuring fantastic longevity for rtx 20/30 cards. when pascal owners have to start dropping resolution scaling,turing owners can use dlss intead.frankly this is amazing,I wish my 1070 had it.
data/avatar/default/avatar05.webp
cucaulay malkin:

I think they've been pressured enough to show their own image reconstruction technology when dlss 2.0 came out more than a year ago. adding dlss to all major engines would be ensuring fantastic longevity for rtx 20/30 cards. when pascal owners have to start dropping resolution scaling,turing owners can use dlss intead.frankly this is amazing,I wish my 1070 had it.
Imagine 1080 Ti had it . That would be nuts . Like that longest high performing card in the history 😀 DLSS 2.0 is not enugh pressure. Need wide support. Progress is progress anything is good.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

DLSS 2.0 is not enugh pressure.
🙄 OF COURSE IT ISN'T I mean what kind of amd owner would like a feature like that
data/avatar/default/avatar19.webp
kapu:

Good , bring more , that will pressure AMD, competition = progress.
DLSS is the main reason why i didn't move to AMD in my last upgrade. And I actually wanted to move.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
geogan:

It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
it looks like anything can be reconstructed with dlss 2.0 but to get satisfactory results you need training
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
kapu:

Imagine 1080 Ti had it . That would be nuts . Like that longest high performing card in the history 😀 DLSS 2.0 is not enugh pressure. Need wide support. Progress is progress anything is good.
AMD's version which is supposedly going to work even for GCN cards will very likely work on 1080Ti too.
data/avatar/default/avatar27.webp
cucaulay malkin:

🙄 OF COURSE IT ISN'T I mean what kind of amd owner would like a feature like that
I think any user would like that . For now i don't need it but in 2 years the card will age and that would help alot.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Fox2232:

AMD's version which is supposedly going to work even for GCN cards will very likely work on 1080Ti too.
If something seems too good to be true, it probably is. At this point AMD still don't know which direction they are going with (DirectML or no DirectML?). My 2cents is if FSR can be used on all hardware then the image quality will turn out terrible (might be better than Upscaling + CAS but nowhere near DLSS). Also there are other Upscaling tech like Unreal Temporal Upsampling which works out pretty well but still lose to DLSS.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
This is the perfect example of how "small" AMD is compared to both Nvidia and Intel because they can afford to create software features alongside their hardware, something that´s much more difficult for AMD because they don´t possess the same financial resources.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
H83:

This is the perfect example of how "small" AMD is compared to both Nvidia and Intel because they can afford to create software features alongside their hardware, something that´s much more difficult for AMD because they don´t possess the same financial resources.
first of all, it takes a lot of planning in advance. dlss was just an afterthought,it started with putting tensor cores on volta,then using them on turing,then improving them on ampere to cut the number of tensor cores but still have the same performance available.2080ti has 588 tensor cores,3070 has 180-something. Mobile entry level cards like 3050 are gonna get it this generation already. It was a long term investment that will pay off.AMD too often has to whip something up fast,and there is no guarantee they won't have to search for other solutions for later generations.
data/avatar/default/avatar09.webp
every time an AMD fb mentions Upscaling + CAS as an equivalent to DLSS im ROFL...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
ViperAnaf:

every time an AMD fb mentions Upscaling + CAS as an equivalent to DLSS im ROFL...
Yeah, I think you missed your mark by 180°. I doubt @Krizby is AMD's fanboy. He seems to have nVidia's GPU and mentioned them in rather negative tone even while CAS is actually good for its performance impact in contrast to Trashscaling available on AMD's side for DX11 and lower. As DX12 replaced it with autonomous VRS. So, you made almost good trash post... almost.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
I almost bought a 5700XT last year but I decided to spring for the few $ more 2070s(could have did a 2070 I suppose) because of it's RT cores and I don't regret it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Noisiv:

sure it's equivalent that's why they hit an all-time rock bottom:
Honestly the 6k series is a good product and I think it's marketshare would be better than 5k series, I just think given the shortage/demand situation AMD would rather spend it's capacity with TSMC on Ryzen dies, which nets them way larger profit margins than the GPUs.
https://forums.guru3d.com/data/avatars/m/239/239063.jpg
After i bought 3080 last december and saw for myself what dlss 2.0 can do, i replaced it with 6800XT. Good riddance. Dlss is a gimmic ! I prefer RAW.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
valentyn0:

After i bought 3080 last december and saw for myself what dlss 2.0 can do, i replaced it with 6800XT. Good riddance. Dlss is a gimmic ! I prefer RAW.
Lol
data/avatar/default/avatar33.webp
geogan:

It's a pity the main AI neural net has to be trained using images of game that needs it. So doesn't work so well in untrained other games or applications. Or does it? Not sure. Sounds like it from what I read.
DLSS 2 does not use per-game training.