Battlefield 5 gets DLSS support starting 12th of February Patch

Published by

Click here to post a comment for Battlefield 5 gets DLSS support starting 12th of February Patch on our message forum
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Bye bye RVII's performance advantage. Ouch. Improved DXR ray tracing performance will be nice too, with DLSS this should make quite a big difference.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Alberto:

@Hilbert Hagedoorn why do you have ignored this news?
There are like a million articles posted a day that don't get posted here - why do you think he's ignoring this specific one?
metagamer:

Bye bye RVII's performance advantage. Ouch.
Arguable. DLSS in FFXV looked pretty average and introduced all kinds of weird visual artifacts. Hopefully it's better here but I'm not holding my breath.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
metagamer:

Bye bye RVII's performance advantage. Ouch. Improved DXR ray tracing performance will be nice too, with DLSS this should make quite a big difference.
And hello image quality degradation. Hah.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Undying:

And hello image quality degradation. Hah.
quite the opposite.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
metagamer:

quite the opposite.
How is it the opposite when you haven't seen it in BF5 and the only game we've seen it in, it introduces a bunch of weird issues with?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Undying:

And hello image quality degradation. Hah.
If you had 90fps in some repeatable place before (Spawn of test range), you can go there and increase internal render resolution till you have same 90fps again. And at that point, you have better IQ at same fps than w/o DLSS. (Hopefully.)
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Denial:

How is it the opposite when you haven't seen it in BF5 and the only game we've seen it in, it introduces a bunch of weird issues with?
Have you seen it in BFV? From what we know, it has some niggles, shimmering is one. If it's too pronounced, I won't be using DLSS. But it has some pros, with producing a sharper image than TAA. It's early days for DLSS, things may start out rough, we'll have to wait and see.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Denial:

There are like a million articles posted a day that don't get posted here - why do you think he's ignoring this specific one? Arguable. DLSS in FFXV looked pretty average and introduced all kinds of weird visual artifacts. Hopefully it's better here but I'm not holding my breath.
From what i saw in DF test it degraded the imagine quality. All the sings on the car licence plates ware blurry using DLSS while TAA was sharp. Also goes for the wires/ropes/fences ware invisible if you are to far away from it, weird like you said. Also that indoor scene with trader DLSS was a mess. Everything behind him was a blurry mess where again TAA was much sharper.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Alberto:

@Hilbert Hagedoorn why do you have ignored this news?
Haven't ignored anything, I am just now reading this? Why do some of you guys automatically assume the negative
data/avatar/default/avatar19.webp
Why do you assume DLSS as a degrade out of the box? So DirectML, the hope of AMD is the omen? The perfection? The greatness? I find DLSS great, better than stuttering combined with low fps and inferior AA.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
warlord:

Why do you assume DLSS as a degrade out of the box? So DirectML, the hope of AMD is the omen? The perfection? The greatness? I find DLSS great, better than stuttering combined with low fps and inferior AA.
As already stated in the thread, the only example we have of DLSS is not great - arguably makes image quality worse and introduces a bunch of weird artifacts. That's not to say it will be bad here but taking the default stance of "quite the opposite of degrade" is kind of weird. No one even mentioned DirectML in here and honestly no - any implementation on DirectML will have the same challenges, probably more considering AMD can't seem to get any of their technologies into more than a few games and I doubt AMD is going to pay to do the training like Nvidia is.
data/avatar/default/avatar08.webp
Nice, finally will see some real dynamic comparison with an engine that can do other AA techniques properly. still the time it takes for DLSS implementation seems way too long
Fox2232:

If you had 90fps in some repeatable place before (Spawn of test range), you can go there and increase internal render resolution till you have same 90fps again. And at that point, you have better IQ at same fps than w/o DLSS. (Hopefully.)
That's not how AI works. There's the training part which requires massive computation and memory (rerunning the algorithms again and again changing massive amount of weights) and there's the inferencing which is a compressed, summarized algorithm which uses a lot less memory and compute meaning, the algorithm is a finished product once Nvidia has shipped it. it doesn't continue to learn on your machine
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HardwareCaps:

That's not how AI works. There's the training part which requires massive computation and memory (rerunning the algorithms again and again changing massive amount of weights) and there's the inferencing which is a compressed, summarized algorithm which uses a lot less memory and compute meaning, the algorithm is a finished product once Nvidia has shipped it. it doesn't continue to learn on your machine
Read text I wrote. Comprehend. Reiterate if you came to same conclusion as in your post above.
data/avatar/default/avatar12.webp
Fox2232:

Read text I wrote. Comprehend. Reiterate if you came to same conclusion as in your post above.
pretty obvious it was sarcastic just wanted to clarify,I'm not a sarcastic guy myself....
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
It'll be interesting to see if textures will take a hit with this too, I'm guessing they will.
https://forums.guru3d.com/data/avatars/m/114/114584.jpg
Yet, they still dont have an option to turn off TAA....
data/avatar/default/avatar13.webp
HardwareCaps:

meaning, the algorithm is a finished product once Nvidia has shipped it. it doesn't continue to learn on your machine
Not true. The algorithm can continue to be refined on the server and either updated in future drivers or shipped down to GFE if there are any DLSS updates.
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
I have been looking forward to this feature DLSS and RT in battlefield V hopefully 1440p goodness with my RTX2080
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
metagamer:

quite the opposite.
Not that i have seen besides FF there is also the Unreal 4 engine infiltrator demo and that has weird things going on with DLSS and far scenes are washed out really badly, but does have higher fps nothing is ever perfect.
data/avatar/default/avatar39.webp
Not true. The algorithm can continue to be refined on the server and either updated in future drivers or shipped down to GFE if there are any DLSS updates.
Maybe I'm just being negative here, but I feel like whatever optimization they've done has likely been running for months now, and whatever implementation they release will be considered their best effort. This being their flagship tittle, if this is bad, I wouldn't expect anything else to be better. Either way, I can't imagine upscaling every looking better than native, especially up close like on a computer monitor.