Battlefield V: DLSS PC Performance Update

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Battlefield V: DLSS PC Performance Update on our message forum
data/avatar/default/avatar13.webp
Great job adding the comparisons! DLSS is definitely more blurry, look at the vegetation and the trees. so blurry for a 4K output.... also the vehicle on fire is far more detailed with native 4K
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
What about 3440x1440? Is it supported?
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
thanks Hilbert ! About DLSS, it's not just blurry, it's just has much less details, so of-course the FPS gonna improve, looks like on simple surfaces DLSS can do OK, but on very complicated objects/textured DLSS can't be useful, the affect is just too bad, SRY NV fans :-(. https://i.postimg.cc/Y0Y9DCWx/DLSSBFV.jpg
https://forums.guru3d.com/data/avatars/m/225/225706.jpg
For what it is I'm surprisingly impressed. Better article then the previous one.
data/avatar/default/avatar11.webp
Netherwind:

What about 3440x1440? Is it supported?
I have an aw3418dw and msi 2080 gaming x trio. It works on 3440x1440, but the image qualitiy loss doesnt worth it. I do prefer play with dlss+rtx off.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
@Hilbert Hagedoorn : Can we get screenshot with DLSS-ON and "resolution scale" high enough to decrease fps to match DLSS-OFF screenshot? (So, we can compare IQ at equal fps.) = = = = Side note, not allowing DLSS on 4K for weaker cards is like saying: "Pay Extra!"
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Limiting DLSS on different cards per resolution is intentional and disgusting. I agree with Fox, it is like saying pay more. After such a heavily marketed feature at the end is another useless one.
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
Little Joke time 🙂. How would Thanos looked like if he was Renderd with DLSS ON ? [spoiler='Click to View'] https://i.postimg.cc/NG1vbjfL/25-thanos.jpg [/spoiler] I think I expressed accurately the DLSS effect on quality- am I right?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
HWgeek:

Little Joke time 🙂. How would Thanos looked like if he was Renderd with DLSS ON ? [spoiler='Click to View'] https://i.postimg.cc/NG1vbjfL/25-thanos.jpg [/spoiler]
DLSS turns him from intimidating and angry into elderly man with expression showing trouble to understand something in front of him.
data/avatar/default/avatar12.webp
It is absolutely illogical why i can't use dlss on 1440p on my 2080ti. Geforce experience says it should be on, but it becomes greyed out in game. Another thing no one ever mentioned...with ray tracing on, i get severe mouse lag. Anyone else?
data/avatar/default/avatar02.webp
Hilbert, I'd very much like to see the following: A comparison between DLSS on/DLSS off/75% resolution scaling at the same resolution/75% resolution scaling at the same resolution on a Radeon VII. I think that would make for a very interesting comparison.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Blury mess, with missing detail, looks terrible.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
The upscaling from 1440p to 2160p hits the performance in 10 fps, from 65 to 55... so now I'm interested in seeing the differences between 1440p and DLSS-2160p, triying to find where to blame Nvidia 😛 Btw, great work Hilbert, I like the sliders a lot, another toy to lose time with.
data/avatar/default/avatar25.webp
Nvidia RTX - features set that keeps on pissing off RTX owners...
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Enabling DLSS only for 4k resolutions makes no sense, obviously...but perhaps it looks a lot less blurry @ 4k--that would be my guess here. Restricting its use is certainly is no oversight. Ah, yes, the nVidia I remember so well is back! Pulling image-quality tricks, deceits, and sleights of hand, once again...! It's Back to the Future, alright....;) It's indeed gratifying to know that when blurriness increases so does frame-rate--even if the Tensor cores kill the frame-rate by nearly half. (That in itself is weird from nVidia, because the company has traditionally been all about benchmarked frame rates and very little else--fought for years against FSAA until they learned how to do it in hardware, etc.) The one thing I didn't really understand here is the difference in the comparative screen shots--the DLSS On shots show vegetation close up, while the DLSS Off screens put the vegetation in the far distance and obscured by fog. Scratching my head on that one...?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
waltc3:

Enabling DLSS only for 4k resolutions makes no sense, obviously...but perhaps it looks a lot less blurry @ 4k--that would be my guess here. Restricting its use is certainly is no oversight. Ah, yes, the nVidia I remember so well is back! Pulling image-quality tricks, deceits, and sleights of hand, once again...! It's Back to the Future, alright....;) It's indeed gratifying to know that when blurriness increases so does frame-rate--even if the Tensor cores kill the frame-rate by nearly half. (That in itself is weird from nVidia, because the company has traditionally been all about benchmarked frame rates and very little else--fought for years against FSAA until they learned how to do it in hardware, etc.) The one thing I didn't really understand here is the difference in the comparative screen shots--the DLSS On shots show vegetation close up, while the DLSS Off screens put the vegetation in the far distance and obscured by fog. Scratching my head on that one...?
DLSS isn't restricted to 4K in BF5? Also you keep confusing Tensor cores with RT cores. Tensor does DLSS, DLSS does not "cut frame-rate by half" RT cores do RTX/DXR - that does cut framerate. Nvidia intended Tensors to denoise the RT image but DICE doesn't use tensors at all for RT in BF5.
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
My main complaint on DLSS is that if nVIdia chose to market it like "Better Quality at same Performance" and indeed Enabling DLSS in game resulted as Deep Learning Up-scaled image with much better quality without decreasing performance- That's Great and worth the $$, But using fancy Tech that Reduces Image Quality to Improve Performance? If I am note mistaken- this was called "Cheating" back in the days. Also Limiting the options of when and how you can use DLSS just proves my point.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HWgeek:

My main complaint on DLSS is that if nVIdia chose to market it like "Better Quality at same Performance" and indeed Enabling DLSS in game resulted as Deep Learning Up-scaled image with much better quality without decreasing performance- That's Great and worth the $$, But using fancy Tech that Reduces Image Quality to Improve Performance? If I am note mistaken- this was called "Cheating" back in the days. Also Limiting the options of when and how you can use DLSS just proves my point.
Well they specifically marketed as an improved TAA. If you go and read the Siggraph whitepaper/slide from 2017 on the early versions of DLSS (Convolutional Auto-Encoder) it was basically designed as a TAA variant that gets rid of the some of the TAA motion artifact issues. The thing about TAA though is that it's not a standard solution. Each dev integrates it differently with varying degrees of quality. FFXV for example has terrible TAA, same with Port Royal. DICE typically does a good job with their variant but they are also notorious for not allowing you to disable it. Further the issues with TAA are mostly motion issues so people comparing single frames of games aren't really looking for the things DLSS was supposed to solve in the first place. The limitations of how/when you can use DLSS is more of an intrinsic property of how DLSS training works then nefariousness by Nvidia. They spend weeks training each resolution and RTX On vs Off has to be done in separate training sessions. Should also point out that the training is ongoing and Nvidia says they'll be delivering DLSS updates to games via Experience. Also I don't think an option to turn something on that can potentially reduce quality for performance improvement is considered cheating.
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
Thank for the info , I do hope that in coming months we gonna see better results. I am just worried that NV will go AI/DL route and claim FPS boost with the technology instead of improving the silicone (for example boosting performance between GTX 680/780/980/1080 ).
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
We all know that bitmaps can not be enlarged arbitrarily, because even the most perfect filter can not extract/display sub-pixel information, because it has not been saved (it simply does not exist). The same applies to upscaling. You can create the illusion of sharpness at first glance, but no anti-aliasing technique will allow you to display details that have not been rendered due to the low resolution of the rendered frame. Perhaps it's worth checking whether, in general, using high resolution textures makes sense with DLSS enabled? Maybe smaller textures will not worsen the quality of the image and will boost fps?