Battlefield V: DLSS PC Performance Update

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Battlefield V: DLSS PC Performance Update on our message forum
data/avatar/default/avatar13.webp
Denial:

Well they specifically marketed as an improved TAA. If you go and read the Siggraph whitepaper/slide from 2017 on the early versions of DLSS (Convolutional Auto-Encoder) it was basically designed as a TAA variant that gets rid of the some of the TAA motion artifact issues. The thing about TAA though is that it's not a standard solution. Each dev integrates it differently with varying degrees of quality. FFXV for example has terrible TAA, same with Port Royal. DICE typically does a good job with their variant but they are also notorious for not allowing you to disable it. Further the issues with TAA are mostly motion issues so people comparing single frames of games aren't really looking for the things DLSS was supposed to solve in the first place. The limitations of how/when you can use DLSS is more of an intrinsic property of how DLSS training works then nefariousness by Nvidia. They spend weeks training each resolution and RTX On vs Off has to be done in separate training sessions. Should also point out that the training is ongoing and Nvidia says they'll be delivering DLSS updates to games via Experience. Also I don't think an option to turn something on that can potentially reduce quality for performance improvement is considered cheating.
New anti-aliasing techniques are always welcomed, giving the user freedom of choice and flexibility. The issue is that DLSS requires dedicated hardware and is only supported by a single architecture(so far). you're basically paying in terms of thermals, die-size, transistors for DLSS. if DLSS doesn't prove itself to be very close in terms of visuals then it makes no sense, that die-space can be used instead for CUDA cores that will negate the performance "increase" gained by DLSS while still being 100% useful on all available games.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HardwareCaps:

New anti-aliasing techniques are always welcomed, giving the user freedom of choice and flexibility. The issue is that DLSS requires dedicated hardware and is only supported by a single architecture(so far). you're basically paying in terms of thermals, die-size, transistors for DLSS. if DLSS doesn't prove itself to be very close in terms of visuals then it makes no sense, that diespace can be used for CUDA cores who will negate the perfomance "increase" gained by DLSS while still being 100% useful on all available games.
Volta is essentially the same architecture as Turing and Titan V has significantly more CUDA cores than a 2080Ti yet performs identically (even when both are overclocked) because the entire architecture is essentially TDP limited. Regardless, Tensor 'cores' exist within the SM - they aren't discreet cores but operational modes of SM/ALUs - same as RT 'cores'. They add some negligible amount of die space but for the most part they reuse existing transistors within the SM. People keep alluding to the fact that they take up all this die space but no one has been able to explain to me how Nvidia managed to get the same CUDA/MM2 density as GP100, on the same process density, with double the cache, dispatch units and RT/Tensor that "take up a ton of space". Further AMD exposed INT4/8 for ML inferencing on Vega 7 presumably to support some type of GPUOpen variant of DLSS on DirectML - which they alluded to doing. DirectML support's tensors so even if there was a vendor agnostic standard it would still utilize the hardware.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Going by the screenshots posted i think it looks better with DLSS off. Looks less detailed and slightly blurry with DLSS. Looks like the game is running in an inferior resolution than the native monitor resolution and upscaled with a good upscaler. No reason to do that if you are running an expensive 4k monitor. Might as well spend the extra money to get a 2080TI and run the game without DLSS.
data/avatar/default/avatar35.webp
Denial:

Volta is essentially the same architecture as Turing and Titan V has significantly more CUDA cores than a 2080Ti yet performs identically (even when both are overclocked) because the entire architecture is essentially TDP limited. Regardless, Tensor 'cores' exist within the SM - they aren't discreet cores but operational modes of SM/ALUs - same as RT 'cores'. They add some negligible amount of die space but for the most part they reuse existing transistors within the SM. People keep alluding to the fact that they take up all this die space but no one has been able to explain to me how Nvidia managed to get the same CUDA/MM2 density as GP100, on the same process density, with double the cache, dispatch units and RT/Tensor that "take up a ton of space". Further AMD exposed INT4/8 for ML inferencing on Vega 7 presumably to support some type of GPUOpen variant of DLSS on DirectML - which they alluded to doing. DirectML support's tensors so even if there was a vendor agnostic standard it would still utilize the hardware.
not sure where you got that info(nvidia didn't reveal much) but in their blog they clearly state that Tensor cores populate a decent amount out of the SMs while RT cores take additional die-space. they managed to get more cache in because of the fact that the die-sizes are much bigger on Turing(also 12nm does improve density slightly) https://devblogs.nvidia.com/wp-content/uploads/2018/09/image11-601x1024.jpg https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/ Also a clear indication is that transistor count has grown between Pascal and Turing on the same CUDA count. GTX 2060 - https://www.techpowerup.com/gpu-specs/geforce-rtx-2060.c3310 10,800 transistors, 445 mm die GTX 1070 - https://www.techpowerup.com/gpu-specs/geforce-gtx-1070.c2840 7,200 transistors, 314 mm die huge difference
data/avatar/default/avatar12.webp
Thanks for the update Hilbert. I don't buy this. Any feature that effectively lowers final output resolution, particularly of textures, just to lower processing cost/meet fps acceptability is a worrying trend. What's the point in developing high/"ultra" res textures (/later 'texture packs') in the first place? I'm sure the game artists will getting pretty pissed off. We see the same with the RT feature - scene RT complexity reduction to meet fps targets. I do wish Nvidia placed a higher weighting on their QA department than marketing...get your act together Nvidia!! I purchased a 1440p monitor for one primary reason - to be able to benefit from higher resolution, crisp graphics. I don't want the artists hard work smeared.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Mesab67:

I don't buy this. Any feature that effectively lowers final output resolution, particularly of textures, just to lower processing cost/meet fps acceptability is a worrying trend. What's the point in developing high/"ultra" res textures (/later 'texture packs') in the first place? I'm sure the game artists will getting pretty pissed off. I purchased a 1440p monitor for one primary reason - to be able to benefit from higher resolution, crisp graphics. I don't want the artists hard work smeared.
You're not forced to use it. Luckily if you have a 20 series card you can disable it in the settings. Now it would be a worry if it was forced on. So far what we have seen of DLSS, it has to be said is rather disappointing. After reading up on it, I had high hopes for it, but it certainly needs a lot of work if it is to be taken seriously moving forward.
data/avatar/default/avatar28.webp
"You're not forced to use it. Luckily if you have a 20 series card you can disable it in the settings. Now it would be a worry if it was forced on. So far what we have seen of DLSS, it has to be said is rather disappointing. After reading up on it, I had high hopes for it, but it certainly needs a lot of work if it is to be taken seriously moving forward." Yes, it would be a big issue if it was forced on and they would have been daft to have done so. But, It does bring us back to Marketing overriding QA (push it out, we'll fix it later...we promise 😉 ) - never a good move.
data/avatar/default/avatar19.webp
Denial:

They spend weeks training each resolution and RTX On vs Off has to be done in separate training sessions. Should also point out that the training is ongoing and Nvidia says they'll be delivering DLSS updates to games via Experience.
If training each resolution took weeks SaturnV would be one of the slowest supercomputers in the world. Fortunately it is ranked 28th fastest in the world so depending on the number of images provided by game development it shouldn't take more than a few hours to produce the initial algorithm required for any game.
data/avatar/default/avatar09.webp
Metro will get a day one patch which will improve DLSS over the review code: Tuned DLSS sharpness to improve image quality Updated learned data for DLSS to improve image quality with DLSS on Fixed blurred UI when DLSS is enabled https://www.metrothegame.com/news/patch-notes-summary/
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
pharma:

Metro will get a day one patch which will improve DLSS over the review code: Tuned DLSS sharpness to improve image quality Updated learned data for DLSS to improve image quality with DLSS on Fixed blurred UI when DLSS is enabled https://www.metrothegame.com/news/patch-notes-summary/
This will be interesting. Stay tuned. Not sure if i would get my hopes up just yet though.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HardwareCaps:

not sure where you got that info(nvidia didn't reveal much) but in their blog they clearly state that Tensor cores populate a decent amount out of the SMs while RT cores take additional die-space. they managed to get more cache in because of the fact that the die-sizes are much bigger on Turing(also 12nm does improve density slightly) https://devblogs.nvidia.com/wp-content/uploads/2018/09/image11-601x1024.jpg https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/ Also a clear indication is that transistor count has grown between Pascal and Turing on the same CUDA count. GTX 2060 - https://www.techpowerup.com/gpu-specs/geforce-rtx-2060.c3310 10,800 transistors, 445 mm die GTX 1070 - https://www.techpowerup.com/gpu-specs/geforce-gtx-1070.c2840 7,200 transistors, 314 mm die huge difference
Those images are artist renditions - they aren't how the physical chip is laid out on a transistor level. Compare any of Nvidia's "die shots" to an actual microscope die shot of the same GPU - it's often pretty different. Also that picture doesn't even show the FP64 units for compatibility on Turing. The 7.5T library they are using on 12nm doesn't improve density - only the 6.5T one does. I compared GP100 to Titan RTX for CUDA/MM2 because GP100 is the only pascal die that features FP16x2 which requires a number of registers. GP104 doesn't utilize FP16x2. Cache size & number of dispatch units doubled with Turing, cache being one of the most transistor dense units on the GPU. The ALUs do the actual computing and when clustered can function in different operational modes. Some transistors are required to add these modes but for the most part the actual silicon in the ALU is reused. If every single marketed 'core' had a separate ALU with it's own dedicated silicon the die sizes would be astronomical. If you want further proof run PyTorch on the GPU and then hit it with a FP32 workload and watch the performance of both drop - you have a fixed number of ALU's and both workloads are sharing them. It's not like you can do full INT/RTX/FP16/FP32 simultaneously.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Well this looks like a bum deal. You can be sure they made it look as good as they can to prevent backlash and its still not up to the job. They spent a long time preparing so it must be pretty close to as good as it gets. Question is, how much better/worse does running 1440p on a UHD screen look? Fingers crossed Metro runs fast enough not to need DLSS.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
Netherwind:

What about 3440x1440? Is it supported?
Yes it is, well on the RTX2080, IDK about the RTX 2080Ti I get around 80fps with RTX on ultra and DLSS. Before i had to turn turn RTX down to med to get around the same performance, ultra used to make it choke as i would run out of VRAM, However the new patch seems to make the game unstable and seems to randomly crash, e.g exiting a single player game to go to the main menu causes it to crash. Also long distance there's shimmering.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Metro will get a day one patch which will improve DLSS over the review code: Tuned DLSS sharpness to improve image quality Updated learned data for DLSS to improve image quality with DLSS on Fixed blurred UI when DLSS is enabled https://www.metrothegame.com/news/patch-notes-summary/
Mufflore:

Well this looks like a bum deal. You can be sure they made it look as good as they can to prevent backlash and its still not up to the job. They spent a long time preparing so it must be pretty close to as good as it gets. Question is, how much better/worse does running 1440p on a UHD screen look? Fingers crossed Metro runs fast enough not to need DLSS.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
Maddness:

Metro will get a day one patch which will improve DLSS over the review code: Tuned DLSS sharpness to improve image quality Updated learned data for DLSS to improve image quality with DLSS on Fixed blurred UI when DLSS is enabled https://www.metrothegame.com/news/patch-notes-summary/
I'm not sure how that helps when we dont know what it looks like to start with or what it looks like after.
https://forums.guru3d.com/data/avatars/m/271/271585.jpg
I played a few rounds with DLSS enabled at 4k on a 2080ti. Previously I had been playing at 1440p ultra and RTX enabled. Some takeaways:

Pros: Random frame drops which had happened once or twice a match on RTX-heavy maps (rotterdam) at 1440p are either no longer present or at least not noticeable. DLSS is excellent as means of AA. Lines that give definition to 3D objects with curves are noticeably smoothed out. Subtle blur makes some objects look more "real" IMO (mostly with objects further away, enemies in motion). It's hard to describe and difficult to show in a still image. Cons: Feels a bit heavy-handed. Like there needs to be a DLSS Low / DLSS high setting. FPS spikes over 60 in scenes where blur is noticeable and detracting from picture quality. As mentioned in the review, only available for 4k with a 2080ti (why?). Subtle blur makes some objects look muddy - items in the "foreground" or close to the player can be noticeably worse looking. Anywhere the blurring results in textures with less detail - see previous bullets.

I honestly don't know which of these is preferable now, 1440p RTX or 4k RTX / DLSS. I personally like the RTX effect in BFV enough to leave it on and DLSS does make RTX a better experience from an FPS standpoint. That said, I hope it is improved/optimized further so that the worst texture blur is less noticeable and/or we get some additional options.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
Romulus_ut3:

Hilbert, I'd very much like to see the following: A comparison between DLSS on/DLSS off/75% resolution scaling at the same resolution/75% resolution scaling at the same resolution on a Radeon VII. I think that would make for a very interesting comparison.
this would be interesting
https://forums.guru3d.com/data/avatars/m/263/263487.jpg
Fox2232:

DLSS turns him from intimidating and angry into elderly man with expression showing trouble to understand something in front of him.
More like DLSS turns him into a Hollywood celeb asking for a smooth over 😛
https://forums.guru3d.com/data/avatars/m/106/106401.jpg
LOL! Indeed - DLSS can be very welcome in the Celebs Photoshoped albums, DLSS will make all their wrinkle/skin problems disappear and will make their skin smooth like baby's face 🙂 DLSS should be Instagram/Photoshop Plug-In for skin smoothing/Anti-Aging affect! And I don't wan't to mention one of the big film making industry that will very much appreciate such a great skin fine details removal algorithms for their 4K/8K videos in future ;-). https://i.postimg.cc/YCtpvMyL/before-after-photoshop-celebrities-38-57d1347c1b9c5-700.jpg P.S I don't remember where I read this but in some comments on DLSS in BFV, some claimed thad DLSS made some of the enemy too blurry to see(Far away?) so it just hurts the game and you should disable it, did you noticed it too?
https://forums.guru3d.com/data/avatars/m/241/241158.jpg
Oh, great. A fake 4k, like console. Lower resolution to gain frames and do an upscale.