The Division 2: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for The Division 2: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
data/avatar/default/avatar39.webp
The dx 12 issues i encountered in the beta are still present: occasional micro freezes here and there, and the lighting going crazy. They fixed at least the random crashes
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
I think that the Vignette effect being disabled could fix most of the lighting issues.
data/avatar/default/avatar18.webp
Undying:

1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
And radeon vii is destroyed by 2080. Vega is so old news.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Radeon VII same as 1080ti in 1080p and better in anything above. Not bad. Considering 1080ti was last ok card from nVidia in terms of performance and perf/buck ratio. Too bad u cant buy neither VII or 1080ti now... LOL! Too expensive for me anyway. But to me, VII looks better than 1080ti (16gb HBM2) Thats not THAT bad and not THAT old news.. Also, its barely released and will work better in time. Game runs "ok"...
data/avatar/default/avatar09.webp
Define "destroyed"? 2-4 fps at 1440p & 4k?
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
Undying:

1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
What part of... ""This is an AMD sponsored title so there are of course optimizations in play."" ...do you not understand?
data/avatar/default/avatar19.webp
Embra:

Define "destroyed"? 2-4 fps at 1440p & 4k?
You do realize there are competitive gamers playing 1080p 144hz+? That 10 fps difference in an AMD sponsored title is quite laughable.
data/avatar/default/avatar13.webp
spajdrik:

I think that the Vignette effect being disabled could fix most of the lighting issues.
i can try that, my issue is the light becomes so bright that i get blinded
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
warlord:

And radeon vii is destroyed by 2080. Vega is so old news.
It's strange because it's an AMD game and you'd expect the RVII to perform better, especially as the RVII and 2080ti are listed as the 4k60 cards for this game and the 2080 is listed as the 1440p card.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Undying:

1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
You're looking at it completely wrong, but that doesn't surprise me in the slightest. 1660ti beats the 590 and 1660 beats the 580. In a AMD sponsored game.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The hyperbole in these threads is hilarious.
data/avatar/default/avatar12.webp
I put down my pitchfork the day before yesterday, but as always, thanks for the benchmarks.
data/avatar/default/avatar23.webp
If anyone is interested here you can see how it runs on ryzen 2700x paired with r9 380: [youtube=bgrN5QWzuzo] #waitingForNavi Also wanted to make dx11 vs dx12 comparison but for some reason i'm unable to make this game work in fullscreen on dx11, anyone else having similar problems? 19.3.2 driver installed. After switching to dx11 game starts in window and i can't go to fullscreen in any possible way: in-game settings, config edit nor alt+enter. When i set fullscreen in menu settings image freezes, game continues to work, after restart of game game is in 1920x1061 windowed mode. When trying to switch to fullscreen with alt+enter image freezes, game is still window and continues to work.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
warlord:

You do realize there are competitive gamers playing 1080p 144hz+? That 10 fps difference in an AMD sponsored title is quite laughable.
This depends on the game completely, CSGO for example. https://prosettings.net/cs-go-best-settings-options-guide/ Fortnite https://prosettings.net/best-fortnite-settings-options-guide/ Overwatch https://prosettings.net/overwatch-best-settings-options-guide/ Look at the settings of the games though, typically they're lower and both card would have no problem doing that.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Dx11 lookin good! πŸ˜€
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
I wish we had 1% and 0.1% charts too, the average alone is kinda not enough because some cards will tank more than others.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
metagamer:

I wish we had 1% and 0.1% charts too, the average alone is kinda not enough because some cards will tank more than others.
Especially those with lower amount of vram. Try playing Apex on 8gb insane texture setting and 6gb card will choke and stutter.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Is Apex still running on Source engine like Titanfall 2 if that's the case chances are it's using the same configuration and the texture is a cache value defined in Megabytes where ultra is 8192 and doesn't really do that much if you lower it down a bit. πŸ™‚ (Keeps more textures in VRAM so less swapping around.) For this game there's a few additional settings in the config file like the neutral lighting from the first game ("full bright" though not quite that drastic.) and it's possible to disable a few things like temporal anti-aliasing and some minor tweaking though a bit less than the first game. Similar glitches too such as upping reflection quality to ultra (3 in the config file.) breaking several reflective surfaces most notably on the character models. Well time to give this a more thorough read, always fun to see a performance comparison and now there's D3D11 and D3D12 too and it's not just CPU either since even higher-end systems reportedly see a 10 - 15 percent gain. Think Turing and also Pascal handles async fairly well so AMD probably sees some competition here since the 1080Ti and the upper end NVIDIA cards outperform Vega although the VII probably does alright but the 2080 and 2080Ti should be faster overall. No NVIDIA or AMD specific effects and many things mentioned in the tech feature video also applied to the first game but I think CPU usage has improved further and probably a bigger focus on async compute as well. (Wonder if that's D3D11 and D3D12 or just D3D12, Crackdown 3 has a toggle for it for DX11 though I expect Division 2 here to focus heavily on DX12 though some users are reporting crashes so it might be sensitive to third party software, overclocks and of course also the display driver itself.) Might try it but I like using ReShade so a 10% perf hit isn't that bad of a compromise from going with D3D11 though I should probably compare them at some point. (RAM and CPU are hitting their limits though, Ryzen 3000 and DDR4 next perhaps but eh who knows. πŸ˜€ ) EDIT: Realistic visuals like this though I wonder what ray-tracing can eventually accomplish once it's developed further and more common, that's probably beyond my lifetime however but I am curious seeing the natural limits of screen-space effects, rasterization and what is approximated with global illumination, light and shadows and all that stuff but yeah decades until this can be phased out in full. (And at point it's all about streaming and renting and the power of some cloud somewhere. πŸ˜› Well perhaps not quite but that's also been quite in focus recently but also quite a ways off yet.) The game isn't bad looking but it's going to be a balance of visual quality and performance and of course feature parity and console and PC and the whole thing so kinda curious seeing what's going to happen for the next gen or two although it will take time. Oh well, this isn't bad either. (AI? Physics? NO! Shiny visuals better. Ha ha! Well maybe that too can catch up eventually. Maybe.)
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
JonasBeckman:

Is Apex still running on Source engine like Titanfall 2 if that's the case chances are it's using the same configuration and the texture is a cache value defined in Megabytes where ultra is 8192 and doesn't really do that much if you lower it down a bit. πŸ™‚ (Keeps more textures in VRAM so less swapping around.)
So that means less texture popin and stutter. That translates in better experience even if you have slower gpu like rx580 compared to 1660.