Sniper Elite 4 PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Sniper Elite 4 PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Correct, still working on the article and need to rewrite a thing or two. That is on the to-do list.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Looks like Nvidia just released their own "Game ready" driver for this (And the games For Honor and Halo Wars 2) wonder if there's any performance gains though from what I've read it seems AMD and Nvidia both perform pretty well for this game although multi-GPU support for DX12 is a bit lacking though that seems to be a pretty common thing so perhaps SLI and Crossfire support will be patched in later on? Far as driver performance gains goes I think AMD mentioned something of a 4 - 5% increase with their 17.2.1 driver over 17.1.1 so driver optimizations might not add much - unless something's completely broken. - but it's still a gain so it'll probably be similar for Nvidia although usually game profiles are ready in advance though smaller tweaks are probably still possible for some additional gains. πŸ™‚ Well the game in question isn't too demanding either this time, visually almost identical to Sniper Elite v.3 but now supporting DX12 as a low-level API alternative to DX11 with a focus on performance. (And the addition of async compute support.) EDIT: Wonder if there will be a For Honor test too, though that game focuses more on online and also uses (enforces?) EAC or Easy Anti Cheat so some utilities might not be completely functional since it's a pretty restrictive anti-cheat solution with a stricter whitelist system. (Afterburner 4.3.0 and RTSS 6.5.0 / 6.5.1 should be OK though since those got whitelisted shortly after Watch_Dogs 2 was released.) EDIT: Huh so Sniper Elite V.4 game actually supports multi-GPU setups already, different from what I've heard but that's really nice to see. πŸ™‚
data/avatar/default/avatar02.webp
Request
As per request, GeForce GTX 970 added.
Kindly add the r9 290x card as well. I'm still running the 290x lightning and I'm very curious to know how dx12+async performs on these older cards. Much Appreciated
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Yes indeed it does and it's faster than dx11 sli.
😱 OMG, that's really happening? :banana:
data/avatar/default/avatar11.webp
What about DX11 vs DX12?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
What about DX11 vs DX12?
Last page in our review deals with DX11/DX12/DX12+ASYNC.
data/avatar/default/avatar18.webp
Hilbert, would you mind throwing in a quick CPU test with an FX-series CPU as well to see if it's much behind Intel or if it's doing alright in this game? Hope it's not too much trouble.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Article update with CPU some scaling results. I added results based on FCAT with a 2,4,6 and 8 core setup. You'll notice that the PCGH results differ from ours big-time. When I looked closer I noticed PCGH is using 1280x720P with low image quality settings to demo CPU differences. I measure as you guys play @ home. Thus proper image quality settings and a 2560x1440 resolution. Neither methodology is erroneous here, but I tend to test the way you guys play games on your PC, 720P does not fit that pattern. Brisse: not today mate. Perhaps I'll have a look with FX tomorrow.
data/avatar/default/avatar08.webp
Last page in our review deals with DX11/DX12/DX12+ASYNC.
Thanks, I rarely look at the last page. I think DX11 vs DX12 should deserve it's own page in the article, a lot of people are interested in it, especially when DX12 actually works.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Thanks for the review! It seems like an exemplary port.
https://forums.guru3d.com/data/avatars/m/269/269397.jpg
nice review! Will you be doing an update of the nvidia gpu results with their latest gameready driver?
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
Very Pleased ports are getting better it is about time hopefully the will release a better texture pack for the PC. I have looked at several reviews/videos and the textures need improving there Ok but not great.
data/avatar/default/avatar28.webp
Brisse: not today mate. Perhaps I'll have a look with FX tomorrow.
That's okay. I managed to find what I was looking for on some russian hardware site. FX series looked quite pathetic even against Intel's lower end CPU's in both dx11 and dx12, but framerates were still perfectly smooth so it's not like it's an issue. Interestingly, the FX-8350 paired with GTX1080 @ Ultra 1080p performed better in dx11 (avg. 99fps) versus dx12 (avg. 91fps). This is exactly the opposite of what I expected, as dx12 tends to come as a blessing for us FX owners. Not so in this case it seems. http://gamegpu.com/action-/-fps-/-tps/sniper-elite-4-test-gpu
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
I swear that nVidia doesn't optimize drivers as well as they could once a card isn't in the newest series :3eyes:.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Article update with CPU some scaling results. I added results based on FCAT with a 2,4,6 and 8 core setup. You'll notice that the PCGH results differ from ours big-time. When I looked closer I noticed PCGH is using 1280x720P with low image quality settings to demo CPU differences. I measure as you guys play @ home. Thus proper image quality settings and a 2560x1440 resolution. Neither methodology is erroneous here, but I tend to test the way you guys play games on your PC, 720P does not fit that pattern. Brisse: not today mate. Perhaps I'll have a look with FX tomorrow.
Thanks Hilbert. I facepalm at sites that do ultra low res CPU tests while leaving out todays common resolutions. This silly practice is a carry over from 20 years ago that completely ignores PC gaming evolution. I know this is still a WIP, and I fully appreciate your inclusion of 1440p CPU tests, but you may face some criticism (especially outside of G3D) if its the only res tested on. After all, 1080p is still predominant in use and CPU performance will likely show more variance with it. Just trying to help you avoid any criticism directed towards you/G3D on this πŸ™‚.
https://forums.guru3d.com/data/avatars/m/252/252846.jpg
1440p GTX Titan X - 116 GTX 1080 - 90 GTX 1070 - 72 R9 Fury X - 71 GTX 980ti - 67 R9 Fury - 66 R9 Nano - 63 4k GTX Titan X - 72 GTX 1080 - 55 R9 Fury X - 49 R9 Fury - 45 R9 Nano - 42 GTX 980ti - 42 GTX 1070 - 40 How exactly do the cards with more than 4GB jump ahead? The GTX 1080 and GTX Titan X were already on top. The GTX 1070 (8GB) dropped below the R9 Nano (4GB), the R9 Fury (4GB) jumped ahead of the 980ti (6GB) and both the R9 Fury (4GB) and R9 Fury X (4GB) jumped ahead of the GTX 1070 (8GB). Not to mention the R9 Fury is 100 euros cheaper than the GTX 1070 (both, cheapest versions available, Sapphire Nitro fury for 330 euros and KFA2 GTX 1070 for 429 euros) That conclusion makes no sense at all...
R9 fury OC ( 1050 Mhz ) 259 euros https://www.grosbill.com/4-sapphire_nitro_radeon_r9_fury_4g_hbm-671664-informatique-_carte_graphique?utm_source=idealo&utm_medium=comparateurs&utm_term=cpc&utm_campaign=671664-carte%20graphique&ectrans=1#siteidealo here πŸ™‚
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
Nice review HH. πŸ™‚ I'm quite pleased to see DX12 looking good and a well optimized and smooth game at launch for both Team Red and Team Green. Since this is a series that I quite enjoy and they seem to have done an exemplary job with it right from launch (so unusual these days) I am very tempted to buy it at full price (instead of on sale next Xmas) just to support their good work.
data/avatar/default/avatar01.webp
Apologies if I have missed this being discussed, but I was wondering why the 780 Ti has not been included on the recent benchmarks. This is still a powerful card and I'm just wondering why it is left out. If it won't be included going forward this site is less useful for me. πŸ™