Watch Dog 2: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Watch Dog 2: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Nice to see more performance comparisons, dialogues can be a bit quirky (I'm a bit old now though compared to the general target demographic or however it's called. :P ) but the game itself isn't too bad so far. ๐Ÿ˜€ EDIT: If you meant temporal filtering with this.
We and you should leave temporal AA disabled, as it is pretty bad.
Then from the recent GeForce.com guide. -> http://www.geforce.com/whats-new/guides/watch-dogs-2-graphics-and-performance-guide It actually works by scaling down your resolution in return for a pretty significant performance boost whereas for anti-aliasing this game offers either FXAA or SMAA for post-process AA (Or you can ReShade inject SMAA and/or other effects.) and then you can also use MSAA if you have some serious GPU power since it's quite demanding and finally for Nvidia GPU's there's also TXAA support (version 3.0) for some temporal stability which well this version is now far less blurry going by the guide I linked to above which should please people who previously criticized just that part about this AA technique. ๐Ÿ™‚ (Going by the comparison image it's actually really good and almost equal to standard MSAA now in terms of sharpness.) I'll just borrow this image from the guide itself since it explains it better and faster than me copy pasting the info about it, heh. http://images.nvidia.com/geforce-com/international/images/watch-dogs-2/watch-dogs-2-temporal-filtering-performance-640px.png (There are interactive comparisons as well but flickering and aliasing are better seen in motion via video or directly in-game to get a full feel for how the effect lowers image quality though in return for a significant performance boost which in this case makes 2560x1440 fully playable at least on a 60hz screen and 4K is kinda playable too if you can tolerate 30 FPS.) MSAA and the volumetric fog when active have a significant performance impact, PCSS+, HBAO+ and the Nvidia GPU exclusive HFTS shadow filter also affect performance though HBAO+ improves ambient occlusion noticeably whereas PCSS+ softens the shadows which compared to default ultra they can even look a bit low-res, HFTS improves resolution (And also accuracy.) over PCSS+ but costs a bit more to enable. (Guessing TXAA is about on par with MSAA in terms of performance cost.) EDIT: Also the GeForce.com guide isn't much of a GPU performance comparison as such since they usually only test a few GPU models or just one and it's more for understanding the various visual options the developers (And Nvidia) implement into the game. ๐Ÿ™‚ (All testing is done with a 1080 in this particular case which I think is also the recommended GPU for ultra settings with was it a 1060 for high settings and 1070 for a mix of them or some such?)
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Temporal filtering is disabled and is recommended to leave it at that. All TF does is render the game @ half resolution then upscale it to your configured one.
data/avatar/default/avatar35.webp
I'm not surprised. It's yet another piss-poor port coming from ubilol+nvidia collaboration.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I'm not surprised. It's yet another piss-poor port coming from ubilol+nvidia collaboration.
The Division and Siege both run fine and both are ubisoft/nvidia. Stop posting garbage.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Temporal filtering is disabled and is recommended to leave it at that. All TF does is render the game @ half resolution then upscale it to your configured one.
Yep saw that the option for it was set to off in the settings menu now. ๐Ÿ™‚ Also already mentioned at the end of the article but yeah texture resolution can also be set to Ultra up from High if the high-res pack is added but the difference is fairly small - It's also a extra 6 GB download. -and all that's going to accomplish is running 4GB VRAM - and lower. - cards into the ground at higher settings, particularly for "4K" which is what it was recommended for where I think according to the monitor in the settings menu it's pretty easy to exceed the 4 GB threshold already, upcoming GPU models will probably be 8+ GB though removing that bottleneck for a little while at least. (Only real downside of this Fury GPU for example, looking forward to seeing what AMD and Nvidia announce in 2017 for that reason though this entire system could do with a upgrade, once there's something substantial enough available at least heh well here's hoping Zen can bring some competition going up against some of the Intel CPU models, should be fun.)
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
I'm not surprised. It's yet another piss-poor port coming from ubilol+nvidia collaboration.
The game itself is not bad at all really, I certainly enjoyed playing the first half hour. And hey ultra quality is not a mandatory thing. I will also add some results with different quality modes later on today.
https://forums.guru3d.com/data/avatars/m/260/260338.jpg
Another unoptimized s..t in 2016. Nice... And this graphics... GTA SA.
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
It looks like a good port from what has been written about it. Good amount of configuration as well. Hopefully the SLI issue is resolved though.
data/avatar/default/avatar07.webp
Why would you use GTX 1060 for VRAM calculations instead of Titan X Pascal? Interesting to see Titan X Pascal & GTX 1060 using same VRAM @ 4k.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
The game itself is not bad at all really, I certainly enjoyed playing the first half hour. And hey ultra quality is not a mandatory thing. I will also add some results with different quality modes later on today.
Contrary to what some are saying, I figured this game wasn't that poorly optimized. Sure, it definitely has room for improvement even at 1080p but in my experience you could turn down a couple notches of something like real-time reflections, AA, or shadows. Those can chop off as much as 40% of your performance and in many cases there isn't much of a visible difference when turning them down a little bit. I figure with a few patches to both the game and drivers and older hardware won't struggle as much at ultra settings. Definitely interesting to me how the FX-8520 kept up so well.
data/avatar/default/avatar17.webp
We need Vega and GTX 1080 Ti! Only GPU worth anything right now is the Titan X Pascal, but few of us would spend that kind of money on a GPU.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
We need Vega and GTX 1080 Ti! Only GPU worth anything right now is the Titan X Pascal, but few of us would spend that kind of money on a GPU.
Vega, definitely yes. 1080Ti, I'm not so sure. In most cases, the GTX 1080 gives a healthy level of performance at 2K resolutions, but (from what I recall) struggles with 4K. Having 2x 1070s ends up being a better value. In order for a 1080Ti to be a worth-while product, it needs to be able to play modern games at 60FPS@4K resolutions. It would have to be noticeably faster than the Titans, which makes pricing complicated. I don't see any of that happening. Personally, I think Nvidia should make a GTX 1090. They haven't done a dual GPU in a long time and making one would likely fit them in the 4K performance bracket. Normally I think dual GPU cards are stupid but this is one of the rare instances where it's actually useful.
data/avatar/default/avatar26.webp
The Division and Siege both run fine and both are ubisoft/nvidia. Stop posting garbage.
1)The exception that proves the rule 2) I will post whatever I like. Thank you.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
1)The exception that proves the rule 2) I will post whatever I like. Thank you.
There are like dozens of exceptions that are conveniently ignored to fit the narrative. It doesn't stop with Ubisoft, nor does it stop with Nvidia. It's almost like different games, studios, etc have different levels of talent and develop to different degrees of optimization. Who would have guessed that's the case, when almost every industry in existence has similar behavior. That's not to mention that this game in particular, according to steam reviews, reddit comments and Hilbert, is well optimized. There are like a couple settings that impact performance for nearly zero difference in fidelity - which is becoming an increasingly common trend in modern games.
https://forums.guru3d.com/data/avatars/m/105/105139.jpg
LOL I fired it up this morning and the game crashed pretty much RIGHT after the end of that vid clip when I went to do my first hack....Back to MGS V I went...
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
Temporal filtering is disabled and is recommended to leave it at that. All TF does is render the game @ half resolution then upscale it to your configured one.
Why is that even an option :3eyes:?
Vega, definitely yes. 1080Ti, I'm not so sure. In most cases, the GTX 1080 gives a healthy level of performance at 2K resolutions, but (from what I recall) struggles with 4K. Having 2x 1070s ends up being a better value. In order for a 1080Ti to be a worth-while product, it needs to be able to play modern games at 60FPS@4K resolutions. It would have to be noticeably faster than the Titans, which makes pricing complicated. I don't see any of that happening. Personally, I think Nvidia should make a GTX 1090. They haven't done a dual GPU in a long time and making one would likely fit them in the 4K performance bracket. Normally I think dual GPU cards are stupid but this is one of the rare instances where it's actually useful.
I don't think they're going to release a dual card again. The 690 was actually a pretty good value and held its own for some time. The Titan line has been all about dumping it in favor of the next and the next and maximizing profit all the way.
https://forums.guru3d.com/data/avatars/m/68/68055.jpg
Chill out, it's a Ubisoft game, it need's couple of patches and a Steam sale discount.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I don't think they're going to release a dual card again. The 690 was actually a pretty good value and held its own for some time. The Titan line has been all about dumping it in favor of the next and the next and maximizing profit all the way.
You may be right, but it's a bit confusing how they have repeatedly left the #90 blank for so many generations. As you said, the Titans pretty much fit that slot but that's already backfiring since people are getting confused about which Titan belongs to which generation. Just gets me to think that Nvidia is leaving the #90s blank for "just in case we make a dual GPU card again". Nvidia no longer makes IGPs, both AMD and Intel have IGPs that would outperform any low-end Nvidia products, and Nvidia seems to be getting a little carried away with the "Ti" suffix. In terms of their numbering scheme, their product lineup is looking pretty small. For this generation, they only go from 1050 to 1080. Replace Titan with #90, spread out the numbers, drop the Ti suffix, and Nvidia would have a pretty clean looking lineup.