Watch Dog 2: PC graphics performance benchmark review

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for Watch Dog 2: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
And DX12/Vulkan support...
+1 This :thumbup:
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Making the game a showcase for DirectX 12 was the original plan when AMD had whatever it's called, sponsorship (?) of the game but somewhere between last year and now Nvidia took over instead and it's a GameWorks showcase instead, probably looks better now than what AMD would have wanted to add to the game though. (Up to the developers to make it a good port though, whether AMD or Nvidia is backing implementing certain effects or using a specific API doesn't necessarily change this.) (Besides for multi-GPU support it would seem, I'm sure there's more to it than that though but NDA's and all that.) EDIT: Not always about money either far as I see it even if it might have some influence but say that AMD wanted the game to be DX12 exclusive with the current slow adoption to Windows 10 and the more recent builds of it required for DX12 to properly function it would have hurt sales for Ubisoft though that's just a example and not fact. (Money does probably play a factor in some publishers being bat-**** crazy and setting their PC releases up as timed exclusives on the Win10 Store though, that hasn't been handled well by Microsoft at all and must be hurting sales like hell plus probably fewer customers then willing to pay full price after X months anyway even if it's on a more popular platform like Steam..)
data/avatar/default/avatar24.webp
duderandom's benchmarked Watch dogs 2 with MSAA 2X. I think we get misleading result if in a performance test MSAA is used for AA. MSAA is extremely taxing with very little or no benefit over Post-Process Anti-Aliasing methods like SMAA which costs just a few frames per second. using MSAA gives the wrong impression about game's performance and is better to be disabled in any graphics performance benchmark.
I don´t think 2xmsaa is that demanding, temporal setting uses it by default on this game. I attribute the gap to one test being made in a closed mission area, the other in the open world. Ps.- And it would be crazy that you can´t play ports with at least msaa 2x with a 1080 at 1080p, I remember forcing 2xmsaa in lots of deferred engine games with my 9800gt, gtx260 and then my 660 no problem, 60fps.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I don´t think 2xmsaa is that demanding, temporal setting uses it by default on this game. I attribute the gap to one test being made in a closed mission area, the other in the open world. Ps.- And it would be crazy that you can´t play ports with at least msaa 2x with a 1080 at 1080p, I remember forcing 2xmsaa in lots of deferred engine games with my 9800gt, gtx260 and then my 660 no problem, 60fps.
MSAA is insanely demanding on modern engines actually. And you always get much better results with a temporal AA + sharpness filter. Those engines don't have not even close the amount of shading that modern engines have.
data/avatar/default/avatar15.webp
I don´t think 2xmsaa is that demanding, temporal setting uses it by default on this game. I attribute the gap to one test being made in a closed mission area, the other in the open world. Ps.- And it would be crazy that you can´t play ports with at least msaa 2x with a 1080 at 1080p, I remember forcing 2xmsaa in lots of deferred engine games with my 9800gt, gtx260 and then my 660 no problem, 60fps.
Check out Nvidia article on Watch Dogs 2 Performance Guide. with disabling 2x MSAA performance improves +38%. http://www.geforce.com/whats-new/guides/watch-dogs-2-graphics-and-performance-guide although I agree with you this test should be done in one of the busy streets of the city. I always watch DudeRandom's videos because his tests include many different scenarios in games.
data/avatar/default/avatar25.webp
Well that´s msaa performance is baad, but still there seems to be more than twice the performance gap between a closed area mission and open world. I don´t know why guru3d tested like this. And engine devs and nvidia and amd need to work together to bring proper antialiasing back. The forza horizon 3 devs have proven it´s doable using those amd dx11+ techniques (console side), post process AA and sharpness filters are lame.
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
Game runs well for me. I use temporal filtering, I'm not sure why people say not to use, at 1440p I see no difference in quality yet I gain 20fps. Ubisoft have done a pretty good job for once, and the game itself is so much better than the first Watch Dogs.
https://forums.guru3d.com/data/avatars/m/259/259676.jpg
Why do you guys use temporal filtering and say so nice words about it. IT SUCKS, BADLY, a gimmick, it's a ***** and fully disrespectful move for PC gamers, that's it all about it. Developers were smiling when they add it ingame and after seeing such approve from users they LAUGH ! "Hey let's half game's resolution,put a filter to blur it a bit so they don't hurt their eyes badly enough and we will be golden!" It's a fake excuse for developers to "give" gamers a "better" performance LOSING SO MUCH IQ..and people say it's great...Jesus,teh game industry is so lame and it's like nothing we can do, only to accept ****ty techniques like this one. Wake up guys. For example you Hughesy, you really don't see any difference in image quality?? take a better look, unless you don't mind playing at 720p (using 1440p+TF) a game today. Clarity and antialiased/crispy image quality has gone way beyond joy..oh boy. But please people, don't get so excited or satisfied with lame methods.. For my taste aliasing is not acceptable thus all that angry post. You can't even compare double resolution + TF vs pure resolution with TF OFF. Oh test it and compare performance/image quality with your eyes to see which one is better.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Why do you guys use temporal filtering and say so nice words about it. IT SUCKS, BADLY, a gimmick, it's a ***** and fully disrespectful move for PC gamers, that's it all about it. Developers were smiling when they add it ingame and after seeing such approve from users they LAUGH ! "Hey let's half game's resolution,put a filter to blur it a bit so they don't hurt their eyes badly enough and we will be golden!" It's a fake excuse for developers to "give" gamers a "better" performance LOSING SO MUCH IQ..and people say it's great...Jesus,teh game industry is so lame and it's like nothing we can do, only to accept ****ty techniques like this one. Wake up guys. For example you Hughesy, you really don't see any difference in image quality?? take a better look, unless you don't mind playing at 720p (using 1440p+TF) a game today. Clarity and antialiased/crispy image quality has gone way beyond joy..oh boy. But please people, don't get so excited or satisfied with lame methods..
As per Party Poison: http://images.nvidia.com/geforce-com/international/comparisons/tom-clancys-rainbow-six-siege/tom-clancys-rainbow-six-siege-anti-aliasing-interactive-comparison-001-off-vs-temporal-filtering.html While it definitely has an impact on AO, the rest of the image looks as good as the original. So I'm not sure why you'd be against it. Sure if you want better image quality, turn it off - but as an option for people with midrange/lower cards, it does a good job of increasing performance, without sacrificing that much quality. Adding a PC only option for people with lower cards is definitely not a "fully disrespectful move for PC gamers", it's the opposite.
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
Exactly, how does an option that sacrificed image quality for performance suck? Is every setting that can be scaled down an insult to PC gamers? Such nonsense, and campaigning for less options is a new level of weirdness. I didn't like it myself when I played RS beta, but it's a great addition to have.
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
Why do you guys use temporal filtering and say so nice words about it. IT SUCKS, BADLY, a gimmick, it's a ***** and fully disrespectful move for PC gamers, that's it all about it. Developers were smiling when they add it ingame and after seeing such approve from users they LAUGH ! "Hey let's half game's resolution,put a filter to blur it a bit so they don't hurt their eyes badly enough and we will be golden!" It's a fake excuse for developers to "give" gamers a "better" performance LOSING SO MUCH IQ..and people say it's great...Jesus,teh game industry is so lame and it's like nothing we can do, only to accept ****ty techniques like this one. Wake up guys. For example you Hughesy, you really don't see any difference in image quality?? take a better look, unless you don't mind playing at 720p (using 1440p+TF) a game today. Clarity and antialiased/crispy image quality has gone way beyond joy..oh boy. But please people, don't get so excited or satisfied with lame methods.. For my taste aliasing is not acceptable thus all that angry post. You can't even compare double resolution + TF vs pure resolution with TF OFF. Oh test it and compare performance/image quality with your eyes to see which one is better.
Wow you need to calm down.. If it made the game look really bad then I wouldn't use it, but it makes no difference in this game at all. Go look at comparisons it you don't believe me. Christ when did the PC community become such entitled whinge bags. Nobody forces you to turn it on, Playing on PC is all about choices.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Exactly, how does an option that sacrificed image quality for performance suck? Is every setting that can be scaled down an insult to PC gamers? Such nonsense, and campaigning for less options is a new level of weirdness. I didn't like it myself when I played RS beta, but it's a great addition to have.
Didn't you hear? It's the new "Master Race". Where nothing is optimized and you have to spend $1,000 per year on your computer!
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
Yeah, they are just a bit annoying and they do nothing but help perpetuate the myth that it's essential to spend that sort of money. As per above, they are already turning on anyone on lower spec hardware, and they will no doubt be the new people to hate when both new consoles have been "upgraded"
data/avatar/default/avatar22.webp
The problem I see with this temporal trick is that it will make devs even more lazy... there shouldn´t be need for this outside of consoles and very old cards (similar to consoles). You see, cards like the 1070 or 1080 should be able to run this game with supersampling, not the other way around.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The problem I see with this temporal trick is that it will make devs even more lazy... there shouldn´t be need for this outside of consoles and very old cards (similar to consoles). You see, cards like the 1070 or 1080 should be able to run this game with supersampling, not the other way around.
I don't even understand this logic. Adding an option for people is making devs more lazy? Hell I don't even understand why people are upset about the performance at the maximum possible setting. If Crysis came out today, I feel like half of Guru3D would have a meltdown. In fact, even when it came out - despite graphically being years ahead of it's time, people still cried about not being able to run it on Ultra at 60fps. "I want the best possible graphics, with no limitations, but I want it to somehow magically scale perfectly across a range of devices, but any setting aiding in that happening is lazy" Ok.
data/avatar/default/avatar08.webp
Having pc gamers to rely on this trick despite using cards several times stronger than consoles is laziness. It´s not the setting per se.
https://forums.guru3d.com/data/avatars/m/259/259564.jpg
The problem I see with this temporal trick is that it will make devs even more lazy... there shouldn´t be need for this outside of consoles and very old cards (similar to consoles). You see, cards like the 1070 or 1080 should be able to run this game with supersampling, not the other way around.
The only problem I see is letting people like you voice their opinions without being sure we drown it out. Go here: http://www.geforce.com/whats-new/guides/tom-clancys-rainbow-six-siege-graphics-and-performance-guide Scroll down to the AA section. Look at the permutations of AA available. The number of options are insane. Image quality and options related to it are not lacking. Stop whining; no one is reaching over to participate in your circlejerk on this one.
I don't even understand this logic. Adding an option for people is making devs more lazy? Hell I don't even understand why people are upset about the performance at the maximum possible setting. If Crysis came out today, I feel like half of Guru3D would have a meltdown. "I want the best possible graphics, with no limitations, but I want it to somehow magically scale perfectly across a range of devices, but any setting aiding in that happening is lazy" Ok.
It's even more unattainable than that. When you sit and internalize that this guy is bitching about a great game that looks great and runs well, with a plethora of options, because it has an option he doesn't need. It's asinine. He should be completely ignored, but unfortunately we've been letting voices like this have these inane opinions for a little too long. The truth of the matter is that these people are fanatical cynics and unfortunately their loud, useless, voices dictate a little too much in the gaming industry.
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
Who says people with a 1070/80 are having to "rely" on using temporal AA? It is a sad state of affairs nowadays, and true that a game like Crysis 1 would be slaughtered nowadays, even more than it was 10 years ago. Not to sound like a grumpy old man, but i suppose that's the new generation for you.
https://forums.guru3d.com/data/avatars/m/259/259676.jpg
I don't say I am against to have more options so users may choose their perf/IQ ratio BUT many of you lose the point of my post, I've read everywhere (I don't bother with Nvidia's advertising and giving SO MUCH credit about it,they have to sell stuff,business is all about money..) that "this temporal filtering trick is amazing" "I have so much nice performance and using it I don't lose quality,it's the same maybe even BETTER" (Jesus..) "hey all using this a get stable 30 fps on 4k, I am so happy" and I really wonder how the new generation brain works? Yes it's just a ****ty option that is so "good" for consoles, for developers that ARE LAZY for optimization their software, for developers that due to time deadlines (business/money in the middle again..) have to release no matter what or how the perf is, and last and worst: many gamers are happy with such tricks leading developers to use more "cheap" techniques at loss of quality as they see that their customers "take the pill" 'cause they don't care much for quality..(for example consoles like PS4 "pro" is another gimmick that so many ignorant customers believe that is BETTER if it can render 4K !?!?! Bottom line (hopefully I will be so wrong about this..but only time will show) as long we,PC gamers, accept to lose quality (due to lack of hard and obviously much time consuming work on games by their developers) some techniques will be taken for granted and then a day will come that games start to developed/released only with temporal filtering which then would be just "regular" method of gaming.. it's all about the money these days, "make it fast,cheap,release and get the money" . ASK for better optimization, DEMAND better support for your hardware, DONT be -PC industrial- fashion victims.. is it so hard to just think of it?