Assassin's Creed: Syndicate - NVIDIA GameWorks Video

Published by

Click here to post a comment for Assassin's Creed: Syndicate - NVIDIA GameWorks Video on our message forum
https://forums.guru3d.com/data/avatars/m/43/43617.jpg
Oh well I'm looking forward to it, and I think it looks stunning as well, as did Unity. I'm not a huge AC fan, but despite that I did like Unity and its setting and this looks even better.
data/avatar/default/avatar12.webp
It looks like they disabled all ambient oclusion to show the goodness of their hbao XD.
https://forums.guru3d.com/data/avatars/m/265/265317.jpg
Well i for one loved AC:Black flag and have watched some parts of a playthrough and it seems good in my oppinion. All this complaining though, if you dont like what you see, play something else or wait for some performance reviews:P
https://forums.guru3d.com/data/avatars/m/113/113386.jpg
Well i for one loved AC:Black flag and have watched some parts of a playthrough and it seems good in my oppinion. All this complaining though, if you dont like what you see, play something else or wait for some performance reviews:P
I still have to finish Black Flag, but playing too much other stuff atm. 🙁 Hope i can run PCSS but probably not at 60fps, with it on in Unity i got around 40-50fps with it on.
https://forums.guru3d.com/data/avatars/m/265/265317.jpg
I still have to finish Black Flag, but playing too much other stuff atm. 🙁 Hope i can run PCSS but probably not at 60fps, with it on in Unity i got around 40-50fps with it on.
Yes thats the problem here too, too many games to choose from. I have about 30 installed but im yet to finish around 10 of those 🙂
https://forums.guru3d.com/data/avatars/m/251/251394.jpg
NVIDIA should release a new Ambient Occlusion technique called LMAO. I'm sick of hearing TXAA being superior. I always stick with MSAA or SMAA. TXAA slightly blurs the textures. In AC4, Ultra Shadows looked better than PCSS. NVIDIA Godrays in FC4 turns off Volumetric Fog. So, you have to choose between Volumetric Fog or Crappy Godrays. I also see that every game with GameWorks is a crappy port. Even AMD's GameWorks equivalent is better than this. HBAO+ is the only feature that is acceptable by me. Simulated Fur in FC4 was way buggy.
https://forums.guru3d.com/data/avatars/m/251/251394.jpg
Wait a minute, is that really in the code of WD?!
Yes.
data/avatar/default/avatar36.webp
And where is the tessellation that was introduced with Unity and never saw a day light? Now they don't even show off with it, probably because adding it to the broken game will make it screenshot fest so lets make it just barely playable without it.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
C'mon ATI owners, let us team green have a few games too.Last year was all about ATi games.And the fact that ubi games are crap is a well known fact and it's not caused by gameworks but by ubi themselves as they simply don't care about PC too much.Pity it's becoming a trend with supposedly AAA titles...bloody consoles and greedy studios not willing to spend money on PC porting
Not many games thats AMD specific and excluding BF3 they all ran on AMD and nVidia hardware pretty well anyway with far fewer issues. Gameworks is a different kettle of fish it runs crap on anything but the current highest end flag ships.
Not surprised either, this forum is becoming a gameworks moanfest.
Of course people will, they paid good money for a game thats nothing more than a lag fest even when far exceeding the recommended specs. Recommended specs is just that but when they slap game works in there and nVidia series 7 and less along with AMD owners having crippled performance, heck they have even don't lable them as nVidia specific features anymore then yes they have the right to be pissed off, cause they took the money from nVidia to promote their newest hardware and gave their customers a big FU slap. They have done nothing but screw their customer over for a quick cash inject from nVidia. Remember nVidia won't let AMD see the source code or how it runs making it difficult for AMD to optimise their paths for it, heck even developers can't properly optimise it either cause it's part of their licence agreement. So yes it will be a moan fest, i have had a couple of games with nVidia ****doesn'tworks and i can say i will NEVER buy any other game with that on it NEVER! Every game with gameworks has been a complete disaster bugged to hell and unoptimised to hell for both nVidia and AMD owners, normally it takes a few pathes before they can even be considered remotely playable.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Not many games thats AMD specific and excluding BF3 they all ran on AMD and nVidia hardware pretty well anyway with far fewer issues. Gameworks is a different kettle of fish it runs crap on anything but the current highest end flag ships. Of course people will, they paid good money for a game thats nothing more than a lag fest even when far exceeding the recommended specs. Recommended specs is just that but when they slap game works in there and nVidia series 7 and less along with AMD owners having crippled performance, heck they have even don't lable them as nVidia specific features anymore then yes they have the right to be pissed off, cause they took the money from nVidia to promote their newest hardware and gave their customers a big FU slap. They have done nothing but screw their customer over for a quick cash inject from nVidia. Remember nVidia won't let AMD see the source code or how it runs making it difficult for AMD to optimise their paths for it, heck even developers can't properly optimise it either cause it's part of their licence agreement. So yes it will be a moan fest, i have had a couple of games with nVidia ****doesn'tworks and i can say i will NEVER buy any other game with that on it NEVER! Every game with gameworks has been a complete disaster bugged to hell and unoptimised to hell for both nVidia and AMD owners, normally it takes a few pathes before they can even be considered remotely playable.
The whole codepath thing is a red herring. The bottom line is that nearly every gamework effect utilizes tessellation as it's primary function. And AMD's best card (Fury X) has the tessellation performance of a GTX 780. Even if AMD had full access to the source and Nvidia's engineers and everything -- their performance would still suffer. You can make the argument that maybe Nvidia shouldn't be implementing these effects with tessellation. But honestly after reading a lot about how they implement God Rays and Hairworks, their reasoning makes sense. Hairworks has a performance impact, yeah -- but so did TressFX. Yet Hairworks is enabled across a range of characters and creatures, TressFX was enabled on only one. Also due to the nature of the vertex implementation, TressFX makes the initial setup for the character extremely difficult. Artists need to spend far more time per character than they need to do with Hairworks, which just requires modifying a bunch of parameters. God Rays is the same thing. Go download CryEngine 2/3 or read about the volumetric light shafts in the engine. In CE2 it required a ton of scene setup time and could only really be used in specific lighting circumstances. In CE3 it uses the dynamic GI but the performance hit is extremely high. Nvidia's God Rays renders a grid in the sky, dynamically tessellates the grid based on the shadow map, and extrudes the vertices. It's extremely easy to implement in games/scenes and the performance scales dynamically based on the complexity of the geometry. Anyway, instead of whining, AMD should just improve it's geometry performance. Similarly Nvidia should improve it's shader performance. Battlefront for example, an AMD sponsored title, runs way better on AMD hardware than it does Nvidia's -- because the game is heavily shader based, which is where Nvidia's architecture is a little slower.
data/avatar/default/avatar07.webp
+ 1 to this ^... AMD fanboys keep on bashing gameworks. why not bash AMD? well for the past decade all you got was Pantene Simulator.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Anyway, instead of whining, AMD should just improve it's geometry performance. Similarly Nvidia should improve it's shader performance. Battlefront for example, an AMD sponsored title, runs way better on AMD hardware than it does Nvidia's -- because the game is heavily shader based, which is where Nvidia's architecture is a little slower.
Thankfully AMD users can set a maximum tessellation level which helps.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
[...] Anyway, instead of whining, AMD should just improve it's geometry performance. Similarly Nvidia should improve it's shader performance. Battlefront for example, an AMD sponsored title, runs way better on AMD hardware than it does Nvidia's -- because the game is heavily shader based, which is where Nvidia's architecture is a little slower.
Way better? Not so sure I'd call it way better.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
NVIDIA should release a new Ambient Occlusion technique called LMAO. I'm sick of hearing TXAA being superior. I always stick with MSAA or SMAA. TXAA slightly blurs the textures. In AC4, Ultra Shadows looked better than PCSS. NVIDIA Godrays in FC4 turns off Volumetric Fog. So, you have to choose between Volumetric Fog or Crappy Godrays. I also see that every game with GameWorks is a crappy port. Even AMD's GameWorks equivalent is better than this. HBAO+ is the only feature that is acceptable by me. Simulated Fur in FC4 was way buggy.
TXAA looks by far the best when the scene is in motion. MSAA and SMAA is better for static shots.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Way better? Not so sure I'd call it way better.
Maybe not on the Ti level, but at my resolution (QHD) a R9 290 ($230) performs the same as my 980 ($450).
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
Not many games thats AMD specific and excluding BF3 they all ran on AMD and nVidia hardware pretty well anyway with far fewer issues. Gameworks is a different kettle of fish it runs crap on anything but the current highest end flag ships. Of course people will, they paid good money for a game thats nothing more than a lag fest even when far exceeding the recommended specs. Recommended specs is just that but when they slap game works in there and nVidia series 7 and less along with AMD owners having crippled performance, heck they have even don't lable them as nVidia specific features anymore then yes they have the right to be pissed off, cause they took the money from nVidia to promote their newest hardware and gave their customers a big FU slap. They have done nothing but screw their customer over for a quick cash inject from nVidia. Remember nVidia won't let AMD see the source code or how it runs making it difficult for AMD to optimise their paths for it, heck even developers can't properly optimise it either cause it's part of their licence agreement. So yes it will be a moan fest, i have had a couple of games with nVidia ****doesn'tworks and i can say i will NEVER buy any other game with that on it NEVER! Every game with gameworks has been a complete disaster bugged to hell and unoptimised to hell for both nVidia and AMD owners, normally it takes a few pathes before they can even be considered remotely playable.
Never understood the moaning about a feature like gameworks that can be turned off. Amd cards can use it, but nvidia is better at tessellation than amd and you got a slider in the amd driver that can manipulate tessellation for better performance.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Never understood the moaning about a feature like gameworks that can be turned off. Amd cards can use it, but nvidia is better at tessellation than amd and you got a slider in the amd driver that can manipulate tessellation for better performance.
Because you own a pair of Nvidia cards? Try a couple of AMD GPUs and disable Gameworks, maybe you will discover Gameworks is a little more than a couple of on/off features. The problem with Gameworks and AMD GPUs is not to enable or disable some features.The problem is game devs accept by contract that the added Nvidia features code is closed for them (and anybody else) and this piece of software is taxing the AMD performance in many cases EVEN with Gameworks features turned off because it's integrated on the game code affecting more than only Gameworks on/off features. I.E. The Witcher 3 case: Revisiting GameWorks: AMD, Nvidia tangle over optimizations to The Witcher 3: Wild Hunt
The GameWorks libraries are sometimes referred to as “middleware” because they act as a bridge between different aspects of the game engine. Instead of building a custom physics engine from scratch, a game developer can use a solution like PhysX or the Intel-owned Havok physics solution, thus saving both time and money compared with doing their own from-scratch implementation. There are dozens of middleware plugins and libraries; The Witcher 3, for example, uses the Umbra 3 and SpeedTree plugins as well as GameWorks. What sets GameWorks apart is that its developed by Nvidia explicitly for Nvidia GPUs, as opposed to being a third-party program written by a software vendor. This has implications for how that source code can be used or shared, which we explore in detail in the next section. Whether GameWorks impairs performance on competing hardware depends entirely on how you parse the meanings behind the statement. Nvidia would argue that incorporating a GameWorks library doesn’t intrinsically impair competing hardware because HairWorks is an optional, user-controlled setting. Integrating a GameWorks library doesn’t impact performance in the title as a whole — if HairWorks doesn’t run properly on an AMD chip, turning it off fixes the problem (from Nvidia’s perspective, at least). AMD would argue that “impair” has a wider meaning than Nvidia has assigned it. As we discussed last year, one of the differences between the two firms is that AMD tends to rely more on source code optimization than Nvidia does, which means closed-source libraries tend to hurt Team Red more than Team Green. Claiming that AMD could write its own libraries to accomplish essentially the same tasks ignores the fact that game developers adopt libraries to simplify their lives, and relatively few of them are willing to adopt two different solutions for doing the same thing. Practically speaking, AMD doesn’t have the financial resources to attack Nvidia head-on in this fashion in any case. Thus, from AMD’s perspective, GameWorks absolutely impairs its own ability to optimize code.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Maybe not on the Ti level, but at my resolution (QHD) a R9 290 ($230) performs the same as my 980 ($450).
Ah, that's what you meant. Yeah, I understand what you're talking about now. I already find it highly questionable that the 980 hardly appears in benchmarks anymore, especially those related to anything testing the compute schedulers (AoS), but I didn't bother to read up on it. Yeah, the Asus 980s haven't been the best purchase I've done so far 😀
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
Because you own a pair of Nvidia cards? Try a couple of AMD GPUs and disable Gameworks, maybe you will discover Gameworks is a little more than a couple of on/off features. The problem with Gameworks and AMD GPUs is not to enable or disable some features.The problem is game devs accept by contract that the added Nvidia features code is closed for them (and anybody else) and this piece of software is taxing the AMD performance in many cases EVEN with Gameworks features turned off because it's integrated on the game code affecting more than only Gameworks on/off features. I.E. The Witcher 3 case: Revisiting GameWorks: AMD, Nvidia tangle over optimizations to The Witcher 3: Wild Hunt
If true that sucks, at least a pretty large majority of gamers do not have to worry such issues.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
If true that sucks, at least a pretty large majority of gamers do not have to worry such issues.
You are absolutely right about that! Nvidia has 82% of the dedicated GPU market.