Strange Brigade: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Strange Brigade: PC graphics performance benchmark review on our message forum
data/avatar/default/avatar16.webp
I have not heard of the game either,but it looks like something I would like to play.Thanks for the review.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Great optimization there! How's the game btw?
data/avatar/default/avatar09.webp
Maybe it's just me but I thought DX12 and, to an extent, Vulcan were both able to leverage more cores and weren't just about those GHz. Less CPU overhead or at least more balanced CPU overhead or something like that. It would be more interesting to see these APIs tested on say a 1060, 1080, 570 and Vega 64 with say an FX 8300, Ryzen 5 and something like i7 2600 vs your standard test bench.
data/avatar/default/avatar37.webp
IMO you are not praising Vulkan highly enough here. It is an open and portable standard, unlike d3d12 which is proprietary locked down crap. The fact that the developers implemented it alongside d3d12 and that it performs on par is awesome for those of us who doesn't use Windows. While this game has no Linux-port as far as I know, it would still run awesomely on Linux using compatibility layers like Proton or Wine. With Vulkan, it would have pretty much zero extra overhead since the graphics API doesn't have to be translated by the compatability layer. A shame it has Denuvo though, as it's rootkit-like functionallity often causes trouble for compatibility layers. Here's to hoping they remove that crap ASAP. Doom 2016 is a great example of how awesome compatibility layers can be when using Vulkan: [youtube=adQNMESdgO4]
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
fatboyslimerr:

Maybe it's just me but I thought DX12 and, to an extent, Vulcan were both able to leverage more cores and weren't just about those GHz. Less CPU overhead or at least more balanced CPU overhead or something like that. It would be more interesting to see these APIs tested on say a 1060, 1080, 570 and Vega 64 with say an FX 8300, Ryzen 5 and something like i7 2600 vs your standard test bench.
U do not really need CPU power for games, these are not that complicated tasks (like calculating proteins), U just need something to handle the graphic cards driver according to the performance it is capable of doing and then some for the AI and sound and engine
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
HOLY S***, look at the Fury X go at 4k. Even then 1080p, 1440p results are really good.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Frametime graphs reads microseconds...
data/avatar/default/avatar40.webp
Agonist:

HOLY S***, look at the Fury X go at 4k. Even then 1080p, 1440p results are really good.
Good proof for 4GiB being enough despite seing higher memory usage on cards with >4GiB.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Raider0001:

U do not really need CPU power for games, these are not that complicated tasks (like calculating proteins), U just need something to handle the graphic cards driver according to the performance it is capable of doing and then some for the AI and sound and engine
Deferred render contexts for one thing. https://docs.nvidia.com/gameworks/content/gameworkslibrary/graphicssamples/d3d_samples/d3d11deferredcontextssample.htm For AMD I'm not too sure how that works at the moment, doesn't seem to be immediately doable but then again they do use multi-threaded rendering as well so, well I have a lot more to learn on the subject. 🙂 https://github.com/GPUOpen-LibrariesAndSDKs/AGS_SDK/issues/20 I do know a lot of games create render threads though, Final Fantasy XV ideally wants a 6 core processor, Assassin's Creed Origins creates a whooping 8 of them and Monster Hunter World has some bug in it so it creates 32 unless restrained though that's not strictly just render threads from how I understand it. And it's improved or even part of DirectX 12 and Vulkan already without extras or other add-on bits though used well it should improve overall CPU usage but it's spread across more cores and games like AC:O don't scale down very well so it's 8 or nothing and thus quad cores see a bit of stuttering though hexa cores do alright in most situations though CPU usage is pretty high. (Though the game also spends a lot of resources on the WMI thread for some undetermined reason.) Though I suppose technically you would still be correct, it's less about power (clock speed.) and more about using the additional cores logical or physical for modern processors more fully though it doesn't hurt to have both especially with modern PC ports being a bit, wonky, and that's being generous for how some of them turned out ha ha. (Although a few are also genuinely demanding and push hardware to it's limits adding various enhancements in the process.) I suppose efficiency doesn't hurt either, depending on extensions and what type of instruction it is modern processors can be quite a bit faster but overall I guess a GPU limit is still the most common but CPU and RAM certainly helps too, sometimes surprisingly so. 😀
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Good to see it released. Runs great too, but the price is a bit steep atm, will wait a bit unless I get a good deal. And I see they jumped on that bs dlc season pass too 🙁 About threading, their engine was never really optimized for multicore, although things changed since avp2010, nazi zombie army was a lot better. And from what I saw it's the same vibe now, nothing bad, but to hire same voice actor for story is a bit meh.. always reminds me of NZA series.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Guys the frame time and pacing plots have been updated. I made a mistake and used the improper data-set (totally stupid), that propagated into the plots. This has been updated and fixed now!
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
i have a feeling this is going to be the new ashes of benchularity ! great performance review HH !
https://forums.guru3d.com/data/avatars/m/255/255045.jpg
This game is awesome, im having a lot of fun playing it!
data/avatar/default/avatar29.webp
Agonist:

HOLY S***, look at the Fury X go at 4k. Even then 1080p, 1440p results are really good.
Well optimized game, and the Fury X works really, really well with Async. Older cards like R9 490x and even to a lesser degree the R9 280x should also have surprisingly good results and punch way above what would be expected leaving their old competition behind.
data/avatar/default/avatar26.webp
I know it is an AMD supported title, but look at how AMD cards destroy the NV competitors: RX 580-GTX 1060, Vega 56-GTX 1070. Vega 56 even tops the 1080.
data/avatar/default/avatar39.webp
Yeah I believe AMD worked with Rebellion on their Asura graphics engine and it seems to run really smooth.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
BReal85:

I know it is an AMD supported title, but look at how AMD cards destroy the NV competitors: RX 580-GTX 1060, Vega 56-GTX 1070. Vega 56 even tops the 1080.
It's pretty funny how much it matters. But naturally Nvidia still claims the absolute crown with their top cards. I reckon the RTX cards would be quite tough as well.
data/avatar/default/avatar05.webp
Nvidia have async compute?
https://forums.guru3d.com/data/avatars/m/90/90026.jpg
Lights and shadows looks soo much better than in ohyped RT Battlefield V...