Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance

Published by

Click here to post a comment for Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance on our message forum
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
I don't even know what to think about these DX12 Benchmark tests. For a start they all seem to favor AMD gfx cards which means god only knows what Nvidia cards are really capable of. We have to remember that HH only uses default clocks on reference gfx cards. I doubt anyone here even runs a reference card. What i do know is that my 980 G1 will out score a reference card by as much as 20% in some tests/benchmark results. If H does a test like this then usually i get about an extra 10 fps on top of his reference results. He's getting 72 fps on a reference 980 @ 1080p which for all we know is crippled by the AMD game engine. I'd get 80+fps on my G1 980 which for me is more than enough to be able to play an AMD DX12 game from 2016, so i'm actually very happy with that and hope to see some actually Nvidia favored DX12 games in the near future.
I feel like a broken record. AMD does not have an inherent DX12 advantage. AMD cards tend to be perform far better in DX12 compared to DX11 because of CPU overhead. In Ashes of the Singularity (and very probably this game as well) raw compute throughput is the major determining factor in game performance. It's no surprise a stock 390 outperforms a stock 980. Stock vs stock 390 vs 980 5376 gflops vs 5038 gflops This is the exact same situation as AotS. 390 OC vs 980 OC 1200mhz (i'm being nice today) vs 1500mhz 6144 vs 6144 The AMD cards in this G3D benchmark test should actually be performing better; Fury X vs 390x Fury X has 45% more shaders @ same clock Fury X is 25% faster at 4k Titan X vs 980 TX has 50% more shaders @ 4% lower clock (reference) TX performs 35.7% faster at 4k
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Almost all the titles have been AMD-sponsored, but I'm not even sure how much that matters any more. The two titles that NVIDIA are doing better are DX11 engines with DX12 patched on.
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
Almost all the titles have been AMD-sponsored, but I'm not even sure how much that matters any more. The two titles that NVIDIA are doing better are DX11 engines with DX12 patched on.
I hope you don't mean ashes, because nvidia hardware is outperforming AMD in it
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I hope you don't mean ashes, because nvidia hardware is outperforming AMD in it
No, it's not. Unless you compare 16nm NVIDIA to 28nm AMD, it does not. The 390x is practically as fast as the 980Ti and the 380x as the 970.You say that AMD does not have an inherent DX12 advantage, which I agree with. The fact is though that NVIDIA is pricing their hardware according to what is basically DX11 perceived performance, and not what the cards can do to the max (as it happens with most DX12 titles). So AMD prices accordingly. The 380x reaching 970 performance is one example of pricing like that. So if you take that into account, then yeah, AMD does have a performance advantage in DX12 regarding performance/dollar. http://www.guru3d.com/index.php?ct=articles&action=file&id=22301
data/avatar/default/avatar03.webp
Almost all the titles have been AMD-sponsored, but I'm not even sure how much that matters any more. The two titles that NVIDIA are doing better are DX11 engines with DX12 patched on.
Can we get a source for that? I mean every time amd does better in something someone is claiming that amd paid for it. At least with nvidias gameworks it is pretty clear when they had a hand in it.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
I feel like a broken record. AMD does not have an inherent DX12 advantage. AMD cards tend to be perform far better in DX12 compared to DX11 because of CPU overhead. In Ashes of the Singularity (and very probably this game as well) raw compute throughput is the major determining factor in game performance.
You and me both, although disabling async compute reduces performance by about 10% most of the increase in performance is the removal of the driver overhead like both of us have already said. It doesn't matter which game it is we seem to always come across it's been sabotaged against nVidia but i think people are over hyping async compute way too much. Rather than, nVidia cards have been running 99% efficient and AMD cards have been running 60% efficient and DX12 has allowed them to become 99% efficient since the driver is no longer fighting for resources on the one CPU core along with applications running code on that very core. I think it's also the reason why frame pacing is typically worse on AMD cards compared to nVidia too.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Can we get a source for that? I mean every time amd does better in something someone is claiming that amd paid for it. At least with nvidias gameworks it is pretty clear when they had a hand in it.
I won't even get into that, there is no meaning. The truth of the matter though is that in all engines that were developed with DX12 in mind, AMD has been getting much higher than their usually expected performance. Yes, they were games sponsored by them, but on the other hand there haven't been any closed libraries like GameWorks on, which means that NVIDIA has had access. They didn't even complain about no access either. The two games they "win" are Rise of the Tomb Raider (a game ported 3 months before its time with DX12 patched on), and Gears of War Ultimate Edition (an atrocious original release with DX12/UWP bolted on top of Unreal Engine 3).
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Total war has always performed iffy and required pure clocks to mitigate. I bet they offered DX12 implementation to bring performance gains across the board. I doubt they just dropped a suitcase of money and said optimize for their cards.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
Total war has always performed iffy and required pure clocks to mitigate. I bet they offered DX12 implementation to bring performance gains across the board. I doubt they just dropped a suitcase of money and said optimize for their cards.
Well since AMD is in the red for their finances, it would have been a pretty light suite case 😀 lol j/k
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Well since AMD is in the red for their finances, it would have been a pretty light suite case 😀 lol j/k
Monopoly money. or Fury dies as a souvenir. :3eyes:
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
No, it's not. Unless you compare 16nm NVIDIA to 28nm AMD, it does not. The 390x is practically as fast as the 980Ti and the 380x as the 970.You say that AMD does not have an inherent DX12 advantage, which I agree with. The fact is though that NVIDIA is pricing their hardware according to what is basically DX11 perceived performance, and not what the cards can do to the max (as it happens with most DX12 titles). So AMD prices accordingly. The 380x reaching 970 performance is one example of pricing like that. So if you take that into account, then yeah, AMD does have a performance advantage in DX12 regarding performance/dollar. http://www.guru3d.com/index.php?ct=articles&action=file&id=22301
Yeah you're right, and that's perfectly valid but you know my qualms with it. AotS is compute heavy, so what we're seeing in those results you just posted are AMD's cards (which offee higher compute at each tier) outperforming their competitors. That isn't AMD performing better in AotS though, that's the individual cards doing better than nvidia counterparts because of the higher compute. What matters to me (and maybe nobody else) is how they perform, in this case, given equal shader throughput, and nvidia takes the lead there. Overclock a 390x to 1200mhz and compare to a reference 980ti Or overclock a 980ti to 1526mhz and compare to Fury x
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Warhammer is supposed to include async compute (the amd way) yet I see no indication of it
data/avatar/default/avatar28.webp
You guys hyped up about a particular GPU performance really shouldn't... I haven't read the article, but yall need to know that these Total War games have always been almost 100% CPU dependant, GPU's never really mattered... Their "Warscape" engine they use has been a piece of **** for a long time, All it's ever done in the past is stress out the first core 100% and not give a flying fck about what GPU's you have (I got 0 performance increase going from 660ti to 290x, and just a few extra frames going from single to Crossfire) Still with that said, the terrible history the game series has with Performance problems, I don't care that they've upgraded to 64-bit, don't trust them... also couldn't give a **** about Warrhammer, I'm a history nerd not a fantasy one
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
You guys hyped up about a particular GPU performance really shouldn't... I haven't read the article, but yall need to know that these Total War games have always been almost 100% CPU dependant, GPU's never really mattered... Their "Warscape" engine they use has been a piece of **** for a long time, All it's ever done in the past is stress out the first core 100% and not give a flying fck about what GPU's you have (I got 0 performance increase going from 660ti to 290x, and just a few extra frames going from single to Crossfire) Still with that said, the terrible history the game series has with Performance problems, I don't care that they've upgraded to 64-bit, don't trust them... also couldn't give a **** about Warrhammer, I'm a history nerd not a fantasy one
Cool story but the game performs fine lol
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
Unrelated but just realized a VR platform for playing tabletop RPGs online would be God damn amazing
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Unrelated but just realized a VR platform for playing tabletop RPGs online would be God damn amazing
Yeah the potential is unlimited. I forsee mass weight gain.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
Warhammer is supposed to include async compute (the amd way) yet I see no indication of it
As far as i am aware there is no Async involved in this title or very little, the patch is really for balancing the workload and use more CPU cores. So it's performance based, there was a article about that i read last week that shows both AMD and nVidia cards getting huge boosts, of course AMD's boost was alot high but again this is down to their high driver overhead being removed. However the point remains it's all to do with getting the most out of the CPU thats being used.... As is All strategy games which are far more CPU intensive than graphics. Hence why HH has CPU scaling. DX12 allows upto 6 cores to be utilised, and the likes of Vulkan/Mantle which has 8+. nVidia has had a work around for the driver overhead for years now and offloads resources to idle parts of the CPU while AMD's does not. Still better than Tomb raiders "CPU" balancing which ends up being negative for both AMD and nVidia and didn't seem to have much impact on CPU performance at all.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
That isn't AMD performing better in AotS though, that's the individual cards doing better than nvidia counterparts because of the higher compute. What matters to me (and maybe nobody else) is how they perform, in this case, given equal shader throughput, and nvidia takes the lead there. Overclock a 390x to 1200mhz and compare to a reference 980ti Or overclock a 980ti to 1526mhz and compare to Fury x
So, they actually are performing better in AotS, because AotS is a compute-heavy game and AMD cards are better in compute (concurrent compute at that). NVIDIA shaders are more "expensive". They occupy larger die sizes and they are more efficient. The total TFlop might be similar, but NVIDIA cards use that more frequently because of that. The fact is that all DX12 games who have had engines made for DX12, do that by default. The reason I exclude Tomb Raider and Gears is not because they are NVIDIA titles, it is because DX12 was basically hastily bolted on old engines for both. I don't believe it was luck that NVIDIA chose those titles to sponsor too. I can't see that pattern changing, it would make it suicidal for cross-platform games. The whole point is that NVIDIA is basically a "weird" company that is surrounded by AMD products of similar usage in other platforms. They have 70% of a very specific market (PC gamers), which is not small, but it is not a target platform for any AAA releases that actually need the GPUs NVIDIA makes. It can't really go to consoles because of the CPU designs required and the lack of a x86 license, and even if it did have those things, the ship has sailed already. These "new" consoles are looking more and more as "platforms" like the Apple App store etc. That requires backwards compatibility, which means no new contracts with NVIDIA for any perceived amount of time. The only reason they will survive is CUDA and the huge products that capture mindshare sold at enormous profit margins (look at the 1080, it's as cheap as it can possibly get, my 7970 has a more complicated PCB than that and a larger die, don't tell me it costs more than $250 to make and ship anywhere). That's why all of their strategy revolves around soft or hard lock-ins. Gsync monitors/Physx/Gameworks, they are all made to make sure that if you have invested a bit more in the NVIDIA environment, you stay there. These are NOT accusations, I'm just noticing some patterns. The GPU world would lose immensely if NVIDIA ever went away, I'm actually happy they are doing so well. But don't ever think that they will keep being the target platform for anything. Anything optimized for NVIDIA would be because it either caters to the PC market only, or because of a special per-case deal. AMD already has even Ubisoft in the fold, along with EA and Square Enix and id. That's like 50% of the big publishers, if not more. Everyone else that does AAA has to make their engine to cater to the particularities of GCN, first with GCN 1.0/1.1 and now with Polaris.
You guys hyped up about a particular GPU performance really shouldn't... I haven't read the article, but yall need to know that these Total War games have always been almost 100% CPU dependant, GPU's never really mattered... Their "Warscape" engine they use has been a piece of **** for a long time, All it's ever done in the past is stress out the first core 100% and not give a flying fck about what GPU's you have (I got 0 performance increase going from 660ti to 290x, and just a few extra frames going from single to Crossfire) Still with that said, the terrible history the game series has with Performance problems, I don't care that they've upgraded to 64-bit, don't trust them... also couldn't give a **** about Warrhammer, I'm a history nerd not a fantasy one
Well dude, read the article then. Both AMD and NVIDIA get really big performance boosts with DX12. A Total War game is actually a very good choice for a low level API, just because of the reasons you mentioned. PCGamer tested with DX11 too. Even NVIDIA cards get 2x the FPS by switching to DX12.
Unrelated but just realized a VR platform for playing tabletop RPGs online would be God damn amazing
Hot damn, I was actually thinking the same about a week ago. Even better, AR. Imagine D&D on an actual tabletop with animated figures that react depending on what you do and what you roll :banana:
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
I think most people who invest $500+ in a gpu will be enthusiasts and therefore overclock. I have overclocked every single gpu I've owned since Voodoo 3Dfx. I think most do.
Honestly, my fury x is the first GPU I havent overclocked. First GPU i spent $300 over as well. I dont even feel the need to oc it. In every game, Im always above the needed FPS im happy with. My HD 7950s were the gpus I oc'd the piss out of the most. 1300 core, 1500 memory. stock was 860/1200. AVG 120 fps in BF4 ultra @ 2560x1080 is awesome, even on a 75hz ultrawide is very nice.
data/avatar/default/avatar11.webp
You guys hyped up about a particular GPU performance really shouldn't... I haven't read the article, but yall need to know that these Total War games have always been almost 100% CPU dependant, GPU's never really mattered... Their "Warscape" engine they use has been a piece of **** for a long time, All it's ever done in the past is stress out the first core 100% and not give a flying fck about what GPU's you have (I got 0 performance increase going from 660ti to 290x, and just a few extra frames going from single to Crossfire) Still with that said, the terrible history the game series has with Performance problems, I don't care that they've upgraded to 64-bit, don't trust them... also couldn't give a **** about Warrhammer, I'm a history nerd not a fantasy one
Jesus Christ man, again? Stop rambling on with your hate and crap. You've been told TW: Warhammer is performing and running great but yet you still can't keep your mouth closed and ramble on about what you feel :3eyes: We understand.... you had bad experience with the devs in the past but as has been said.... they really improved the engine this time.