Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance
Click here to post a comment for Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance on our message forum
Aelders
PrMinisterGR
Almost all the titles have been AMD-sponsored, but I'm not even sure how much that matters any more. The two titles that NVIDIA are doing better are DX11 engines with DX12 patched on.
Aelders
PrMinisterGR
http://www.guru3d.com/index.php?ct=articles&action=file&id=22301
No, it's not. Unless you compare 16nm NVIDIA to 28nm AMD, it does not. The 390x is practically as fast as the 980Ti and the 380x as the 970.You say that AMD does not have an inherent DX12 advantage, which I agree with. The fact is though that NVIDIA is pricing their hardware according to what is basically DX11 perceived performance, and not what the cards can do to the max (as it happens with most DX12 titles). So AMD prices accordingly. The 380x reaching 970 performance is one example of pricing like that. So if you take that into account, then yeah, AMD does have a performance advantage in DX12 regarding performance/dollar.
vazup
Dazz
PrMinisterGR
GeniusPr0
Total war has always performed iffy and required pure clocks to mitigate. I bet they offered DX12 implementation to bring performance gains across the board. I doubt they just dropped a suitcase of money and said optimize for their cards.
Dazz
GeniusPr0
Aelders
GeniusPr0
Warhammer is supposed to include async compute (the amd way) yet I see no indication of it
Illyrian
You guys hyped up about a particular GPU performance really shouldn't...
I haven't read the article, but yall need to know that these Total War games have always been almost 100% CPU dependant, GPU's never really mattered...
Their "Warscape" engine they use has been a piece of **** for a long time,
All it's ever done in the past is stress out the first core 100% and not give a flying fck about what GPU's you have (I got 0 performance increase going from 660ti to 290x, and just a few extra frames going from single to Crossfire)
Still with that said, the terrible history the game series has with Performance problems, I don't care that they've upgraded to 64-bit, don't trust them... also couldn't give a **** about Warrhammer, I'm a history nerd not a fantasy one
GeniusPr0
Aelders
Unrelated but just realized a VR platform for playing tabletop RPGs online would be God damn amazing
GeniusPr0
Dazz
PrMinisterGR
PCGamer tested with DX11 too. Even NVIDIA cards get 2x the FPS by switching to DX12.
Hot damn, I was actually thinking the same about a week ago. Even better, AR. Imagine D&D on an actual tabletop with animated figures that react depending on what you do and what you roll :banana:
So, they actually are performing better in AotS, because AotS is a compute-heavy game and AMD cards are better in compute (concurrent compute at that). NVIDIA shaders are more "expensive". They occupy larger die sizes and they are more efficient. The total TFlop might be similar, but NVIDIA cards use that more frequently because of that. The fact is that all DX12 games who have had engines made for DX12, do that by default. The reason I exclude Tomb Raider and Gears is not because they are NVIDIA titles, it is because DX12 was basically hastily bolted on old engines for both. I don't believe it was luck that NVIDIA chose those titles to sponsor too. I can't see that pattern changing, it would make it suicidal for cross-platform games.
The whole point is that NVIDIA is basically a "weird" company that is surrounded by AMD products of similar usage in other platforms. They have 70% of a very specific market (PC gamers), which is not small, but it is not a target platform for any AAA releases that actually need the GPUs NVIDIA makes. It can't really go to consoles because of the CPU designs required and the lack of a x86 license, and even if it did have those things, the ship has sailed already. These "new" consoles are looking more and more as "platforms" like the Apple App store etc. That requires backwards compatibility, which means no new contracts with NVIDIA for any perceived amount of time. The only reason they will survive is CUDA and the huge products that capture mindshare sold at enormous profit margins (look at the 1080, it's as cheap as it can possibly get, my 7970 has a more complicated PCB than that and a larger die, don't tell me it costs more than $250 to make and ship anywhere).
That's why all of their strategy revolves around soft or hard lock-ins. Gsync monitors/Physx/Gameworks, they are all made to make sure that if you have invested a bit more in the NVIDIA environment, you stay there. These are NOT accusations, I'm just noticing some patterns. The GPU world would lose immensely if NVIDIA ever went away, I'm actually happy they are doing so well. But don't ever think that they will keep being the target platform for anything. Anything optimized for NVIDIA would be because it either caters to the PC market only, or because of a special per-case deal. AMD already has even Ubisoft in the fold, along with EA and Square Enix and id. That's like 50% of the big publishers, if not more. Everyone else that does AAA has to make their engine to cater to the particularities of GCN, first with GCN 1.0/1.1 and now with Polaris.
Well dude, read the article then. Both AMD and NVIDIA get really big performance boosts with DX12. A Total War game is actually a very good choice for a low level API, just because of the reasons you mentioned.
Agonist
evilkiller650