Total War WARHAMMER DX12: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Total War WARHAMMER DX12: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/235/235344.jpg
The main point took away from the cpu scaling was that ipc is still more important than amount of "cores" once the 8370 was included. Was interesting to see that there is a point of diminishing returns between 1080 and 1440 once the performance floor was surpassed. Does not seem like bulldozer was on the ground floor. Good thing Zen is coming.
data/avatar/default/avatar36.webp
I would hope Vega murders the 1080. 1080 is GP104. That's like saying that the Fury X should murder the 980.
Are you saying that amd's high end should not be compared with nvidias? In any case regular Fury smashes 980 in this benchmark also.
https://forums.guru3d.com/data/avatars/m/265/265988.jpg
-pets his 390-
I was just about to say that 😉 I own a 780 but that R9 390 has been a cracking card 🙂 Also nice to see AMD starting getting the frametimes in DX12 as good and as smooth as they have on DX11 they were struggling a tiny bit on Hilberts other DX12 tests. I'm really looking forward to Polaris & Vega , AMD seem to support there customers very well , Nvidia has got very poor support in regards to newer game optimizations for their older products.
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
Are you saying that amd's high end should not be compared with nvidias?
No. He isn't. He just got confused. There are two Vega GPUs, four SKUs. "big vega" will be go against Titan/Ti "little vega" will go agaisnt 1080/1070
In any case regular Fury smashes 980 in this benchmark
Fury is faster by more than 20%! Interesting number that, 20%... 20% is also an average OC on a 980. I wonder how a 980 @ 1500mhz would do in this test
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
For now, yes. If you take a look how Fury X performs i have a feeling Vega will murder 1080.
It performs pretty much identical to a lot cheaper 1070. At least that's what I'm seeing. And only just outperforms the cheaper 980ti. Then there's the MLAA thing favoring AMD. Nvidia looks fine in this chart. Can't see any advantage for AMD at all. Nvidia performs the same and is cheaper (both 1070 and 980ti). Not to mention overclocking, factor that in and it's a clear win for Nvidia. Lower end cards, AMD seem to have an advantage though.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
No. He isn't. He just got confused. There are two Vega GPUs, four SKUs. "big vega" will be go against Titan/Ti "little vega" will go agaisnt 1080/1070 Fury is faster by more than 20%! Interesting number that, 20%... 20% is also an average OC on a 980. I wonder how a 980 @ 1500mhz would do in this test
But like everything else they can be overclocked too, but i think having a set line of default clocks then if you overclock well more performance. The biggest performance killer is depth of field. Not sure what the unlimited video memory option is used for but i have a R9 290 so limited to 4GB Ram. Wonder if the unlimited video memory is for caching or something. Maybe the 8GB cards will pull away with that enabled?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
No. He isn't. He just got confused. Interesting number that, 20%... 20% is also an average OC on a 980. I wonder how a 980 @ 1500mhz would do in this test
I didn't get confused, I know there will be two Vega GPU's but when you use "Vega" like that and don't specify I think it's pretty clear that you're referring to the larger of the two, or that you don't know two exist in the first place.
Are you saying that amd's high end should not be compared with nvidias? In any case regular Fury smashes 980 in this benchmark also.
I'm not saying AMD's high end should not be compared to Nvidias, but Nvidia's high end isn't GP104, it's GP102, which will probably be out around the same time Vega is. Just because AMD isn't currently competing against GP104 doesn't magically make GP104 the best Nvidia has to offer with Pascal.
https://forums.guru3d.com/data/avatars/m/154/154498.jpg
"Hitman is a cache what you can continuously type of title" Not sure what happened there but I think it might need to be fixed? Great article otherwise, very interesting AMD CPU results.
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
I didn't get confused, I know there will be two Vega GPU's but when you use "Vega" like that and don't specify I think it's pretty clear that you're referring to the larger of the two, or that you don't know two exist in the first place. I'm not saying AMD's high end should not be compared to Nvidias, but Nvidia's high end isn't GP104, it's GP102, which will probably be out around the same time Vega is. Just because AMD isn't currently competing against GP104 doesn't magically make GP104 the best Nvidia has to offer with Pascal.
I wouldn't call GP102 high-end simply because then you have to call GP104 midrange. Let's call GP102 enthusiast :P
But like everything else they can be overclocked too, but i think having a set line of default clocks then if you overclock well more performance.
Im not sure what you mean there. My point is a stock 980 runs at around 1220mhz (reference), it can comfortably hit 1500+ A stock Fury runs at 1.05Ghz, you're lucky if you're stable at 1150 10 % vs 25%
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
A stock Fury runs at 1.05Ghz, you're lucky if you're stable at 1150 10 % vs 25%
What i mean is i think HH was a little stretched for time to play about with tweaking every single graphics card, and although yours may hit 1500MHz it does not mean they all hit that so leaving it at a base line is the best thing to do.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
What happens when your baseline is more expensive and worse performing? lol I feel like the review process needs to change for the 1080, or we just punish Nvidia for coming up with it in the first place.
https://forums.guru3d.com/data/avatars/m/267/267531.jpg
What i mean is i think HH was a little stretched for time to play about with tweaking every single graphics card, and although yours may hit 1500MHz it does not mean they all hit that so leaving it at a base line is the best thing to do.
I never suggested HH does that. I don't have a 980. Most REFERENCE 980s in reviews hit 1500 approximately so it seems like a good OC baseline. The fact is when you see reviews citing 980 performance they're talking reference, and you can squeeze another 20-25% out of them. I'm not saying Hilbert should test every card OC'd, but that readers should be aware of this
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
What happens when your baseline is more expensive and worse performing? lol I feel like the review process needs to change for the 1080, or we just punish Nvidia for coming up with it in the first place.
Well in fairness they want to make their new product look good, they don't want the 980Ti @ 1.4GHz catching up to their shiny new 1080, makes people think well why the hell would i buy that?
I never suggested HH does that. I don't have a 980. Most REFERENCE 980s in reviews hit 1500 approximately so it seems like a good OC baseline. The fact is when you see reviews citing 980 performance they're talking reference, and you can squeeze another 20-25% out of them. I'm not saying Hilbert should test every card OC'd, but that readers should be aware of this
But again not everyone overclocks, thats why i said if you know what you can get atleast and you overclock you get more then fair enough. Alot of people will check benchmarks for games and i bet a good amount of them have never even overclocked a video card. If you overclock it and they don't then they think why is my card so slow? Now no one will complain if theirs is faster see my point? So anyway assuming it is an ideal world that they increase exactly by the percentage of the overclock then the 980 would go from 51 to 63 while the Fury would be 63 to 71fps at 1440p.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Hmmmm..... AMD Fury and DX12 seem to be "glamorous". Forthcoming AMD GPUs appear to be cost effectively elegant ... Waiting....for....
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
I think most people who invest $500+ in a gpu will be enthusiasts and therefore overclock. I have overclocked every single gpu I've owned since Voodoo 3Dfx. I think most do.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I don't even know what to think about these DX12 Benchmark tests. For a start they all seem to favor AMD gfx cards which means god only knows what Nvidia cards are really capable of. We have to remember that HH only uses default clocks on reference gfx cards. I doubt anyone here even runs a reference card. What i do know is that my 980 G1 will out score a reference card by as much as 20% in some tests/benchmark results. If H does a test like this then usually i get about an extra 10 fps on top of his reference results. He's getting 72 fps on a reference 980 @ 1080p which for all we know is crippled by the AMD game engine. I'd get 80+fps on my G1 980 which for me is more than enough to be able to play an AMD DX12 game from 2016, so i'm actually very happy with that and hope to see some actually Nvidia favored DX12 games in the near future.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
omg! just look at my R9 390 betting the GTX 980... Finally we see DX12 bring out the full power of GCN and it's amazing.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
I don't even know what to think about these DX12 Benchmark tests. For a start they all seem to favor AMD gfx cards which means god only knows what Nvidia cards are really capable of. We have to remember that HH only uses default clocks on reference gfx cards. I doubt anyone here even runs a reference card. What i do know is that my 980 G1 will out score a reference card by as much as 20% in some tests/benchmark results. If H does a test like this then usually i get about an extra 10 fps on top of his reference results. He's getting 72 fps on a reference 980 @ 1080p which for all we know is crippled by the AMD game engine. I'd get 80+fps on my G1 980 which for me is more than enough to be able to play an AMD DX12 game from 2016, so i'm actually very happy with that and hope to see some actually Nvidia favored DX12 games in the near future.
See i am not so sure about that as they only partnered up with AMD less than 2 months before release so too me thats to far gone, the main reason why AMD is performing so well in DX12 titles is due to the massive over head from the driver which has held AMD back in DX11 on a otherwise sound architecture. Fact is nVidia has been competing with crippled products because they had the advantage of having well optimised drivers to get the most out of their own architecture.
https://forums.guru3d.com/data/avatars/m/47/47825.jpg
I think most people who invest $500+ in a gpu will be enthusiasts and therefore overclock. I have overclocked every single gpu I've owned since Voodoo 3Dfx. I think most do.
Amen brother lol ordinary joes pay 250 maybe 300 on a good day.
https://forums.guru3d.com/data/avatars/m/267/267249.jpg
Matt at techtested (youtube video) did this already with DX12 and showed 8 cores was the sweet spot for DX12...value for moneywise...sorry can't post links need 5 or more posts on guru.