Hitman 2: PC graphics DX12 (v2.20) performance update

Published by

Click here to post a comment for Hitman 2: PC graphics DX12 (v2.20) performance update on our message forum
data/avatar/default/avatar33.webp
I know the release notes say Kepler isn't supported, but I tried it any way, but it hard locks just when the mission is loaded and the gameplay is about to start. Does it have to do with the game requiring feature level 11_1, or would the game simply just refuse to start if that were the case? It's a shame, because DX12 was a huge benefit already in Hitman 1 due to the better CPU performance
https://forums.guru3d.com/data/avatars/m/274/274789.jpg
G4600 2c/4t @3.6 marrakesh location, all ultra dx11 27 fps avg dx12 38 fps avg also less stutters in 12 gamechanger for me
data/avatar/default/avatar08.webp
Strange Times:

G4600 2c/4t @3.6 marrakesh location, all ultra dx11 27 fps avg dx12 38 fps avg also less stutters in 12 gamechanger for me
That seems like a great result ,now only if your card supported freesync you would be golden.
data/avatar/default/avatar16.webp
DX12 helps a lot in CPU limited scenes in HITMAN 2, I got +20-40fps. It's a totally different game for me now. IOI rushed DX11 implementation with Turing that's why DX12 shine so much. I recorded a benchmark : [youtube=twxe7dNWeKE] Specs : Windows 10 Professional ‎(X64)‎ 1809 Intel Core i7 9700K 4.9GHz MSI GeForce RTX 2080 Gaming X Trio 16GB 2x8GB DDR4 Dual Channel 2666 MHz (Intel XMP on) Nvidia Driver - 419.67 2560x1440 (DSR) - GSYNC - 144Hz Pascal GPUs benefit from these gains too : https://www.reddit.com/r/HiTMAN/comments/b5p79r/some_hitman_2_directx12_benchmarks_massive_fps/ DX12 implementation is not perfect since the results in 4K are better with DX11, and SLI is not working atm. Gains with Turing are really massive in 1440p/1080p in all CPU limited areas, between 20-50fps, awesome. Edit : For clarification, official benchmark score is not the best way to see the difference, which is why guru3d's test does not represent the reality, the difference between DX11 and DX12 with HITMAN 2 and a good computer is extraordinary except in 4K. To better notice the gains, play the game, not the internal benchmark (especially Miami one).
data/avatar/default/avatar14.webp
_Hardware_:

DX12 helps a lot in CPU limited scenes in HITMAN 2, I got +20-40fps. It's a totally different game for me now. IOI rushed DX11 implementation with Turing that's why DX12 shine so much. I recorded a benchmark : [youtube=twxe7dNWeKE] Specs : Windows 10 Professional ‎(X64)‎ 1809 Intel Core i7 9700K 4.9GHz MSI GeForce RTX 2080 Gaming X Trio 16GB 2x8GB DDR4 Dual Channel 2666 MHz (Intel XMP on) Nvidia Driver - 419.67 2560x1440 (DSR) - GSYNC - 144Hz Pascal GPUs benefit from these gains too : https://www.reddit.com/r/HiTMAN/comments/b5p79r/some_hitman_2_directx12_benchmarks_massive_fps/ DX12 implementation is not perfect since the results in 4K are better with DX11, and SLI is not working atm. Gains with Turing are really massive in 1440p/1080p in all CPU limited areas, between 20-50fps, awesome. Edit : For clarification, official benchmark score is not the best way to see the difference, which is why guru3d's test does not represent the reality, the difference between DX11 and DX12 with HITMAN 2 and a good computer is extraordinary except in 4K. To better notice the gains, play the game, not the internal benchmark (especially Miami one).
Absolutely incredible performance improvement on DX12 with this patch on both Pascal and Turing. some people say Pascal is not good at DX12 but this proves it's just the matter of optimization not some problem with GPU architecture.
data/avatar/default/avatar38.webp
Maybe Dx12 improves frame time or min fps and not max fps? Dunno...
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
HardwareCaps:

But but Dx12 is the future (laughs my ass off)
DX12 is DX10 all over.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
HardwareCaps:

But but Dx12 is the future (laughs my ass off)
You can make a ferrarri run like a turd if you don't build it correctly.
data/avatar/default/avatar15.webp
Is DX12 mGPU supported with this update? The first game had quite a solid implementation as my CPU was severly bottlenecking my GPUs in DX11 wich was not the case in DX12. Very handy for 1440p 144hz monitors.
data/avatar/default/avatar39.webp
Aura89:

You can make a ferrarri run like a turd if you don't build it correctly.
The question is "how many companies can build a ferrari?" and "how viable it is for mass production?" works pretty well here.
data/avatar/default/avatar25.webp
Berke53:

Is DX12 mGPU supported with this update? The first game had quite a solid implementation as my CPU was severly bottlenecking my GPUs in DX11 wich was not the case in DX12. Very handy for 1440p 144hz monitors.
Not yet. Only DX11 supports multi GPU for the moment.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
HardwareCaps:

The question is "how many companies can build a ferrari?" and "how viable it is for mass production?" works pretty well here.
There's nothing not viable about DX12 and unlimited companies can easily makes games with it. That being said, going back into a game that was built on DX11 and implementing DX12 into it after the fact, is typically why there are issues. Even many games that support DX12 from the beginning, did not start getting built on DX12 at the beginning of development. Like with most things, if you cater to one type of development, and add other types of development half way in, rather then fully support only one type of development, you'll have overall worse performance across the board. This is one of the principle facts why consoles in general can do more with their hardware, as developers only have 1, 2, or 3 types of hardware and software to build for.
https://forums.guru3d.com/data/avatars/m/262/262995.jpg
Undying:

Its strange that Polaris does not benefit from dx12 in this game, most other dx12 games do. Bad implementation i would say, performance is all over the place. We can only dream developers will do a better job with dx12 like with Tomb Raider and Division 2.
While Dx12 implementation is better in Div 2 than it was in the original game, it's still broken. DX12 results in flickering textures, crashes, stutters and more stutters...and lower framerate.
data/avatar/default/avatar24.webp
Aura89:

There's nothing not viable about DX12 and unlimited companies can easily makes games with it. That being said, going back into a game that was built on DX11 and implementing DX12 into it after the fact, is typically why there are issues. Even many games that support DX12 from the beginning, did not start getting built on DX12 at the beginning of development. Like with most things, if you cater to one type of development, and add other types of development half way in, rather then fully support only one type of development, you'll have overall worse performance across the board. This is one of the principle facts why consoles in general can do more with their hardware, as developers only have 1, 2, or 3 types of hardware and software to build for.
Most developers even today go for DX11 over 12. DX12 removes a lot of the abstraction that exists in DX11 things like API overheads, Drivers etc. that's great because it gives developers the ability to fine tune and control more things, the issue begins with studios that do not have the experience with low level API's and things that AMD/Nvidia do for them in DX11.(frame pacing,Instuction & memory control, etc) Small studios that can't hire or spend money on these kind of skills will simply use DX11 because it's easier. eventually developers will have to use DX12 (DXR, Variable rate shading and I'm sure more features will follow) but DX11 is going to stay strong even for the next 3 years.
data/avatar/default/avatar05.webp
Irenicus:

While Dx12 implementation is better in Div 2 than it was in the original game, it's still broken. DX12 results in flickering textures, crashes, stutters and more stutters...and lower framerate.
It makes sense..... because that's what DX12 is... developers have to control frame pacing, handle GPU & CPU instructions and memory.... basically more places to mess up for developers and more resources required to produce a product.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
HardwareCaps:

Most developers even today go for DX11 over 12.
Yes, this is the issue. Sounds like you finally understand. The rest of your post implies you don't, but i'm hoping that first sentence will break through.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Aura89:

Yes, this is the issue. Sounds like you finally understand. The rest of your post implies you don't, but i'm hoping that first sentence will break through.
The rest of his post is basically in agreement with the bottom of yours. DX12 hands the developer a ton of responsibility once handled automagically by Microsoft/Nvidia/AMD. On consoles with one standard hardware implementation it's easier to debug that responsibility, find the issue/performance degradation and fix it. On PC the same issue might present itself differently across a variety of configurations, requiring either a separate codepath for each configuration or "catch all" fix that may not be as clean as if it was with a single configuration - making it not only harder to find (you need to test all different configs) but harder to fix it once you found it. That on top of what you said (that developers are integrating DX12 after they already built the game for DX11) definitely adds to the complexity/issues. DX11 was always supposed to coexist with DX12 - I don't see that changing. I think most of the larger devs have been getting progressively better at DX12 though - Battlefield and Division both have come a long way since their first go.
data/avatar/default/avatar29.webp
Denial:

The rest of his post is basically in agreement with the bottom of yours. DX12 hands the developer a ton of responsibility once handled automagically by Microsoft/Nvidia/AMD. On consoles with one standard hardware implementation it's easier to debug that responsibility, find the issue/performance degradation and fix it. On PC the same issue might present itself differently across a variety of configurations, requiring either a separate codepath for each configuration or "catch all" fix that may not be as clean as if it was with a single configuration - making it not only harder to find (you need to test all different configs) but harder to fix it once you found it. That on top of what you said (that developers are integrating DX12 after they already built the game for DX11) definitely adds to the complexity/issues. DX11 was always supposed to coexist with DX12 - I don't see that changing. I think most of the larger devs have been getting progressively better at DX12 though - Battlefield and Division both have come a long way since their first go.
Consoles have their own unique software specifically designed for low level access, the device is exactly the same across all users so same core counts, GPU performance and memory architecture. Big studios with in-house built engines and skilled developers can handle DX12, medium sized to indie studios can't.