Rise of the Tomb Raider Gets Multi-GPU DirectX 12 Patch

Published by

Click here to post a comment for Rise of the Tomb Raider Gets Multi-GPU DirectX 12 Patch on our message forum
data/avatar/default/avatar02.webp
Wondering and implying it's a conspiracy isn't the same thing. As you said yourself, please fill me in why they should have delayed such a patch on purpose until Pascal is out if they would have greatly benefited from such a feature. Maybe I then understand the logic behind your argumentation.
It's beneficial only for cards which can actually utilize async compute, meaning AMD GCN and NV Pascal (even if latter isn't as flexible on it as GCN is) (and possibly Intels some gens, not really sure about them, but they're not gaming GPUs for AAA-games to begin with so I guess they can be left out) Maxwell can in theory support "async compute", but it can't actually benefit from it. You can allocate certain portion of Maxwell GPUs for graphics and the rest for compute, and when graphics or compute task is finished before the other, the resources allocated for the one finishing quicker are left idle 'till the next tasks come in. Changing allocations needs expensive context switching, slowing things down even further. Now, as we know, RoTR's PC version is sponsored by NVIDIA, it has GameWorks bells and whistles and TWIMTBP-stamp. It released as DX11 game, and got DX12 patch little later. The DX12 was a sad joke for DX12 patch, missed major performance boosting feature like async (which obviously was utilized on consoles since the beginning) and performed worse than DX11 on both AMD and NV without even giving any extra eyecandy for the performance loss. Now, that NVIDIA has cards that can utilize async compute too, they suddenly release new DX12 patch with async support. There are coincidences and there are "coincidences", this one is just a bit too clear to be the former. (also, it really should be addressed by Nixxes why on earth they left 1st gen GCN out from async-party, there's no technical reason for it)
data/avatar/default/avatar16.webp
Users in the Tomb Raider thread in the AMD section report major performance gains even for single GPU setups in DX12. Even a user with a Kepler GPU reports much better performance in DX12.
https://forums.guru3d.com/data/avatars/m/204/204118.jpg
What I like about this patch is that it hints Nixxes may provide DX12 multiGPU with DeusEx.
https://forums.guru3d.com/data/avatars/m/263/263845.jpg
Err, RoTR has had DX12 support for a long time already, this patch doesn't add it. The games DX12 implementation was a joke, though, at least before this patch. It reduced performance compared to DX11 no matter if you had AMD or NVIDIA hardware.
That's not entirely true. While dx12 patch reduced average fps a little bit, it did boost lower/lowest fps for me, especially at Geothermal Valley.
data/avatar/default/avatar33.webp
I've only had the game a few weeks, so on the newer patches, but I get higher fps in DX12 mode than 11. Running on i5 3570k and titan x card.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Source? Because until now ROTR is the worst title running under D3D12. It is not too old. It is supported by AMD and it was sold until R9 300 series (included) with several re-brands. That library will be released "soonish" on github. It will be an easy multi-GPU wrapper for both AMD and NVIDIA hardware (linked-adapter mode for AFR only at in the first releases). I bet ROTR issues state elsewhere (what about over-committed scenarios? If not properly handled they will perform worst under DX12 instead of better then DX11). But I do not have the game in my library to profile it..
Technically, GCN 1.0 was sold as a rebrand on 300 series, yes. GCN 1.1 is available on 7790 + 290/260 series. GCN 1.2 is R285 + Fury/X GCN 1.3 is Polaris. So yeah, its a bit of a downer that all the 7900 series cards don't seem to be compatible any more....they sold boatloads of those.
data/avatar/default/avatar08.webp
Technically, GCN 1.0 was sold as a rebrand on 300 series, yes. GCN 1.1 is available on 7790 + 290/260 series. GCN 1.2 is R285 + Fury/X GCN 1.3 is Polaris. So yeah, its a bit of a downer that all the 7900 series cards don't seem to be compatible any more....they sold boatloads of those.
Just to nitpick, AMD finally settled and calls them 1st/2nd/3rd/4th gen GCN Anyway, while those are correct, it gets fuzzier with 3rd/4th gens. Polaris, while it is indeed 4th gen GCN with many updates/upgrades on the internals, is still using "gfx ip level 8.x", just like 3rd gen GCN is, it's ISA is unchanged from Tonga/Fiji. According to earlier leaks/rumors, Greenland (to be Vega 10 or 11) is first with "gfx ip level 9.x" (leak originated from linkedin, it was first dismissed pretty much everywhere, but with Polaris 10 launch we now know that the other linkedin-leak from about same time was correct about Polaris diesize, which increases the credibility of the greenland leak somewhat)
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Err, RoTR has had DX12 support for a long time already, this patch doesn't add it. The games DX12 implementation was a joke, though, at least before this patch. It reduced performance compared to DX11 no matter if you had AMD or NVIDIA hardware. Funny how async compute support, included in console versions, but useless for NVIDIA hardware before Pascal, gets added in now that Pascals are out. Those GameWorks-stickers obviously have nothing to do with it, right... right?
Please explain to me then how a Athlon II X4 @ 3.5ghz was stuttering in ROTR @ 2560x1600 with a 290x, gpu usuage all over the place, avg 35 fps in DX11, but when switched to DX12, it was nearly butter smooth, avg 55fps, and stutter free. Quite a fun little test that was. For me on every rig I own that I tested with DX12 ran far better. Even a GTX 650ti 1GB showed nice min fps improvements. Sucks my R9 280(7950 GCN 1.0) does not support above DX11_1 and only DX12 API wrapper features. All RX 480s are sold out on newegg and Im waiting for the Devil 480 AIB.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I actually find them leaving out Kepler and Maxwell cards even more outrageous than leaving out GCN 1.0. Mainly due to the units sold, not that the GCN 1.0 AMD base is small exactly. It also more or less confirms that the pipeline itself matters as much as "feature levels" that a lot of fanboys have been touting. Maxwell and Pascal both have 12_1, yet it's Pascal's per-pixel preemption pipeline that gets the benefit. GCN 1.0 was/is also probably fine for that (hey async works with them in everything else), but I guess they went for feature level 12_0 as a minimum. Meh, lazy.
data/avatar/default/avatar21.webp
I actually find them leaving out Kepler and Maxwell cards even more outrageous than leaving out GCN 1.0. Mainly due to the units sold, not that the GCN 1.0 AMD base is small exactly. It also more or less confirms that the pipeline itself matters as much as "feature levels" that a lot of fanboys have been touting. Maxwell and Pascal both have 12_1, yet it's Pascal's per-pixel preemption pipeline that gets the benefit. GCN 1.0 was/is also probably fine for that (hey async works with them in everything else), but I guess they went for feature level 12_0 as a minimum. Meh, lazy.
Kepler can't do async at all I think, and Maxwell can't benefit from async (explained earlier in this thread), so there's no point including them
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
Would be nice if someone could bench the async on/off
you can change it in the registry HKEY_CURRENT_USER\SOFTWARE\Crystal Dynamics\Rise of the Tomb Raider\Graphics 1 = on , 0 = off
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
A brief test shows performance on my rig to still bottoms out in DX12, no luck here.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Interesting , i do not have this game but with this new patch it suddenly becomes a good option for me. Now with DX12 patches and my downgrade from 980Ti to 290X it makes sense , with all the free boost AMD video cards are getting from DX12 and Vulkan 😀 this 290X has aged very well in my hands. If the price is right i will buy it.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Interesting , i do not have this game but with this new patch it suddenly becomes a good option for me. Now with DX12 patches and my downgrade from 980Ti to 290X it makes sense , with all the free boost AMD video cards are getting from DX12 and Vulkan 😀 this 290X has aged very well in my hands. If the price is right i will buy it.
Its unbelievable how much the 290x has aged well. I may just pick one up for $120 locally with a reference cooler. I have a Kraken G10 and an AIO and even the vrm kit for it. Im tired of waiting for RX 480 to go into stock on newegg. Plus it wont be $199. It will be $199 + tax + shipping where I live. Its almost $250 USD alone for a 4GB reference.
https://forums.guru3d.com/data/avatars/m/261/261432.jpg
Tested out multi gpu with 295x2. Game runs nicely although there are occasional micro stutters, its not perfect but certainly runs alot better.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Why are you even posting this kind of nonsense.
:stewpid: I f i post it under PROGRAMMING section.... it would be like 😛uke2:
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Its unbelievable how much the 290x has aged well. I may just pick one up for $120 locally with a reference cooler. I have a Kraken G10 and an AIO and even the vrm kit for it. Im tired of waiting for RX 480 to go into stock on newegg. Plus it wont be $199. It will be $199 + tax + shipping where I live. Its almost $250 USD alone for a 4GB reference.
If you can find one at $120 US do it , place that AIO in there OC it to around 1150/1500Mhz and you will be in business :thumbup: