Rise of the Tomb Raider: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Rise of the Tomb Raider: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/228/228458.jpg
Just to be clear this game uses Denuvo DRM which has notable performance overhead, right?
Yes it uses Denuvo. But there is a zero performance impact because of it.
data/avatar/default/avatar12.webp
Is there any difference with Purehair on Nvidia cards versus AMD cards? I've seen videos where the hair moves really naturally and another where it's a bit more toned down (assuming a lower setting maybe?) Also, I've heard that the snow in the hair is missing on Nvidia cards. Can anyone confirm/deny/elaborate?
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Test is not optimal for both platforms as nvidia had Rise of the Tomb Raider ready driver and AMD just not, also there is no 390 in the lineup...
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
http://www.guru3d.com/index.php?ct=articles&action=file&id=20121
That 1080p I totally get but tomb raider is a whole another story where 980 easily keeps up to the 4k res, on Anno it does not. This game is a total mess on AMD cards. It's like the 390 is doing what it should and competes with 970 then rest of the AMD cards just nope.
https://forums.guru3d.com/data/avatars/m/260/260114.jpg
First of all: It's so Crappy that i can't belive Crystal do this to TB Fans ! Game uses 1 thread on Intel CPU's and Max 2 threads on FX CPU's (1 Module to be exact) in 2016 Next Gen-Game? :3eyes: One Ask's question why? Some GPU's ;-) are better in Single Thread DX11 work-loads !! Compared to X1 where TR uses 6 threads + use asynchronous compute -> a hardware-level feature absent on Nvidia hardware and fully implemented on the GCN GPU's ! Why it was deleted for PC? Why TB does not uses so called DX11.3? (X1 does) Then you look into TomB Raider guide on nMilk site: "PureHair is Crystal Dynamics and Square Enix's hair rendering technology" -> so now it's not AMD TressFX 3.0? and it's a Crystal Dynamics and Square Enix's Tech? PureHair TM and PureMaterial TM is part of AMD TressFX Technology. Dawn Engine Tech Demo skip to 1.51m and read in: 2.03m https://en.wikipedia.org/wiki/TressFX OMG, they don't even acknowledge that it's essentially TressFX AMD's tech ! More Lies to boost /obsolete DX11/ GPU sales :grab:
data/avatar/default/avatar02.webp
Hi everyone, i always check every review here, and i always ask myself the same thing... Im a sli gtx 770 owner with 4GB each card. This is one of the few latest card that has been revamped with 2 or 4GB. But here in reviews you do not specify how much memory is in your card. When i saw 25fps i went sad, and i was asking myselft if it is a 2gb card or 4gb card. Checking performance difference with 970 that makes double fps, i'm asking myself if the 4gb would help here. (i know probably main difference here is made by rop units )
https://forums.guru3d.com/data/avatars/m/95/95844.jpg
Oh, boy... Can't wait to play this sequel! I really loved the first one (even though it was too short)! 😀 Hilbert, nice review! And thank you in advance for all future updates to this review. 🙂 By the way... There's a spelling mistake on page 2, first line of third paragraph --> "per se".
data/avatar/default/avatar05.webp
I've test the game on tv 4k (65)with nvidia shield. It's realy good. The game is beautiful, is a first time, i would like say....is so good. I've an sli of 980 ti kingpin@1400 + i7 4770kat4.9...All works fine with all option max.
data/avatar/default/avatar26.webp
Hi everyone, i always check every review here, and i always ask myself the same thing... Im a sli gtx 770 owner with 4GB each card. This is one of the few latest card that has been revamped with 2 or 4GB. But here in reviews you do not specify how much memory is in your card. When i saw 25fps i went sad, and i was asking myselft if it is a 2gb card or 4gb card. Checking performance difference with 970 that makes double fps, i'm asking myself if the 4gb would help here. (i know probably main difference here is made by rop units )
IF SLI works (it seems to be a mixed bag atm) you should be fine. Remember the settings used in the review are maxed out. So go from Very High Details to High, lower PureHair setting and just ignore HBAO+ (it never works as well as nvidia advertise it) and use SSAO. This way you should run it at 45-50 fps. If SLI doesn't work than you should still be able to do lock the game at 30 fps and play that way.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
I'd just like to note that very high is not the highest possible detail setting. Two or three individual settings have yet one higher option. That's what I am running at.
data/avatar/default/avatar13.webp
Very good review. Would be nice to see some gpu usage graphs from both vendors. I bet AMD highed cards do look quite bad in terms of gpu usage at 1080p. Atleast Purehair seems to be nice tech, and doesn't really take much performance hit. Here's link for Gregster's video review: https://www.youtube.com/watch?v=97a095M58Yc What is that, 1fps hit?
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
purehair is basically tressfx3.0 (modified). And no one gives credit to AMD. I think at late part of this pc port development, NVIDIA forced to disable asynchronous compute as their GPU are not capable of/good at it...? just my thinking. This type of aggressive intervention will do nothing but harm to pc gaming industry.
First of all: It's so Crappy that i can't belive Crystal do this to TB Fans ! Game uses 1 thread on Intel CPU's and Max 2 threads on FX CPU's (1 Module to be exact) in 2016 Next Gen-Game? :3eyes: One Ask's question why? Some GPU's ;-) are better in Single Thread DX11 work-loads !! Compared to X1 where TR uses 6 threads + use asynchronous compute -> a hardware-level feature absent on Nvidia hardware and fully implemented on the GCN GPU's ! Why it was deleted for PC? Why TB does not uses so called DX11.3? (X1 does) Then you look into TomB Raider guide on nMilk site: "PureHair is Crystal Dynamics and Square Enix's hair rendering technology" -> so now it's not AMD TressFX 3.0? and it's a Crystal Dynamics and Square Enix's Tech? PureHair TM and PureMaterial TM is part of AMD TressFX Technology. Dawn Engine Tech Demo skip to 1.51m and read in: 2.03m https://en.wikipedia.org/wiki/TressFX OMG, they don't even acknowledge that it's essentially TressFX AMD's tech ! More Lies to boost /obsolete DX11/ GPU sales :grab:
totally agree:)
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
purehair is basically tressfx3.0 (modified). And no one gives credit to AMD. I think at late part of this pc port development, NVIDIA forced to disable asynchronous compute as their GPU are not capable of/good at it...? just my thinking. This type of aggressive intervention will do nothing but harm to pc gaming industry. totally agree:)
Is asynchronous computing even a part of this game under dx11? Also, is there any GW in there? I'm not sure, haven't looked into it.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Is asynchronous computing even a part of this game under dx11? Also, is there any GW in there? I'm not sure, haven't looked into it.
It shouldn't be part of the game under dx11 really. The german site said something about async not being there currently.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
It shouldn't be part of the game under dx11 really. The german site said something about async not being there currently.
I somehow don't think that asynchronous compute is a feature of the dx11 engine, that's what I was wondering. Nobody talked about it until AoS pre beta benchmark.
https://forums.guru3d.com/data/avatars/m/262/262085.jpg
"Those with 4GB GPUs are recommended to use High as VRAM stuttering can be observed on Very High, especially when swapping between gameplay, cutscenes and cinematics, and between gameplay zone transitions. For a smooth experience with those max-quality, 4K x 4K textures, a 6GB GPU is instead recommended. And to crank things up to 4K (3840x2160), with Very High textures and max settings, we'd recommended GeForce GTX TITAN X GPUs with 12GB of VRAM, as usage can near 10GB over prolonged sessions." -from nvidia's rise of the tomb raider guide- http://www.geforce.com/whats-new/guides/rise-of-the-tomb-raider-graphics-and-performance-guide
data/avatar/default/avatar10.webp
purehair is basically tressfx3.0 (modified). And no one gives credit to AMD.
Well this is exactly how I think most of GPUOpen effects will work in future. You get source for effects , and modify your own working versions out of them. And there we have TressFX shaped into Purehair. I hope most developers would adapt these new open source features.
data/avatar/default/avatar39.webp
ComputerBase found that GeForces don't render snow accumulating on the TressFX/PureHair-hair properly or at all, can you confirm this?
https://forums.guru3d.com/data/avatars/m/149/149188.jpg
"Those with 4GB GPUs are recommended to use High as VRAM stuttering can be observed on Very High, especially when swapping between gameplay, cutscenes and cinematics, and between gameplay zone transitions. For a smooth experience with those max-quality, 4K x 4K textures, a 6GB GPU is instead recommended. And to crank things up to 4K (3840x2160), with Very High textures and max settings, we'd recommended GeForce GTX TITAN X GPUs with 12GB of VRAM, as usage can near 10GB over prolonged sessions." -from nvidia's rise of the tomb raider guide- http://www.geforce.com/whats-new/guides/rise-of-the-tomb-raider-graphics-and-performance-guide
That explains that.... sadly. I'm having horrible performance issues with this game on my GTX 970. Issues with massive stuttering and picture freezes like I have never experienced before. Even Skyrim with an entire world of 8K textures and maxed VRAM does not stutter this bad!!!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Is asynchronous computing even a part of this game under dx11? Also, is there any GW in there? I'm not sure, haven't looked into it.
It uses HBAO+ and supposedly VXAO which is new although I haven't seen anyone talk about it. The game also makes use of tessellation but it doesn't seem to change anything on the AMD front whether its on or off. As for Async, the game on Xbox 1 probably doesn't run in DX12. It probably uses whatever API they had before that, which is like a mixture of DX11, Mantle and DX12. I imagine it couldn't be ported to PC so they just stuck with DX11.