Rise of the Tomb Raider - PC patch

Published by

Click here to post a comment for Rise of the Tomb Raider - PC patch on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Size of maps is not problem of TotTR. It is bad LOD scaling and dynamic shader pop-ins/outs. Does anyone even remember jumping here? (10 years ago): [youtube]SkumkXsKbIA[/youtube] People simply no longer accept effect which appears only if you are closer than 5m from it. And they for some reason failed general LOD rule: If it pops in at range of 20m, it has to go away no closer than 30m. But for some reason RotTR has pop in at 5m and you make step back and it is gone again. or just turn camera a bit, so viewport moves few cm away. Side note: Original Rise of the Triad had executable named rottr.exe
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
GRAW 1 and 2 were\still are damn fun. Gretzkys weapon mod FTW. Shame about the daft ai though.
https://forums.guru3d.com/data/avatars/m/261/261368.jpg
Compared to the first Tom raider and it was AMD title and how good it was performance wise and this Nvidia title and how horrible it is really makes me wanna stop feeding that creep company with money.
data/avatar/default/avatar08.webp
Ok lets try your SLI in Far Cry 4. All settings max. Good morning for shadow bugs, screen flickerig and other minor issues. Ok lets try your SLI in Batman: Arkham Knight. All settings max. Good morning for 50% FPS drops. I sell my second 980TI because the SLI is bugged in 50% games. The two VGA is working 99% but i not have the double FPS. I need one brutal VGA, not two bugged s.it. No more SLI! 4K? The two 980Ti is very poor in 4K because the games is not optimized for this resolution and VRAM. I wait 2-3 years for a good VGA for 4K.
Good evening for you too. No need to try those, sucks hard. Youre right, SLI wont work properly in some / most of the games, for me this far its ok because no time to hรถsselhรถff around with every game, what i am playing (BO3, Battlefield4, SW battlefront) i have quite nice experience about SLI + G-sync (eliminates micro-stuttery etc. too and keeps games ultrasmooth). Never doubles the frames but if you want to get 1440p / 144fps there is no other option. We can wait those supercards next decade but im quite sure you allways need SLI to get maximum out of these. Heat and noise too ๐Ÿ™‚
https://forums.guru3d.com/data/avatars/m/101/101307.jpg
Compared to the first Tom raider and it was AMD title and how good it was performance wise and this Nvidia title and how horrible it is really makes me wanna stop feeding that creep company with money.
Well spotted. Nearly all GW games have issues. Nvidia adding unnecessary gfx code.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Well spotted. Nearly all GW games have issues. Nvidia adding unnecessary gfx code.
I find this amusing because the first Tomb Raider had massive performance problems with Nvidia akin to what AMD users are experiencing with this game. http://www.ign.com/articles/2013/03/11/tomb-raider-pc-patch-fixes-nvidia-issues It took a patch and a driver update for Nvidia to fix it. Now rise of the tomb raider is released and AMD's Non-Fiji cards were running fine, only the Fiji cards had issues. http://www.pcper.com/reviews/Graphics-Cards/Rise-Tomb-Raider-AMD-and-NVIDIA-Performance-Results/Feb-5th-Patch-and-Multi-G So how is this any different? How come R6: Siege has no performance issues, yet it's a Gameworks game? How come Witcher 3, which AMD claimed they needed Source code to fix, released a driver update that fixed performance without the source code? How come AMD lied about working with Project Cars before the game released and Nvidia sabotaging it? Like obviously Nvidia isn't a saint either. They definitely target GameWorks optimization towards their own architecture and they clearly have deceptive marketing nonsense and whatnot.. But the narrative out of AMD users is getting ridiculously one sided. The problem that AMD had with Rise was clearly an AMD problem and not a Gameworks one.
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
I love it when people moan they can't get 60fps with max settings. One second they want graphics that push the hardware, and then when they do people moan... It's a demanding game, but a beautiful one, so I have no problem not being able to get 60fps when maxing the settings. Whenever someone can't get the magic 60 they scream unoptimised regardless of how good it looks...
Truth. This is why I get so annoyed with people when they talk about how some games have visually gotten a downgrade and its so crappy because it won't push their hardware.. meanwhile those games are released and people can't max them out with their top end machines even after said downgrade. I have kept trying to tell people that if these games weren't toned down a bit from their "Early demos" (Think Witcher 3, Division) they may look amazing on release but run at 20FPS on a 980TI. I think that we should have the higher end graphics options remain in game so that we could max them out if we so choose, however, you'd get nothing but bitching that the games are unoptimized to the moon. Devs know this, which is part of the reason why we see these amazing looking pre-release media and then the actual games release looking "downgraded"
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
The difference is that when Nvidia users get issues, that is proof that Nvidia also mess up, but when AMD users get issues then it's also Nvidia's fault. AMD essentially have their customers well trained to deflect the blame no matter if it's their fault or not. The much duller explanation for the difference between 2013 Tomb Raider and the new game is that the 2013 game was a last gen console port and this is a current gen console port. The reality is that alot of games have issues these days of some sort.
data/avatar/default/avatar03.webp
The difference is that when Nvidia users get issues, that is proof that Nvidia also mess up, but when AMD users get issues then it's also Nvidia's fault. AMD essentially have their customers well trained to deflect the blame no matter if it's their fault or not. The much duller explanation for the difference between 2013 Tomb Raider and the new game is that the 2013 game was a last gen console port and this is a current gen console port. The reality is that alot of games have issues these days of some sort.
Well i dont know what is the case with TR, we see problem of cpu cores scaling ( 1 CPU core have better performance than 2-4-6 cores enabled ), I hope they release a patch, and a driver update before i buy the game. ( Well the problem seems not bee only on AMD gpu`s, how come this game dont scale on perf with Nvidia gpu`s between 1 and 8 cores ? .. For a 2016 games that is damn strange. There's something who is messing on every hw on this respect. No difference between 4 and 8 or even 2 and 4 cores, ok.. but no difference between 1 and 4 cores.. thats a first.... Look like the game will run just fine on an Athlon 1 cores. http://www.guru3d.com/articles_pages/rise_of_the_tomb_raider_pc_graphics_performance_benchmark_review,9.html I was plan to get it when released, but i have absolutely no time for play thoses last and next weeks. This said for respond to you, theres no need that a game run for ever bad on your competitor gpu`s, in general review and performance tests are just made when the game is launched. ๐Ÿ™‚ ( just say that for the smile , nothing serious . Anyway new driver seems do a difference and not all review show the same behavior: http://www.overclock3d.net/reviews/gpu_displays/rise_of_the_tomb_raider_pc_performance_retested_with_new_amd_drivers/5 http://www.overclock3d.net/gfx/articles/2016/02/02081826358l.jpg http://www.overclock3d.net/gfx/articles/2016/02/02081900217l.jpg
https://forums.guru3d.com/data/avatars/m/101/101307.jpg
I find this amusing because the first Tomb Raider had massive performance problems with Nvidia akin to what AMD users are experiencing with this game. http://www.ign.com/articles/2013/03/11/tomb-raider-pc-patch-fixes-nvidia-issues It took a patch and a driver update for Nvidia to fix it. Now rise of the tomb raider is released and AMD's Non-Fiji cards were running fine, only the Fiji cards had issues. http://www.pcper.com/reviews/Graphics-Cards/Rise-Tomb-Raider-AMD-and-NVIDIA-Performance-Results/Feb-5th-Patch-and-Multi-G So how is this any different? How come R6: Siege has no performance issues, yet it's a Gameworks game? How come Witcher 3, which AMD claimed they needed Source code to fix, released a driver update that fixed performance without the source code? How come AMD lied about working with Project Cars before the game released and Nvidia sabotaging it? Like obviously Nvidia isn't a saint either. They definitely target GameWorks optimization towards their own architecture and they clearly have deceptive marketing nonsense and whatnot.. But the narrative out of AMD users is getting ridiculously one sided. The problem that AMD had with Rise was clearly an AMD problem and not a Gameworks one.
The two games you mentioned are shining lights and maybe the only two with hardly no issues. Yes, TressFX was a killer for NV in TR 2013 but after AMD opened it up NV got a nice boost. It's not a conspiracy theory but GW titles are known to cause problems to NV and mostly AMD users.
data/avatar/default/avatar23.webp
I love it when people moan they can't get 60fps with max settings. One second they want graphics that push the hardware, and then when they do people moan... It's a demanding game, but a beautiful one, so I have no problem not being able to get 60fps when maxing the settings. Whenever someone can't get the magic 60 they scream unoptimised regardless of how good it looks...
Some people are under delusion that cards like 980Ti and like are so powerfull that they are supposed to run all games now, and in future 60fps at 1080p even. Which isn't going to happen. Only way such situation would be possible, is that we would have to freeze all graphical development in games for current level for years to come. Bottom line is, we haven't even closely maxed detail levels that we can render at 1080p, so it's even stupid to expect crazy high framerates at higher resolutions. Take any blueray 1080p rendered movie, and that's where limits of 1080p lies. Yep, we're not even close that quality. Glad we can also lower details if our hardware can't run them properly.
data/avatar/default/avatar35.webp
With this new patch,Titan X 3-way sli is working flawless ๐Ÿ˜€ Acer x34 g-sync 100hz 3440x1440 All settings on max except SSAA on 2x. 90-100fps and 99% gpuload on all gpu's. 6,4GB vram usage. 5960x @ 4500mhz and TX's @ 1450mhz.
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
With this new patch,Titan X 3-way sli is working flawless ๐Ÿ˜€ Acer x34 g-sync 100hz 3440x1440 All settings on max except SSAA on 2x. 90-100fps and 99% gpuload on all gpu's. 6,4GB vram usage. 5960x @ 4500mhz and TX's @ 1450mhz.
And people talking bout how they're not interested unless a monitor is 144-200Hz now, 100 is weaksauce.. meanwhile they never talk about how they plan on achieving said 144+ fps.. Glad to hear that the patch has made SLI work better for those with multicard.
https://forums.guru3d.com/data/avatars/m/260/260338.jpg
With this new patch,Titan X 3-way sli is working flawless ๐Ÿ˜€ Acer x34 g-sync 100hz 3440x1440 All settings on max except SSAA on 2x. 90-100fps and 99% gpuload on all gpu's. 6,4GB vram usage. 5960x @ 4500mhz and TX's @ 1450mhz.
Haha, 100/5 game support the 3 VGA. XDDDD And the scaling? 1 TitanX=50FPS 2 TitanX=100FPS? 3 TitanX=150FPS? No... 3 TitanX is only 100FPS. Nice scaling for 3x consumption. XD GG man. ๐Ÿ˜›uke2: PS.: Why not buy a 4. VGA? You only need two Corsair 1500i. XDDD
data/avatar/default/avatar35.webp
.........
data/avatar/default/avatar33.webp
And people talking bout how they're not interested unless a monitor is 144-200Hz now, 100 is weaksauce.. meanwhile they never talk about how they plan on achieving said 144+ fps..
Because... On 60Hz display, vsync=on, the longest time that an already rendered frame will be held back before being pushed onto display is 16.6ms. It's only 6.9ms on 144Hz display. Vsync ON or OFF; 60, 129 or 144fps, does not matter -> 144Hz display will ALWAYS give superior gameplay compared to 60Hz display.
data/avatar/default/avatar16.webp
Well spotted. Nearly all GW games have issues. Nvidia adding unnecessary gfx code.
any proof to back this claim up? because latest title with very questionable performance is XCOM2 and it doesn't have any gameworks in it.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Removed BS warez talk.... please let this be a warning also
https://forums.guru3d.com/data/avatars/m/101/101307.jpg
any proof to back this claim up? because latest title with very questionable performance is XCOM2 and it doesn't have any gameworks in it.
Do I even have to? XCOM2 is just badly optimized.