Shadow of the Tomb Raider RTX GPUs hotfix

Published by

Click here to post a comment for Shadow of the Tomb Raider RTX GPUs hotfix on our message forum
https://forums.guru3d.com/data/avatars/m/232/232040.jpg
But no Ray Traced patch... I didnt delete game until saw new tech 😀
data/avatar/default/avatar21.webp
I wonder if Nvidia is holding this back to get more sales in. People won't be happy with 1080p with a Ti and sales will tank. I want the tech to work out but I get the feeling it is too soon.
No, quite the opposite - Nvidia will be very concerned that even at this late stage (5wks after release), the developer (and Nvidia themselves) cannot further pair back RTX functionality to get acceptable frame rates. Any sane individual monitoring this will most certainly not be making a (very expensive) purchase. Message to reviewers (based purely on the insane pricing structure and the fact that we'll need the 2080Ti model as a minimum) : please include how extensive these RTX features are being implemented in each game/release. The real danger is misleading marketing where the feature is barely being used for fps reasons.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Dynarush_333:

I wonder if Nvidia is holding this back to get more sales in. People won't be happy with 1080p with a Ti and sales will tank.
NVidia doesn't seem they want people to buy RTX cards. They're more interested in selling all that left-over Pascal stock.
data/avatar/default/avatar38.webp
All Conspiracy Theories can go both ways. I mean Nvidia does not want to show yet how awesome RTX is and how good it works (frame and resolution wise) so people decide to buy the old 1080ti, 1080 and 1070 cards that are still in the market and that seems like a better proposition now. Because Princing structures is right now making them look like a much better proposition than the RTX series, then when stock for 10xx cards is gone, the release for all RTX games happens and they are awesome, amazing graphics 4K 60fps with RTX and DLSS and stuff, and all those who just bought a 10xx card will spend money again for the new RTX series. Brilliant move by Nvidia. I hope people understand I am just making a conspiracy theory, is not my belief, my point is theories can go any way we want.
https://forums.guru3d.com/data/avatars/m/231/231016.jpg
AlbertX:

All Conspiracy Theories can go both ways. I mean Nvidia does not want to show yet how awesome RTX is and how good it works (frame and resolution wise) so people decide to buy the old 1080ti, 1080 and 1070 cards that are still in the market and that seems like a better proposition now. Because Princing structures is right now making them look like a much better proposition than the RTX series, then when stock for 10xx cards is gone, the release for all RTX games happens and they are awesome, amazing graphics 4K 60fps with RTX and DLSS and stuff, and all those who just bought a 10xx card will spend money again for the new RTX series. Brilliant move by Nvidia. I hope people understand I am just making a conspiracy theory, is not my belief, my point is theories can go any way we want.
Has it not been shown that RTX performance sucks on the 20xx series? Maybe a couple generations of RTX later we will see the performance Nvidia spews. There is a nice review over at HardOCP that shows the 2080 isn't that much faster than a 1080ti. And people knock AMD. Can we just stop adding newer features and optimize what we already have. They want to sell more cards, optimize games for CF or SLI. Strange Brigade has shown it works. https://www.hardocp.com/article/2018/10/22/rtx_2070_vs_2080_gtx_1080_ti_1070/
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
ht_addict:

Can we just stop adding newer features and optimize what we already have. They want to sell more cards, optimize games for CF or SLI. Strange Brigade has shown it works.
You can optimize & add new features concurrently, both AMD/Nvidia have been doing that for decades. SLI/CF support are on the developer to support and most developers choose to utilize performance optimizations that only work on single cards because the vast majority of users have single cards.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
The RTX generation is dumb, they should have not included RTX, and use it to gain more cuda cores and increase performance. On the power node, 7nm they could have included the RTX and DLSS, when at least 3 major big games will be released with that. Think about Cyberpunk 2077, no way it will run max out at 4K with single 2080TI at constant 60fps, let alone any RTX if it will be added.
https://forums.guru3d.com/data/avatars/m/231/231016.jpg
Denial:

You can optimize & add new features concurrently, both AMD/Nvidia have been doing that for decades. SLI/CF support are on the developer to support and most developers choose to utilize performance optimizations that only work on single cards because the vast majority of users have single cards.
Again I go back to Strange Brigade. The difference is huge. If you say most users have only a single card then sales of the game would suck. Its been shown in other games it works(FarCry 5). So why not ACO? Why offer HighRes, Ultra settings if you cant run with a single card. AMD and Nvidia need to put pressure on the developers.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
moab600:

The RTX generation is dumb, they should have not included RTX, and use it to gain more cuda cores and increase performance. On the power node, 7nm they could have included the RTX and DLSS, when at least 3 major big games will be released with that. Think about Cyberpunk 2077, no way it will run max out at 4K with single 2080TI at constant 60fps, let alone any RTX if it will be added.
How do you suppose they do that when the card is pulling ~280w in a gaming loop?
ht_addict:

Again I go back to Strange Brigade. The difference is huge. If you say most users have only a single card then sales of the game would suck.
Most users have a single card.. that's not even debatable. Also the game sold less than 75k copies.. which while not bad for the game it is, definitely doesn't make it a great example of your point. As for FC5, best case you get 50% scaling at 4K, 25% at QHD. 1.32% of steam users have 4K monitors and only a subset those have SLI/CF and those users are only getting 50% additional performance for 100% the cost. If I was a developer I wouldn't look at that and go "oh yeah that's something we really need to focus on" There is a reason why CF/SLI is dying and its because vast majority of people aren't buying a second card and it's insanely hard to get good scaling out of it. Now with the advent of optimizations utilizing previous/adjacent pixel data to improve performance/quality it makes even less sense.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
moab600:

The RTX generation is dumb, they should have not included RTX, and use it to gain more cuda cores and increase performance. On the power node, 7nm they could have included the RTX and DLSS, when at least 3 major big games will be released with that. Think about Cyberpunk 2077, no way it will run max out at 4K with single 2080TI at constant 60fps, let alone any RTX if it will be added.
I'm actually glad they released RTX series in its current form. The poor value for money, features with too little games support (RT, DLSS), dismal perf jump over previous cards, all basically ensures I must wait for the 7nm follow up. Hopefully the poor reception to RTX will knock some sense into Nvidia to get back on track with a better value for money 7nm successor. By then should have enough games supporting the new features out of the box and would certainly offer massive perf gains for GPU owners who skipped RTX.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
alanm:

I'm actually glad they released RTX series in its current form. The poor value for money, features with too little games support (RT, DLSS), dismal perf jump over previous cards, all basically ensures I must wait for the 7nm follow up. Hopefully the poor reception to RTX will knock some sense into Nvidia to get back on track with a better value for money 7nm successor. By then should have enough games supporting the new features out of the box and would certainly offer massive perf gains for GPU owners who skipped RTX.
Yep same, mine 980TI is overclocked to 1460mhz on core and 8000mhz on memory, it's a blast for QHD, might get 1080ti if i find a cheap one. But i'm waiting for Cyberpunk 2077 to see how the new gpus will perform, and i mean the 7nm AMD and nVidia.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
AlbertX:

All Conspiracy Theories can go both ways. I mean Nvidia does not want to show yet how awesome RTX is and how good it works (frame and resolution wise) so people decide to buy the old 1080ti, 1080 and 1070 cards that are still in the market and that seems like a better proposition now. Because Princing structures is right now making them look like a much better proposition than the RTX series, then when stock for 10xx cards is gone, the release for all RTX games happens and they are awesome, amazing graphics 4K 60fps with RTX and DLSS and stuff, and all those who just bought a 10xx card will spend money again for the new RTX series. Brilliant move by Nvidia. I hope people understand I am just making a conspiracy theory, is not my belief, my point is theories can go any way we want.
You forgot that most of gamers out there in the world can't afford even RTX 1070. They are already considering if money put towards GTX 1060 or RX-570 should not be given to their kids instead.
data/avatar/default/avatar40.webp
2x msi 2080ti tri x Nvlink 7980xe @4600mhz G.skill @ 4000 c16 tweaked (120GB/s read) Hotfix fixed dx12 for me 🙂
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Reminder: The features demonstrated at the RTX launch were based on the slow path feature set provided by the Titan V, and were not optimised for actual hardware. Adding, optimising and polishing a graphical feature not originally in the engine is not an easy task. Unless you're ubisoft and don't care, coz bling.
data/avatar/default/avatar33.webp
alanm:

I'm actually glad they released RTX series in its current form. The poor value for money, features with too little games support (RT, DLSS), dismal perf jump over previous cards, all basically ensures I must wait for the 7nm follow up. Hopefully the poor reception to RTX will knock some sense into Nvidia to get back on track with a better value for money 7nm successor. By then should have enough games supporting the new features out of the box and would certainly offer massive perf gains for GPU owners who skipped RTX.
I agree. Plus I am not taxing either my GTX 1070 in my Desktop PC or my GTX 1060 GB in my gaming laptop that would warrant an upgrade. The same goes for my newly assembled Emulation rig with a GTX 1050.
https://forums.guru3d.com/data/avatars/m/64/64284.jpg
Conspiracy Theories ..."release for all RTX games happens and they are awesome, amazing graphics 4K 60fps with RTX and DLSS and stuff, and all those who just bought a 10xx card will spend money again for the new RTX series." This is what some say to justify the reasons they spent way to much for a card that has nothing NEW to really offer other then higher price. RTX is what Nvidia is telling GAMERS you NEED THIS! There are no "experts" out there saying GO BUY ONE! For me...watching NV 1st came out told the world. Seeing something was different. Something they were not telling/showing people. Then blaming the price on haha supply and demand. I would have been a fool to buy one. People blame NV for not having drivers for OS X Mojave. Here we are..some GAMES that will be based on RayTrace...and in all this time.. no one was ready? Gotta let that 30 days pass. How I see it.
data/avatar/default/avatar01.webp
MaxBlade:

Conspiracy Theories ..."release for all RTX games happens and they are awesome, amazing graphics 4K 60fps with RTX and DLSS and stuff, and all those who just bought a 10xx card will spend money again for the new RTX series." This is what some say to justify the reasons they spent way to much for a card that has nothing NEW to really offer other then higher price. RTX is what Nvidia is telling GAMERS you NEED THIS! There are no "experts" out there saying GO BUY ONE! For me...watching NV 1st came out told the world. Seeing something was different. Something they were not telling/showing people. Then blaming the price on haha supply and demand. I would have been a fool to buy one. People blame NV for not having drivers for OS X Mojave. Here we are..some GAMES that will be based on RayTrace...and in all this time.. no one was ready? Gotta let that 30 days pass. How I see it.
Not me I am not purchasing an RTX card because they are too expensive and I don't have the cash to upgrade and even if I did I'm not sure if my 4790k would able to handle the card. Also even if the game devs were ready RayTracing probably wouldn't work properly at first. Also I think the game devs and Nvidia pulled a Sega back when Sega released the Saturn because back when Sega released the Saturn they did it too early and didn't give the devs a chance to understand the hardware of the Saturn to make games for it. Nvidia released Raytracing capable cards without talking to the game devs or the Game Devs said they were ready for the cards but in reality they were not. That's what I think anyway.