Review: Red Dead Redemption 2: PC graphics benchmark analysis

Published by

Click here to post a comment for Review: Red Dead Redemption 2: PC graphics benchmark analysis on our message forum
data/avatar/default/avatar03.webp
asturur:

Yes i was expecting better from the 1080ti. How can a xbox oneX run this at native 4k? is the first post correct?
Do you even know what setting the xbox one x using ?? it could be running low to medium which give almost double fps compared to higher settings. The division 2 also runs native 4K on xbox one x, but according to digital foundary, terrain quality runs lower the lowest setting on PC [youtube=A2A-rhCQCuY] So, it not surprise that the game run native 4K when some setting are even lower than low, and runs only 30fps
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
mackintosh:

Wouldn't be the first time nvidia pulled something like this, they have quite the track record, after all. I don't trust them one bit.
They've never pulled anything like this. Stop spreading misinformation.
data/avatar/default/avatar15.webp
Denial:

They haven't done similar before. There can be a hundred different reasons why Turing performs better than Pascal in this particular game that's not related to Nvidia artificially gimping performance.
The 10 series seems to just be down all around here, even vs. AMD's offerings...what's this game doing differently than every other game out there?
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Efficiency perhaps, I don't know the improvements in Turing compared to Pascal however but if it relates to improvements for D3D12 or Vulkan that might be part of it. Comparably the Vega GPU's and Radeon VII from AMD should be stronger than Navi but they aren't due to shortcomings and limitations though it's a bit different with Navi starting a new architecture after GCN. Add driver optimizations and improvements that might allow newer hardware to scale even better and that might explain it although I'm certainly no expert on what the specifics are. EDIT: Ah but I see the VII's actually holding it's own here edging out just ahead of the Navi, not bad though the gap is pretty close between the two.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Hats off to team red for this one. By next gen I'll probably have a good card to properly replace my 1070.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
AndroidVageta:

The 10 series seems to just be down all around here, even vs. AMD's offerings...what's this game doing differently than every other game out there?
Utilizes compute more? FP16 support on various shaders? Better use of intrinsic shaders on AMD? Optimized for filling GCN's code path properly to avoid stalls (something that has been done on previous games - see "Strange Brigade" for example) hundred other reasons, etc. The argument that Nvidia is gimping their own cards on this specific game - but no other game that's come out recently, just doesn't make any sense. There is no proof of it. There's been no proof in the past that they've done anything like that. I'm tired of reading it.
data/avatar/default/avatar40.webp
SpajdrEX:

little offtopic, RTX2060 SUPER is also on par with GTX1080Ti in Call of Duty : Modern Warfare https://www.techpowerup.com/review/call-of-duty-modern-warfare-benchmark-test-performance-analysis/5.html
This seems a bit suspicious as well. I don't think there being another example of this is conclusive to anything though...at 1080p though the 1080Ti has the advantage by a bit while the 2080 is considerably higher. Weird really. I know that pure architectural changes can certainly bring on their own new way of doing things but this game along with CoD really are some outliers against everything else while the games themselves don't appear to be doing anything differently.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Requirements look extreme, but we are talking ultra here. To the folks comparing the console performance to pc... have you seen the comparison screens of console vs pc? On majority of the screens the difference reminded me Crysis on low vs ultra. So thats why. And yes. The elephant in the room. Rx 5700xt. My next card. Enjoy your 1usd less electricity bills... or run some rtx demo 🙂 Honestly... what nvidia did with 10xx series here is evident. I will much enjoy exact opossite with AMD and my card getting better with time. This is the story which repeats itself for a decade, and I will not ignore this fact, when I decide on the upgrade. 5800xt will end up beating 2080 or better in a while.
https://forums.guru3d.com/data/avatars/m/274/274789.jpg
pascal gang where a you lmao
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Denial:

Utilizes compute more? FP16 support on various shaders? Better use of intrinsic shaders on AMD? Optimized for filling GCN's code path properly to avoid stalls (something that has been done on previous games - see "Strange Brigade" for example) hundred other reasons, etc. The argument that Nvidia is gimping their own cards on this specific game - but no other game that's come out recently, just doesn't make any sense. There is no proof of it. There's been no proof in the past that they've done anything like that. I'm tired of reading it.
There is no proof but who cares about the technical reasons? Its the results the end user should be concerned about. 10xx series are DEAD in this game. And surelly this will not be the only game. Its because nVidia is like a highly optimized car for ONE specific road. Change the road and you are done. AMD is a bulldozer with crazy real power, thats why it shines in many aplications besides games, its real power is huge. 5700xt is more powerful than 2080ti in many specific cases. So here the road has changed, and the 10xx gen is over, RIP. Who cares why... deffinitely not the 5700(xt) users...
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
HybOj:

There is no proof but who cares about the technical reasons? Its the results the end user should be concerned about. 10xx series are DEAD in this game. And surelly this will not be the only game. Its because nVidia is like a highly optimized car for ONE specific road. Change the road and you are done. AMD is a bulldozer with crazy real power, thats why it shines in many aplications besides games, its real power is huge. 5700xt is more powerful than 2080ti in many specific cases. So here the road has changed, and the 10xx gen is over, RIP. Who cares why... deffinitely not the 5700(xt) users...
Hopefully NVidia can provide some driver optimisations for their Pascal 10 series cards, or maybe like Denial said earlier...maybe this game is just using parts of the Turing cards that excel in comparison to Pascal. Will be interesting to see how this pans out and if there will be any further NVidia Pascal driver optimisations, I would bet it's not the drivers though.
data/avatar/default/avatar12.webp
Robbo9999:

Hopefully NVidia can provide some driver optimisations for their Pascal 10 series cards, or maybe like Denial said earlier...maybe this game is just using parts of the Turing cards that excel in comparison to Pascal. Will be interesting to see how this pans out and if there will be any further NVidia Pascal driver optimisations, I would bet it's not the drivers though.
Nvidia aint gonna waste a single breath on pascal - they want you to buy turing. Anyone thinking this is not intentional are delusional to how corporations operate...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HybOj:

There is no proof but who cares about the technical reasons?
Ah, here I was thinking I was on Guru3D and not Gamespot.
Dragam1337:

Nvidia aint gonna waste a single breath on pascal - they want you to buy turing. Anyone thinking this is not intentional are delusional to how corporations operate...
How do you explain NULL, Image Sharpening, Image Upscaling and DXR then? Seems weird that they brought all these things back but apparently "not wasting a single breath" on Pascal.
karma777police:

Nvidia is slowing down 1080 ti big time. There is no f. way it performs as 2060, that's f. bullshit. That's why I do not run any drivers higher than 39x.xx. My 1080ti performs between 2080 super and 2080 ti in all games without ray tracing.
Lol this makes it sound like updating newer than 39x.xx slows your 1080Ti down to a 2060 in all games? So if Hilbert rolled back the 1080Ti driver for this test it would suddenly perform better in RDR2?
data/avatar/default/avatar03.webp
GTX 1080 SLI owner here. Apparently this game supports multi gpu using the vulkan api out of the box. Both cards are pegged at 99% and performance is increased by an average of 60%. However... In some areas there seem to be some flickering and performance issues with a mgpu config. Also.. The afterburner OSD is flckering constantly. Its really interesting to see this as this is the first vulkan game utilizing both cards. Apparently rockstar wanted to support mgpu rendering in this title but they only came half way. I really hope they keep coding improvements for this as it is not a luxurity to have more than one gpu for this incredibly gpu hungry game.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
karma777police:

I installed 39x.xx drivers. My 1080ti runs faster than 2080 in this game ~2080 super performance. Yeah, f. you Nvidia. 39x.xx drivers have no problems running this game, after all it is same f. engine like GTA V
@Hilbert Hagedoorn - Can you confirm this?
data/avatar/default/avatar29.webp
Denial:

Ah, here I was thinking I was on Guru3D and not Gamespot. How do you explain NULL, Image Sharpening, Image Upscaling and DXR then? Seems weird that they brought all these things back but apparently "not wasting a single breath" on Pascal.
They just applied those things universally to the nvidia driver - not specifically for pascal. And in case of DXR, they only made it available to pascal in order to make people go "oh my performance with it is shit, i gotta upgrade!"... But you are suggesting that they were good do'ers by not making their usual artifical segregation by only making it available to the last gen cards, even if there has never been any reason for them to do it in the past, aside of trying to force people to upgrade... ?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

They just applied those things universally to the nvidia driver - not specifically for pascal. And in case of DXR, they only made it available to pascal in order to make people go "oh my performance with it is crap, i gotta upgrade!"...
For the image upscaling they specifically said they use a different system to do it on Turing that yields better image quality - so right there they had to do something different for Pascal. It still requires testing that it works on Pascal. If they they were simply get people to buy Turing wouldn't they just not bring any of the features back - wouldn't that make Turing more alluring?
data/avatar/default/avatar19.webp
karma777police:

Sorry. I tried Red Dead Redemption with 39x.xx onm Nvidia 1080 ti and runs better than 2080, ~2080 Super performance...so yeah how about you stop spreading misinformation?
And Game Ready drivers aren't just about performance optimizations. They can also be about fixing various rendering bugs. And that's also why some games sometimes show better performance on older drivers, not because of gimping, but because something might be rendering incorrectly or not at all. It does happen that older GPUs see regression in newer drivers, but saying Nvidia is paying its engineers to deliberately sit and write code to bring down the performance on their older series is just insane.
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
karma777police:

Sorry. I tried Red Dead Redemption with 39x.xx on Nvidia 1080 ti and runs better than 2080, ~2080 Super performance...so yeah how about you stop spreading misinformation?
Can you post fps difference btw 399.x drivers and latest ones?