Destiny 2: PC graphics analysis benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Destiny 2: PC graphics analysis benchmark review on our message forum
data/avatar/default/avatar29.webp
Thank you Hilbert for the benchmarks! Wonder why only the 1440P performance was updated for AMD cards (Vega 56 just got beaten in FHD by the 1070, but ultimately destroys it in 1440P, just like the 580 with the 1060). Vega64 is actually 10% from the 1080Ti in 4K. And it is an NV supported title. Absolutely terrific results for AMD (just like in Shadow of War).
https://forums.guru3d.com/data/avatars/m/113/113386.jpg
After updating to the newest AMD drivers, i can't seem to get 75fps in the open areas i can see my GPU usage maxing out. And thats when i notice framedrops, i don't think the previous driver did that, it was butter smooth all the way.
data/avatar/default/avatar13.webp
Impressive results for AMD Vega at 1440 and 4k. I am surprised with how these results compare to 1080p.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
That called my attention too, the jump from 1080p to 1440p for AMD performance when you compare it to Nvidia
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Glad your copy didn't get banned 😀 OnT the game runs great. I personally run the Highest settings too but turned DoF down to Low for a nice FPS boost.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
This seems to be more on par with what I'm getting. Note to self, ignore reddit user's claims.
data/avatar/default/avatar09.webp
For how the game looks these results are somewhat surprising. The graphics are quite ugly and the GTX 1070 gets FullHD/Maxed only 125 FPS. This is not great performance in my books, only average.
https://forums.guru3d.com/data/avatars/m/212/212018.jpg
@Hilbert Hagedoorn, wrong numbers for the 480 or could be other issues lying behind?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
I noted that anomaly in the review. Not sure, ran it three times for the 480 and even reinstalled drivers - but got the same odd results over and over again. The results are what they are, not gonna sugar coat that by leaving it out of the chart.
data/avatar/default/avatar03.webp
I hate being the conspiracy theory guy here but is there some sort of planned obsolescence built into Nvidia cards or do they just not age well? These kind of results with the 1070ti on the cusp is highly suspect to me. I admit its been quite a while since I owned Nvidia cards. As soon as a new Nvidia card comes out the older cards relative performance begins to tank. Edit: Also is this the air cooled Vega 64? If so this means the liquid is even closer to the 1080ti.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Ehm no, because I already tested the card you mention with this title and it's spot on where it needs to be. Secondly if Nvidia would do such a thing, it would show in all games. AMD found something clever that works for them, it's as simple as that. And yes, this is tested with the air-cooled Vegas. AMD never shipped the liquid cooled edition for review. BTW it is a good question, I'll make some changes in future charts denoting clearly we use the air cooled versions.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Looks like some very suspicious results for AMD. 580 is twice as fast as the 480 with just a 5% OC. Not buying that in the least. The jumps from 1080 to 1440 are too wide. Something is going on at the driver level that is excluding 4xx cards from optimization’s or AMD is disabling something via driver on the newer cards.
data/avatar/default/avatar32.webp
Hilbert, are you able to test 1440p and 4k at Highest Settings but with DoF drops to High? Right now on Pascal at 1080p, the difference between Highest and High is 5-7% yet at higher resolution i.e 1440p and 4K it's up to 33% performance difference, very significant. I suspect the same thing is happening to the Rx 480.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
I'd like to see comparisons in IQ between AMD and Nvidia. Call me old, but there was a time 15 years ago when nvidia's FX series was a disaster in dx9 and they did everything possible to "tweak, optimize, fix, help, improve", the frames in those games by doing some rather shady stuff. AMD then got caught doing the same thing 2 or three gens later. Ever since then, regardless of company, when I see two pieces of hardware that are usually close to one another and a big difference like this appears, I tend to assume it's cheap optimizations. Yes, that makes me sound negative, but if you did any reading into these companies, their drivers and the things they've done in the last decade even, you'd understand my assumptions. Just seems too good to be true. And my grandpa taught me if it sounds too good to be true, it usually is. I assumed these would be common results with AMD vs nvidia after AMD won the rights to BOTH big consoles. I assumed porting over through dx12 to AMD would be easier and probably make for more real optimizations. But this has not been the case.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Console DX12 is different to PC DX12 according to several major dev studios and big profile coders so I would not automatically assume having XBox and PS4 using GCN would help AMD and it's up to however the studio ports to PC which can be anything from basic to poor to outright disastrous or occasionally even good or almost excellent. As for image quality and optimization I remember both AMD and Nvidia being caught "cheating" with very aggressive tweaks for some games early on, doubt some minor less visible effect could yield whatever uncorked the performance here though (Why not for ALL Polaris GPU's though, that's a bit weird.) and it's some other issue that were blocking things but who knows, more detailed analysis of the game code and drivers could help but without being able to attach much to the games process due to anti-cheat restrictions that might be difficult to discern outside of Blizzard directly. 5 - 10% performance can be something like a tweak or bug fix, more than that as here is usually something important though and outside of outright killing effects I doubt they could do anything that major by "optimizing" certain things though again it's hard to tell. (Could be things outside of the players view like culling or edits to shaders or just making use of whatever hardware improvements the Polaris and Vega cards have to help boost things for these but again that should have carried over to the 400 series too if it was that.) EDIT: Can't find it now but a ex-Nvidia driver engineer made some interesting points too for what went into the display driver and some workarounds for games breaking basic API usage and best practice recommendations from full shader replacements and just how much of the driver was compatibility fixes and such, they're pretty complex to put it mildly. (And going by people such as Durante for DSFix and some other works or Kaldaien for SpecialK some games are a bit problematic to put it very mildly, all sorts of issues being discovered by poking around a bit.) But that's a different issue, Destiny 2 does seem to be running pretty good on PC without being super demanding too. (And newer drivers might improve this even further with some more work, ideally without murdering image quality in the process though I doubt that's anything that needs to be worried about.)
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
JonasBeckman:

Console DX12 is different to PC DX12 according to several major dev studios and big profile coders so I would not automatically assume having XBox and PS4 using GCN would help AMD and it's up to however the studio ports to PC which can be anything from basic to poor to outright disastrous or occasionally even good or almost excellent. As for image quality and optimization I remember both AMD and Nvidia being caught "cheating" with very aggressive tweaks for some games early on, doubt some minor less visible effect could yield whatever uncorked the performance here though (Why not for ALL Polaris GPU's though, that's a bit weird.) and it's some other issue that were blocking things but who knows, more detailed analysis of the game code and drivers could help but without being able to attach much to the games process due to anti-cheat restrictions that might be difficult to discern outside of Blizzard directly. 5 - 10% performance can be something like a tweak or bug fix, more than that as here is usually something important though and outside of outright killing effects I doubt they could do anything that major by "optimizing" certain things though again it's hard to tell. (Could be things outside of the players view like culling or edits to shaders or just making use of whatever hardware improvements the Polaris and Vega cards have to help boost things for these but again that should have carried over to the 400 series too if it was that.) EDIT: Can't find it now but a ex-Nvidia driver engineer made some interesting points too for what went into the display driver and some workarounds for games breaking basic API usage and best practice recommendations from full shader replacements and just how much of the driver was compatibility fixes and such, they're pretty complex to put it mildly. (And going by people such as Durante for DSFix and some other works or Kaldaien for SpecialK some games are a bit problematic to put it very mildly, all sorts of issues being discovered by poking around a bit.) But that's a different issue, Destiny 2 does seem to be running pretty good on PC without being super demanding too. (And newer drivers might improve this even further with some more work, ideally without murdering image quality in the process though I doubt that's anything that needs to be worried about.)
Thanks for the post. I was not aware console and PC DX12 was different. Kind of defeats the purpose of it doesn't it? As for the cheats. I'm not flat out saying AMD is cheating. The problem is, both have cheated in the past, so it's logical to me to be hesitant when something out of the ordinary pops out like this. But, by no means am I saying it's certainly a cheat. Maybe it's legit, clean, just plain better dev work by AMD's driver team. Maybe they had a better relationship with Bungie working on the PC version. I ask a lot of questions. It's in my nature and I'm an annoyance in that way (many more as well, I'm sure). So I just wonder, is all. Maybe in future reviews we can get a page for IQ? I swear Hilbert used to do that?
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Not sure why a large amount of the general populace on reddit are fanboying over how well the game is optimized. To me this game looks like it relies on heavy amounts of post-processing in order to look good. If this is indeed the case, the performance results shouldn't surprising at all (aside from those few GPU showing anomalies). Not saying the game is a sh!t port or that it looks bad. It looks pretty and it's what a port should be. Perhaps people have grown too used to crap ports.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
I'm wondering what the massive difference is in this game between the RX400 and the RX500 series. Like the RX480 scoring 33FPS at 1400, but the RX570 scoring 50 - that's a MASSIVE difference between those two.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
AlmondMan:

I'm wondering what the massive difference is in this game between the RX400 and the RX500 series. Like the RX480 scoring 33FPS at 1400, but the RX570 scoring 50 - that's a MASSIVE difference between those two.
A 60% increase from 480 to 580. I feel sorry for AMD owners on this. Looks like 1st gen Polaris downgrade is in progress. http://www.guru3d.com/index.php?ct=articles&action=file&id=35831
https://forums.guru3d.com/data/avatars/m/266/266825.jpg
:O if only i had Destiny 2 , i love to test it with a rx580 bios vs 480 one @ same clock speed's , would be kinda shit if AMD screwd up this.