Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis

Published by

Click here to post a comment for Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis on our message forum
data/avatar/default/avatar31.webp
willgart:

I love the guys claiming that GDDR6X is required or a huge bandwidth. the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K specialy the drop in performance of the 3080 is 31% and the 3070 its 33% same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected. AMD did the right move to go GDDR6 and not GDDR6X so we can expect to see the 6900XT at the 3090 performance level as expected.
You are talking about relative performance, rather than actual performance. A higher performing gpu needs more bandwidth to not get bottlenecked by the bandwidth, more so with increasing resolution. So the less gpu power you have, the less bandwidth is needed, as it can't saturate the bandwidth... and vice versa. So as the 3070 has a comparable gpu power vs bandwidth ratio to the 3080, they will scale roughly the same, only the 3080 obviously performs alot higher overall. For the same reason amd used to scale much better at high res with it's 512 bit bus gpus vs nvidia's 384 bit bus gpus, where as now the tables has turned, where the 256 bit bus amd gpu's scale poorly compared to nvidia's gpus at higher res.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Dragam1337:

Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k. Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't. HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.
Ah yeah the memory configuration too for this I forgot about that. But yep there's a good margin for the 3090 competitor for them to have room to implement both for a true enthusiast GPU showcase with the 6900XT while possibly keeping to 1000 - 1200 US Dollars / Euro comparable to the 1500 for the 3090 GPU. Remove as much of the AMD GPU's are memory bottlenecked issue as possible when they have the chance even if they might have kept it for the actual flagship GPU's with the 6800 series if possible since GPU pricing has increased anyway and NVIDIA priced their 3090 card relatively high so AMD would have had a bit of room to still pack in features while pricing the 6900XT lower to compete head-on with it in terms of performance either high-refresh rate at lower resolutions or 3840x2160 maybe higher. Reviews might still show it performing favorable though but maybe not as big of a step up from the 6800XT and competing with the 3090 instead of placing itself ahead of it and at least until the 3080Ti in the GPU performance lead which I'm sure AMD could have used to their advantage as well. EDIT: Well we shall see in about a week maybe also get the first major update out for this game in question and then what's next for Legion and if that changes things around a bit performance wise without going for the highest-end available hardware ha ha. (Or balancing the whole ultra and v.high options a bit there's probably barely much of a difference to the volumetric clouds again for a quick suggestion if it's similar to Odyssey here.) EDIT: Although it also sounds like AMD was surprise the 3090 wasn't faster so if they had known earlier it wouldn't be as big of a step up from the 3080 maybe they would have chanced to really take the performance lead here even if briefly, hard to say it's speculative after all.
data/avatar/default/avatar39.webp
AlexM:

If this game uses 5.7Gb Vram @ 1440p ultra (at witch i game) then i am sure my recent upgrade to a 3080 will be enough for the next 4 years until i upgrade again.
Most other games will use significantly more. https://i.imgur.com/vN8f59Y.jpg https://i.imgur.com/tAIqjsf.jpg And these games are a couple of years old now. Within a couple of years, vram usage in AAA titles with settings maxed will deffo be above 10gb.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
5700XT faster than RTX 3070 and almost as fast as 3080... AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HybOj:

5700XT faster than RTX 3070 and almost as fast as 3080... AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech
At 1080p.. it's like 30% slower than a 3070 4K. No one is going to take these kinds of conclusions seriously. Clearly there is some kind of bottleneck at lower resolutions in the game that makes the 5700XT competitive.
MonstroMart:

Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...
https://tpucdn.com/review/nvidia-geforce-rtx-2080-super-founders-edition/images/battlefield-5-2560-1440.png There are examples of this on the 2080 Supers launch, so I'm not sure why it's surprising or pulling a "pascal" again - whatever that means.
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
MonstroMart:

Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...
Well the 5700XT is beating both the 2080ti and 3070 at 1080p, which should never happen. I don't think this is a case of Nvidia gimping anything, rather it seems like Ubisoft developed the game around AMD and didn't consider Nvidia, which is rare and, in my opinion, a rather amusing and unexpected turn around. Ubi really should have optimised better for Nvidia, it looks like they didn't even bother.
data/avatar/default/avatar36.webp
willgart:

the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K specialy the drop in performance of the 3080 is 31% and the 3070 its 33% same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
good job proving that both cards are similarly balanced. with both cards being equally bottlenecked by their respective bw complete garbage nonsense fixed πŸ™‚
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
BReal85:

That doesn't mean anything. Some AMD sponsored titles run better on NV and some NV supported titles run better on AMD.
while this is absolutely true it's pretty obvious something is "wrong" with Nvidia cards as the 5700XT is matching the same gen cards in the $800+ range fairly handily.
https://forums.guru3d.com/data/avatars/m/283/283898.jpg
Wow my 2070S is really lagging behind now lol, 2070S = 5600XT Even the Vega 64 is almost as fast, wow
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
This game runs great on my 1080 Ti at 1440p with everything maxed and G-SYNC enabled. It isn't a locked 60 fps by anyway means but it plays silky smooth from the SSD installation. I've turned off framerate monitoring and found that I have been enjoying games far more without being distracted by the fact that a game isn't running at 60+ fps. It really is transformative for me! πŸ˜€ Game does look great but perhaps isn't as impressive as the framerate hit suggests, especially when you consider how much better Horizon Zero Dawn looks and runs on this same PC.
data/avatar/default/avatar22.webp
AlexM:

The pics you posted are @ 4k so yes, 3080 will be just fine @ 1440p for a few years but we will have to wait and see πŸ˜€
True, but it still holds true - memory consumption only goes one way. And then there is the case of lightweight games, where you have plenty of gpu power to downsample, but it reqruires alot more vram. Like here in cod modern warfare 2 remastered, where im downsampling from 8k https://i.imgur.com/OEzvxKM.jpg Not gonna be doing that with 10gb vram.
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
Resisting anything and all benchs till I get my 6800xt that have promised myself my once unassailable 2080ti looks so bad on this now and is losing £££s by the Day on Ebay, guess my Wifes 2070 will be getting upgraded, she always gets my cast offs lol
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Denial:

RX5700xt faster than 3070... At 1080p.. RX 5700xt it's like 30% slower than a 3070 4K. No one is going to take these kinds of conclusions seriously. Clearly there is some kind of bottleneck at lower resolutions in the game that makes the 5700XT competitive. .
Ok than "enjoy" those sweet 40 fps on 3070 @ 4k and tell yourself how good the card works compared to AMD πŸ™‚ Your logic is ridiculous. Everyone wants to play at 60fps+. It doesnt matter if 3070 is faster at 4k or 8k or 16k if it cant produce enough frames for the game to be actually playable. 5700xt smoked it where it counts - in the realm of "over 60fps"
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HybOj:

Ok than "enjoy" those sweet 40 fps on 3070 @ 4k and tell yourself how good the card works compared to AMD πŸ™‚ Your logic is ridiculous. Everyone wants to play at 60fps+. It doesnt matter if 3070 is faster at 4k or 8k or 16k if it cant produce enough frames for the game to be actually playable. 5700xt smoked it where it counts - in the realm of "over 60fps"
This isn't a particularly difficult concept to understand. Running the game at higher resolutions puts added load onto the graphics card. In this game, at QHD & 4K, clearly the 3070 pulls ahead of AMD. At 1080p it doesn't - which would indicate that the game is being bottlenecked by something else, or there is some other kind of issue preventing utilization. Which makes your original point "AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech" a really poor one. Now that you randomly went from "AMD's entire previous generation is faster than Nvidia at 1/2 the price based one one game, at one resolution, that clearly has some funky utilization going on" to "Well only 60fps matters" I'm still going to stay you're wrong. I think most people, at higher resolutions, would rather turn down a few settings to meet 60fps at their preferred resolution then dropping the res to hit 60fps.
data/avatar/default/avatar22.webp
None of the reviews seem to want to mention the last few AC games have been running on an engine from 2014, made for PS4 and Xbox One... and when I say "made" I really mean an updated version of Ubisoft's in house engine created in 2007! actually Hardware Unboxed yesterday mentioned it casually in their graphic guide for the game, but didn't bother making it the huge deal that it really is, as it's responsible for all the issues AC games have had in terms of performance. So we're basically seeing a 10+ year old engine in 2020 with the powerful PC and Consoles we have right now.... the fact that the AC series has always ran poorly has NOTHING to do with the fact it's pushing any sort of boundaries in terms of visual quality, it's just a badly outdated engine that's punching way above its weight class in this new era of hardware. Also if you think AC games look amazing when you look up close, you need your eyes checked... the textures and pop in alone gives it away as a last gen title not worthy of this new era of PC and Consoles... I mean no Ray Tracing is a dead give away.... Compare Valhalla to Witcher 3 even on the lower settings and you'll see what a proper engine can do for your game, ALSO Witcher 3 is getting a Ray Tracing update for ALL PLATFORMS !
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
Sylwester ZarΔ™bski:

Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
[youtube=TiK257YS-3w]
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
Badjoras:

Hi, first of all I know that there is no direct comparison between different benchmark scenarios and builds used from different websites/sources but there is no consistency in this case. Normally you either get a better overall values because the test was perform on a area that was less demanding or simply the build produces better results, but in this case we can see clearly that AMD cards have all more average fps and the Nvidia cards all got fewer across the board. Something like Vega64 (55.2 -- 60) RTX3070 (86.2-76) As a Vega user I cannot complain πŸ™‚ but this made me a bit confused about the actual performance that I might get when I start this game.
upload_2020-11-12_17-48-9.png

upload_2020-11-12_17-49-51.png
Is there a simple fact that I'm missing for this results?
just follow Guru3d reviews πŸ™‚
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Badjoras:

Hi, first of all I know that there is no direct comparison between different benchmark scenarios and builds used from different websites/sources but there is no consistency in this case. Normally you either get a better overall values because the test was perform on a area that was less demanding or simply the build produces better results, but in this case we can see clearly that AMD cards have all more average fps and the Nvidia cards all got fewer across the board. Something like Vega64 (55.2 -- 60) RTX3070 (86.2-76) As a Vega user I cannot complain πŸ™‚ but this made me a bit confused about the actual performance that I might get when I start this game.
upload_2020-11-12_17-48-9.png

upload_2020-11-12_17-49-51.png
Is there a simple fact that I'm missing for this results?
lol tpop is so biased its sad. In their review 10700k is still faster than 5900x. That should say something.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
HybOj:

Ok than "enjoy" those sweet 40 fps on 3070 @ 4k and tell yourself how good the card works compared to AMD πŸ™‚ Your logic is ridiculous. Everyone wants to play at 60fps+. It doesnt matter if 3070 is faster at 4k or 8k or 16k if it cant produce enough frames for the game to be actually playable...
You would be surprised how well cards can run at 4k with adjusted settings. And this can be done with minimal visual loss. Ppl who are too lazy to experiment with the settings and just use blanket ultra on everything will be those with sub-par experiences with it.
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
Hilbert Hagedoorn:

:) Not saying a thing here, but sometimes I am so proud of the Guru3D community.
and what if I add a 15% overclocking performance? ;)