Assassins Creed: Valhalla graphics perf benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Assassins Creed: Valhalla graphics perf benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
The 3070 seems to perform as well as the 2080ti at 4K despite having lower memory bandwidth. There is a bit more to GPU architecture than bandwidth tbh.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Netherwind:

When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps. I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti. I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
Wrote it before. There is always going to be some company which will target 1440p and maximize visuals there. But nothing prevents you from turning few options to lower settings.
data/avatar/default/avatar28.webp
willgart:

I love the guys claiming that GDDR6X is required or a huge bandwidth. the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K specialy the drop in performance of the 3080 is 31% and the 3070 its 33% same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected. AMD did the right move to go GDDR6 and not GDDR6X so we can expect to see the 6900XT at the 3090 performance level as expected.
You are talking about relative performance, rather than actual performance. A higher performing gpu needs more bandwidth to not get bottlenecked by the bandwidth, more so with increasing resolution. So the less gpu power you have, the less bandwidth is needed, as it can't saturate the bandwidth... and vice versa. So as the 3070 has a comparable gpu power vs bandwidth ratio to the 3080, they will scale roughly the same, only the 3080 obviously performs alot higher overall. For the same reason amd used to scale much better at high res with it's 512 bit bus gpus vs nvidia's 384 bit bus gpus, where as now the tables has turned, where the 256 bit bus amd gpu's scale poorly compared to nvidia's gpus at higher res.
data/avatar/default/avatar18.webp
If this game uses 5.7Gb Vram @ 1440p ultra (at witch i game) then i am sure my recent upgrade to a 3080 will be enough for the next 4 years until i upgrade again.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Dragam1337:

Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k. Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't. HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.
Ah yeah the memory configuration too for this I forgot about that. But yep there's a good margin for the 3090 competitor for them to have room to implement both for a true enthusiast GPU showcase with the 6900XT while possibly keeping to 1000 - 1200 US Dollars / Euro comparable to the 1500 for the 3090 GPU. Remove as much of the AMD GPU's are memory bottlenecked issue as possible when they have the chance even if they might have kept it for the actual flagship GPU's with the 6800 series if possible since GPU pricing has increased anyway and NVIDIA priced their 3090 card relatively high so AMD would have had a bit of room to still pack in features while pricing the 6900XT lower to compete head-on with it in terms of performance either high-refresh rate at lower resolutions or 3840x2160 maybe higher. Reviews might still show it performing favorable though but maybe not as big of a step up from the 6800XT and competing with the 3090 instead of placing itself ahead of it and at least until the 3080Ti in the GPU performance lead which I'm sure AMD could have used to their advantage as well. EDIT: Well we shall see in about a week maybe also get the first major update out for this game in question and then what's next for Legion and if that changes things around a bit performance wise without going for the highest-end available hardware ha ha. (Or balancing the whole ultra and v.high options a bit there's probably barely much of a difference to the volumetric clouds again for a quick suggestion if it's similar to Odyssey here.) EDIT: Although it also sounds like AMD was surprise the 3090 wasn't faster so if they had known earlier it wouldn't be as big of a step up from the 3080 maybe they would have chanced to really take the performance lead here even if briefly, hard to say it's speculative after all.
data/avatar/default/avatar22.webp
AlexM:

If this game uses 5.7Gb Vram @ 1440p ultra (at witch i game) then i am sure my recent upgrade to a 3080 will be enough for the next 4 years until i upgrade again.
Most other games will use significantly more. https://i.imgur.com/vN8f59Y.jpg https://i.imgur.com/tAIqjsf.jpg And these games are a couple of years old now. Within a couple of years, vram usage in AAA titles with settings maxed will deffo be above 10gb.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
5700XT faster than RTX 3070 and almost as fast as 3080... AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech
data/avatar/default/avatar06.webp
Kool64:

great review HH. It's interesting to see what "financially backing" a game engine can do for graphics performance.
That doesn't mean anything. Some AMD sponsored titles run better on NV and some NV supported titles run better on AMD.
data/avatar/default/avatar40.webp
Dragam1337:

Most other games will use significantly more. And these games are a couple of years old now. Within a couple of years, vram usage in AAA titles with settings maxed will deffo be above 10gb.
The pics you posted are @ 4k so yes, 3080 will be just fine @ 1440p for a few years but we will have to wait and see πŸ˜€
data/avatar/default/avatar12.webp
AlexM:

The pics you posted are @ 4k so yes, 3080 will be just fine @ 1440p for a few years but we will have to wait and see πŸ˜€
RTX 3080 is not a 1440p card. It's for 4K.
HybOj:

5700XT faster than RTX 3070 and almost as fast as 3080... AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech
That's because AMD cards do really well in this title PLUS RTX 3000 cards have better performance on higher resolutions, especially in 4K and they do really bad on FHD, just check every review of the 3080-3090. There is about a 15 percentage point difference between a 2080 Ti and a 3080 between FHD and 4K in the TPU review. For example, the difference between the 1080Ti and the 2080 performance gain in FHD and 4K is ZERO.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
HybOj:

5700XT faster than RTX 3070 and almost as fast as 3080... AMD doesnt even need to release new cards, they beat current gen nVidia for 1/2 of price and one generation older tech
At 1080p.. it's like 30% slower than a 3070 4K. No one is going to take these kinds of conclusions seriously. Clearly there is some kind of bottleneck at lower resolutions in the game that makes the 5700XT competitive.
MonstroMart:

Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...
https://tpucdn.com/review/nvidia-geforce-rtx-2080-super-founders-edition/images/battlefield-5-2560-1440.png There are examples of this on the 2080 Supers launch, so I'm not sure why it's surprising or pulling a "pascal" again - whatever that means.
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
MonstroMart:

Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...
Well the 5700XT is beating both the 2080ti and 3070 at 1080p, which should never happen. I don't think this is a case of Nvidia gimping anything, rather it seems like Ubisoft developed the game around AMD and didn't consider Nvidia, which is rare and, in my opinion, a rather amusing and unexpected turn around. Ubi really should have optimised better for Nvidia, it looks like they didn't even bother.
data/avatar/default/avatar14.webp
BReal85:

RTX 3080 is not a 1440p card. It's for 4K.
For me, the one who decides what the card is good for, is the user, so for me the 3080 delivers good fps at 1440p which makes it a good 1440p card for me. The argument that it's not a 1440p card, when it can run that resolution at good fps, is just false.
data/avatar/default/avatar05.webp
willgart:

the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K specialy the drop in performance of the 3080 is 31% and the 3070 its 33% same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
good job proving that both cards are similarly balanced. with both cards being equally bottlenecked by their respective bw complete garbage nonsense fixed πŸ™‚
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
BReal85:

That doesn't mean anything. Some AMD sponsored titles run better on NV and some NV supported titles run better on AMD.
while this is absolutely true it's pretty obvious something is "wrong" with Nvidia cards as the 5700XT is matching the same gen cards in the $800+ range fairly handily.
https://forums.guru3d.com/data/avatars/m/283/283898.jpg
Wow my 2070S is really lagging behind now lol, 2070S = 5600XT Even the Vega 64 is almost as fast, wow
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
This game runs great on my 1080 Ti at 1440p with everything maxed and G-SYNC enabled. It isn't a locked 60 fps by anyway means but it plays silky smooth from the SSD installation. I've turned off framerate monitoring and found that I have been enjoying games far more without being distracted by the fact that a game isn't running at 60+ fps. It really is transformative for me! πŸ˜€ Game does look great but perhaps isn't as impressive as the framerate hit suggests, especially when you consider how much better Horizon Zero Dawn looks and runs on this same PC.
data/avatar/default/avatar05.webp
AlexM:

The pics you posted are @ 4k so yes, 3080 will be just fine @ 1440p for a few years but we will have to wait and see πŸ˜€
True, but it still holds true - memory consumption only goes one way. And then there is the case of lightweight games, where you have plenty of gpu power to downsample, but it reqruires alot more vram. Like here in cod modern warfare 2 remastered, where im downsampling from 8k https://i.imgur.com/OEzvxKM.jpg Not gonna be doing that with 10gb vram.