Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
ASUS Radeon RX 7600 STRIX OC review
Corsair RM1200X SHIFT 1200W PSU Review
Intel NUC 13 Pro (Arena Canyon) review
Endorfy Arx 700 Air chassis review
Beelink SER5 Pro (Ryzen 7 5800H) mini PC review
Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review

New Downloads
AMD Ryzen Master Utility Download 2.10.3.2504
CrystalDiskInfo 9.0.1a Download
AMD Radeon Software Adrenalin 23.5.2 WHQL download
Intel ARC graphics Driver Download Version: 31.0.101.4382
Corsair Utility Engine Download (iCUE) Download v5.2
GeForce 535.98 WHQL driver download
CPU-Z download v2.06
AMD Radeon Software Adrenalin 23.5.1 WHQL download
GeForce 532.03 WHQL driver download
AMD Chipset Drivers Download 5.05.16.529


New Forum Topics
Gpu only works in 3D Mark but in games I only have iGPU option, Saphire X080 *(Rx 6700) Review: ASUS Radeon RX 7600 STRIX OC PSA: 535 system stability concerns. RTX 4090 Owner's thread We need to talk about UE4 Shader compilation issues 535.98 + Hotfix 536.09 - Clean Version Got 6 Samsung 870 EVO - 100% failure rate Applying extra thermal pads to GPU Info Zone - gEngines, Ray Tracing, DLSS, DLAA, TSR, FSR, XeSS, DLDSR etc. New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's




Guru3D.com » Review » Assassins Creed: Valhalla graphics perf benchmark review 5

Assassins Creed: Valhalla graphics perf benchmark review 5

Posted by: Hilbert Hagedoorn on: 11/12/2020 02:32 PM [ 109 comment(s) ]

It's a bit of a feast for the eyes, yes we'll check it out in a PC graphics performance and PC gamer way. We'll test the game on the PC platform relative to graphics card performance with the latest AMD Radeon and NVIDIA GeForce graphics cards. Many graphics cards are being tested and benchmarked. 

Read article


Advertisement



« Mountain Makalu 67 mouse review · Assassins Creed: Valhalla graphics perf benchmark review · EK AIO Elite 360 D-RGB review »

pages « 3 4 5 6 > »

Dragam1337
Senior Member



Posts: 5545
Posted on: 11/12/2020 04:04 PM
Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model.

Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually.
A bit under a week for reviews for these though, some good more demanding titles for testing them too. :D

Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k.
Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't.

HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.

BoobZ
Member



Posts: 72
Posted on: 11/12/2020 04:09 PM
Holy, my 5600XT just gained some extra value :D

Sylwester Zarębski
Member



Posts: 36
Posted on: 11/12/2020 04:24 PM
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.

Netherwind
Senior Member



Posts: 8538
Posted on: 11/12/2020 04:33 PM
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.

I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.

Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.

I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.

willgart
Junior Member



Posts: 10
Posted on: 11/12/2020 04:41 PM
I love the guys claiming that GDDR6X is required or a huge bandwidth.
the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.

AMD did the right move to go GDDR6 and not GDDR6X

so we can expect to see the 6900XT at the 3090 performance level as expected.

pages « 3 4 5 6 > »

Post New Comment
Click here to post a comment for this article on the message forum.

Guru3D.com » Articles » Assassins Creed: Valhalla graphics perf benchmark review 5

Guru3D.com © 2023