ASUS Radeon RX 7600 STRIX OC review
Corsair RM1200X SHIFT 1200W PSU Review
Intel NUC 13 Pro (Arena Canyon) review
Endorfy Arx 700 Air chassis review
Beelink SER5 Pro (Ryzen 7 5800H) mini PC review
Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review
Assassins Creed: Valhalla graphics perf benchmark review





It's a bit of a feast for the eyes, yes we'll check it out in a PC graphics performance and PC gamer way. We'll test the game on the PC platform relative to graphics card performance with the latest AMD Radeon and NVIDIA GeForce graphics cards. Many graphics cards are being tested and benchmarked.
Read article
Advertisement
« Mountain Makalu 67 mouse review · Assassins Creed: Valhalla graphics perf benchmark review
· EK AIO Elite 360 D-RGB review »
pages « 3 4 5 6 > »
BoobZ
Member
Posts: 72
Member
Posts: 72
Posted on: 11/12/2020 04:09 PM
Holy, my 5600XT just gained some extra value
Holy, my 5600XT just gained some extra value

Sylwester Zarębski
Member
Posts: 36
Member
Posts: 36
Posted on: 11/12/2020 04:24 PM
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
Netherwind
Senior Member
Posts: 8538
Senior Member
Posts: 8538
Posted on: 11/12/2020 04:33 PM
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.
I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.
I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.
I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
willgart
Junior Member
Posts: 10
Junior Member
Posts: 10
Posted on: 11/12/2020 04:41 PM
I love the guys claiming that GDDR6X is required or a huge bandwidth.
the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
AMD did the right move to go GDDR6 and not GDDR6X
so we can expect to see the 6900XT at the 3090 performance level as expected.
I love the guys claiming that GDDR6X is required or a huge bandwidth.
the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
AMD did the right move to go GDDR6 and not GDDR6X
so we can expect to see the 6900XT at the 3090 performance level as expected.
pages « 3 4 5 6 > »
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 5545
Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model.
Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually.
A bit under a week for reviews for these though, some good more demanding titles for testing them too.
Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k.
Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't.
HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.