Razer Leviathan V2 gaming soundbar review
Guru3D NVMe Thermal Test - the heatsink vs. performance
EnGenius ECW220S 2x2 Cloud Access Point review
Alphacool Eisbaer Aurora HPE 360 LCS cooler review
Noctua NH-D12L CPU Cooler Review
Silicon Power XPOWER XS70 1TB NVMe SSD Review
Hyte Y60 chassis review
ASUS ROG Thor 1000W Platinum II (1000W PSU) review
ASUS ROG Rapture GT-AXE11000 WIFI6E router review
Backforce One Plus Gaming Chair review
Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis
It's a bit of a feast for the eyes, yes we'll check it out in a PC graphics performance and PC gamer way. We'll test the game on the PC platform relative to graphics card performance with the latest AMD Radeon and NVIDIA GeForce graphics cards. Close to 30 graphics cards are being tested and benchmarked.
Read the review here.
« First PlayStation 5 that died was due to its SSD · Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis
· MSI Starts Teasing a New Graphics Card »
Review: G.Skill TridentZ Royal DDR4 4000 MHz CL17 (4x 8GB) - 11/11/2020 03:14 PM
Today, we are looking at G.Skill TridentZ Royal 4000 MHz CL17 memory in a 32 GB set consisting of four 8 GB modules. At the end of 2018, we had an opportunity to check a lower-clocked kit from that se...
Review: Palit GeForce RTX 3070 Jetstream OC - 11/09/2020 02:03 PM
We review and benchmark the RTX 3070 from a board partner, this round the Palit GeForce RTX 3070 Jetstream OC is seated onto our test system to see how well this product performs. Palit overhauled the...
Review: AMD Ryzen 7 5800X 8-core ZEN3 processor - 11/06/2020 03:02 PM
It's time for already our 4th ZEN3 review, yes the much anticipated Ryzen 5 5800X. This is the processor that is on the watchlist of many with 8 cores and 16 threads if offers a bit more flexibility ...
Review: AMD Ryzen 5 5600X - 11/05/2020 06:58 PM
ZEN3 is here and in this review we'll have a stab at the six-core part, the Ryzen 5 5600X. Overall, this processor series is to bring extensive single and multi-thread workload performance gains. The...
Review: AMD Ryzen 9 5900X and 5950X processors - 11/05/2020 04:00 PM
We review ZEN3, the new Ryzen 9 5900XT, and 5950XT. Released by AMD as a new architecture that will once again attack Intel, this round AMD places a substantial focus on your gaming performance. Overa...
BoobZ
Member
Posts: 72
Joined: 2013-09-06
Member
Posts: 72
Joined: 2013-09-06
#5853827 Posted on: 11/12/2020 04:09 PM
Holy, my 5600XT just gained some extra value
Holy, my 5600XT just gained some extra value

Sylwester Zarębski
Member
Posts: 23
Joined: 2020-03-23
Member
Posts: 23
Joined: 2020-03-23
#5853831 Posted on: 11/12/2020 04:24 PM
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
Netherwind
Senior Member
Posts: 8010
Joined: 2009-11-13
Senior Member
Posts: 8010
Joined: 2009-11-13
#5853833 Posted on: 11/12/2020 04:33 PM
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.
I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.
I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.
I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
willgart
Junior Member
Posts: 10
Joined: 2019-10-02
Junior Member
Posts: 10
Joined: 2019-10-02
#5853834 Posted on: 11/12/2020 04:41 PM
I love the guys claiming that GDDR6X is required or a huge bandwidth.
the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
AMD did the right move to go GDDR6 and not GDDR6X
so we can expect to see the 6900XT at the 3090 performance level as expected.
I love the guys claiming that GDDR6X is required or a huge bandwidth.
the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.
AMD did the right move to go GDDR6 and not GDDR6X
so we can expect to see the 6900XT at the 3090 performance level as expected.
Click here to post a comment for this news story on the message forum.
Senior Member
Posts: 3970
Joined: 2017-11-23
Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model.
Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually.
A bit under a week for reviews for these though, some good more demanding titles for testing them too.
Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k.
Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't.
HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.