Final words and conclusion
The year 2020 proofs to show exciting times in the graphics card industry. New technologies like raytracing are now becoming a norm for gaming. With Big Navi, aka the 6800 and 6900 series, AMD is back to the table with a deck of cards that offers support for that full DirectX Ultimate feature set. Not just that, they've made some big bets with the architecture. The infinity cache definitely brings them where they need to be performance-wise. So in that respect, AMD is back in that front row seat with a top-3 performing graphics card.
Ultimately everything and anything it's all about gaming price, performance, and, of course, rendering quality. Of course, the Radeon RX 6800 XT is a product that ticks the mandatory boxes. We do feel the 6800 offers more value for money. This card can run games at 4K; it will serve extremely well at WQHD and with brutally GPU bound games. At Full HD, you'll be quite often bottlenecked and CPU limited, but compared to NVIDIA's RTX 3080 and even 3090, AMD's L3 cache brings them some exceptionally fast performance. Competition wise you're looking at 3080 levels of performance for the 6800 XT overall. Performance-wise we can safely state that this is a true Quad HD graphics card that is very Ultra HD capable of the current games. AMD's biggest deficit however is that they do not have a solution at hand that matches DLSS from NVIDIA. The added Tensor cores in hardware on the RTX series will always work out better for NVIDIA. Even with DirectML support pending, these still need to be run over the compute engine. So no matter how we look at it, it'll cost performance whereas NVIDIA can offload its ML supersampling to the tensor cores. It's the sole reason why they implemented these specific DL/AI cores in the first place.
What about Smart Access Memory?
You will have noticed that AMD introduced SAM, smart Access memory. And it's a feature to keep an eye on as the results have shown IF it kicks in, it is able to boost your framerates significantly. Out of the three games tested only one kicked in, but extremely well; Assassins Creed: Valhalla absolutely loves this feature. As explained it does come with compromises, as SAM requires that CSM Support is turned off in the BIOS in order to enable above 4G Decode, which will allow Resizable Bar Support (SAM) to be enabled. The problem here is that if your Windows installation is configured as non-UEFI, Windows will be unable to boot from your currently installed SSD/HDD. Most PCs will be configured like that. The only solution is to disable CSM, and reinstall Windows 10 to get this feature-set supported.
Looking at raytracing, we have to admit that AMD is performing reasonable at best, sometimes close and in line with NVIDIA but often at GTX 2080 Ti / RTX 3070 performance levels. We had hoped for a bit more as RT wise you'll be playing at WQHD at best, and there is where ML supersampling could have benefit AMD. What helps them is the Infinity cache, so overall AMD is offering a fun first experience in Hybrid raytracing but we expected more. If we look at full path Raytracing, then AMD lags behind significantly as the competition is showing numbers nearly twice as fast.
Generic compute performance in application like VRAY show a good boost in performance. OpenGL performance is lagging behind. AMD shines in D3D12 and Vulkan, less so on professional workloads.
Cooling & noise levels
In extremely stressed conditions, we did hit a close to 40 dBA though it took a while for the card to get there (warms up slowly); that is considered a normal acoustic level. Depending on the airflow level inside your chassis, expect the card to sit in the 70 running max 75 Degrees C range temperature-wise under hefty load conditions (depending on the airflow in your chassis). As FLIR imaging shows, the card's top and bottom side shows minor heat bleeding. Overall, we're very comfortable with what we observe.
In the previous paragraph, I already mentioned this; your heat output and energy consumption are always closely related to each other as (graphics) processors and heat can be perceived as a 1:1 state; 100 Watts in (consumption) often equals 100 Watts of heat as output. This is the basis of TDP. AMD is listing them at 250 to 300 Watts for the flagship product, which is okay for a graphics card in the year 2020 at this performance level. We measure numbers to be close to the advertised values for the XT, in fact, it was spot on 300 Watts.
The reference cards do exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.
Limited availability and nauseating price levels these days for a graphic card are becoming bothersome. I mean, even for us true hardcore PC gamers, it's getting more and more difficult to explain why people should put down this much money to be able to play computer games on a PC. I mean the spread is $999 for the 6900 XT, $649 for the 6800, and $579 for the cheapest 6800 model. Sure you can game at Ultra HD and get 16GB of GDDR6 memory but with the new consoles from Microsoft and Sony in mind sitting at a 500 USD marker for their GPU, CPU, Storage, and hey the console as a package we are getting more and more uncomfortable about the price levels for graphics cards. We do expect AIB cards to be more expensive, as that is a trend as of late. We'll have to wait and see how that pans out, though, as everything is dependant on the actual volume availability of these cards.
Tweaking BiG Navi GPUs has been a bit of a challenge. Sometimes confusing, other times easy. AMD does enforce cap again on the memory, we don't like that as we feel we could have gone a notch further. The tweaks on the clock frequency and memory run fine, but the performance was often lower than defaults.
For the RX 6800 series, we'd expect you to reach ~17 GBps of effective bandwidth. Of course, increase the power limiter to the max, so your GPU gets (even) more energy budget. Why this huge differential these you might wonder? Well you can clock the card even at 3 GHz, but when a power limiter kicks in, it'll always bring down that frequency to match your power budget. Results will vary per board, brand, and even card due to cooling (GDDR6/GPU/VRM), Frequency matters LESS these days as your power limiter will be the decisive and dominant factor, lowering that clock frequency to meets its assigned power budget. That's said, we reached 2450 MHz, and do expect some board to hit 2500 MHz. But all that tweaking and extra energy consumption will bring you a max of ~5% extra performance at best.
These are interesting times if you're on the lookout for a high-performance graphics card. I think it is safe to say that many did not expect the new Radeon 6800 series this fast seen from that good old rasterizer shading engine. Big NAVI and thus RDNA2 as architecture is impressive at many levels. it now offers Raytracing support as well, a feature that AMD can compete within the RTX 30 series with a reasonable performance at best though. And yeah, enabling Raytracing om Battlefield V for the first time with an AMD card, was a bit of a special feel alright. If you're into Raytrace assisted games the loss for AMD however is the lack of some sort of AI supersampling that is hardware accelerated. AMD might add ML supersampling at one point in time, but it will always run over the compute engine, and that will lower performance as that engine is already in use so the workload is shared. NVIDIA has added dedicated hardware with Tensor cores to offload that render engine and thus that's a win, anytime. As time passes the discussion on Raytracing and technologies like DLSS/MLSS is getting more extreme. When comparing the latest gamers shaded or raytraced, most people find themselves in a situation where they can hardly see any distinction. See, for the past decade or two rasterization and shading have become extremely good and efficient at what it does, and there is a dilemma to be found. So if you don't care that much about RT/MLSS/DLSS then AMD has a properly good proposition at hand. Big Navi delivers that's a sure thing. Games show fantastic framerates on your monitor all the way-up-to Ultra HD. I however have stated this many times now, we do feel that overall graphics cards are too expensive. I mean the cheapest version 6800 costs $579, for less money you can purchase the new premium version Xbox. This is a good product with a proper amount of graphics memory compared to what NVIDIA is offering with the RTX 3070 and 3080. With future games in mind, this will turn out to your advantage as you can quite easily play Ultra HD games without running into VRAM limitations anytime soon. With Smart Access memory in mind, this helps out as well. But as stated that feature is limited to CSM mode in your BIOS. So yeah, $579 for the 6800 and $649 for the 6800 XT. We expect AIB (board partners) to offer products slightly higher in price with more premium features to be found in cooling and RGB. Overall, the product performs better than we initially expected in the lower resolutions and offers expected performance (in between RTX 3080/3090) at Ultra HD when playing good old fashioned rasterizer engine games. If you are less interested in what Raytracing and DLSS bring to the table, then the 6800 XT with it's 16GB graphics memory is an easily recommended choice over the $699 USD GeForce RTX 3080 at a price of 649 USD. But if that is reversed for you and you want maximum Hybrid raytracing performance, then AMD's miss is the lack of some sort of hardware Tensor cores and a DLSS equivalent. Choices choices, they never are easy to make.
- Hilbert, LOAD"*",8,1.