Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Published by

Click here to post a comment for Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080 on our message forum
https://forums.guru3d.com/data/avatars/m/271/271573.jpg
Do these tests show the poor DXR performance as well? I have the impression that my oc 2080ti will be good enough for the next three years.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Michal Turlik 21:

Do these tests show the poor DXR performance as well? I have the impression that my oc 2080ti will be good enough for the next three years.
They kind of pit DX-R performance of RDNA2 at native resolution against Turing which used DLSS as it actually renders at lower than native resolution. So, you would have to go through particular tests and see where sites used DLSS and where they had sense and compared actual brute force. For me, neither DLSS nor AMD's upscaling is way to go.
https://forums.guru3d.com/data/avatars/m/207/207671.jpg
I look forward to buying either an nv or amd card when they go on sale to the general public in 7/2021. Can't wait to buy one then.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
And most of those games are prolly nvidia titles too. Im not talking about gameworks an that fuzz. Just that alot of games are not omptimized for AMD gpus as much as Nvidia to do market share, fanboyism, and developer moronic issues. As someone who is a 1440p UW user, 4k is useless and trash anyways. 3440x1440>3840x2160 any day.
data/avatar/default/avatar29.webp
Agonist:

And most of those games are prolly nvidia titles too. Im not talking about gameworks an that fuzz. Just that alot of games are not omptimized for AMD gpus as much as Nvidia to do market share, fanboyism, and developer moronic issues. As someone who is a 1440p UW user, 4k is useless and trash anyways. 3440x1440>3840x2160 any day.
So the lower resolution is superior... great logics...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Dragam1337:

So the lower resolution is superior... great logics...
They are as they can run more fluidly. People can have 144Hz+ 1440p screens and get appropriate frame rate. At 4K, not so much. But I guess, you have 144Hz 4K screen and all games run 100fps+ on your 3090, right? Well, no. Better luck next time. But before you try next time, maybe look at history of so called 4K graphics cards. Their ability to deliver and their longevity. 4K is a dream which costs much, much more than 1440p reality.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Yeah, and having just one boob to deal with is better as well, you only need to use one hand lmao. Yeah right.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
I feel like people are jumping the gun on RT perf, why not wait for more games? sure nvidia had a head start but look at dirt5, something tells me most games coming off the xbox/microsoft will perform closer to that sure codemasters had AMD watching over their shoulder, but its not like they completely disregarded nvidia's RT, no dev in their right mind would do that(discard 90% of users) if anything this is a huge opportunity for AMD, nvidia has the vast majority of the GPU market, but the RT capable GPU market is just getting started, if they manage to get a nice chunk of it, devs will have to optimize for it just as much, unlike what happens with their current 15% marketshare on normal GPUs that hardly any dev cares about
data/avatar/default/avatar03.webp
Fox2232:

They are as they can run more fluidly. People can have 144Hz+ 1440p screens and get appropriate frame rate. At 4K, not so much. But I guess, you have 144Hz 4K screen and all games run 100fps+ on your 3090, right? Well, no. Better luck next time. But before you try next time, maybe look at history of so called 4K graphics cards. Their ability to deliver and their longevity. 4K is a dream which costs much, much more than 1440p reality.
Depends what your priority is. If your priority is fluidity, then a 1080p 360 hz screen is the best. However, if your priority is image quality, then 4k is the way to go - and it just so happens that is the case for me. No, i have possibly the best 60 hz screen made - xb321hk. I run it with g-sync and a framecap at 58 fps - runs super smooth, and gives me superb image quality. My 3090 delivers excellently, as did my 2080 ti. 4k only costs too much if you cant afford it.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Kaarme:

From what I've seen, Nvidia usually has pretty good and optimised drivers right from the beginning. Apart from occasional initial bug like problems, which caused the whole capacitor spectacle with the 3000 series. AMD, however, seems to require months to figure out how its own hardware works, to optimise the drivers. I wonder what would have happened if AMD had invested in a wider memory bus. The 128MB miracle cache was supposed to help, but it's precisely with 4k where 6800 (XT) seems to be lagging. Does the cache fail to deliver in the higher res? I'd like to imagine AMD tested the whole thing somehow during development. Or maybe the reason is elsewhere.
excellent points. indeed i think that the bandwidth issue maybe perfect for the refresh, which may also be sooner than anyone thinks. Apple is really working out the 5nm bugs with the M1 @ TSMC
https://forums.guru3d.com/data/avatars/m/172/172966.jpg
Kosmoz:

Except guru3d, I don't care or trust any of those reviews/reviewers. And before guru3d, the most I trust are Hardware Unboxed and Gamers Nexus in that order. Both showing how AMD beats nvidia at 1080p and 1440p in most games (some with a huge difference in favor of AMD) which is what the majority of people still use and play at, not 4k.
lmao, for sure people buy these cards to play at 1080p here's another untrusted review for you. at 1440p 6800xt surpasses 3080 in 7 out of 23 games tested, with RT on it's at 3070 level and below https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/7.html
data/avatar/default/avatar36.webp
The AMD card also offers insane overclocks over the non existent Ampere OC capabilities, which about evens the field. Ref 6800XT cards hit 2600+MHz clocks while Ampere had no headroom basically
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Kosmoz:

Except guru3d, I don't care or trust any of those reviews/reviewers. And before guru3d, the most I trust are Hardware Unboxed and Gamers Nexus in that order. Both showing how AMD beats nvidia at 1080p and 1440p in most games (some with a huge difference in favor of AMD) which is what the majority of people still use and play at, not 4k.
Pretty much the only 3 websites/channels i check too. Personally i have a 3k monitor now but nobody put that resolution in reviews 😉
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Fox2232:

They kind of pit DX-R performance of RDNA2 at native resolution against Turing which used DLSS
No, the DXR tests are done with DLSS off to keep things fair.
data/avatar/default/avatar34.webp
Fox2232:

Then someone can come and say: 6800XT is 7.2% cheaper and eats 12% less energy on reference design. One should not better look at AIB's cards. Other differences are not even needed to be mentioned. Like performance balance at 1080p vs 1440p vs 4K. Most relevant are 1440p results. 4K are almost irrelevant. And 1080p too, as very few people will pair 6800(XT) with 1080p screen.
4k may be irrelevant to you because you don't have a 4k monitor, but I do and for me all the other resolutions are irrelevant. And frankly if you're searching for 1440p performance then looking any higher than the 6800 or the RTX3070 is a waste of money. So please don't make broad generalizations that so far seem to apply only to you.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
The performance difference is unfortunate, this put the 6800XT out of the running for me, not that you can get one anyway. But the lack of availability put the 3080 out also. So I bought a 3090 today (without even going hunting, they are just in stock in lots of places) and feel so glad I dont have to worry about anything for 4 years. Phew 🙂
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
StevieSleep:

4k may be irrelevant to you because you don't have a 4k monitor, but I do and for me all the other resolutions are irrelevant. And frankly if you're searching for 1440p performance then looking any higher than the 6800 or the RTX3070 is a waste of money. So please don't make broad generalizations that so far seem to apply only to you.
Lovely good night story... with GTX 1080 running 4K. I bet that those new games really run well on 4K. Higher resolution supplements AA. But cost of running reduced details (missing effects) and missing on actual per pixel precision is not worth it. Especially since other option is to have crappy framerate. It is simple. You have RTX 1080. When did you buy it? And then we can look at how long that card delivered 60fps+ on 4K. But today. Your dreamy bubble of 4K bursted into 20~40fps on average in most of recent AAA games. Sacrifices you have to make to keep 60 fps vs sacrifices someone does at 1440p (with same HW) result in much worse image quality on your side. You could as well play some of those heavier games on 1080p with 2:1 pixel scaling, they will look better that way and run better too. And when you even remotely start thinking about future of DX-R... Top cards may have trouble keeping 60fps+ on 1440p year after they are out. 4K people can only pray that fake pixels will come to rescue.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
The 6800XT is 7.4x slower than the 3080, on Windows 😉 I'm most likely going to get a 6700; the 6000 series is the only obvious choice for Linux users right now. The Linux drivers are so much better optimized (especially for compute tasks - the 6800 can often outperform the 3080). These GPUs suck at raytracing, but, I don't know if I'm going to be able to take advantage of that any time soon anyway so it's not much of a loss to me. Even if I were a Windows gamer, I wouldn't be dumping any money into raytracing anyway since the technology is going to need another couple years of maturity. As I've said before, reminds me a lot of the early days of tessellation.
data/avatar/default/avatar21.webp
4k for 1% of gamers., sure valid. For that 1%
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Fox2232:

Lovely good night story... with GTX 1080 running 4K. I bet that those new games really run well on 4K. Higher resolution supplements AA. But cost of running reduced details (missing effects) and missing on actual per pixel precision is not worth it. Especially since other option is to have crappy framerate. It is simple. You have RTX 1080. When did you buy it? And then we can look at how long that card delivered 60fps+ on 4K. But today. Your dreamy bubble of 4K bursted into 20~40fps on average in most of recent AAA games. Sacrifices you have to make to keep 60 fps vs sacrifices someone does at 1440p (with same HW) result in much worse image quality on your side. You could as well play some of those heavier games on 1080p with 2:1 pixel scaling, they will look better that way and run better too. And when you even remotely start thinking about future of DX-R... Top cards may have trouble keeping 60fps+ on 1440p year after they are out. 4K people can only pray that fake pixels will come to rescue.
Plenty of games out there that even potatoes like a 1080/2070/5700 can play at 4K Even better if they have DLSS