Benchmark Results Radeon RX 6800 XT Show Good RT scores and Excellent Time Spy GPU score

Published by

Click here to post a comment for Benchmark Results Radeon RX 6800 XT Show Good RT scores and Excellent Time Spy GPU score on our message forum
data/avatar/default/avatar05.webp
Dragam1337:

No, no and no. Resolution and hz is pushing the need for more gpu power, not raytracing. I dont know anyone who actually uses raytracing, as it's just a waste of gpu power. But i know alot of people who needs faster gpu's to push 240 hz or 4k.
this, im waiting for the 6900xt or to get a 3080, i know im swappiing my 3900x for a 5900x, and i already ordered a 240hz 1440p screen, i want 200fps + at 2k, i dont even care about max detailt, i play mostly FPS games, RT performance isnt a game changer for me, ill play single player game with rt sure but otherwise it ll be turned off
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Dragam1337:

No, no and no. Resolution and hz is pushing the need for more gpu power, not raytracing. I dont know anyone who actually uses raytracing, as it's just a waste of gpu power. But i know alot of people who needs faster gpu's to push 240 hz or 4k.
Raytracing is the future of graphics. It's always been what was styled towards. We are not there yet, but we're at the point where it's sort of the same as when you had cards with or without hardware T&L.
data/avatar/default/avatar23.webp
AlmondMan:

Raytracing is the future of graphics. It's always been what was styled towards. We are not there yet, but we're at the point where it's sort of the same as when you had cards with or without hardware T&L.
It has every resemblance of a gimmick - it barely does anything for visuals... slightly fancier reflections for a 50% performance hit. Yeah no thanks. Gpu's can barely keep up with traditional rasterization demands of hz and resolution, let alone have enough power to waste it on slightly prettier reflections. For gpu's to be able to do fully raytraced games real time (properly raytraced like movies, and not just this reflection gimmick crap), gpu's would have to be 50 times more powerful than what we currently have. But then resolution and hz demands will continue to increase, as will overall demands from games, so i dont think that true raytracing will become viable... ever.
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
0blivious:

Just to chime in, I also couldn't care less about raytracing abilities. (or DLSS, for that matter) All things being equal, sure; I'll take it. When buying, RT is so far down the list of things I might care about on a GPU that it's rendered completely irrelevant to my GPU purchase decision. It keeps getting brought up and I keep not caring. It would affect maybe 5% of my playtime, if even that much. Nothing I've seen (so far) makes it's functionality a deal breaker to me. Nvidia wants me to care but I just don't. Maybe I'll care in 2025 but not now.
Yeah same here. RT is a feature I would use in a game to see how it looks then promptly turn off to get the frame rate back. DLSS is another tech where I keep seeing people say it's a deal breaker that AMD doesn't have it but how many games are available that use it? How many upcoming titles will it be in? Watch Dogs Legion and Cyberpunk, whenever that gets released, are the two newer standout titles, anything else? The technology is fine but only really useful for players if it gets included in more games.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Dragam1337:

It has every resemblance of a gimmick - it barely does anything for visuals... slightly fancier reflections for a 50% performance hit. Yeah no thanks. Gpu's can barely keep up with traditional rasterization demands of hz and resolution, let alone have enough power to waste it on slightly prettier reflections. For gpu's to be able to do fully raytraced games real time (properly raytraced like movies, and not just this reflection gimmick crap), gpu's would have to be 50 times more powerful than what we currently have. But then resolution and hz demands will continue to increase, as will overall demands from games, so i dont think that true raytracing will become viable... ever.
That's a really ... I don't know what to say here, uninformed, I guess, take at what RT is. I don't think you have taken even a quick look at what it does for graphical fidelity. It's so far beyond "slightly fancier reflections" that even saying something like that is borderline absurd. GPUs can easily keep up with traditional requirements, though. The 1080p segment, previously something that was really highend, is now almost laughably easy to run games at. Especially the most popular games in the world are easily run at this resolution by almost any computer. Hell, it's so easy that your mobile phone can run those games at 60 fps there. I don't know the stats for higher than 60hz displays, but I'm pretty sure that it's still somewhat uncommon to have a higher than 60hz display. 60hz is still the gold standard, even it seems old, there's no real gold standard beyond it. At one point it looked like 120 might be the next thing, but panel makers just don't make anything that's a mass market standard at above 60hz. So maxing out the mainstream display resolution and frequency is easily attained by modern hardware. A fully ray traced game will not require rasterization - the traditional GPU will be useless in such a title. You'll have a point in time where it's like "this requires a RT-only GPU" like when you had Quake 3 come out and require that you had a graphics accellerator. And to get to that point, we need to be going in a direction where the ability to run these hybrid solutions of today are a necessary step.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Dragam1337:

No, no and no. Resolution and hz is pushing the need for more gpu power, not raytracing. I dont know anyone who actually uses raytracing, as it's just a waste of gpu power. But i know alot of people who needs faster gpu's to push 240 hz or 4k.
Actually, yes, somewhat. Resolution and Hz are absolutely 2 of the major contributors to needing more GPU power, but (especially for AMD) RT is too. I say "especially for AMD" because IIRC, they don't have dedicated hardware for it, so you basically have to trade rendering performance for RT calculations. Just because you don't know anyone who wants/needs such things, doesn't make it invalid. I don't know a single person who wants or has used fentanyl, even though my state has one of the highest amount of users for it. Like it or not, RT is here to stay and it is only going to get more common.
Dragam1337:

It has every resemblance of a gimmick - it barely does anything for visuals... slightly fancier reflections for a 50% performance hit. Yeah no thanks.
You're either immensely exaggerating or don't actually know enough about it. The use of RT for things like reflections or shiny surfaces is very much stupid and overly taxing for an otherwise relatively simple function, and annoyingly, that's mostly all we've been seeing. But if you think that's all RT is good for, you must not know much about computer graphics because real-time RT is basically the holy grail of enhancing detail. RT is the reason why CGI in blockbuster movies takes so damn long to render, and is also the reason why it can often look so realistic. RT is objectively a big deal, it just doesn't seem that way when the most common examples of it involve things you don't need RT for at all (puddles, shadows, shiny surfaces, etc).
i dont think that true raytracing will become viable... ever.
We as a community should hold you to that.
data/avatar/default/avatar29.webp
Dragam1337:

It has every resemblance of a gimmick - it barely does anything for visuals... slightly fancier reflections for a 50% performance hit. Yeah no thanks. Gpu's can barely keep up with traditional rasterization demands of hz and resolution, let alone have enough power to waste it on slightly prettier reflections. For gpu's to be able to do fully raytraced games real time (properly raytraced like movies, and not just this reflection gimmick crap), gpu's would have to be 50 times more powerful than what we currently have. But then resolution and hz demands will continue to increase, as will overall demands from games, so i dont think that true raytracing will become viable... ever.
Isn't a gimmick. The reflections ha always been in games, and were emulated first with the environmental textures, then in shaders because we couldn't make proper RT. So the whole shading world advanced because we were aiming at better lightning in general. Now we are switching to a different implementation, and while i disagree shading won't be useful anymore in the future, there will be some generations of cards that are are in both worlds, like those.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
AlmondMan:

Raytracing is the future of graphics. It's always been what was styled towards. We are not there yet, but we're at the point where it's sort of the same as when you had cards with or without hardware T&L.
Wake me up when there is GPU that can do pure-raytracing at 720p and delivers semi-photorealistic gameplay with 60fps+. Hell, ring a bell when it can do 540p 30fps... DVD quality images.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Jumbik:

Now we just need to know when and how will the "super resolution" tech work which is supposedly offer similar results as DLSS, hopefully.
That would be insane for the consoles, even more if they also use the same implementation when they port it to PC
data/avatar/default/avatar07.webp
AlmondMan:

That's a really ... I don't know what to say here, uninformed, I guess, take at what RT is. I don't think you have taken even a quick look at what it does for graphical fidelity. It's so far beyond "slightly fancier reflections" that even saying something like that is borderline absurd.
There is a big difference between what it CAN do, and what it actually does in current games - show me a game where it does anything other than slightly improving reflections or shadows. And at a massive performance penalty. Sure, raytracing can create completely lifelike visuals, as seen with the fully raytraced scenes used with cgi in movies. But it's so absurdly performance heavy, that it wont ever be able to run in real time.
GPUs can easily keep up with traditional requirements, though. The 1080p segment, previously something that was really highend, is now almost laughably easy to run games at. Especially the most popular games in the world are easily run at this resolution by almost any computer. Hell, it's so easy that your mobile phone can run those games at 60 fps there. I don't know the stats for higher than 60hz displays, but I'm pretty sure that it's still somewhat uncommon to have a higher than 60hz display. 60hz is still the gold standard, even it seems old, there's no real gold standard beyond it. At one point it looked like 120 might be the next thing, but panel makers just don't make anything that's a mass market standard at above 60hz. So maxing out the mainstream display resolution and frequency is easily attained by modern hardware.
There are absolutely no one who buys top end gpu's who are running 1080p 60 hz. If they are running 1080p, it's 240 or 360 hz. More likely it's 1440p 144 hz, or higher. Or as in my case, 4k. And no, gpu's can't easily keep up there - even the 3090 will be struggling at 4k in certain scenarios. And i can guarantee you that raytracing will be the first thing to be turned off. Has been with my 2080 ti and will be with my 3090.
A fully ray traced game will not require rasterization - the traditional GPU will be useless in such a title. You'll have a point in time where it's like "this requires a RT-only GPU" like when you had Quake 3 come out and require that you had a graphics accellerator. And to get to that point, we need to be going in a direction where the ability to run these hybrid solutions of today are a necessary step.
It was clear cut to everyone that shaders were a better solution to performance vs visuals compared to older solutions, so the whole industry moved towards shaders. The same can't be said with raytracing. Can it potentially look better? Yes, but at a huge performance penalty. Visuals already look so good with rasterization, and run so well, that there is very little incentive for the mainstream to move to raytracing.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Dragam1337:

Who actually cares about facking raytracing though? The vast majority of popular games doesn't support raytracing, and good riddance...
I agree. Until it's a normal thing in games, yawn. And only it racing games do I truly care for it.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Dragam1337:

There is a big difference between what it CAN do, and what it actually does in current games - show me a game where it does anything other than slightly improving reflections or shadows. And at a massive performance penalty. Sure, raytracing can create completely lifelike visuals, as seen with the fully raytraced scenes used with cgi in movies. But it's so absurdly performance heavy, that it wont ever be able to run in real time.
Yes - so is that a reason to be completely and utterly dismissive? In an enthusiast forum at that? 😛 the improvement in reflections isn't "slight", it is massive. The improvement in shadows is also massive. The improvement in making things appear as if they are actually a part of the scene and not floating in the world is also massive. And that's just the rudimentary version that we have now. There's a lot of things that it does right now that just makes the graphics look more believable. I don't see how taking a performance penalty to get better graphics is somehow unexpected or wrong. Remember "can it run Crysis?"? Or any other title in the past that massively changed the graphical fidelity possible. I don't really know why you would even want to say "never" when talking about the possibility of the development of computational power. They might not be able to do it now, at the point where it's comparable to a high-end movie in 2020 (movies that also use a hybrid approach by blending 3d graphics and actual footage), but then again, neither could the graphics renderers that did graphics for movies 30 years ago.
Dragam1337:

There are absolutely no one who buys top end gpu's who are running 1080p 60 hz. If they are running 1080p, it's 240 or 360 hz. More likely it's 1440p 144 hz, or higher. Or as in my case, 4k. And no, gpu's can't easily keep up there - even the 3090 will be struggling at 4k in certain scenarios. And i can guarantee you that raytracing will be the first thing to be turned off. Has been with my 2080 ti and will be with my 3090.
Probably as few as there are the other way around. From all the benchmarks, it would appear titles are easily run at 60FPS and up at 4k. But even if they didn't, it's always been a thing that high resolution could require sacrificing detail on if you want the highest possible frame rate on new titles. Like I said, it's not common. It's also not common for anyone to have a 3090 or 2080Ti or whatever. It's uncommon. Most people have at most a 1080p 60hz display and they can run Fortnite, LoL, and so on at 60hz no problem on their 8 year old system that cost 200$ on ebay. There's no problem for traditional rendering to match what the normal population has. Nor is there for the more enthusiast crowd running 1440p in standard or ultrawide. A midrange GPU will do you perfectly fine there.
Dragam1337:

It was clear cut to everyone that shaders were a better solution to performance vs visuals compared to older solutions, so the whole industry moved towards shaders. The same can't be said with raytracing. Can it potentially look better? Yes, but at a huge performance penalty. Visuals already look so good with rasterization, and run so well, that there is very little incentive for the mainstream to move to raytracing.
Yes, and as time moved on, technology improved and new opportunities arose. Like various path traces and similar ray tracing based technologies that are in use in many titles already. Made possible by the advancement of computational power in graphics cards. In the future, a fully raytraced realtime image will be made possible. It's just a matter of time and money.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Fox2232:

Wake me up when there is GPU that can do pure-raytracing at 720p and delivers semi-photorealistic gameplay with 60fps+. Hell, ring a bell when it can do 540p 30fps... DVD quality images.
Any particular reason why you want to not experience anything in between that time?
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Dragam1337:

Who actually cares about facking raytracing though? The vast majority of popular games doesn't support raytracing, and good riddance...
I also don´t care for RT right now but some dome people do and we can´t critize them for that. Before RT, choosing the best cards was about pure performance or performance/cost ratio, now we also have to choose betweent pure rasterization performance or good rasterization and RT performance.
kapu:

Early benchmarks shows that AMD is in deed faster in RT than nVidia without DLSS.
If this is true, then Nvidia screwed up big time.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
AlmondMan:

Any particular reason why you want to not experience anything in between that time?
Because performance hit vs image quality improvement sux badly. I have seen better reflections than DX-R offered at lower performance cost. Sure, harder to pull off. But games sell 100 million copies. And they blow 30% of budget on marketing. I see no reason why people should pay huge premium in such landscape for HW that makes it easier to make similar visuals. Do you think CoD with DX-R costs player less while being cheaper to make? Raytracing makes nice GI, can be done without it and will look as well while costing less performance. Benefit of DX-R is that developer has easier time making good looking game. So once more. When I pay upfront extra price for HW that enables game developer to save man-hours, why are not games cheaper? They do not deliver more content for same price either. When I play DX-R enabled game, maybe I'll enable effects, maybe not. But those effects are not going to dictate HW I purchase, nor game I play. I do not buy games for visuals, I buy them to have fun. And then you add glitches in existing stitched up together games. (Missing geometry in reflections. Light sources visible through objects/walls. ...) Look at comparison of DX-R ON/OFF, some games are just stupid. They put reflective surfaces in places they should not be. Put extra puddles while land/street with DX-R OFF has fewer of them. This often creates images that look less realistic than when DX-R is OFF. Sure, DX-R can be more accurate, if only environment did not became much less accurate for purpose of selling you visual effect.
https://forums.guru3d.com/data/avatars/m/261/261894.jpg
The worst blind is that not wanna see... People stay saying DLSS here and there... DLSS is to compression resource of image... so, you´ll LOST QUALITY! Who buy a card like this not wanna lost quality... they wanna both! The truth is that AMD outdone himself with ths 6XXX series and now the fanboys must find some bullshit to try negate the reality... AMD is at TOP of VGA my dear! A custom liquid 6800XT will beat even the 3090 in every game... Nvidia must take care or will go to the same way of Intel!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Borys:

The worst blind is that not wanna see... People stay saying DLSS here and there... DLSS is to compression resource of image... so, you´ll LOST QUALITY! Who buy a card like this not wanna lost quality... they wanna both! The truth is that AMD outdone himself with ths 6XXX series and now the fanboys must find some bullshit to try negate the reality... AMD is at TOP of VGA my dear! A custom liquid 6800XT will beat even the 3090 in every game... Nvidia must take care or will go to the same way of Intel!
I would not be so sure about wins. We have not seen Hilbert's reviews, nor we can expect full performance with non-Zen3 CPUs. Those cards will dance around each other for most users.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
RT reflections and shadows seems to be exactly how devs will be using RT for next-gen console games. I originally thought it'd take 5yrs for RT to go mainstream, however, in reality it's closer to 2.5yrs with games like Spiderman on PS5 already having RT in place. Now I'm quite certain that RT with help from PS5+XSX will become the new normal and ports to PC will have these features included. We're still in the very early stages of this tech, but, with the limited resources available to consoles, we can see what the base-line implementation of RT is likely going to be already. If devs do only use a couple of RT features, then, I don't think it'll actually be that taxing on PC. I'm guessing that consoles will be able to offer RTX2060~ levels of RT performance. That will mean more PC gamers will be able to enjoy those effects in-game. However, on the flip-side, there's also expectations of PC RT being higher-quality and more fleshed-out with more fine-grain options.
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
Dragam1337:

Who actually cares about facking raytracing though? The vast majority of popular games doesn't support raytracing, and good riddance...
Agreed. I would argue that it is more a gimmick rather than true experience enhancement currently. Maybe it is more important in VR, who knows. Just glad AMD can produce good performance on their card with these features enabled, without having to resort to other proprietary means.
data/avatar/default/avatar16.webp
moo100times:

Agreed. I would argue that it is more a gimmick rather than true experience enhancement currently. Maybe it is more important in VR, who knows. Just glad AMD can produce good performance on their card with these features enabled, without having to resort to other proprietary means.
Exactly, it ticks all the boxes of a gimmick. It's much like what physx was, only now it's slightly more fancy reflections rather than more fancy particles... at a huge performance penalty.