Radeon RX 6800 XT performs half as good compared to RTX 3080 in Vulkan Ray Tracing tests

Published by

Click here to post a comment for Radeon RX 6800 XT performs half as good compared to RTX 3080 in Vulkan Ray Tracing tests on our message forum
data/avatar/default/avatar37.webp
I like RT cyberpunk. For me its not blurry. I dont know if its because you are running it too long an option. I am in quality and its fantastic. Sure its not DLSS that makes it blurry? Because its rendered at a lower resolution and upscaled. But that is the whole point. DLSS is suppose to give more FPS not better image.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
As i have already said even a RTX 3090 doesn't do proper RT... We are still in early debut of this render for lambda consumer. So i would take this info with some added points: - Quake RT: have being possible only with NVidia financement. - Vulkan is (sadly) a next dead standard. M$ eaten too much market share 🙁 . - NV RT use it own tech wich is more effective for now... But the standard will be DXR and it will end like NVidia's PhysX ( again M$ eaten too much market share 🙁 )
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
Once I saw Quake II, I figured it's an nVidia RTX sponsored/involved game, so not surprised that it'd suck on AMD cards......NOT saying that AMD would be good at RT in the first place.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

Maybe im wrong. I though i played quake 2 as a child. Didnt had the shiny walls though. 😉
played q2 rtx had real time reflections,dynamic ray traced shadows and ray traced GI,most textures and materials were reworked. pretty nice,tbh I never saw dynamic shadows work that good in any fps I played.stunning.the way all those modern games make shadows hard and crisp but hardly moving is a travesty to gaming. 2070S was sluggish at 1440p tho,but I figure if 3080 is doing 80 then a 3070 will be good for 60.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
kapu:

Yes exacly , we compared also some shots inside buldings and we decided RT OFF looks better , some scenes are too "bright" with RT ON, but its' matter of opinion , problem is RT makes FPS Drop to 40-50 Fps (3060Ti , its not trash GPU , close to 3070 overclocked) . , and DLSS makes game blurry , so you are left without options.
Not sure where you're getting blurry performance from DLSS.... Digital Foundry even complements the implementation of it due to how it actually makes the games look better. And yeah real light from the sun is bright... Game looks seriously better with the shtuff turned on. Opinions aside. And yeah whoever thought full on ray tracing wasn't taxing on a system?!?
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
I don't think many believed AMD would beat Nvidia with their first implementation of RT - BUT, the only comments I ever see are to the effect that 'we ran this game, and AMD was worse.'. As someone who is a software and data engineer and has solved a lot of business performance problems, I can't just accept the I ran it and this was the result. I want to hear, read and know more about the hardware. I've tried to a bit of research, and the most I found was that AMD was considerably faster than Nvidia for calculating triangle intersections for RT, at 1 triangle per CU/clock and 4 times that for box intersections. That said, I gave up looking for what Nvidia was capable of. It also appears that out of INT32, FP32 and FP64, the AMD cards have some extra capability here also. It would just be interesting to understand more about the hardware capabilities from both sides.
data/avatar/default/avatar14.webp
The anti-raytracing brigade appear to be a bit below the flat earthers. I accept this level of ignorance is brought about by sites such as HU, but such sites are on notice and will have far deeper NDAs to agree to. If you can't afford RT, fair enough, but when you keep putting it down due to envy.... 🙄
https://forums.guru3d.com/data/avatars/m/224/224720.jpg
That title... It's half as well, not half as good. Superman does good.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
So Nvidia FANBOYS can rejoice in a 2nd gen RT card beating the others first gen RT card and still needs DLSS to have good FPS. God this is hilarious. Even reshade RT tanks fps.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
This is inline with my expectations from AMD's first attempt at RT. Its pretty much in line with Nvidia's first attempt.
data/avatar/default/avatar34.webp
I do not care about this gimmick....today, the problem is as someone who's going to buy a 1000+USD videocard and watercool it I'm not buying a card for 2020 but for 2021-2023 for that kind of money I'm not gambling on the fact I won't need RT later and for the same reason I'm not buying a 10gb card, if you go check various youtube rivatuner/afterburner videos you'll see several 4K games reach 9Gbs in real life, knowing "optimization" is not really getting better (cough 250gbs call of duty cough) another gamble I'm not taking in a weird and sad way we're all winning with the lack of availability, we'll have the next improved versions probably
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Kaarme:

The RX 6000 series' lack of RT performance is a bit of a mystery. It really seems like AMD was aiming at barely beating/matching the first generation Nvidia RT, despite the huge criticism back in the day about Turing RT being next to useless in practice. AMD specifically didn't include ray tracing in the 5000 series because they didn't feel like they were ready for it. Didn't AMD seriously anticipate Nvidia overhauling the ray tracing performance in the Ampere architecture? It's puzzling.
Nvidia didn't overhauled anything. 3000 are faster cards in general, so RT is faster. Watch hardware unboxed reviews and you'll see RT performance scales linearly, the new RT cores is BS from the leather jacked man.
kapu:

RT is not a thing for this generation. My friend has 3060Ti and he disabled RT in cyberpunk because he said it looks worse than enabled. Also dlss makes game blurry, so big quality loss there.
Don't you love to pay more for silicon real stat that you don't use? That's Nvidia and RT for you. PS: Not trying to hurt you, just commenting on how your friend has to pay for something he doesn't use.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Silva:

Nvidia didn't overhauled anything. 3000 are faster cards in general, so RT is faster. Watch hardware unboxed reviews and you'll see RT performance scales linearly, the new RT cores is BS from the leather jacked man. Don't you love to pay more for silicon real stat that you don't use? That's Nvidia and RT for you. PS: Not trying to hurt you, just commenting on how your friend has to pay for something he doesn't use.
no,new rt cores aren't "bs" just cause rt performance always was and will be tied to compute performance https://ithardware.pl/admin/ckeditor/filemanager/userfiles/tomcug/2019/04/rt/metro_exodus_dxr.jpg rt cores amount to about 1/10th the time of a frame.doubling the performance of an RT core you're only cutting the time required for the green part in half.so 5% altogether. wasn't the main point of new rt and tensor cores to cut their number on the gpu while retaining same peformance ? that's what I understood. 3070 has a lot fewer rt and tensor,yet more cuda than a 2080ti. they had to leverage it somehow in the SM. no one "has to" pay for a gpu. also,the alternative is to get an amd gpu,pay for rt accelerated hardware that is neither stable nor usable without dlss
Agonist:

So Nvidia FANBOYS can rejoice in a 2nd gen RT card beating the others first gen RT card .
and turing beating rdna2
data/avatar/default/avatar12.webp
I really have a hard time telling ray tracing on vs off. Especially when playing. If I concentrate on it I can. Barely. On top of that titles with ray tracing are coming. But literally I play no titles with ray tracing right now. Unless they updated them without me knowing. Point is yes I think ray tracing will become part of the video game experience. But as it stands it has no bearing on my purchase. 5 to 10 years down the road when the tech catches up and they can smooth it out without serious hits to performance. I'll be all in. Right now. No thanks.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Silva:

Nvidia didn't overhauled anything. 3000 are faster cards in general, so RT is faster. Watch hardware unboxed reviews and you'll see RT performance scales linearly, the new RT cores is BS from the leather jacked man. Don't you love to pay more for silicon real stat that you don't use? That's Nvidia and RT for you. PS: Not trying to hurt you, just commenting on how your friend has to pay for something he doesn't use.
Actually, I've been a no-AA user for over 2 decades. I also don't use v-sync, again, another major feature I never use. I'm not trying to justify anything, but, paying for something I don't use has been every generation so far and isn't going to stop happening anytime soon.

cucaulay malkin:

https://i.imgur.com/fHt4GP1.jpg
Those RT numbers for the RX6000 series are highly suspect, especially when they tell you in the readme that RT isn't working properly yet.

https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Stormyandcold:

Actually, I've been a no-AA user for over 2 decades. I also don't use v-sync, again, another major feature I never use. I'm not trying to justify anything, but, paying for something I don't use has been every generation so far and isn't going to stop happening anytime soon.
xbox series s has a hw accelerated gpu talk about wasteful with just 4 tflops fp32
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
cucaulay malkin:

xbox series s has a hw accelerated gpu talk about wasteful with just 4 tflops fp32
What's interesting is that it looks like the Series S actually runs Cyberpunk 2077 the best out of all the Xbox's atm.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
and I still don't regret buying a 2070S earlier in the year.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Silva:

Nvidia didn't overhauled anything. 3000 are faster cards in general, so RT is faster. Watch hardware unboxed reviews and you'll see RT performance scales linearly, the new RT cores is BS from the leather jacked man.
Regardless, the new AMD 6000 match Nvidia 3000 in traditional rasterisation work, yet lag behind badly in RT. No matter how that disparity is generated, it's there. If what you say is true, it makes things even worse for AMD.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
cucaulay malkin:

no it sucks on one 40 fps and 78 fps are not equally bad
Just because one has nearly double the performance, that doesn't mean they don't both suck. It is actually amazing that we can do real-time raytracing at all, but, we're still another couple years away from it not being such a major performance hog. Spend enough money on Nvidia and you'll get playable framerates, but that's either at 1080p or with DLSS. That's still a sub-par experience. I'd rather just not enable raytracing.