Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070

Published by

Click here to post a comment for Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070 on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Turanis:

Sorry to intervine,but benches from who?If its directly from Microsoft with DXR support then ok. https://old.reddit.com/r/nvidia/comments/9lcs4u/microsoft_dxr_demos_compiled_for_windows_10/ But if its from 3Dmark Port Royal (made it by Nvidia) then no. Future console games for PS5/XBOX will support everything because RDNA2 and Ryzen 3.The PC gaming will have m 🙂 Heck even actual RT games do not work well on every Nvidia RTX gpu,because infancy and PS4/Xbone consoles with their games(ported on PC) do not supported.
Can we read the article before we intervene?
data/avatar/default/avatar13.webp
Turanis:

Future console games will support everything because RDNA2 and Ryzen 3. 🙂
Doesn't make sense. What about PS5? It's not DX-12 complient.
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
How's that?RDNA 2 (PS5/XBOX)do not support DX12 Ultimate?I guess does,including DXR 1.1.
data/avatar/default/avatar03.webp
Turanis:

How's that?RDNA 2 (PS5/XBOX)do not support DX12 Ultimate?I guess does,including DXR 1.1.
PS5 is not RDNA2 ... no VRS, Sampler Feedback, etc, RT is believed to be different (not sure) ... I believe they have something schedule shortly for more detail.
data/avatar/default/avatar16.webp
Sony developed their own APIs called GNM and GNMX for the PS4, likewise PS3 used their own APIs, presumably they'll do the same for the PS5. That one uses DX12 instead of Vulkan or another proprietary API matters little in this case. Long term, I wouldn't be surprised to see RDNA2 GPUs age better than Turing and Ampere, like what happened for GCN vs Kepler. Mark Cerny said PS5 is RDNA2, Microsoft now claims theirs is the only full RDNA2. Putting their words against eachother is as reliable as AMD and Nvidia and Intel claiming whose GPUs are better.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
At this point I'm intrigued, but will still wait for the official release of the cards, more reviews, and benchmarks in real games (CP2077) comparing all of the RDNA2 cards to Ampere. And then I will have facts to decide on. But, if they can match 3080 performance in raytracing with the 6900XT or what it will be called, I'd be pretty impressed. Still slower, but not bad for a start. But the pricing will be a thing then, being slower in RT, since they should be cheaper than Nvidia cards, I feel.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
As expected, if the RT performance was great they would be the first ones to shout about it. Personally, i´m not concerned about RT performance too much, so no problem. What bothers me the most are prices, higher than they should be, both for Nvidia and AMD. If the 6800xt was around 500€, i would probably get one, but at around 700€, i think i´ll pass...
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
pharma:

PS5 is not RDNA2 ... no VRS, Sampler Feedback, etc, RT is believed to be different (not sure) ... I believe they have something schedule shortly for more detail.
Errmm from a random site,not so big: https://www.howtogeek.com/677445/what-is-directx-12-ultimate-on-windows-10-pcs-and-xbox/ "Future RDNA2-based AMD graphics cards, as well as the Xbox Series X, will also support DX12 Ultimate." https://devblogs.microsoft.com/directx/new-in-directx-feature-level-12_2/
data/avatar/default/avatar28.webp
Maddness:

Yep, Metro Exodus springs straight to mind. That game was totally worth the performance hit enabling Ray Tracing. That's my opinion and you're welcome to disagree, but it's how I feel.
Really? Can you point out where in this video it's totally worth it? Just read the comments below it. No-one seems to agree. [youtube=yPySV5C1i24]
data/avatar/default/avatar23.webp
Horus-Anhur:

Remember when the CEO of AMD, stated that the PS5 uses RDNA2. https://twitter.com/lisasu/status/1260602084669390850?lang=en
It is possible I could be misinterpreting the scenario. However the tech/dev sites I frequent are patiently waiting for Sony's announcement regarding DirectX 12 Ultimate compliance which should be very shortly.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Sony will never use DirectX, because it's a proprietary API, that belongs to MS. Sony has to create it's own API, that calls to RDNA2 feature set.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
It's actually better than I guessed if true. However at $650 I think it's not quite as good unless the 6800XT can brute force it's way past or the RT deficiency.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
To me, the best (and really, only) practical use of RT is for secondary lighting effects, which to my understanding (and may be wrong) is also the least taxing on GPUs. Tertiary reflections are also really nice, but seldom ever come up. Developers have pretty much perfected shiny and semi-glossy surfaces (with or without texture) for years now, where RT really doesn't yield any noteworthy benefit at all, but is hugely computationally expensive. It's kind of weird to me, how people are willing to sacrifice tens of FPS for something like puddles that don't really look any better than what technology has offered before, but balk at the idea of buying a piece of hardware (whether that be CPU, GPU, RAM, storage, etc) that, despite any other advantages, has an unnoticeable performance drop under specific workloads. Unlike a lot of people, I do legitimately think RT is absolutely critical to the future of gaming graphics, but it's pretty sad when Minecraft is the only really good example of how it should be used.
Maddness:

Yep, Metro Exodus springs straight to mind. That game was totally worth the performance hit enabling Ray Tracing. That's my opinion and you're welcome to disagree, but it's how I feel.
Of screenshots I saw, there are a small handful of situations in Metro Exodus that really stood out as "this is a great example of RT" but in most cases, it yielded no significant advantage at all.
data/avatar/default/avatar09.webp
Tyrchlis:

I call bullshit on your claim. I play Metro Exodus, Tomb Raider, and Wolfenstein Youngblood ALL with RT on, 1440p ULTRA settings. Quit acting like you know everybody, you don't. Not even close, clearly. As of today, my RTX 3090 will be playing Watch Dogs Legion sometime a little later, again, with FULL RTX on. If the game supports it, I turn it on. RT, DLSS, all of it. A little later I will be also be playing Mechwarrior 5, though I hear it's RT implementation doesn't look as good as others, I will be the judge of that, and it won't be a performance issue causing me to turn it off if it looks bad.
That's great and all, but all I'm asking is for someone to point out where FULL RTX makes the game look significantly better. I'd even take slightly better. There's none in that Metro video. In the AMD presentation the World of Warcraft Shadowlands supposedly showing off ray-tracing effects only resulted in some wall's with some ambient glow on it. It looked abysmal, and nothing you couldn't do without raytracing.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Was a bit worried when RT was not mentioned. Although right now is not a deal breaker, with next gen supporting it we may see more and more games include it maybe making it a deal breaker for some. If this is also the 6900XT performance on it, then thats worrying. Don't get me wrong great for first gen of it, but I bet running RT will see performance drop so much. The other thing that wasn;t mentioned was DLSS or something like that, which also begs the question is that also sub-par? By pure rasta, these cards are beast and competitive with Nvidia, but think Nvidia have the edge with the extra RT and AI cores that can bring about extra performance in games, which at the moment is just current titles but I did see DLSS start to get impliments in far more games which the jumps up on FPS using DLSS has been amazing
https://forums.guru3d.com/data/avatars/m/87/87316.jpg
Richard Nutman:

That's great and all, but all I'm asking is for someone to point out where FULL RTX makes the game look significantly better. I'd even take slightly better. There's none in that Metro video. In the AMD presentation the World of Warcraft Shadowlands supposedly showing off ray-tracing effects only resulted in some wall's with some ambient glow on it. It looked abysmal, and nothing you couldn't do without raytracing.
You touched on a very controversial point. IMO RT is great, but a future technology. Right now it's implementation is just going to be a way to show power and, maybe, gain some performance. True RT is only going to come into play when all lighting are real time. Meanwhile, it's only for some happy few that can afford a top of the line GPU.
data/avatar/default/avatar25.webp
AMD is really impressive on paper however I am going to wait for actual reviewer benchmark results. After what AMD has pulled in the past with their benchmark results it is very difficult to trust them.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

To me, the best (and really, only) practical use of RT is for secondary lighting effects, which to my understanding (and may be wrong) is also the least taxing on GPUs. Tertiary reflections are also really nice, but seldom ever come up. Developers have pretty much perfected shiny and semi-glossy surfaces (with or without texture) for years now, where RT really doesn't yield any noteworthy benefit at all, but is hugely computationally expensive. It's kind of weird to me, how people are willing to sacrifice tens of FPS for something like puddles that don't really look any better than what technology has offered before, but balk at the idea of buying a piece of hardware (whether that be CPU, GPU, RAM, storage, etc) that, despite any other advantages, has an unnoticeable performance drop under specific workloads. Unlike a lot of people, I do legitimately think RT is absolutely critical to the future of gaming graphics, but it's pretty sad when Minecraft is the only really good example of how it should be used.
I have the same opinion. RT may be amazing and the future of graphics but i think it´s still too early to be really concerned about it. But i can understand those who like RT early implementations and consider an important aspect of graphics.