Microsoft Adds DirectX 12 Feature-level 12_2, and Turing, Ampere and RDNA2 Support it

Published by

Click here to post a comment for Microsoft Adds DirectX 12 Feature-level 12_2, and Turing, Ampere and RDNA2 Support it on our message forum
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Good to see everyone is getting in on it.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Ha. Guess what have RDNA1 and PS5 in common? And what does RDNA2 and XSX have in common? RDNA1 and PS5 uses Primitive Shaders introduced with Vega. RDNA2 and XSX have Mesh Shaders available. (Introduced with Turing.) Those two things in a way work towards same goal, but their approach is way different, and so is resulting performance. There is no reason to claim that they are same, unless people want to go and say: "Vega has Mesh Shaders capability."
data/avatar/default/avatar16.webp
isn't ps5 a rdna2 too and so supporting mesh shaders also? ( with their own api not dx12 )
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
asturur:

isn't ps5 a rdna2 too and so supporting mesh shaders also? ( with their own api not dx12 )
Presentation talks heavily about primitive shaders. And when Cerny talked about raytracing, he basically said that their RT is based on same RT as will PC have in form of RDNA2. Not, that it "IS" same. Was pretty careful not to make statements what would later have legal implications. One of interesting things is that PS5 had chips for dev kits much earlier than XSX. And one of rumors which were around was that MS did not want to have as many custom changes as they wanted RDNA2 as close as possible to PC. So they waited for that. Now, it does not mean PS5 customizations can't bring good or better than RDNA2 things to table. But it means that PS5 chip development forked before RDNA2 was complete while XSX sprouted quickly from RDNA2.
data/avatar/default/avatar27.webp
oh ok so ps5 may not have rdna2 as it is, but something else. Weird. For who buys purely on specs, this may be a deal breaker. I just want the new playstation, so for me is not a big deal
data/avatar/default/avatar39.webp
asturur:

oh ok so ps5 may not have rdna2 as it is, but something else. Weird. For who buys purely on specs, this may be a deal breaker. I just want the new playstation, so for me is not a big deal
for those who buys purely on specs, it's no question that XSX is the way to go. it's 12TFLOPS vs. 10TFLOPS, and it seems that PS5 won't have a fully functioning RDNA2 either, but more like an "RDNA1.5", let's say but I don't think that many people would choose their console purchase based purly on specs
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
PS5 will have equal strength to XBOX series x. Just less features. It all depends "if" those features are gonna be used by developers. Now, Turing, Ampere and RDNA2, not vega, not rdna 1.5 or 1 are viable, tech is moving on, fast and unforgiving. Cheer-up gurus! 😎
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Tier 0.9? That's a interesting little thing for that sampler feedback. EDIT: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html Well it does something, probably nothing too important. Already multiple tiers for several of the D3D12_2 stuff too, interesting. 🙂 (And it seems ray tracing also has incremental ones like 1.1 going by Ampere GPU capabilities. Hmm well it probably means nothing much overall.)
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Can someone forward on all the DX12 documentation to Microsoft - they seem to need it to finish developing Flight Simulator 2020... appears they were only given the DX11 docs.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
geogan:

Can someone forward on all the DX12 documentation to Microsoft - they seem to need it to finish developing Flight Simulator 2020... appears they were only given the DX11 docs.
Microsoft didn't develop flight simulator 2020.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
K.S.:

So much for early adopters of RDNA
RTX 2060 Super and RTX 2070 Super are like a steal compared to 5600XT, 5700 and 5700XT. RDNA1 is underwhelming and looks worse day by day. Missing so many features.
https://forums.guru3d.com/data/avatars/m/261/261821.jpg
itpro:

RTX 2060 Super and RTX 2070 Super are like a steal compared to 5600XT, 5700 and 5700XT. RDNA1 is underwhelming and looks worse day by day. Missing so many features.
I'd say the 2060 Super is completely useless. You can get a 5700XT for same price with a lot more power. Raytracing is horrible on 2060 Super(even on my 2080 its horrible(performance)), and barely any games to play, still). Same with DLSS 2.0, barely any games to support it and 1.0 is crap.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
riot83:

I'd say the 2060 Super is completely useless. You can get a 5700XT for same price with a lot more power. Raytracing is horrible on 2060 Super(even on my 2080 its horrible(performance)), and barely any games to play, still). Same with DLSS 2.0, barely any games to support it and 1.0 is crap.
Techpowerup's game summary puts a 5700XT at 5% faster than a 2060 Super across all 3 resolutions. Idk if I'd call that "a lot more power", unless of course you're talking about the 5700XT's 35w higher power consumption. Having the 2060 Super gives you Mesh Shaders, VRS, RT and DLSS. Yes there are relatively few games that use it.. but obviously with consoles coming with support for most of these technologies, more games in the future will use RT/Mesh/VRS. Further, some developers are looking at using RT for other things aside from just graphics - like sound for example, which shouldn't tax RT as much as reflects/shadows/etc, so even if you think the performance of 2060 Super in current RT games isn't enough, it might be in those circumstances. Then there are other advantages like the video encoder/decoder just being generally better. RTX voice if that's something you're interested in. All the weird Nvidia experience/shader bullshit that no one probably uses. Various VR improvements no probably uses either but maybe they do. So idk, I think calling it completely useless is a bit of a stretch. Like even if you ignored the massive feature advantages towards the Nvidia card, it's a 5% performance difference for roughly the same price.. no one is batting an eye at that. All this being said, with both companies on the verge of releasing new cards within the next couple months, you probably shouldn't be buying either.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
It is so sad from consumers' perspective. Early adopters of RTX 20 series got new features plus made preparations for the stage of RTX 30 series. RDNA early adopters got only the badge of AMD casual fan and loyal consumer. RDNA 2 will make RDNA 1 go EOL quicker than ever. AMD did it so wrong again. "Poor vega", "poor small navi". Let's hope big navi is rich and successful, enough with technologies' poverty.
data/avatar/default/avatar27.webp
K.S.:

So much for early adopters of RDNA
I would like to ask what did we give to AMD when they supported DX12, including ASync compute, all the way from GCN, while NVidia lied to us that Maxwell supports ASync compute, when it did not. This includes me because I purchased 980Ti and 1080Ti. That's right, we went ahead and still bought products from their competitor, because what matters, according to us, is what they can deliver the day of the launch and not future proofing. If we encourage day of the launch performance vs future proofing, then that is what we will get. End of story.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
karma777police:

Ray tracing is useless until Nvidia releases 4000 series. Around that time the industry will pick up on it in more meaningful way. In mean time you are wasting your money.
Enjoy CP2077 without ray tracing then buddy!
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Everything depends on how devs "will" work with the API(s)..... Remembering of DirectX 10 with Crysis 1.....:(;)
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
Why should I care for all these new tiers if Cyberpunk was developed without them? For now I am good with DX 12_1.
data/avatar/default/avatar11.webp
DannyD:

Enjoy CP2077 without ray tracing then buddy!
Proper SSR will look almost as good without the performance hit, and we’re not even doing fully ray-traced scenes yet. Perhaps the 3090 will manage 1440 60fps with RT in most games and maybe even 4K but I wouldn’t hold my breath for anything other than the 3090 to run RT @1440p smoothly, not without the help of DLSS anyway.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Denial:

Techpowerup's game summary puts a 5700XT at 5% faster than a 2060 Super across all 3 resolutions. Idk if I'd call that "a lot more power", unless of course you're talking about the 5700XT's 35w higher power consumption. Having the 2060 Super gives you Mesh Shaders, VRS, RT and DLSS. Yes there are relatively few games that use it.. but obviously with consoles coming with support for most of these technologies, more games in the future will use RT/Mesh/VRS. Further, some developers are looking at using RT for other things aside from just graphics - like sound for example, which shouldn't tax RT as much as reflects/shadows/etc, so even if you think the performance of 2060 Super in current RT games isn't enough, it might be in those circumstances. Then there are other advantages like the video encoder/decoder just being generally better. RTX voice if that's something you're interested in. All the weird Nvidia experience/shader bullshit that no one probably uses. Various VR improvements no probably uses either but maybe they do. So idk, I think calling it completely useless is a bit of a stretch. Like even if you ignored the massive feature advantages towards the Nvidia card, it's a 5% performance difference for roughly the same price.. no one is batting an eye at that. All this being said, with both companies on the verge of releasing new cards within the next couple months, you probably shouldn't be buying either.
1080p data on TPU says that RX 5700 XT is 2% above RTX 2070. Not that it matters much. And all 3 resolutions? Neither of them is suitable for 4K. 1440p has no longevity on those cards. Those are very good 1080p cards which will server in everything except DX-R well for quite some time.