AMD Might Cancel Implicit Primitive Shader Driver Support

Published by

Click here to post a comment for AMD Might Cancel Implicit Primitive Shader Driver Support on our message forum
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Idk, it is shaddy no denying that.. But then again Nvidia promoted async compute with maxwell and you saw how that turned out..
data/avatar/default/avatar19.webp
Well no performance loss, but there will be no performance gain either. Vega owners are ****** as fury owners did after years. A wannabe enthusiast GPU "FINEWINE" edition. Many promises and useless features all over the place. AMD is a superior hype race officially. Neither Intel nor Nvidia lied so much ever before. Look at ryzen and vega slides. Then look at real performance numbers and satisfaction values. AMD chose the wrong way. They are gonna lose all RX and Ryzen users faster than the period took to acquire them. RIP.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Well the devs won't implement them, as AMD hardly have a Market Share, if any at all now.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
-Tj-:

Idk, it is shaddy no denying that.. But then again Nvidia promoted async compute with maxwell and you saw how that turned out..
Maxwell supports async compute, it's implementation just wasn't as robust as AMDs specific definition, so I'm not sure how that relates.
https://forums.guru3d.com/data/avatars/m/250/250676.jpg
People have the right to get mad when companies do things like this behind customers back. like what Apple did to slow down your older phone without letting people know. AMD had 6 months to let everyone know but failed to do so.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Well, here I was already thinking "another fanboy bashing for nothing" (sry @NvidiaFreak650 ). Then I read up on what primitive shaders even are, and what they are supposed to do. Then I don't understand, why are they cancelling it? Didn't it work before? In theory that's something really useful...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
fantaskarsef:

Well, here I was already thinking "another fanboy bashing for nothing" (sry @NvidiaFreak650 ). Then I read up on what primitive shaders even are, and what they are supposed to do. Then I don't understand, why are they cancelling it? Didn't it work before? In theory that's something really useful...
They aren't canceling it - the developer can still manually support it but initially AMD's driver was supposed to automagically replace the geometry shaders when it felt it could provide a benefit to geometry performance. It probably led to flickering/image artifact issues that they couldn't resolve in the automated process. So now the developer has to decide when to use it - which becomes a question of support, if the PS4 Pro and Xbox One X support primitive shaders then it's possible we will see some games with it. If it's just desktop vega, I doubt we'll see much of anything - that being said is geometry performance really that much of a bottleneck with Vega? I genuinely don't know.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Denial:

They aren't canceling it - the developer can still manually support it but initially AMD's driver was supposed to automagically replace the geometry shaders when it felt it could provide a benefit to geometry performance. It probably led to flickering/image artifact issues that they couldn't resolve in the automated process. So now the developer has to decide when to use it - which becomes a question of support, if the PS4 Pro and Xbox One X support primitive shaders then it's possible we will see some games with it. If it's just desktop vega, I doubt we'll see much of anything - that being said is geometry performance really that much of a bottleneck with Vega? I genuinely don't know.
Again, thanks for explaining.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Denial:

Maxwell supports async compute, it's implementation just wasn't as robust as AMDs specific definition, so I'm not sure how that relates.
Disabling it specifically for Maxwell or to have negative performance when enabled sounds about right. My point was it's nothing new about promoting certain stuff then ditch it for inefficiency or compatibility reasons..
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
NvidiaFreak650:

People have the right to get mad when companies do things like this behind customers back. like what Apple did to slow down your older phone without letting people know. AMD had 6 months to let everyone know but failed to do so.
This is a little different...Not right what Apple of course, but this was something AMD promised. Like that was a feature people invested in Vega for, and now they aren't going to get it. AMD Made their money with mining, and see how well Vega does in mining so of course they aren't going to go through with this anymore. Basically people bought a product that wasn't finished, beta, early access, whatever you want to call it. Instead of releasing the product fully complete like they should of done and for some reason market trends seem to agree with this model, they in turn release a half ass final product now. So it would be totally different if Vega competed without these shaders, but they hardly do...That on top of the outrageous price due to mining it really is clear to see where their priorities are. Which they are a business, so I'm not surprised that they want to do anything to turn a profit.
data/avatar/default/avatar01.webp
no API support, no party. Hopefully in DirectX 13 ('pardon, DirectX 14) and Vulkan 2.0 there will be a unification of the geometry pipeline....
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
There are quite a few wrappers/injectors for DX games. I guess those people can put this flag and override developer's choice. Or even enable it for older games.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
RealNC:

Didn't AMD also say they will enable something called "Draw Stream Binning Rasterizer" after Vega launch and improve performance? https://overclock3d.net/news/gpu_displays/amd_vega_gpu_architectural_analysis/3
DSBR is active on Vega FE but only utilized in some pro applications - I haven't really heard about it's status on RX Vega cards. DSBR isn't really a catch-all performance/power increase/decrease (respectively) like lots of people believe. Nvidia's Tom Peterson was interviewed on PC Perspective about it and basically said they enable it on a profile basis - it actually decreases shader performance, so it's only useful when memory bandwidth is a bottleneck and not shader performance. It's also known to cause graphical issues in some games/applications.
data/avatar/default/avatar33.webp
Fox2232:

There are quite a few wrappers/injectors for DX games. I guess those people can put this flag and override developer's choice. Or even enable it for older games.
Why should them? Developers will never care about such huge difference in the shader pipeline just for one architecture (Vega) on one platform (PC). Developers care more about features that are available on consoles and that could or will available on many architectures on PC like the features coming with Shader Model 6.1 and 6.2.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Alessio1989:

Why should them? Developers will never care about such huge difference in the shader pipeline just for one architecture (Vega) on one platform (PC). Developers care more about features that are available on consoles and that could or will available on many architectures on PC like the features coming with Shader Model 6.1 and 6.2.
I am not writing about game developers, but about people behind software like re-shade or other wrappers/injectors which can override anything developer of game implements or does not. I have seen wrappers adding shader code to old DX7 games, making them effectively DX8.1.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Intel better keep Koduri on a tight leash so that future Intel iGPU's aren't burdened with features that won't work and need to be quietly forgotten behind the scenes.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Kaarme:

Intel better keep Koduri on a tight leash so that future Intel iGPU's aren't burdened with features that won't work and need to be quietly forgotten behind the scenes.
To be fair these features, both DSBR and Primitive Shaders require a lot of software development work, testing, etc. We have no idea what went on behind the scenes as far as RTG's budget and what got allocated towards the Zen project in the last few years. Typically it's about 2-3 years from initial design of a GPU architecture to the final shipping product. So Raja most likely planned these features with an expectation that he would have a software team capable of delivering them and then who knows what happened. I'm not even sure what RTG's way forward is at this point - I think AMD needs to be firing on all cylinders for it's CPU division to keep Zen competitive, especially now that Intel is tripping up. On the GPU side, I don't think they have a chance at beating Nvidia. I think their polaris strategy of just targeting the masses with mid/low tier cards at extremely competitive price points is really the only way forward because it's extremely safe. I know people are hoping for a crazy MCM setup on the GPU side but I think the engineering effort both software/hardware is not something AMD can afford right now, if anything goes wrong and the product is a failure it's game over. It's going to be interesting to see how it all plays out.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
It aint over till the fat lady sings, that being said this is not entirely surprising, implementing them implicitly would be an insane amount of work many things can go wrong.
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
Hmm, I think what this means is: -AMD is not investing any resources to get this working from the drivers -AMD is telling developers to incorporate it in game. If that's the case they will manage back in driver support. So, on one hand they aren't technically dropping support for it. They are just delegating responsibility to developers to use it. Question is what incentive do developers have to implement it? I would be shocked to see that FC5 will even think twice about development for it at this point.