Intel Arc and Xe iGPUs Need to emulate support for DirectX 9 Through a Wrapper

Published by

Click here to post a comment for Intel Arc and Xe iGPUs Need to emulate support for DirectX 9 Through a Wrapper on our message forum
data/avatar/default/avatar02.webp
That's a bit of a misleading headline. Not supported *natively*, but emulated via D3D9On12 interface.
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
I have seen reviews where by the ARC cannot run Dirt 5. Which is a definite lol.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Pinstripe:

That's a bit of a misleading headline. Not supported *natively*, but emulated via D3D9On12 interface.
Well, heck, if you need to emulate through a wrapper you're not exactly supporting it in hardware.
data/avatar/default/avatar27.webp
Hilbert Hagedoorn:

Well heck, if you need to emulate it you're not exactly supporting it in hardware.
DX9 games still run on it though, so it's somewhat misleading.
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
I can't see these being particularly attractive. It's just one bad thing after another! As someone who plays a lot of (heavily modded and very poorly optimised) DX9 games, these would basically be useless to me.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
I just remembered intel had that competition to win an Intel Xe gpu.... I forget how long ago that was, my question for entering the competition was actually will Intel Xe gpu's support DirectX 9..... xD
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Intel's driver development for Arch is a trainwreck.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
I think this is an understandable move from Intel, DX9 is old and they already have enough on their plate just writing drivers for modern games. The question here is how good is the emulation in the end.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
9on12 isn't a wrapper.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I see no problem with this. Sure, there's going to be some additional CPU overhead, but I have no doubts that even the most demanding DX9 game is going to be plenty playable on even the worst Arc GPU through D3D9On12. Most of such games used no more than 2 threads. DXVK effectively accomplishes the same thing and it can actually improve performance over native DX support. So depending how MS wrote their driver, Intel may see a performance advantage in DX9 games. EDIT: Kinda funny how DX10 wasn't even mentioned. Makes sense though - just about every game that supported it also supported another version (or OpenGL).
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Pinstripe:

That's a bit of a misleading headline. Not supported *natively*, but emulated via D3D9On12 interface.
I'm glad to learn my AMD and nVidia cards were supporting glide back in the days since it could run some glide games using a wrapper ... Pretty much every reviews or Arc i've watched where they tested a bunch of old games said the support/performance for old games was awful at best. If D3D9On12 is not a wrapper then it's certainly not a native implementation of d3d9. I'm not knowledgeable about D3D9On12 but looking at the doc quickly it looks like a wrapper to me. Most people online call it a wrapper too.
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
schmidtbag:

I see no problem with this. Sure, there's going to be some additional CPU overhead, but I have no doubts that even the most demanding DX9 game is going to be plenty playable on even the worst Arc GPU through D3D9On12
I suspect Borderlands 2 will run like garbage, even with Alder Lake CPU. No idea why people always think CPU performance with D3D9 wouldn't be an issue, those games often have super-shitty scaling with newer CPUs vs. newer APIs, or they were always terrible to begin with. Load Gothic 2 (ok, it's DirectDraw, but still) Khorinis savegame and be surprised, this isn't too uncommon either.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
MonstroMart:

. If D3D9On12 is not a wrapper then it's certainly not a native implementation of d3d9. I'm not knowledgeable about D3D9On12 but looking at the doc quickly it looks like a wrapper to me. Most people online call it a wrapper too.
Espionage724:

What is a wrapper then? https://github.com/microsoft/D3D9On12 Are we arguing semantics here or is this really not as clear-cut as I'm thinking it is? I'm aware DX and D3D aren't the same thing. Headline says DX9 needs to be emulated through a wrapper. D3D9On12 does DX9 -> DX12. Therefore, to run DX9 things on Arc/Xe, it needs to get mapped/translated/wrapped/magic'd/whatever to DX12, instead of having driver-level support for it. What is confusing here?
To be fair a "wrapper" is a pretty specific thing, wrappers tend to be standalone programs (see the glide wrapper or dxvk), where as a translation layer or mapping layer as microsoft calls it is a more generic term and can cover more intimately integrated software that effectively does the same thing (providing compatibility). However I don't think that the distinction is that important in this case, since they are close enough. nvm: I just educated myself, its a programmer term, found this explantion helpful https://training.glass.lu/courses/318762/lectures/4894960 in this case a wrapper, completely "wraps" the api, ie in this case the application talks to the native driver via translated calls. where as what microsoft is doing, is creating a d3d9 driver(called d3d9on12) and the the application talks to that, and then the d3d9on12 driver talks to the native dx12 driver if I understand correctly, which is why it is a "mapping layer" and not a wrapper. you achieve a similar result , but they are not the same thing.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Pinstripe:

DX9 games still run on it though, so it's somewhat misleading.
Nothing about it is misleading. It says it needs to use a wrapper, not that it won't run.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
aufkrawall2:

I suspect Borderlands 2 will run like garbage, even with Alder Lake CPU. No idea why people always think CPU performance with D3D9 wouldn't be an issue, those games often have super-shitty scaling with newer CPUs vs. newer APIs, or they were always terrible to begin with.
I don't see how an AL would struggle with a game like that. While the game wouldn't be able to take advantage of any newer instructions or more threads, the improvements in the process node, cache, clock speeds, and OS schedulers ought to make a significant improvement. In other words, it won't be worse on AL than it was on a Sandy Bridge i7. The added CPU overhead of doing a compatibility/translation layer or wrapper (or whatever you want to call it) would have no little to no impact on the game since it would most likely happen in a separate thread. So unless there's something I'm not understanding here, such games shouldn't run any worse compared to a native DX9 driver.
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
schmidtbag:

In other words, it won't be worse on AL than it was on a Sandy Bridge i7.
My statement obviously was that even fast CPUs like Alder Lake will deliver subpar performance with a crap driver like Intel's, even in ancient D3D9 games. No idea why you explain why faster CPUs should be faster than slower at this occasion, not really enlightening...
schmidtbag:

The added CPU overhead of doing a compatibility/translation layer or wrapper (or whatever you want to call it) would have no little to no impact on the game since it would most likely happen in a separate thread. So unless there's something I'm not understanding here, such games shouldn't run any worse compared to a native DX9 driver.
Now you're acting like I said a translation layer itself would be an issue. *Annoying.* But apparently, something in Intel's render chain for D3D9 actually is a giant dookie.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
aufkrawall2:

My statement obviously was that even fast CPUs like Alder Lake will deliver subpar performance with a crap driver like Intel's, even in ancient D3D9 games. No idea why you explain why faster CPUs should be faster than slower at this occasion, not really enlightening...
Except what really is obvious, as I enlightened, is that wouldn't be the case. Sure, we're not talking a major improvement in FPS, but it'll still be plenty playable. I managed to play BL2 on Linux using a Bulldozer CPU and it was totally fine. That's a worst-case scenario - an i3 or Ryzen 3 would get the job done just fine. Modern CPUs can handle DX9 games just fine.
Now you're acting like I said a translation layer itself would be an issue. *Annoying.* But apparently, something in Intel's render chain for D3D9 actually is a giant dookie.
I am not; in fact, I've already stated in a previous post that I think Intel's approach is perfectly fine, and potentially a good idea (assuming MS did a sensible job on their part with the translation layer). However, your initial response to me seemed to imply "these games are CPU limited enough to struggle on modern CPUs so they won't run well by adding more overhead". Apparently, that's not how you felt.
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
schmidtbag:

Modern CPUs can handle DX9 games just fine.
No, they can't, because Windows 10 has regressed CPU bound performance in lots of D3D9 games by ~50%. I also don't think I share your opinion of what's "just fine" when you think BL2 ran fine on your BDZ, when I remember my 2500K @4.8GHz was struggling in a fair number of scenes. Not to speak of multiplayer...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
aufkrawall2:

No, they can't, because Windows 10 has regressed CPU bound performance in lots of D3D9 games by ~50%. I also don't think I share your opinion of what's "just fine" when you think BL2 ran fine on your BDZ, when I remember my 2500K @4.8GHz was struggling in a fair number of scenes. Not to speak of multiplayer...
Well, I guess having not used Windows or any DX9 games in while, I wasn't aware of that issue. Also looking up the game in userbenchmark.com, apparently even a 2600K ran like crap, whereas a FX-6300 was totally playable: https://www.userbenchmark.com/PCGame/FPS-Estimates-Borderlands-2/3669/0.463.0.0.0 https://www.userbenchmark.com/PCGame/FPS-Estimates-Borderlands-2/3669/14719.0.0.0.0 So, I really wasn't wrong. GPUs of the time seem to have no problem too: https://www.userbenchmark.com/PCGame/FPS-Estimates-Borderlands-2/3669/14719.0.0.0.0 In any case, while I wasn't staying above 60FPS the whole time, the game definitely was playable. Perhaps the Linux version is optimized a little better, who knows (I doubt it).
data/avatar/default/avatar30.webp
thats how nvidia do it. only gpu that dont is amd that's why your cable has to be tip top with amd cards.