DirectX 12 Adoption Big for Developers Microsoft shares New Info

Published by

Click here to post a comment for DirectX 12 Adoption Big for Developers Microsoft shares New Info on our message forum
https://forums.guru3d.com/data/avatars/m/263/263906.jpg
Really can't wait for HDR displays. I think for gamers the ultimate monitors in the next few years are gonna be 1440p IPS HDR 144Hz + FreeSync/GSync! We should hopefully be able to see on DP 1.3 (all of em IPS): 2560x1440 @ 168Hz + HDR + FreeSync/GSync 3440x1440 @ 144Hz + HDR + FreeSync/GSync 3840x2160 @ 60Hz + HDR only 60Hz with DP 1.3 afaik + FreeSync/GSync
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
For me personally, there is not much of interest besides them admitting that they broke vsync / fsync / gsync with dx12, and that they need to fix this.
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
Half the list is devoted to HDR which needs new monitors which will be capable to drive you temporarily blind by mega bright splashes.
https://forums.guru3d.com/data/avatars/m/263/263906.jpg
For me personally, there is not much of interest besides them admitting that they broke vsync / fsync / gsync with dx12, and that they need to fix this.
On Windows Store games only, right (due to the new format)? Ashes on DX12 works with FreeSync.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
On Windows Store games only, right (due to the new format)? Ashes on DX12 works with FreeSync.
Well for store games certainly, not so sure about the others (dx12 games that is, because of that unified pipeline). Haven't really been able to differentiate, not enough time to read into it.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
So I read up a little on HDR monitors, but what I could find on the first couple hits doesn't really yield much info other than what appears to be marketing-buzzwords. I mean, what's the difference between a HDR monitor and the wide color-gamut Dell U2711 I have? (obviously this is an "old" monitor and newer ones provide better quality, but compared to a newer version of same wide-gamut type display)
https://forums.guru3d.com/data/avatars/m/254/254800.jpg
For me, at least so far, I haven't seen massive gains or improvements, though I guess the games i played weren't "pure" direct X12 games (ROTR, GOW)
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Soo we're moving to OLED monitors or..? In their current state IPS and TN monitors do not have the necessary accuracy for the HDR described here as far as I know. Interesting, but the presentation reads fishy to me.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
realistically speaking... We have seen several DX12 implementation and they really did not left mark on target as promised. So now, instead of classical HDR we get new level of blinding light. That's gimmick at best. At worst people will hate it. I am sure many people here have screens calibrated for use during day and even with proper calibration those screens feel bit too bright at night. That's unless you have another source of light in room which sufficiently illuminates wall/objects behind screen to create day-like feeling for eye. I do not see myself to be happy about spot on screen which suddenly becomes 2~3 times brighter than what maximum brightness of screen was calibrated to. And in total... I see list of features which are to serve as smoke screen to let everyone forget that DX12 failed at its main promise.
data/avatar/default/avatar04.webp
Soo we're moving to OLED monitors or..? In their current state IPS and TN monitors do not have the necessary accuracy for the HDR described here as far as I know. Interesting, but the presentation reads fishy to me.
You will need new monitors capable of REC 2020 color space and GPU`s for it ( so far only AMD have speak about HDR support )... This said, most future UHD monitors should be 10 bit - REC2020 as it part of standard. http://www.businesswire.com/news/home/20160104006605/en/UHD-Alliance-Defines-Premium-Home-Entertainment-Experience https://en.wikipedia.org/wiki/Rec._2020 The more important things with HDR, is it a need and a will from games developpers... Its incredible how you can just increase the image quality with it...
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
DX12 isn't a failure yet. What was promised as "better" performance has ended-up being AC touted as the important feature for real-world performance benefits. This isn't what DX12 was supposed to do. This one feature is also not exclusive to DX12; opengl and vulkan can both use it. In effect, this negates windows 10. We were promised orders of magnitude better performance under dx12 (especially if you look at early benchmarks). Overall, this hasn't actually happened. What it's actually done is cause developers to spend much more time tweaking code where-as before some of this work was automated by the schedulers. DX12 then has actually made dev's jobs harder, not simpler.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
And in total... I see list of features which are to serve as smoke screen to let everyone forget that DX12 failed at its main promise.
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it. I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it. Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though. Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it. I personally think DX12 did exactly what it promised. People just falsely expected more.
( so far only AMD have speak about HDR support )...
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it. I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it. Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though. Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it. I personally think DX12 did exactly what it promised. People just falsely expected more. Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
Another possibility is that PC low level API's will give us effectively more powerful GPU's as developers don't need to be careful of drawcalls bottlenecking the driver thread(s), that should increase game hardware requirements. VR and UHD is helping too.
data/avatar/default/avatar03.webp
They will have to do alot more then mentioned above to sell me on dx-12 I am sorry,1st you need a Dx-12 capable card (gpu) then windows 10 for obvious reasons which I do not like at all, I will stay with 7 until I have no choice-Where is the performance benafits they bragged about so much? Howcome v-sync does not work with dx-12? That is a MAJOR kick in the balls!
https://forums.guru3d.com/data/avatars/m/259/259842.jpg
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer... Hmmm - maybe my old CRT will be able to do it πŸ™‚ You could adjust the brightness to unimaginable levels with that thing πŸ™‚
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer... Hmmm - maybe my old CRT will be able to do it πŸ™‚ You could adjust the brightness to unimaginable levels with that thing πŸ™‚
Idk, my Samsung 8500 has HDR, although I guess the newer Samsung's announced this year are supposed to be better at it. I watched an HDR demo on it and while it was kind of cool, it wasn't something I'd want on all the time. I personally think it's overrated. I'd rather monitor companies sell me a $100 premium on a guarantee that my monitor will come out of the box with zero defects. No backlight bleed, sub 10% delta in gray uniformity, factory calibrated to SRGB, zero dead pixels. I had to go through multiple ROG Swifts and multiple XB270HU's before I found a decent one. And even my current one still has some bleed in the lower left corner. Unacceptable for a $800 monitor.
https://forums.guru3d.com/data/avatars/m/264/264961.jpg
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer... Hmmm - maybe my old CRT will be able to do it πŸ™‚ You could adjust the brightness to unimaginable levels with that thing πŸ™‚
geez, i too am glad i never threw mine out the window! gonna try it with the fermi dx12 drivers...oh wait
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it. I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12. We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU? It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it. I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12. We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU? It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
I don't know. I keep waiting to see Hitman/Tomb Raider CPU scaling tests but no one does them. I feel like DX12 UE games are irrelevant at the moment. By Epic's own admission their engine does not support DX12 yet. On the forums they are saying it might be feature complete by 4.12, they aren't even shipping 4.11 yet. https://trello.com/b/gHooNW9I/ue4-roadmap 4.11 is apparently has a bunch of new DX12 stuff, but still not "officially supported".
https://www.youtube.com/watch?v=sc4dsiq-I7g https://www.youtube.com/watch?v=D1bTr96ZHrc Performance improvements were definitely promised early on.
Yeah, in CPU tests and not only that, but they are very specific CPU functions. If those specific functions are not bottlenecking performance, then why would increasing them increase the performance of a game? I mean that's the issue I have with what people are saying. Yeah we saw Intel render 50K unique asteroids on it's processor, swap to indirect and get 3x the framerate, but no real game is doing that, aside from maybe Ashes. In real games you're going to instance like 90% of those asteroids and reduce the draw calls to nearly nothing. DX12 won't make a difference in that case.