Beelink SER5 Pro (Ryzen 7 5800H) mini PC review
Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review
MSI GeForce RTX 4060 Ti Gaming X TRIO review
GeForce RTX 4060 Ti 8GB (FE) review
Corsair 2000D RGB Airflow Mini-ITX - PC chassis review
ASUS PG27AQDM Review - 240Hz 1440p OLED monitor
DirectX 12 Adoption Big for Developers Microsoft shares New Info
Over at GDC (Game Developers Conference) Microsoft talked about the latest advancements in DirectX 12 technology. Principal Development Lead Max McMullen and Program Manager Chas Boyd shared quite a lot of details, there was quite a lot of information that gamers can enjoy.
Below you can read a recap of the points that were shared, info and photos are courtesy of DualShockers (see source link), and the full slides of the presentation in the gallery.
- The adoption of DirectX 12 is “huge” among developers, with many more games being worked on.
- Microsoft is working on improving stability and performance of the API.
- At the moment V-sync tearing, Freesync and G-Sync don’t work correctly, but the team is working actively to solve the issue and to get an update ready as soon as possible, even if a release date hasn’t been shared. More details will come at Build Conference at the end of the month.
- Xbox PIX (performance analysis tool for developer) is going to be ported to Windows’ development tools. More details will be shared later this year.
- The team is working on both Windows 10 and Xbox One, so improvements will work across the two platforms.
- The HLSL update (HLSL stands for High-Level Shader Language, which is the proprietary shader language for Direct3D) has the objective to expose the latest hardware features in order to enable developers to innovate. There’s focus on GPU programming, and the team will adopt the best ideas from across the industry.
- Shader Model 6 will be implemented, introducing new features.
- The team is working on a concept called procedural textures, which is a new hardware feature that lets developers vary dynamically the amount of pixels rendered in a particular screen area, controlling performance by varying the image quality of individual parts of a frame independently of the rest of the screen.
- Innovations that require driver updates will come in the second time of the year, but the rest can be implemented at any time. Those will be added with a much higher frequency than driver changes.
- Some features will be deprecated because no one really uses them, while others will be implemented, including language-level features in the second half of the year.
- Developers can write their own tools based on the tech, and Microsoft provides a sample to show how to do it.
- Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.
- On a fundamental level, HDR allows a display to emit a peak brightness for certain pixels that is a hundred times higher than on today’s TVs. You can’t run every pixel at that brightness, but having parts of it like stars of floodlights with that kind of brightness is a “big win,”
- Windows will allow content to access that additional value.
- At the moment games are locked to a brightness that some define “paperweight” or “email background” color, which is comparable to the brightness of the white background of your mail set so that it’s readable. It’s basically the equivalent of a white sheet of paper. That’s the “1.0 value.”
- With the new features, the email background will still be set at 1.0 value, but bright spots in games, photos and movies will be able to go substantially above that.
- Developers will be able to submit really high values of brightness, and they will be clamped only when it’ll be absolutely necessary.
- Color gamuts will work the same way. How colorful a scene can be is currently limited by the official range of colors that current display of windows and HDTV are set at. Going forward those limits will also be removed, and developers will be able to use more than the currently used 30% of the human visual range, up to 75 or 80%. This will allow games to express things visually that they can’t today.
- When you see a white spot in a game, you don’t really know if it’s a piece of plaster, the reflection of the sun, or a glowing object, because they’re all clamped to the same 1.0 value. With HDR diffuse surfaces like plaster could be set close to 1.0, while light sources could be two or three times brighter, allowing the user to actually distinguish what the object is. It will be a new level of realism.
- As Windows will be able to support HDR on all of its devices, there won’t be a limit to what panel vendors can create.
- To take advantage of this, developers need to utilize physically based rendering, which is already widespread among most AAA productions and keep their reference values such as 1.0 is at about 80-100 nits.
- After that, developers just need to tweak the buffer values in their game engines so that the intensity of the light sources is actually what it really would be. Most of the content of the game like textures and meshes doesn’t need any additional work.
- Post-production effects also need to be tweaked a bit. For instance, bloom is used currently to show that something is really bright, when it really isn’t, because the screen cannot display it. With High Dynamic Range that’s not necessary anymore, and actually using a strong bloom with High Dynamic Range overdoes the effect.
- The UI also needs not to be too bright, and black outlines might be necessary in the case it’s displayed in front of things like sunlight reflecting on water. Movie makers are currently running in the same issue with subtitles.
- Two formats will be supported, one with a maximum luminance of 5.2 million nits, and the other with a maximum luminance of 10,000 nits.
- When buying an HDR panel. people should look at the actual maximum brightness. Most retailers actually try to hide them and have you buy according to brand.
- The support of larger color gamuts is less important, as they require more changes to the art of the game, and users can’t tell the difference as easily as with HDR. That said, it will be supported at the same time as HDR on Windows.
- Microsoft plans to deliver this feature to developers in calendar year 2016 (those in the Windows Insider program should get an update with it in the second half of the year), while it should become available to end users in calendar year 2017.
« Nvidia Releases Quadro M6000 and it has 24GB vram · DirectX 12 Adoption Big for Developers Microsoft shares New Info
· MSI Teams up with Warner Bros on Consumer products and DC entertainment »
Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs - 02/11/2016 07:40 PM
Today some previews have been released on the game that was intended for Xbox. The game received many praises for its complex story-line, game-play and graphics. Good news, it is now also confirmed th...
Rise of the Tomb Raider might get DirectX 12 PC patch - 02/10/2016 09:33 AM
Some media are reporting that Rise of the Tomb Raider might get an upgrade towards DirectX 12 through a patch. It definitely is not he first time thus rumors pops up, but history has learned that wh...
FutureMark shows first footage 3DMark DirectX 12 (video) - 12/14/2015 10:12 AM
With DirectX available on Windows 10 it's just a matter of time before everybody jump onto it. Not just the hardware and games, the test software's will adapt as well. Futuremark has been hard at wo...
Nvidia and AMD Cross Multi GPU Tested In DirectX 12 - 10/26/2015 08:14 PM
Though I’m not quite sure if I’d be using Ashes of Singularity for testing this myself, it is an interesting read. Anandtech posted some benchmark testing a feature in DirectX 12 c...
Unreal Engine 4 - Darth Vader DirectX 12 Tech Demo - 10/02/2015 08:43 AM
CryZENx has released a new fan DX12 tech demo for Unreal Engine 4, in which players can control Darth Vader. Next to the video you can also download this demo....
kegastaMmer
Senior Member
Posts: 326
Joined: 2015-09-17
Senior Member
Posts: 326
Joined: 2015-09-17
#5249110 Posted on: 03/23/2016 02:44 PM
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
Hmmm - maybe my old CRT will be able to do it
You could adjust the brightness to unimaginable levels with that thing
geez, i too am glad i never threw mine out the window! gonna try it with the fermi dx12 drivers...oh wait
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
Hmmm - maybe my old CRT will be able to do it


geez, i too am glad i never threw mine out the window! gonna try it with the fermi dx12 drivers...oh wait
Fox2232
Senior Member
Posts: 11808
Joined: 2012-07-20
Senior Member
Posts: 11808
Joined: 2012-07-20
#5249111 Posted on: 03/23/2016 02:45 PM
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it.
I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12.
We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU?
It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it.
I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12.
We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU?
It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
Stormyandcold
Senior Member
Posts: 5844
Joined: 2003-09-15
Senior Member
Posts: 5844
Joined: 2003-09-15
#5249113 Posted on: 03/23/2016 02:51 PM
https://www.youtube.com/watch?v=sc4dsiq-I7g
https://www.youtube.com/watch?v=D1bTr96ZHrc
Performance improvements were definitely promised early on.
https://www.youtube.com/watch?v=sc4dsiq-I7g
https://www.youtube.com/watch?v=D1bTr96ZHrc
Performance improvements were definitely promised early on.
Denial
Senior Member
Posts: 14090
Joined: 2004-05-16
Senior Member
Posts: 14090
Joined: 2004-05-16
#5249116 Posted on: 03/23/2016 02:54 PM
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it.
I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12.
We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU?
It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
I don't know. I keep waiting to see Hitman/Tomb Raider CPU scaling tests but no one does them.
I feel like DX12 UE games are irrelevant at the moment. By Epic's own admission their engine does not support DX12 yet. On the forums they are saying it might be feature complete by 4.12, they aren't even shipping 4.11 yet.
https://trello.com/b/gHooNW9I/ue4-roadmap
4.11 is apparently has a bunch of new DX12 stuff, but still not "officially supported".
https://www.youtube.com/watch?v=sc4dsiq-I7g
https://www.youtube.com/watch?v=D1bTr96ZHrc
Performance improvements were definitely promised early on.
Yeah, in CPU tests and not only that, but they are very specific CPU functions. If those specific functions are not bottlenecking performance, then why would increasing them increase the performance of a game?
I mean that's the issue I have with what people are saying. Yeah we saw Intel render 50K unique asteroids on it's processor, swap to indirect and get 3x the framerate, but no real game is doing that, aside from maybe Ashes. In real games you're going to instance like 90% of those asteroids and reduce the draw calls to nearly nothing. DX12 won't make a difference in that case.
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it.
I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12.
We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU?
It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
I don't know. I keep waiting to see Hitman/Tomb Raider CPU scaling tests but no one does them.
I feel like DX12 UE games are irrelevant at the moment. By Epic's own admission their engine does not support DX12 yet. On the forums they are saying it might be feature complete by 4.12, they aren't even shipping 4.11 yet.
https://trello.com/b/gHooNW9I/ue4-roadmap
4.11 is apparently has a bunch of new DX12 stuff, but still not "officially supported".
https://www.youtube.com/watch?v=sc4dsiq-I7g
https://www.youtube.com/watch?v=D1bTr96ZHrc
Performance improvements were definitely promised early on.
Yeah, in CPU tests and not only that, but they are very specific CPU functions. If those specific functions are not bottlenecking performance, then why would increasing them increase the performance of a game?
I mean that's the issue I have with what people are saying. Yeah we saw Intel render 50K unique asteroids on it's processor, swap to indirect and get 3x the framerate, but no real game is doing that, aside from maybe Ashes. In real games you're going to instance like 90% of those asteroids and reduce the draw calls to nearly nothing. DX12 won't make a difference in that case.
Click here to post a comment for this news story on the message forum.
Senior Member
Posts: 14090
Joined: 2004-05-16
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
Hmmm - maybe my old CRT will be able to do it
Idk, my Samsung 8500 has HDR, although I guess the newer Samsung's announced this year are supposed to be better at it. I watched an HDR demo on it and while it was kind of cool, it wasn't something I'd want on all the time. I personally think it's overrated.
I'd rather monitor companies sell me a $100 premium on a guarantee that my monitor will come out of the box with zero defects. No backlight bleed, sub 10% delta in gray uniformity, factory calibrated to SRGB, zero dead pixels.
I had to go through multiple ROG Swifts and multiple XB270HU's before I found a decent one. And even my current one still has some bleed in the lower left corner. Unacceptable for a $800 monitor.