ASUS Radeon RX 7600 STRIX OC review
Corsair RM1200X SHIFT 1200W PSU Review
Intel NUC 13 Pro (Arena Canyon) review
Endorfy Arx 700 Air chassis review
Beelink SER5 Pro (Ryzen 7 5800H) mini PC review
Crucial T700 PCIe 5.0 NVMe SSD Review - 12GB/s
Sapphire Radeon RX 7600 PULSE review
Gainward GeForce RTX 4060 Ti GHOST review
Radeon RX 7600 review
ASUS GeForce RTX 4060 Ti TUF Gaming review
DirectX 12 Adoption Big for Developers Microsoft shares New Info
Over at GDC (Game Developers Conference) Microsoft talked about the latest advancements in DirectX 12 technology. Principal Development Lead Max McMullen and Program Manager Chas Boyd shared quite a lot of details, there was quite a lot of information that gamers can enjoy.
Below you can read a recap of the points that were shared, info and photos are courtesy of DualShockers (see source link), and the full slides of the presentation in the gallery.
- The adoption of DirectX 12 is “huge” among developers, with many more games being worked on.
- Microsoft is working on improving stability and performance of the API.
- At the moment V-sync tearing, Freesync and G-Sync don’t work correctly, but the team is working actively to solve the issue and to get an update ready as soon as possible, even if a release date hasn’t been shared. More details will come at Build Conference at the end of the month.
- Xbox PIX (performance analysis tool for developer) is going to be ported to Windows’ development tools. More details will be shared later this year.
- The team is working on both Windows 10 and Xbox One, so improvements will work across the two platforms.
- The HLSL update (HLSL stands for High-Level Shader Language, which is the proprietary shader language for Direct3D) has the objective to expose the latest hardware features in order to enable developers to innovate. There’s focus on GPU programming, and the team will adopt the best ideas from across the industry.
- Shader Model 6 will be implemented, introducing new features.
- The team is working on a concept called procedural textures, which is a new hardware feature that lets developers vary dynamically the amount of pixels rendered in a particular screen area, controlling performance by varying the image quality of individual parts of a frame independently of the rest of the screen.
- Innovations that require driver updates will come in the second time of the year, but the rest can be implemented at any time. Those will be added with a much higher frequency than driver changes.
- Some features will be deprecated because no one really uses them, while others will be implemented, including language-level features in the second half of the year.
- Developers can write their own tools based on the tech, and Microsoft provides a sample to show how to do it.
- Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.
- On a fundamental level, HDR allows a display to emit a peak brightness for certain pixels that is a hundred times higher than on today’s TVs. You can’t run every pixel at that brightness, but having parts of it like stars of floodlights with that kind of brightness is a “big win,”
- Windows will allow content to access that additional value.
- At the moment games are locked to a brightness that some define “paperweight” or “email background” color, which is comparable to the brightness of the white background of your mail set so that it’s readable. It’s basically the equivalent of a white sheet of paper. That’s the “1.0 value.”
- With the new features, the email background will still be set at 1.0 value, but bright spots in games, photos and movies will be able to go substantially above that.
- Developers will be able to submit really high values of brightness, and they will be clamped only when it’ll be absolutely necessary.
- Color gamuts will work the same way. How colorful a scene can be is currently limited by the official range of colors that current display of windows and HDTV are set at. Going forward those limits will also be removed, and developers will be able to use more than the currently used 30% of the human visual range, up to 75 or 80%. This will allow games to express things visually that they can’t today.
- When you see a white spot in a game, you don’t really know if it’s a piece of plaster, the reflection of the sun, or a glowing object, because they’re all clamped to the same 1.0 value. With HDR diffuse surfaces like plaster could be set close to 1.0, while light sources could be two or three times brighter, allowing the user to actually distinguish what the object is. It will be a new level of realism.
- As Windows will be able to support HDR on all of its devices, there won’t be a limit to what panel vendors can create.
- To take advantage of this, developers need to utilize physically based rendering, which is already widespread among most AAA productions and keep their reference values such as 1.0 is at about 80-100 nits.
- After that, developers just need to tweak the buffer values in their game engines so that the intensity of the light sources is actually what it really would be. Most of the content of the game like textures and meshes doesn’t need any additional work.
- Post-production effects also need to be tweaked a bit. For instance, bloom is used currently to show that something is really bright, when it really isn’t, because the screen cannot display it. With High Dynamic Range that’s not necessary anymore, and actually using a strong bloom with High Dynamic Range overdoes the effect.
- The UI also needs not to be too bright, and black outlines might be necessary in the case it’s displayed in front of things like sunlight reflecting on water. Movie makers are currently running in the same issue with subtitles.
- Two formats will be supported, one with a maximum luminance of 5.2 million nits, and the other with a maximum luminance of 10,000 nits.
- When buying an HDR panel. people should look at the actual maximum brightness. Most retailers actually try to hide them and have you buy according to brand.
- The support of larger color gamuts is less important, as they require more changes to the art of the game, and users can’t tell the difference as easily as with HDR. That said, it will be supported at the same time as HDR on Windows.
- Microsoft plans to deliver this feature to developers in calendar year 2016 (those in the Windows Insider program should get an update with it in the second half of the year), while it should become available to end users in calendar year 2017.
« Nvidia Releases Quadro M6000 and it has 24GB vram · DirectX 12 Adoption Big for Developers Microsoft shares New Info
· MSI Teams up with Warner Bros on Consumer products and DC entertainment »
Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs - 02/11/2016 07:40 PM
Today some previews have been released on the game that was intended for Xbox. The game received many praises for its complex story-line, game-play and graphics. Good news, it is now also confirmed th...
Rise of the Tomb Raider might get DirectX 12 PC patch - 02/10/2016 09:33 AM
Some media are reporting that Rise of the Tomb Raider might get an upgrade towards DirectX 12 through a patch. It definitely is not he first time thus rumors pops up, but history has learned that wh...
FutureMark shows first footage 3DMark DirectX 12 (video) - 12/14/2015 10:12 AM
With DirectX available on Windows 10 it's just a matter of time before everybody jump onto it. Not just the hardware and games, the test software's will adapt as well. Futuremark has been hard at wo...
Nvidia and AMD Cross Multi GPU Tested In DirectX 12 - 10/26/2015 08:14 PM
Though I’m not quite sure if I’d be using Ashes of Singularity for testing this myself, it is an interesting read. Anandtech posted some benchmark testing a feature in DirectX 12 c...
Unreal Engine 4 - Darth Vader DirectX 12 Tech Demo - 10/02/2015 08:43 AM
CryZENx has released a new fan DX12 tech demo for Unreal Engine 4, in which players can control Darth Vader. Next to the video you can also download this demo....
Denial
Senior Member
Posts: 14092
Joined: 2004-05-16
Senior Member
Posts: 14092
Joined: 2004-05-16
#5249079 Posted on: 03/23/2016 01:34 PM
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it.
I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it.
Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though.
Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it.
I personally think DX12 did exactly what it promised. People just falsely expected more.
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
And in total... I see list of features which are to serve as smoke screen to let everyone forget that DX12 failed at its main promise.
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it.
I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it.
Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though.
Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it.
I personally think DX12 did exactly what it promised. People just falsely expected more.
( so far only AMD have speak about HDR support )...
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
theoneofgod
Senior Member
Posts: 4672
Joined: 2014-01-17
Senior Member
Posts: 4672
Joined: 2014-01-17
#5249081 Posted on: 03/23/2016 01:38 PM
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it.
I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it.
Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though.
Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it.
I personally think DX12 did exactly what it promised. People just falsely expected more.
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
Another possibility is that PC low level API's will give us effectively more powerful GPU's as developers don't need to be careful of drawcalls bottlenecking the driver thread(s), that should increase game hardware requirements. VR and UHD is helping too.
I don't think it failed it's main promise. I just think that, once again, gamers created an unrealistic expectation for it.
I don't recall Microsoft ever saying it would bring massive performance improvements across the board. I recall them saying that it would reduce CPU overhead. I recall them saying it would allow developers to have their games drive a deeper into the architecture itself. That it would be consistent across multiple platforms and that it would have a decent toolset to accompany it.
Multiple times in the last few years I've said that it wouldn't make much of a difference in GPU heavy games. The performance improvements you see in games like Ashes is a combination of RTS games being heavily CPU throttled and Oxide making good use of Async Compute to eliminate that bottleneck by offloading more of the CPU calculations to the GPU. Again, I stated multiple times that game and RTS in general is literally the perfect genre to exploit DX12. MMO's might be a close second. Most FPS/RPGs/etc are GPU bound though.
Games like Hitman, which AMD calls "The best use of Async Compute" the maximum difference between DX11/12 card is the 390, with ~10%. Most GPU's only see ~3% gain, some see none. Then I see multiple posts here on Guru3d going "Where is DX12's performance, must be immature drivers". Nope, that's actually probably it. By the time game engines really start taking advantage of DX12's low level stuff and creating some new rendering methods, the DX11/12 comparisons will be gone because there will be no DX11 variant. Without the comparison any visual gain of the difference is essentially lost in terms of gamers being able to see it.
I personally think DX12 did exactly what it promised. People just falsely expected more.
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
Another possibility is that PC low level API's will give us effectively more powerful GPU's as developers don't need to be careful of drawcalls bottlenecking the driver thread(s), that should increase game hardware requirements. VR and UHD is helping too.
nz3777
Senior Member
Posts: 2502
Joined: 2014-01-21
Senior Member
Posts: 2502
Joined: 2014-01-21
#5249088 Posted on: 03/23/2016 01:56 PM
They will have to do alot more then mentioned above to sell me on dx-12 I am sorry,1st you need a Dx-12 capable card (gpu) then windows 10 for obvious reasons which I do not like at all, I will stay with 7 until I have no choice-Where is the performance benafits they bragged about so much?
Howcome v-sync does not work with dx-12? That is a MAJOR kick in the balls!
They will have to do alot more then mentioned above to sell me on dx-12 I am sorry,1st you need a Dx-12 capable card (gpu) then windows 10 for obvious reasons which I do not like at all, I will stay with 7 until I have no choice-Where is the performance benafits they bragged about so much?
Howcome v-sync does not work with dx-12? That is a MAJOR kick in the balls!
Juliuszek
Senior Member
Posts: 106
Joined: 2014-10-08
Senior Member
Posts: 106
Joined: 2014-10-08
#5249100 Posted on: 03/23/2016 02:22 PM
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
Hmmm - maybe my old CRT will be able to do it
You could adjust the brightness to unimaginable levels with that thing 
So I need a HDR display now? And I was sooo happy when I got my G-Sync monitor at end of 2014... I thought it was to stay for couple of years longer...
Hmmm - maybe my old CRT will be able to do it


Click here to post a comment for this news story on the message forum.
Senior Member
Posts: 5845
Joined: 2003-09-15
DX12 isn't a failure yet. What was promised as "better" performance has ended-up being AC touted as the important feature for real-world performance benefits. This isn't what DX12 was supposed to do. This one feature is also not exclusive to DX12; opengl and vulkan can both use it. In effect, this negates windows 10.
We were promised orders of magnitude better performance under dx12 (especially if you look at early benchmarks). Overall, this hasn't actually happened. What it's actually done is cause developers to spend much more time tweaking code where-as before some of this work was automated by the schedulers. DX12 then has actually made dev's jobs harder, not simpler.