DirectX 12 Adoption Big for Developers Microsoft shares New Info

Published by

Click here to post a comment for DirectX 12 Adoption Big for Developers Microsoft shares New Info on our message forum
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Give my OLED... Then I can buy in to that HDR thing.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I don't know. I keep waiting to see Hitman/Tomb Raider CPU scaling tests but no one does them. I feel like DX12 UE games are irrelevant at the moment. By Epic's own admission their engine does not support DX12 yet. On the forums they are saying it might be feature complete by 4.12, they aren't even shipping 4.11 yet. https://trello.com/b/gHooNW9I/ue4-roadmap 4.11 is apparently has a bunch of new DX12 stuff, but still not "officially supported".
Yes, I know UE4 was not much of DX12 engine at time Infiltrator came, but I believe those people working on it do their best to have most popular engine. I keep eye on UE4 through ARK: Survival Evolved planed patches news as they are putting DX12 further and further away. It does matter little if Hitman does 5~10% better on high end GPU if it does not deliver on greatly reduced CPU need. Because that same Hitman with very low need for CPU would run on notebooks which has only iGPU, as TDP of APU would be used for graphics rendering instead of managing rendering paths. Reduced CPU overhead may deliver maybe 15~20% improved desktop performance. But for mobile we may see twice as high fps than before. (That's for Atoms. Where it may lead APUs? I do not know.)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Yes, I know UE4 was not much of DX12 engine at time Infiltrator came, but I believe those people working on it do their best to have most popular engine. I keep eye on UE4 through ARK: Survival Evolved planed patches news as they are putting DX12 further and further away. It does matter little if Hitman does 5~10% better on high end GPU if it does not deliver on greatly reduced CPU need. Because that same Hitman with very low need for CPU would run on notebooks which has only iGPU, as TDP of APU would be used for graphics rendering instead of managing rendering paths. Reduced CPU overhead may deliver maybe 15~20% improved desktop performance. But for mobile we may see twice as high fps than before. (That's for Atoms. Where it may lead APUs? I do not know.)
Is there a benchmark of Hitman running on an APU? The old Starwarm benchmarks show APU's running double under DX12. I haven't seen a Hitman APU benchmark though. I don't know if we would see double on Hitman, but I imagine we would see a much larger increase than we see on dGPU's.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'd rather monitor companies sell me a $100 premium on a guarantee that my monitor will come out of the box with zero defects. No backlight bleed, sub 10% delta in gray uniformity, factory calibrated to SRGB, zero dead pixels.
QFT.
Thing is, we have not seed that reduced CPU utilization. Have we? UE4 Infiltrator tanked in DX12 same way as it did in DX11 upon reaching last scenes outside where GPU utilization went down to nothing as CPU could not handle it. I want to see that people will get same performance with 4.5GHz OC CPU under DX11 as they do with regular much cheaper 3.2GHz CPUs under DX12. We have read about new concepts for graphics, we have seen "new" ways to use GPUs. But where is that reduced use of CPU? It would deliver great performance improvement for intel's/AMD's SoCs as in those CPU and iGPU are fighting over limited TDP. And DX12 was supposed to free this CPU TDP use and allow iGPU to truly shine.
What I have been thinking (and almost forgotten) after the first dx12 demos / benches / games now have launched, but remembered lately, that if you don't run into any bottlenecks with a high performance gaming rig under dx11 (as in, top dx11 gaming rigs), I did not expect huge gains under dx12. Also, don't forget, how should Intel sell those expensive CPUs to gamers if you don't need them anymore? 😀
data/avatar/default/avatar33.webp
Tom Peterson from Nvidia indirectly said that Pascal will support it on PcPers podcast.
I have no doubt aboout Pascal supporting it, or well can output it.. because it is the next leap in display image quality.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Is there a benchmark of Hitman running on an APU? The old Starwarm benchmarks show APU's running double under DX12. I haven't seen a Hitman APU benchmark though. I don't know if we would see double on Hitman, but I imagine we would see a much larger increase than we see on dGPU's.
That entire DX12 business is propagated wrong way and makes me feel that industry is changing direction and heading where we do not need it to be. As for tests, not even notebookcheck makes distinction between DX11/DX12 game tests for Hitman 2016 and RotTR.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Hilbert, would it be possible to get an APU test for Hitman, DX11 vs DX12? I'd be interested in seeing it. I'm kind of blown away by the fact that I can't find a single APU test, for neither Ashes/Tomb Raider/Gears or Hitman. The benefits for APU's was like one of the largest advertising points going into DX12s announcement. Now I can't find anything. Reporting my own post to get his attention lol
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Hmm new monitors? Gsync,FreeSync, they did not have much success.So they(industry) want to introduce new monitors(like 3D TVs some years ago),then another wasted money on new monitors. Why not they put that thing,new HDR,in graphic Gpu or in Post process or in drivers so the actual monitors will displays that thing. More money,more problems.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Hmm new monitors? Gsync,FreeSync, they did not have much success.So they(industry) want to introduce new monitors(like 3D TVs some years ago),then another wasted money on new monitors. Why not they put that thing,new HDR,in graphic Gpu or in Post process or in drivers so the actual monitors will displays that thing. More money,more problems.
Uh because the monitor has to be able to display a higher brightness? It's like saying why don't they put 4K in graphics cards and not require buying a new monitor. It's not physically possible.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
DX12 is fine. People KEEP confusing it with the push for Windows Store games that Microsoft is going for. The whole promise of DX12 is that you get an almost console-like low leve utilization of the computer. One of the side effects is that AMD's GCN architecture can finally be "fed" more effectively, but that's not really the reason for it. The reason is to lower latencies, drop power consumption and/or create CPU based effects that finally can use all the CPU power that modern computers have. As for the rest of the news, the MOST important announcement was Shader Model 6. Will it require new hardware? Will current hardware support it correctly?
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Uh because the monitor has to be able to display a higher brightness? It's like saying why don't they put 4K in graphics cards and not require buying a new monitor. It's not physically possible.
I understand,but i think this would be another hyped product on the market and after maybe 2-3 years they will cancel this.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
As for the rest of the news, the MOST important announcement was Shader Model 6. Will it require new hardware? Will current hardware support it correctly?
Probably not, like it was back then with SM 2.0 and 3.0 (2.0 wouldn't start Bioshock without a workaround / patch for instance, did this myself back then with a SM 2.0 card, Sapphire 4850 Toxic I think.). Thinking further, will Polaris and Pascal be SM6.0 compliant, or will we have for 2017 to see GPUs to support the new SM fully?
data/avatar/default/avatar32.webp
People still dont understand how important dx12 is, and im not the one who will make it clear, theres just too much info on the internet, HDR will not be a gimmick, it will be like black and white to colors, in the past. As Vr will not be a gimmick too, although will not be feaseable for the majority of content and people by now as it alienates people from reality. The major problem with HDR is that you will need an special place with little to no light to really have the experience it is capable of delivering, OLEDs should be watched in full darkness, and the other variants will accept a little more light, the curved displays will help in getting rid of reflections i guess but still will be very proeminent.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Probably not, like it was back then with SM 2.0 and 3.0 (2.0 wouldn't start Bioshock without a workaround / patch for instance, did this myself back then with a SM 2.0 card, Sapphire 4850 Toxic I think.). Thinking further, will Polaris and Pascal be SM6.0 compliant, or will we have for 2017 to see GPUs to support the new SM fully?
I suspect that we might not actually need new hardware. From the information I found in the Tehcpowerup article about it, it seems that it's just accessing the same hardware in a new way, more consistent with the new API and it's using tiled resources. If that's the case, it doesn't seem to have extra hardware requirements. It was interesting that I saw a presentation about HDR from the Radeon Group recently. They seem to keep cooking DX12 features closely with Microsoft.
data/avatar/default/avatar26.webp
Uh because the monitor has to be able to display a higher brightness? It's like saying why don't they put 4K in graphics cards and not require buying a new monitor. It's not physically possible.
Why it isnt? It definitly is, some monitors have contrast enough to produce something near, its just about the standards and hardware acceleration needed to make the process faster, produce less heat, and consume less resources. LOL
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Why it isnt? It definitly is, some monitors have contrast enough to produce something near, its just about the standards and hardware acceleration needed to make the process faster, produce less heat, and consume less resources. LOL
What are you talking about? All it needs is a compatible decode engine as far as hardware goes. Most monitors cap out at 350-400 nits. The Samsung HDR displays and future HDR displays are boosting up to 1000 nits and have a lower black level. Can you artificially boost contrast? I guess, if enjoy a washed out garbage image. That's like saying DSR 4K is the same as a 4K monitor. Sorry but no. And no amount of random "LOL"s at the end of your posts will change that.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
It's not that "cutting-edge" until it can burn your retina. Then, we'll have real retina displays 😀
data/avatar/default/avatar32.webp
On Windows Store games only, right (due to the new format)? Ashes on DX12 works with FreeSync.
It not even Windows store games. Vsync is perfectly fine in Dx12\Windows store games. Borderless fullscreen = vsync on. The problem is that you cant turn it off. That will be fixed once exclusive fullscreen gets implemented.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
It not even Windows store games. Vsync is perfectly fine in Dx12\Windows store games. Borderless fullscreen = vsync on. The problem is that you cant turn it off. That will be fixed once exclusive fullscreen gets implemented.
You can. NVIDIA has it implemented and AMD did that too in their latest driver. From that point on it's on the hands of the developers.