FutureMark shows first footage 3DMark DirectX 12 (video)

Published by

Click here to post a comment for FutureMark shows first footage 3DMark DirectX 12 (video) on our message forum
data/avatar/default/avatar17.webp
Does it even matter? There are two benchmarks out that utilize Async compute in a real world games. Ashes of Singularity and Fable Legends. In AoS Nvidia's 980Ti ties the Fury X @ 4K in Heavy Compute (Which literally no real game will ever do) and in Fable Legends the 980Ti finishes the compute portion of the test at nearly half the latency of the Fury X. Nvidia doesn't need to pay millions to drop Async support because the Ti is just as good if not better than the Fury X at normal levels of compute. The only time the Fury X is going to benefit is during extreme compute levels, levels that AoS literally rending thousands of units on the screen computing light for every unit couldn't hit. Epic released a few of the original playthrough things like the Elemental demo in DX12, there was no noticable difference in performance. That being said, Epic hasn't fully implemented DX12 into the engine. UE4.11 will be interesting because the entire build is geared towards renderer optimizations. It should bring a bunch of improvements to performance in UE titles. Epic recently announced that UT4 will be seeing a DX12 build soon, along with Fortnite. Should be interesting to see what Epic can do with it in their own games.
Right with you, but at 1080pFury finish at 0,47ms for compute shader, and the 980TI at 0,86ms .. Also the results was a bit unpredictable specially at 4K ( where strangely the fury have suddenly compute shader 4 times slower for unkown reason ( from 0,47 to 2.27ms when the result are consistant for the Nvidia one with 0,86 to 1.7ms ). Last driver improve things it seems. ( well we will need see benchmark redone with it ) This said, i dont even rember if Fable use Async compute and as said by the developpers (as for AOS ), the game and performance will be quite different of the benchmark.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Right with you, but at 1080pFury finish at 0,47ms for compute shader, and the 980TI at 0,86ms .. Also the results was a bit unpredictable specially at 4K ( where strangely the fury have suddenly compute shader 4 times slower for unkown reason ( from 0,47 to 2.27ms when the result are consistant for the Nvidia one with 0,86 to 1.7ms ). Last driver improve things it seems. ( well we will need see benchmark redone with it ) This said, i dont even rember if Fable use Async compute and as said by the developpers (as for AOS ), the game and performance will be quite different of the benchmark.
Hah you're right. Sorry I compared the dark green to blue, it looks the same on my ****ty work monitor. Overall framerates between both cards are similar though. http://*************/asynchronous-compute-investigated-in-fable-legends-dx12-benchmark/ That article explains the Async in Fable. It's enabled but it's only being utilized on some parts, dynamic lighting and instanced foliage.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Async is a DX12 feature in that I don't think you can do it in DX11. Nothing in APIs are really exclusive features, they just "allow" certain techniques to be used.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Alright then 🙂
data/avatar/default/avatar13.webp
Wow, a single little nvidia logo and the conspiracy is coming up. Nobody mentioned the 'Galax' writing there? It was even bigger. Probably they will make the benchmark run better on Galax cards (with reading out the bios vendor ID). :wanker: I wonder if running future mark or any other benchmark program for dx12 will show so much understatement of the topic... as dx12 offers more possibilities, it will load those possibilities onto the devs... and they will most likely not be as thourough to use all the modern GPU's capabilities in all games. As we might see many games that aren't even remotely as optimized as this futuremark, will it make sense to chose GPUs / systems after such benchmarks in the dx12 era? Performance can and will differ noticable from in game performance.
Kfa/galax makes how many amd cards? it just further proves it. as someone on wcc said, it might even have gameworks in it :P
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
While I see some unnecessary fighting here,I hope that at least this benchmark shows benefits of DX12. Because till now we really did not see anything impressive. DX12 in AoS helps, but considering what's actually on screen, game performs poorly. DX12 in UE4 shows no practical benefit over DX11. That's reason why ARK: SE still sits on DX11.
Not exactly. They haven't released the DX12 version of Ark: Survival Evolved for the exact same reason why not a single game on the market is running DX12... The API was only finalized a few months back and it takes time to make such large changes to the game. Early behind closed doors testing of the DX12 Ark is showing a 20% performance increase. Right now they're fixing big performance issues like adding seek-free packages which will greatly improve streaming, load times, and reduce hitching.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Kfa/galax makes how many amd cards? it just further proves it. as someone on wcc said, it might even have gameworks in it :P
Bah... Vantage was clearly sponsored by Sapphire yet nobody said that it was biased towards AMD.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Kfa/galax makes how many amd cards? it just further proves it. as someone on wcc said, it might even have gameworks in it :P
:bang:
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Which games show DX12 with UE4 has no benefits? I thought ARK doesn't have support yet due to technical issues.
There were no real technical issues. In reality it is not economical to release renderer which brings no performance improvements and looks practically same as DX11. Because then you run into risk that someone will have problems running it, but it is risk with no possibility to gain anything in return. UE4 has fully working DX12 path, but way it is implemented does not bring benefits. Soonish we can see improvement, I wonder what will be performance difference between "Infiltrator Demo" from older and new version.
https://forums.guru3d.com/data/avatars/m/260/260114.jpg
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
Yep it will be DX12 Demo without DX12 Proper Utilisation :bang: We need to wait for New NFS/Patch for SW BF3/Mirrors Edge/HitMan not this looking good but wothless benchies. nV said to devs: Please use DX12 but without DX12 ;-) Then: It's a Feature, like 3GB with 1GB DDR is a GDDR5 lol And then Again nV beg MS to implement their H/W in DX12 spec so we have DX12_2 (nV Only) and DX12_0 + 12_3 AMD Only 🙂 Becouse their Obsolete H/W (and Money Making with Fermi, no matter what they call it this days. Maxwell etc.) Industry is cripled by nV $ And don't forget even Maxwell don't have DX 11.1 ! Look into BF4/Rivals YT Videos whne D3D is visible: Always DX11 not 11.1 like in GCN. Here is proof (and see on YT Maxwell videos with API in it lol) [Spoiler] http://oi61.tinypic.com/35m16ya.jpg [/Spoiler]
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
So it's only proper DX12 when it's AMD sponsored?? What a bunch of hypocrites. You all need to stop hijacking every thread with your paranoia.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
There were no real technical issues. In reality it is not economical to release renderer which brings no performance improvements and looks practically same as DX11. Because then you run into risk that someone will have problems running it, but it is risk with no possibility to gain anything in return. UE4 has fully working DX12 path, but way it is implemented does not bring benefits. Soonish we can see improvement, I wonder what will be performance difference between "Infiltrator Demo" from older and new version.
Ah, ok. That's what they put on steam.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I bet fps will drop a lot when camera passes by that lit foliage/ruins? aquarium or by them light shafts :P
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Ah, ok. That's what they put on steam.
I know what they put on steam about technical issues with both nV/AMD. But it is UE4, and you can create your own project and measure difference. Back then only difference was that Afterburner did not detect/show OSD on DX12 context.
data/avatar/default/avatar36.webp
Yep it will be DX12 Demo without DX12 Proper Utilisation :bang: We need to wait for New NFS/Patch for SW BF3/Mirrors Edge/HitMan not this looking good but wothless benchies. nV said to devs: Please use DX12 but without DX12 ;-) Then: It's a Feature, like 3GB with 1GB DDR is a GDDR5 lol And then Again nV beg MS to implement their H/W in DX12 spec so we have DX12_2 (nV Only) and DX12_0 + 12_3 AMD Only 🙂 Becouse their Obsolete H/W (and Money Making with Fermi, no matter what they call it this days. Maxwell etc.) Industry is cripled by nV $ And don't forget even Maxwell don't have DX 11.1 ! Look into BF4/Rivals YT Videos whne D3D is visible: Always DX11 not 11.1 like in GCN. Here is proof (and see on YT Maxwell videos with API in it lol) [Spoiler] http://oi61.tinypic.com/35m16ya.jpg [/Spoiler]
Maxwell has DX 11.1 take a deep breath son... 'Proof' http://s26.postimg.org/lqq5rczjt/dai_dx11d1.png
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Yep it will be DX12 Demo without DX12 Proper Utilisation :bang: We need to wait for New NFS/Patch for SW BF3/Mirrors Edge/HitMan not this looking good but wothless benchies. nV said to devs: Please use DX12 but without DX12 ;-) Then: It's a Feature, like 3GB with 1GB DDR is a GDDR5 lol And then Again nV beg MS to implement their H/W in DX12 spec so we have DX12_2 (nV Only) and DX12_0 + 12_3 AMD Only 🙂 Becouse their Obsolete H/W (and Money Making with Fermi, no matter what they call it this days. Maxwell etc.) Industry is cripled by nV $ And don't forget even Maxwell don't have DX 11.1 ! Look into BF4/Rivals YT Videos whne D3D is visible: Always DX11 not 11.1 like in GCN. Here is proof (and see on YT Maxwell videos with API in it lol) [Spoiler] http://oi61.tinypic.com/35m16ya.jpg [/Spoiler]
:bonk:
So it's only proper DX12 when it's AMD sponsored?? What a bunch of hypocrites. You all need to stop hijacking every thread with your paranoia.
Let them be. The only thing they can do right now one could say it seems.
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
Yep it will be DX12 Demo without DX12 Proper Utilisation :bang: We need to wait for New NFS/Patch for SW BF3/Mirrors Edge/HitMan not this looking good but wothless benchies. nV said to devs: Please use DX12 but without DX12 ;-) Then: It's a Feature, like 3GB with 1GB DDR is a GDDR5 lol And then Again nV beg MS to implement their H/W in DX12 spec so we have DX12_2 (nV Only) and DX12_0 + 12_3 AMD Only 🙂 Becouse their Obsolete H/W (and Money Making with Fermi, no matter what they call it this days. Maxwell etc.) Industry is cripled by nV $ And don't forget even Maxwell don't have DX 11.1 ! Look into BF4/Rivals YT Videos whne D3D is visible: Always DX11 not 11.1 like in GCN. Here is proof (and see on YT Maxwell videos with API in it lol) [Spoiler] http://oi61.tinypic.com/35m16ya.jpg [/Spoiler]
Posts like these are the number 1 reason why I hardly post anymore.