FutureMark shows first footage 3DMark DirectX 12 (video)

Published by

Click here to post a comment for FutureMark shows first footage 3DMark DirectX 12 (video) on our message forum
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
About time 😀 Hope it uses all features that DX12 brings so far.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Hope it uses all features that DX12 brings so far.
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
https://forums.guru3d.com/data/avatars/m/239/239932.jpg
Finally something to play ... soon.
data/avatar/default/avatar05.webp
Will this be a whole new 3DMark? Or will this test be added to the current 3DMark?
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
Async Compute is only one aspect, there's more to DX12 like Conservative Rasterization, Rasterizer Ordered Views, Volume Tiled Resources and so forth. Also the Async story is old, latest drivers now puts Maxwell on par on Ashes of the Singularity. http://www.tweaktown.com/news/48021/nvidia-beats-amd-ashes-singularity-dx12-test-new-driver/index.html
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Async compute isn't a DX12 feature.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
Probably.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
That's a GOC sponsored event by Nvidia, the logo does not fall into the projection yet is embedded on the casing - it has nothing to do with 3DMark.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
3DMark 11 test 3/4 in a box. Cute. 🙂
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
Probably.
That's a GOC sponsored event by Nvidia, the logo does not fall into the projection yet is embedded on the casing - it has nothing to do with 3DMark.
Wow, a single little nvidia logo and the conspiracy is coming up. Nobody mentioned the 'Galax' writing there? It was even bigger. Probably they will make the benchmark run better on Galax cards (with reading out the bios vendor ID). :wanker: I wonder if running future mark or any other benchmark program for dx12 will show so much understatement of the topic... as dx12 offers more possibilities, it will load those possibilities onto the devs... and they will most likely not be as thourough to use all the modern GPU's capabilities in all games. As we might see many games that aren't even remotely as optimized as this futuremark, will it make sense to chose GPUs / systems after such benchmarks in the dx12 era? Performance can and will differ noticable from in game performance.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
ha It took me over a year to get on vista for 3dvantage highly doubt I will be running to w10 when this comes out
https://forums.guru3d.com/data/avatars/m/254/254338.jpg
ha It took me over a year to get on vista for 3dvantage highly doubt I will be running to w10 when this comes out
If you're running Vista and interested in gaming why wouldn't you upgrade to Windows 10 while it's free? DX12 won't ever be back ported, so in the medium to long term you'll find more and more games you can't play.
https://forums.guru3d.com/data/avatars/m/228/228458.jpg
If you're running Vista and interested in gaming why wouldn't you upgrade to Windows 10 while it's free? DX12 won't ever be back ported, so in the medium to long term you'll find more and more games you can't play.
You misunderstood what he said. He's saying he didn't upgrade to Vista until a year after it's release.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
While I see some unnecessary fighting here,I hope that at least this benchmark shows benefits of DX12. Because till now we really did not see anything impressive. DX12 in AoS helps, but considering what's actually on screen, game performs poorly. DX12 in UE4 shows no practical benefit over DX11. That's reason why ARK: SE still sits on DX11.
https://forums.guru3d.com/data/avatars/m/250/250066.jpg
"somewhere in 2016" i'm certain it will be here when arctic islands and pascall comes out .
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
DX12 in UE4 shows no practical benefit over DX11. That's reason why ARK: SE still sits on DX11.
Which games show DX12 with UE4 has no benefits? I thought ARK doesn't have support yet due to technical issues.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Async compute isn't a DX12 feature.
If it's a programming technique that will only really appear in dx12 games then it would be all the same to call it a part of dx12 even if it's not specifically mentioned in the standard. I don't know if it has been such a big topic only because Nvidia failed with it or if it's really worth the hype, though. I sure hope it is as extra compute could make games more interesting.
That's a GOC sponsored event by Nvidia, the logo does not fall into the projection yet is embedded on the casing - it has nothing to do with 3DMark.
That's good to know. I was quite dispirited when I saw what I interpreted as a popular benchmark tool being sponsored by a single GPU manufacturer. I stand corrected then. I shouldn't have been so eager to judge them. Especially since my every other GPU is from Nvidia (or AMD, however it should be viewed).
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
If it's a programming technique that will only really appear in dx12 games then it would be all the same to call it a part of dx12 even if it's not specifically mentioned in the standard. I don't know if it has been such a big topic only because Nvidia failed with it or if it's really worth the hype, though. I sure hope it is as extra compute could make games more interesting.
But it's not a DX12 feature. It appears in PS4 games, Mantle games, and there's no doubt Vulkan will be able to take advantage of it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Does it even matter? There are two benchmarks out that utilize Async compute in a real world games. Ashes of Singularity and Fable Legends. In AoS Nvidia's 980Ti ties the Fury X @ 4K in Heavy Compute (Which literally no real game will ever do) and in Fable Legends the 980Ti finishes the compute portion of the test at nearly half the latency of the Fury X. Nvidia doesn't need to pay millions to drop Async support because the Ti is just as good if not better than the Fury X at normal levels of compute. The only time the Fury X is going to benefit is during extreme compute levels, levels that AoS literally rending thousands of units on the screen computing light for every unit couldn't hit.
Which games show DX12 with UE4 has no benefits? I thought ARK doesn't have support yet due to technical issues.
Epic released a few of the original playthrough things like the Elemental demo in DX12, there was no noticable difference in performance. That being said, Epic hasn't fully implemented DX12 into the engine. UE4.11 will be interesting because the entire build is geared towards renderer optimizations. It should bring a bunch of improvements to performance in UE titles. Epic recently announced that UT4 will be seeing a DX12 build soon, along with Fortnite. Should be interesting to see what Epic can do with it in their own games.
data/avatar/default/avatar22.webp
Considering I see Nvidia's logo on top of the shots, I reckon it features the absolute minimum amounts of async compute. Nvidia probably paid them millions to code their tests like that, to make it worth it. So, until the next generation of Nvidia GPUs that should have the flaw fixed, I reckon it tests only some of the features.
As write upper by HH i have miss his post ) thoses logos was displayed on the screen , projector not part of the movie, benchmark. The left one is the Galax GOC 2015 event logo, the second is there because Nvidia is partner to GOC event.. http://i503.photobucket.com/albums/e435/lanek66/12/3DMark-Time-Spy-DirectX-12-Benchmark_3_zpsospxi9lg.jpg http://i503.photobucket.com/albums/e435/lanek66/12/3DMark-Time-Spy-DirectX-12-Benchmark_4_zps1mszqqs8.jpg http://i503.photobucket.com/albums/e435/lanek66/12/3DMark-Time-Spy-DirectX-12-Benchmark_5_zpsthpm8hn9.jpg