Unreal Engine 4 - Infiltrator Tech Demo

Published by

Click here to post a comment for Unreal Engine 4 - Infiltrator Tech Demo on our message forum
data/avatar/default/avatar29.webp
can i shoot in this demo so no i dont like it
data/avatar/default/avatar37.webp
www(dot)file-upload.net/download-10916825/INFILTRATOR-DX12.rar.html
https://forums.guru3d.com/data/avatars/m/261/261432.jpg
www(dot)file-upload.net/download-10916825/INFILTRATOR-DX12.rar.html
Thank you kind sir
https://forums.guru3d.com/data/avatars/m/259/259676.jpg
can i shoot in this demo so no i dont like it
Clearly it's not for users like you sir, it's a demo meaning demonstration, try to remember how many time ago a playable demo (with such graphics..) was released. UE 4 EPIC GAMES demos (and their launcher) is not for everyone ofc.
data/avatar/default/avatar04.webp
*sight* They should call it Lens flare and chromatic aberration demo instead. edit: At least this time they didn't exaggerate with the motion blur while depth of field almost crossing the line imo.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Actually I'll have to check it via launcher. On reddit people wrote that they had to build it via UEditor. And that would be awesome thing from authors. Because it would allow a lot of people to learn from it and create better looking graphics.
www(dot)file-upload.net/download-10916825/INFILTRATOR-DX12.rar.html
This file is based on nVidia UE build as it has their libraries. (just saying) in console ` you can use: stat fps stat unitgraph ...to get some basic overview on fps and frame times.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Actually I'll have to check it via launcher. On reddit people wrote that they had to build it via UEditor. And that would be awesome thing from authors. Because it would allow a lot of people to learn from it and create better looking graphics. This file is based on nVidia UE build as it has their libraries. (just saying) in console ` you can use: stat fps stat unitgraph ...to get some basic overview on fps and frame times.
The Kite demo and that stuff was all released, I'm sure Infiltrator will be too eventually.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Visually it is pretty impressive. At least what I could see via RDP. I did use StartFPSChart and StopFPSChart which dumps csv file with frame times. While GPU utilization via RDP was only between 40~80%, I got average 59.13fps. Will rerun once I get home. But I noticed frametime spikes upon scene changes (likely due to running it from HDD instead of SSD). And there were some frame time anomalies, but I would say RDP is culprit: Frame times in ms of adjacent frames: 10.23 20.16 10.85 19.45 10.41 19.58 10.27 19.87 ... Plus at splash screen when it waits for spacebar fps is locked, So I wonder how much it is locked afterwards.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Visually it is pretty impressive. At least what I could see via RDP. I did use StartFPSChart and StopFPSChart which dumps csv file with frame times. While GPU utilization via RDP was only between 40~80%, I got average 59.13fps. Will rerun once I get home. But I noticed frametime spikes upon scene changes (likely due to running it from HDD instead of SSD). And there were some frame time anomalies, but I would say RDP is culprit: Frame times in ms of adjacent frames: 10.23 20.16 10.85 19.45 10.41 19.58 10.27 19.87 ... Plus at splash screen when it waits for spacebar fps is locked, So I wonder how much it is locked afterwards.
I don't think UE4 uses ASync shaders or even does much GPU compute stuff -- but I'd like to see a DX11/DX12 comparison on the Fury X if you get time/get home. (RDP is probably going to effect some performance stuff) A guy on reddit/nvidia did it on a 970 and saw about a 9% increase in performance, which I guess isn't bad for a free performance upgrade.
https://forums.guru3d.com/data/avatars/m/258/258822.jpg
I find it funny that after a tech is release by the time you get to use it, its already worn out. Even still, this is pretty awesome to finally see it for ourselves on our own pc's.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I don't think UE4 uses ASync shaders or even does much GPU compute stuff -- but I'd like to see a DX11/DX12 comparison on the Fury X if you get time/get home. (RDP is probably going to effect some performance stuff) A guy on reddit/nvidia did it on a 970 and saw about a 9% increase in performance, which I guess isn't bad for a free performance upgrade.
You can write your own modules for compute. But this demo uses APEX, likely for particle effects, so it is CPU stuff.
data/avatar/default/avatar04.webp
Why is it always not possible to switch YouTube videos on guru3d to fullscreen? Is this some kind of advertisement so you always have to see other page content?
data/avatar/default/avatar22.webp
Why is it always not possible to switch YouTube videos on guru3d to fullscreen? Is this some kind of advertisement so you always have to see other page content?
Whatever is the site, when a youtube vid is embedded, just click on the youtube logo, go in Youtube.
data/avatar/default/avatar35.webp
Whatever is the site, when a youtube vid is embedded, just click on the youtube logo, go in Youtube.
But there is a button for fullscreen but it is just greyed out, i don't see the purpose of needing to reopen it on youtube page.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I don't think UE4 uses ASync shaders or even does much GPU compute stuff -- but I'd like to see a DX11/DX12 comparison on the Fury X if you get time/get home. (RDP is probably going to effect some performance stuff) A guy on reddit/nvidia did it on a 970 and saw about a 9% increase in performance, which I guess isn't bad for a free performance upgrade.
Damn bad. DX11 average fps 70.85 ; DX12 average fps 69.37. GPU utilization in both... I do not even know. Because @31% fan speed GPU gets to 41°C and stays there. In KF2 maxed which looks worse I get to 50~54°C. Weirdest thing is that in parts DX11 does better and DX12 does better in others. Then I checked CPU utilization too. For both DX11 and 12 it is practically same. All 4 cores utilized between 70~90%. Maybe more cores would help, but I really do not know what this DX12 implementation is supposed to achieve. http://i59.tinypic.com/27y04r4.png Btw, graph shows fps 🙂
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
when is guru3d gona host the tech demo??
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Damn bad. DX11 average fps 70.85 ; DX12 average fps 69.37. GPU utilization in both... I do not even know. Because @31% fan speed GPU gets to 41°C and stays there. In KF2 maxed which looks worse I get to 50~54°C. Weirdest thing is that in parts DX11 does better and DX12 does better in others. Then I checked CPU utilization too. For both DX11 and 12 it is practically same. All 4 cores utilized between 70~90%. Maybe more cores would help, but I really do not know what this DX12 implementation is supposed to achieve. http://i59.tinypic.com/27y04r4.png Btw, graph shows fps 🙂
Seems like the 980Ti gets lower performance in DX12 too. Lame. I guess we have to wait till Fable, but if this is anything to go by, the results will probably similar. Epic better step their **** up.
https://forums.guru3d.com/data/avatars/m/259/259676.jpg
Damn bad. DX11 average fps 70.85 ; DX12 average fps 69.37. GPU utilization in both... I do not even know. Because @31% fan speed GPU gets to 41°C and stays there. In KF2 maxed which looks worse I get to 50~54°C. Weirdest thing is that in parts DX11 does better and DX12 does better in others. Then I checked CPU utilization too. For both DX11 and 12 it is practically same. All 4 cores utilized between 70~90%. Maybe more cores would help, but I really do not know what this DX12 implementation is supposed to achieve. http://i59.tinypic.com/27y04r4.png Btw, graph shows fps 🙂
Wow,you get a bit worse perf. than my single 970. Maybe it needs opt. (next driver ? ).
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Wow,you get a bit worse perf. than my single 970. Maybe it needs opt. (next driver ? ).
No, it is some weird sh*t there. I went through console, changed over 80 graphical settings. With slomo 0.01 to see actual impact on same scene. While I managed to get render time practically to game time, GPU time never changed. It is like something else is hogging GPU with empty workload (I guess so from low temperature I am getting even @1440p). I even disabled AO, reflections, refractions, apex, postprocess, AA, post AA, bloom, ... No change in render time at all. Only thing which actually affected it was change in aniso filter level. (apparently killing gbuffer boosts fps, but well, most of stuff is rendered there, so it is like disabling nearly everything) What makes me think that there is something wrong in this tech demo or in way it was compiled is fps you get before and after. While you render one line of text in 3d and then simple logo. fps there should go to thousands. How can one line of text utilize GPU on 100% while doing only 70-80fps @1080p fullscreen? I have over 120fps in games based on UE4 with lot of stuff on screen.
data/avatar/default/avatar10.webp
No, it is some weird sh*t there. I went through console, changed over 80 graphical settings. With slomo 0.01 to see actual impact on same scene. While I managed to get render time practically to game time, GPU time never changed. It is like something else is hogging GPU with empty workload (I guess so from low temperature I am getting even @1440p). I even disabled AO, reflections, refractions, apex, postprocess, AA, post AA, bloom, ... No change in render time at all. Only thing which actually affected it was change in aniso filter level. (apparently killing gbuffer boosts fps, but well, most of stuff is rendered there, so it is like disabling nearly everything) What makes me think that there is something wrong in this tech demo or in way it was compiled is fps you get before and after. While you render one line of text in 3d and then simple logo. fps there should go to thousands. How can one line of text utilize GPU on 100% while doing only 70-80fps @1080p fullscreen? I have over 120fps in games based on UE4 with lot of stuff on screen.
I'd guess its just since this a tech demo release - not really meant to be used as a performance benchmark, just show off the effects. A lot of hand-tuned optimizations are done in real game environments that probably weren't done for this demo.