Quick comparison: Geforce GeForce 378.66 versus 378.78 DirectX 12 performance

Published by

Click here to post a comment for Quick comparison: Geforce GeForce 378.66 versus 378.78 DirectX 12 performance on our message forum
data/avatar/default/avatar17.webp
These results are much different than mine. I saw much larger improvements. My fps nearly doubled in some parts of the ashes of singularity benchmark. Still testing out other games. What cpu was being used for these tests?
https://forums.guru3d.com/data/avatars/m/183/183421.jpg
What cpu was being used for these tests?
fast clocked (8 cores @ 4.3 GHz) 16 thread processors (5960X)
data/avatar/default/avatar22.webp
So guru is comparing it to a different driver than what Nv used, only to turn around and say the gains are not there. Right.. I mean, why not do both, a recent one and the old driver ? This way we can check on Nvs claim, but also compare the gains vs a newer release.
http://cdn.mos.cms.futurecdn.net/PagR5uepvmCDmhWMgXdU8o-650-80.png I don't agree with nvidia's testing methodology of seeing gains across titles compared to the GTX 1080 launch that took place on May, last year. Fact is, some of these games have seen numerous updates from the developer's side too. The executable they used to evaluate the performance initially is not the same executable that's being used to compare with this driver. That's a huge variable. And you'd have to be living under a rock to own a GTX 1080 and have stuck to using the launch drivers and haven't updated since. Rise Of The Tomb Raider didn't have a DirectX 12 patch till July, and it took them quite a few patches after the initial DirectX 12 patch to get things working. In May last year, when the GTX 1080 launched, there wasn't a DirectX 12 Rise Of The Tomb Raider codepath available, at least not to the public. Tom Clancy's The Division got it's DirectX 12 support in last December so how did nvidia get those DirectX 12 numbers back in May with their GTX 1080 launch drivers is beyond me. Same goes for Gears Of War 4 which launched in October 11, 2016. I highly doubt that the executable for Ashes Of The Singularity that existed back in May, 2016 hasn't accommodated any updates to this date that didn't make performance improvements. Even if you install the launch GTX 1080 driver which probably isn't a good idea on a Windows 10 Anniversary installation, some of these games nvidia have mentioned to have improved upon won't have a working "Gameready" profile within the driver itself and might crash or glitch. :infinity:
data/avatar/default/avatar12.webp
fast clocked (8 cores @ 4.3 GHz) 16 thread processors (5960X)
Well that explains everything. I have to say this is not what I would expect from a professional site like guru3d. Obviously the results are going to be better on more cpu limited setups.
data/avatar/default/avatar27.webp
Well that explains everything. I have to say this is not what I would expect from a professional site like guru3d. Obviously the results are going to be better on more cpu limited setups.
http://i.imgur.com/7FcGZsZ.png Come again? It's professional of a site to bench GPUs under a CPU bottlenecked scenario, right? Hilbert should have opted for a Core i3 2100 instead judging from your opinion.
https://forums.guru3d.com/data/avatars/m/239/239932.jpg
Forza Horizon also runs the same.
data/avatar/default/avatar35.webp
Come again? It's professional of a site to bench GPUs under a CPU bottlenecked scenario, right? Hilbert should have opted for a Core i3 2100 instead judging from your opinion.
Exactly. The more CPU bottlenecked, the better. That is the whole point of dx12 in the first place.
https://forums.guru3d.com/data/avatars/m/268/268716.jpg
The Nvidia comparison is very misleading, you could argue that it's there in the smallprint but I have to question the morality of making such headline claims. As for DX12 comparisons on a bottlenecked cpu, I can see the reasoning for some people wanting this and the results would be interesting. Although to be fair to Hilbert, most posters on the threads I read do not seem fall into that category.
https://forums.guru3d.com/data/avatars/m/80/80115.jpg
Any possibility of seeing a comparison on a previous gen card e.g. 970 or 980? Wondering if driver releases still improve performance on our "old" cards 😛
data/avatar/default/avatar24.webp
Only hitman have a big performance gain. Aslo with Maxwell. I think gains are with higher resolution plus extreme SSAA like with ROTR
data/avatar/default/avatar04.webp
Exactly. The more CPU bottlenecked, the better. That is the whole point of dx12 in the first place.
No, that is not the point of DirectX 12 at all. DirectX 12 doesn't magically reduce CPU bottlenecks, but all it does is help to utilize CPUs better by allowing it to make use of the GPU more efficiently compared to DirectX 11. This is an oversimplification and there are variables involved. The aforementioned image you see in my post above is from nvidia's site where they themselves used a Core i7 5930K and obtained those numbers at 4K. From the looks of nvidia's claims, the roughly 16% gains across numerous titles are found at 4K, compared to the initial launch driver performance. I have a GTX 770 4GB and I didn't see any noticeable improvements in Deus EX Mankind Divided or Far Cry Primal. I don't have Rise Of The Tomb Raider installed right now. Guess I probably am not applicable for the boost either since my card is of an older architecture. Guess that's a different tale. Frankly speaking, the claims for 16% increase across the titles is BS. They improved in Hitman, Rise Of The Tomb Raider and The Division. Anything else "compared to the GTX 1080 launch driver" is a big lie, as you may have noticed I pointed out in a previous post that some of the games they are noting didn't have a DirectX 12 executable at the time of the GTX 1080 launch.
data/avatar/default/avatar24.webp
The Nvidia comparison is very misleading, you could argue that it's there in the smallprint but I have to question the morality of making such headline claims. As for DX12 comparisons on a bottlenecked cpu, I can see the reasoning for some people wanting this and the results would be interesting. Although to be fair to Hilbert, most posters on the threads I read do not seem fall into that category.
Just looking at the posts in this thread, alot have Sandy Bridge, which is six years old at this point. For myself, I also have Phenom II X4 and Core 2 Quad systems, for which a DX12 path could be a game-changer Even a highly overclocked i5 2500k is showing its age today
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
No, that is not the point of DirectX 12 at all. DirectX 12 doesn't magically reduce CPU bottlenecks, but all it does is help to utilize CPUs better by allowing it to make use of the GPU more efficiently compared to DirectX 11.
Yeah. In other words, it magically reduces CPU bottlenecks. :-P This benefits slower CPUs much more.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Well that explains everything. I have to say this is not what I would expect from a professional site like guru3d. Obviously the results are going to be better on more cpu limited setups.
I don't understand what you mean, using a 1000USD CPU is unprofessional? Get a life.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Probably the same as AMD's claims of performance increases. All from the games being patched, very little from drivers. I figure there'll be no real difference between the first driver and this one when run on fully patched games. It would be interesting to limit the CPU to 4 cores only, as the i5 is likely the most common baseline CPU.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
well imo its just another crappy PR stunt. Wish they made another R337 driver boost, that was for me quite good across various games.
data/avatar/default/avatar34.webp
Yeah. In other words, it magically reduces CPU bottlenecks. :-P This benefits slower CPUs much more.
It's not the same thing. A bottleneck is when your GPU is stalling due to it waiting on the CPU to process queues. A CPU that utilizes the GPU above 99% the majority of the time can still see gains going from DirectX 11 to DirectX 12 simply because it can send commands to GPU to get more work done in parallel at the same time, and still have the same GPU usage. Games like Gears Of War 4 sees performance scaling beyond 4 cores. It's probably the only game I have seen in a long time that also made use of the Hyperthreading. The minimum framerate on a 6500K at 4.5 GHz can't even touch a 6700 locked SKU in terms of performance thanks to the CPU and it's threads being utilized properly. But this is very title specific.
https://forums.guru3d.com/data/avatars/m/175/175780.jpg
Did they remove mandatory login and telemetry spying ? :infinity:
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
So Hitman's DX12 has been optimised in this driver but everything else is within margin of error? Can't say I am surprised as DX12 has been one disappoint after another since it launched for me but it's still disappointing that this new(-ish) API still offers no real performance boost in the majority of games or graphical features over DX11. It's like it exists just to sell Windows 10 (via exclusive DX12 Store games) and to benefit weak console hardware like the Xbox One...
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
So Hitman's DX12 has been optimised in this driver but everything else is within margin of error? Can't say I am surprised as DX12 has been one disappoint after another since it launched for me but it's still disappointing that this new(-ish) API still offers no real performance boost in the majority of games or graphical features over DX11. It's like it exists just to sell Windows 10 (via exclusive DX12 Store games) and to benefit weak console hardware like the Xbox One...
DX11 launch games were the same. Alot of them only had DX11 api path just for having AA with deferred lighting. After about a year of DX11 games they really started to shine. Sadly it will take time but I think DX12 will be around a long time. We are 7 years into DX11 titles now. I didnt expect anything mind blowing graphics wise from games. And honestly didnt expect them to run better either. DX11 didnt really run better then DX10 or DX9 games. Infact most of the time DX9 ran better because of less graphics features being used.