Download: GeForce 441.12 WHQL drivers

Published by

Click here to post a comment for Download: GeForce 441.12 WHQL drivers on our message forum
data/avatar/default/avatar30.webp
Is this missing profiles?? Or is something messed up on my end - can't find profiles for later releases in this driver- like RDR2, Outer worlds or Borderlands 3. ETA: Ok got it sorted. Windows misbehaving - did its own forced "update" in background to 430 driver. Will DDU and Re-install. . .
https://forums.guru3d.com/data/avatars/m/260/260078.jpg
As for driver performance from my benchmarks fps and games fps the 3 best drivers so far for my Pascal gtx 1060 are: 1. 398.36 2. 411.70 3. 399.24 From best to worst of the best. Then drivers from R 416 to R 430 also good but a bit slower then the best 3. From R 431 performance for all numbers are much worse across benches and games
https://forums.guru3d.com/data/avatars/m/204/204699.jpg
I see a lot of complains about non-turing gpu performance drops. Do you have any serious comparison or something that would acknowledge this ? I own a 1080ti and i didn't see any performance drop, but i'm not as cautious as i once was lol. Are they doing kepler downgrade all over again ? I remember that story.
https://forums.guru3d.com/data/avatars/m/234/234283.jpg
Yopfraise:

I see a lot of complains about non-turing gpu performance drops. Do you have any serious comparison or something that would acknowledge this ? I own a 1080ti and i didn't see any performance drop, but i'm not as cautious as i once was lol. Are they doing kepler downgrade all over again ? I remember that story.
Its a conspiracy theory based on no facts that people who paid too much for Volta cards believe nVidia is taking performance away, when in fact, that card was never meant for gaming.
https://forums.guru3d.com/data/avatars/m/204/204699.jpg
jwb1:

Its a conspiracy theory based on no facts that people who paid too much for Volta cards believe nVidia is taking performance away, when in fact, that card was never meant for gaming.
I thought that in the first place, but then i saw a lot of the same complains coming from GTX 1000 users as well, which really tingles me (not the good ASMR kind of tingles).
https://forums.guru3d.com/data/avatars/m/278/278431.jpg
I just pre-order RDR 2 and wait for the release. I have 1070TI and shoul i update my card with new update?
https://forums.guru3d.com/data/avatars/m/166/166060.jpg
Yiğit:

I just pre-order RDR 2 and wait for the release. I have 1070TI and shoul i update my card with new update?
"This latest GeForce Game Ready driver delivers day-one support for Red Dead Redemption 2..."
data/avatar/default/avatar08.webp
densou:

fix'd [ ain't TROLLING, that's a fact #same-ol'-nvidia ]
I've got performance data for games and benches going back since I purchased my 1080 Ti at launch. It's been in the same system at the same clocks and processor the entire time. Every single metric I have tested is either the same or slightly higher. This is across OS and driver updates through the period. The claims of a performance regression via the drivers is trolling and you should really stop the trolling, thanks. The only thing significantly affecting performance was the Spectre and Meltdown mitigations which I have disabled. Have you and your fellow conspiracy theorist ruled that out? Noticed they released a Studio Driver, same version, slightly larger. Anyone got the deets for the difference?
https://forums.guru3d.com/data/avatars/m/260/260078.jpg
So enlight me why I can observe for example almost 450 points difference in FS on my lap, gtx 1060 88W TDP(core oc +240 mem + 860MHz) , 32GB RAM 3200MHZ CL 17, 7700HQ oced to 3502MHz on all cores. Using TGC liquid metal for cpu and gpu. Always same clocks for tests and temps, always same OS settings etc. Example driver 441.08 fs graphics max score 13868, driver 411.70 fs graphics 14272 I can give you more if you want. It is not conspiracy but facts, that it may vary from setup to setup man
https://forums.guru3d.com/data/avatars/m/265/265648.jpg
HeavyHemi:

I've got performance data for games and benches going back since I purchased my 1080 Ti at launch. It's been in the same system at the same clocks and processor the entire time. Every single metric I have tested is either the same or slightly higher. This is across OS and driver updates through the period. The claims of a performance regression via the drivers is trolling and you should really stop the trolling, thanks. The only thing significantly affecting performance was the Specter and Meltdown mitigations which I have disabled. Have you and your fellow conspiracy theorist ruled that out? Noticed they released a Studio Driver, same version, slightly larger. Anyone got the deets for the difference?
Yes, I do consider that. And knowing facts does not make one a conspiracy theorists. 2nd- there is no need to be constantly rude to other's or undermine their opinion. I have contributed generous how to's in this forum and was also responsible for 3 and 4 way SLI in pascal and posted in this forum first. Contributing to a forum is just as important as what you take away. I have only seen a decrease in perfomance in Volta since the 399.xx series drivers. I have posted observations and finding inmost driver releases since then. Titan V is in the gaming driver release and not in the quadro driver release, so this all seems normal discussion for a forum. Lastly, I have a powerful and complicated build, my understanding of how the hardware and software work might be different from yours. 1st, I don't enable mitigations in the OS or bios. From Microsoft -In Windows server and enterprise LTSC. In HKLM in Session manager Memory manager add the two DWORD values to disable ALL mitigations. I think its FeatureSetOverideMask and FeatureSetOveride set it to "3" - I'll verify when I get home from work 2nd - The nvidia driver is out of my control for mitigations. 3rd - All mitigations are for Lab research in speculative, meltdown etc. Not a real scenario to me- and I don't feel like having my build hobbled especially since array / raid is crippled with mitigation measures. I have everything backed up and I'm not concerned. 4th - if you are concerned, then run the app with SGX on in the bios, or in a VM -end of story YOU don't have to follow my example, the choice is mine. However, the issue is decreasing performance on Volta and Titan XP (for me) and with the passage of time I expect drivers to increase performance, not decrease it. 2nd- stability should increase, not decrease such as in the case of the nvcpl issue many many have reported once nvidia got into the 430.xx and above on STANDARD driver due to nvidia using the microsoft store for telemetry and RAS. Lastly on ALL telemetry. I've said this before: Polite is an OPT IN Rude is to have to OPT OUT Disrespectful is no OPT IN or OPT OUT. Cheers ---->side note, if you don't want microsoft telemetry you'll need win 10 enterprise LTSC or 2019 server LTSC. There are a handful of things to turn it off without doing registry hacks. Just requires some know how and know the choices as well as be able to use local policy, msc and gpedit, DCOM etc. Turn off the update feature and uses a tool such as WSUS and pick what and when to apply updates. Those settings do not work in regular windows 10. I would hate turning on my PC and not feel like I was in control of something I built and own. build: Asus c621e sage 2x Xeon 8180M 768GB of ram LRDIMM 4x Titan V including Titan V CEO ed 32GB 960pro nvme (OS) 10x SSD Raid (4TB ea Samsung 860-Pro, 40TB) 1600W PSU (digital) noiseless and 600W PSU(digital) assist MS 2019 Data Center, Ubuntu (no mitigation) P1 mini-ITX case its small, dense/compact, runs very cool and very very quiet
https://forums.guru3d.com/data/avatars/m/229/229798.jpg
Is it normal now for the past 2 drivers for nvidia vsync to cap my 144ghz Gsync monitor to 138.3?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Yopfraise:

Are they doing kepler downgrade all over again ?
nothing has ever been downgraded. if volta ever performed better it was because it was failing to render parts of games as a bug or badly implemented performance optimization. Volta is not meant for gaming, the chips raster pipeline is actually supposed to be slower core by core than pascal at the same clocks, it only pulls ahead through brute force, and as far as i remember the pipeline has a penalty on context switches. Thats not to mention if you need FP16 rendering tasks performed, the Tensor Volta design cannot run this in parallel with FP32.
mahanddeem:

Is it normal now for the past 2 drivers for nvidia vsync to cap my 144ghz Gsync monitor to 138.3?
Yes
https://forums.guru3d.com/data/avatars/m/263/263271.jpg
Just over 1 month till rdr2 will get on steam. Can't wait.
https://forums.guru3d.com/data/avatars/m/257/257142.jpg
mahanddeem:

Is it normal now for the past 2 drivers for nvidia vsync to cap my 144ghz Gsync monitor to 138.3?
When G-SYNC + NVCP V-SYNC is combined with Low Latency Mode "Ultra," yes. As a side note, G-SYNC + Fast Sync has been exhibiting this auto-capping behavior for years (the Ultra combo appears to do something similar without the recurring triple-buffer microstutter). Whether it is a direct substitute to a dedicated FPS limiter (for the purposes of avoiding V-SYNC input lag while staying within the G-SYNC range) remains to be seen, so I can't 100% recommend it for that purpose currently.
https://forums.guru3d.com/data/avatars/m/254/254926.jpg
people should stop whining and crying about how bad nvidia is doing to them, just take action, or just upgrade your old gpu, or just go and sell all your nvidia cards and buy the competition if there is a problem with your purchase and you are not satisfied with it, some of you might even have warranty and can even change your current gpu at low or zero cost. For me nvidia has never been better, new tech to try and be happy with, if you like it keep it, if you don't then you know what to do, just buy a not nvidia gpu. If you paid a lot for your gpu and not receiving what are you looking for, then it's time to abandon the nvidia ship.
data/avatar/default/avatar06.webp
venturi:

Yes, I do consider that. And knowing facts does not make one a conspiracy theorists. 2nd- there is no need to be constantly rude to other's or undermine their opinion. I have contributed generous how to's in this forum and was also responsible for 3 and 4 way SLI in pascal and posted in this forum first. Contributing to a forum is just as important as what you take away. I have only seen a decrease in perfomance in Volta since the 399.xx series drivers. I have posted observations and finding inmost driver releases since then. Titan V is in the gaming driver release and not in the quadro driver release, so this all seems normal discussion for a forum. Lastly, I have a powerful and complicated build, my understanding of how the hardware and software work might be different from yours. 1st, I don't enable mitigations in the OS or bios. From Microsoft -In Windows server and enterprise LTSC. In HKLM in Session manager Memory manager add the two DWORD values to disable ALL mitigations. I think its FeatureSetOverideMask and FeatureSetOveride set it to "3" - I'll verify when I get home from work 2nd - The nvidia driver is out of my control for mitigations. 3rd - All mitigations are for Lab research in speculative, meltdown etc. Not a real scenario to me- and I don't feel like having my build hobbled especially since array / raid is crippled with mitigation measures. I have everything backed up and I'm not concerned. 4th - if you are concerned, then run the app with SGX on in the bios, or in a VM -end of story YOU don't have to follow my example, the choice is mine. However, the issue is decreasing performance on Volta and Titan XP (for me) and with the passage of time I expect drivers to increase performance, not decrease it. 2nd- stability should increase, not decrease such as in the case of the nvcpl issue many many have reported once nvidia got into the 430.xx and above on STANDARD driver due to nvidia using the microsoft store for telemetry and RAS. Lastly on ALL telemetry. I've said this before: Polite is an OPT IN Rude is to have to OPT OUT Disrespectful is no OPT IN or OPT OUT. Cheers ---->side note, if you don't want microsoft telemetry you'll need win 10 enterprise LTSC or 2019 server LTSC. There are a handful of things to turn it off without doing registry hacks. Just requires some know how and know the choices as well as be able to use local policy, msc and gpedit, DCOM etc. Turn off the update feature and uses a tool such as WSUS and pick what and when to apply updates. Those settings do not work in regular windows 10. I would hate turning on my PC and not feel like I was in control of something I built and own. build: Asus c621e sage 2x Xeon 8180M 768GB of ram LRDIMM 4x Titan V including Titan V CEO ed 32GB 960pro nvme (OS) 10x SSD Raid (4TB ea Samsung 860-Pro, 40TB) 1600W PSU (digital) noiseless and 600W PSU(digital) assist MS 2019 Data Center, Ubuntu (no mitigation) P1 mini-ITX case its small, dense/compact, runs very cool and very very quiet
And yet, where is your data? And yet I told you precisely what my parameter were: exact same system, same clocks. The variables are, drivers, and OS. All my metrics are within the margin of error, over the period of time, nearly three years. They are either flat or slightly higher. Your claimed contributions has zero to do with this thread or your claims. Your deflection to personalities and your feelings versus actually addressing FACTS, which you claim to held dear, is telling. And seriously, I don't need your simplistic suggestions on what OS to use or how to configure it. Oh well.
data/avatar/default/avatar10.webp
Krzyslaw:

So enlight me why I can observe for example almost 450 points difference in FS on my lap, gtx 1060 88W TDP(core oc +240 mem + 860MHz) , 32GB RAM 3200MHZ CL 17, 7700HQ oced to 3502MHz on all cores. Using TGC liquid metal for cpu and gpu. Always same clocks for tests and temps, always same OS settings etc. Example driver 441.08 fs graphics max score 13868, driver 411.70 fs graphics 14272 I can give you more if you want. It is not conspiracy but facts, that it may vary from setup to setup man
What may vary from setup to set up? The argument is not that one driver or another may not have a 1% variance like you're citing. But a LONG TERM SIGNIFICANT DECLINE IN PERFORMANCE OVER TIME. IE GIMPING ON PURPOSE TO SELL NEWER GPU's. Either you buy into that or you do not. State your position. Thanks.
data/avatar/default/avatar30.webp
It sucks for Volta owners if the drivers have regressed, but it's not surprising since they were never sold as gaming cards, and Nvidia might do minimal tests for their performance for gaming purposes. I wouldn't be surprised either if tests showed Maxwell 1 GPUs have regressed. Kepler, Maxwell 2 and Pascal are alot more widespread, so it makes sense if Nvidia gets most reports about these cards, and prioritize internal tests for them.
https://forums.guru3d.com/data/avatars/m/262/262208.jpg
jwb1:

Its a conspiracy theory based on no facts that people who paid too much for Volta cards believe nVidia is taking performance away, when in fact, that card was never meant for gaming.
Conspiracy maybe for you but not for me, I use GPUs for rendering and newer drivers are just slower in rendering on Pascal GPUs 388.13 o 399.24 have been best for me in rendering, I can't compare it in different situations like gaming because that time I didn't play a lit games Hope this helps Thanks, Jura
data/avatar/default/avatar16.webp
jorimt:

When G-SYNC + NVCP V-SYNC is combined with Low Latency Mode "Ultra," yes. Whether it is a direct substitute to a dedicated FPS limiter (for the purposes of avoiding V-SYNC input lag while staying within the G-SYNC range) remains to be seen, so I can't 100% recommend it for that purpose currently.
I've noticed this too and it's awesome that I can finally stop running Rivatuner if this is a permanent addition. It just seems odd that it's 138 instead of 142.