Download: NVIDIA Studio Driver 431.70 WHQL

Published by

Click here to post a comment for Download: NVIDIA Studio Driver 431.70 WHQL on our message forum
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Adds support for 30-bit color on GeForce and TITAN GPUs 😱
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
Astyanax:

Adds support for 30-bit color on GeForce and TITAN GPUs 😱
That's insane! Are there even any monitors/TV's that are capable of 30-bit color?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
RavenMaster:

That's insane! Are there even any monitors/TV's that are capable of 30-bit color?
its 10bit per channel with no apha. 32 bit is 24 bits + 8 bit alpha.
data/avatar/default/avatar29.webp
what does this mean in regards to 10 bit color we are used to in monitors or tv's(or 8 bit for that matter), and is it even feasible to use on desktop monitors in display port 1.2?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
its 10bit support for opengl workstation applications.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
yep , driver OCD kicking in , INSTALLING! :P
https://forums.guru3d.com/data/avatars/m/269/269690.jpg
In layman's terms, what exactly does this bring to the average gaming, browsing, odd bit of messing around in photoshop, Joe ?
data/avatar/default/avatar35.webp
I would think nada, zylch, as most if not all users use 8 bit panels, certainly in gaming, but it might provide better image quality once supported by your monitor(proffesional) in creative applications. This was once reserved to the expensive quadro cards, up till now.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
The color depth setting in nvcp is bugged with this driver. its at 8bit greyed out, so I am unable to choose anything.
data/avatar/default/avatar35.webp
possibly due to the only depth supported on your current config??
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
WhiteLightning:

The color depth setting in nvcp is bugged with this driver. its at 8bit greyed out, so I am unable to choose anything.
10bit mode is triggered when you run an application that sets an opengl 10bit mode in the specific case that it is enabled for in this driver. i don't think it would do so anyway since your G24 is 8bit only.
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
Astyanax:

10bit mode is triggered when you run an application that sets an opengl 10bit mode in the specific case that it is enabled for in this driver. i don't think it would do so anyway since your G24 is 8bit only.
+
er557:

possibly due to the only depth supported on your current config??
Ive installed it on my living room PC which is connected to a 10bit 4k HDR. It is a proplem for windows though, but usually i can set it to 8bit or 12bit.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
sounds like a bug, or limitation in their 10bit studio support atleast.
data/avatar/default/avatar36.webp
As a professional photographer, I’ve been waiting a long time for this. It wasn’t worth buying a workstation card as I also like to game. This makes it easier to see smooth transitions in the tones. I feel like they were certainly holding this feature back but AMD started providing this awhile back. Almost like got a free “upgrade” so I’ll take it.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
RavenMaster:

That's insane! Are there even any monitors/TV's that are capable of 30-bit color?
They've been around for years. 10-bit displays are called that because they support 10 bits per color channel. There's three color channels total (red, green and blue, aka RGB.) Three times 10 equals 30. That's what "30 bit" means in this driver. So in a sense, 10-bit color and 30-bit color are the same thing, just spelled differently. I suppose the professional photography and video crowd have their own nomenclature. We call it 10-bit, they call it 30-bit.
data/avatar/default/avatar06.webp
i have a question. as someone who both playes the latest games and uses 3d modeling software on a daily basis should i install this and the gaming drivers? or does either of them include the optimization of the other?
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Astyanax i dont need to start anything, for the setting to appear/able to use it. panel offers 8bit after driver install, as long as im on 60Hz, i can switch to 10bit...
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
fry178:

@Astyanax i dont need to start anything, for the setting to appear/able to use it. panel offers 8bit after driver install, as long as im on 60Hz, i can switch to 10bit...
You're confused about whats happening here. This has nothing to do with your ability to set your screen to 10bit, it has to do with OpenGL workstation applications previously having no 10bit window support in the opengl client driver on Geforce and Titan parts, they were artificially restricted to 8bit.
data/avatar/default/avatar06.webp
Just in case, the studio driver also works with Maxwell GPU (GTX970, 980 and mobile version etc..) and they are supposed to support 10 bit as well ! You just need to mod the nv_dispsi.inf file (in Display.Driver folder after extraction) and disable driver signature enforcement. Note: For now I can't try the 10 bit workflow in Premiere Pro CC, I don't have a 10 bit monitor with display port at home... just a 4K HDR TV with hdmi port and laptop screen. But if someone want a modded INF to try that, let me know : )
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
RealNC:

They've been around for years. 10-bit displays are called that because they support 10 bits per color channel. There's three color channels total (red, green and blue, aka RGB.) Three times 10 equals 30. That's what "30 bit" means in this driver. So in a sense, 10-bit color and 30-bit color are the same thing, just spelled differently. I suppose the professional photography and video crowd have their own nomenclature. We call it 10-bit, they call it 30-bit.
Thanks for the lesson mate 😉