NVIDIA: V-Sync OFF Not Recommended as a Global Setting Starting
This communique slipped through the mazes, but apparently NVIDIA V-Sync OFF is no longer recommended as a Global Setting Starting with Driver Version 461.09.
One of the problems is that several Windows APPs can produce screen tearing (doh). The following statement has been made by NVIDIA:
Beginning with the NVIDIA Driver version 461.09, setting V-Sync OFF as a global setting from the NVIDIA Control Panel will likely result in tearing even when viewing content in windowed mode.
When V-Sync is enabled, the application's frame rate is synchronized with the display refresh rate in order to eliminate tearing. Typically, tearing would occur when viewing applications in full-screen mode, so V-Sync is useful in eliminating tearing in this case.
Prior to driver version 461.09, only a single plane was available to the OS. The Desktop Windows Manager (DWM) would composit the contents from all the windows and then present everything at the refresh rate cadence. This means that tearing was avoided by DWM design even without V-Sync.
However, driver version 461.09 and later supports multiplane overlay (MPO). (MPO provides benefits, such as reduced latencies, for gaming in windowed mode.) With multiplane overlay, the application is allowed to present its contents independently and with its own V-Sync setting. The OS, outside of driver control, decides which apps get promoted to their own MPO plane. If V-Sync is OFF, then there will be tearing (even in windowed mode) because, as in full-screen mode, there is now no desktop compositor controlling the presentation. Setting V-Sync OFF now has the same effect in windowed apps as it does for full-screen apps.
To avoid unexpected tearing in windowed mode, NVIDIA recommends leaving the V-Sync global setting at the default "Use the 3D application setting".
NVIDIA: Ampere RTX 30 Stock Issues Is not only based on GPU shortages - 12/03/2020 06:54 PM
Nvidia's CFO Colette Kress was present at a technology conference hosted by Swiss financial services company Credit Suisse; she spoke about the ongoing stock shortages. Nvidia understands that the R...
NVIDIA: 8-pin PCIE connector on contest photo was a concept render - 11/30/2020 04:55 PM
Funny this one, if you look at the photo below the fold, you're going to notice a standard PCIe PEG 8-pin power header on a founder edition GeForce RTX 3070. Read it again, 8-pin power header....
NVIDIA: A Dozen More Games will have Ray Tracing and DLSS This Year - 10/21/2020 09:11 AM
RTX ON: A Dozen More Games will have Ray Tracing and DLSS This Year The rest of October and November will be busy for GeForce Gamers. The holidays are coming, and with them comes a dozen games that su...
NVIDIA: Minecraft is getting ray tracing (+screenshots) - 08/19/2019 09:27 AM
Minecraft, the world’s best-selling video game with over 176 million copies sold, will be dramatically more realistic through support for real-time ray tracing on PC, NVIDIA and Microsoft to...
NVIDIA: G-SYNC Validation Runs into 94% failure rates - 05/30/2019 09:52 AM
Ever since NVIDIA opened up Adaptive-Sync in their drivers as G-Sync (compatible) the display monitor vendors can get a G-Sync certification. Aside from some money for that label, the monitor manufact...
Senior Member
Posts: 4698
Joined: 2008-09-07
O always prefer 'adaptive' for global.
Member
Posts: 81
Joined: 2016-05-29
I highly disagree, everyone knows vsync introduced input lag, and many does not want that, and do not care about tearing. On a high refresh monitor the tearing is very subtle, and the benefit of improved input lag is major. So the best option is to default as always, and then let the user decide what is best for their own use.. because each user likely knows best what they want and need, and not someone else..
Senior Member
Posts: 3898
Joined: 2017-11-23
It never has been, there are applications that rely on being able to specify their tearing and swapmode methods that otherwise break.
Application default has always been the recommended setting, and then control the vsync off state either from the application side or a driver profile.
Nvidia need to remove the off state from the global setting and only expose it under profiles
That is the most retarded thing i've heard in a long time... why the **** should they remove the option... it's entirely up to people themselfs if they want to make use of the option or not, and some do.
Senior Member
Posts: 1198
Joined: 2010-05-12
Input lag has always been the wrong definition imho.
What vsync introduces is visual lag / reaction lag.
You kick a button at time X, the frame goes in the rendering time and is in the back buffer at X + N, then goes in the front buffer at X + N + N.
The game engine could shoot that input to the network before the frames are done.
So you see it happening later, but the input is not suffering from lag. Is a visual lag issue imho.
People think that their inputs are not considered because all the computer does nothing waiting for the next vsync, and that seems silly to me as a software developer. If it is true in games, is games fault.
Senior Member
Posts: 13244
Joined: 2018-03-21
It never has been, there are applications that rely on being able to specify their tearing and swapmode methods that otherwise break.
Application default has always been the recommended setting, and then control the vsync off state either from the application side or a driver profile.
Nvidia need to remove the off state from the global setting and only expose it under profiles