NVIDIA: V-Sync OFF Not Recommended as a Global Setting Starting

Published by

Click here to post a comment for NVIDIA: V-Sync OFF Not Recommended as a Global Setting Starting on our message forum
data/avatar/default/avatar10.webp
@NiColaoS any 144hz monitor can be set at 120hz and 100 and sometimes 90 other than 75. So still good if missing vsync you can gear to your machine. I have an old i7 990x that couldn't do 144hz frame on modern wow, but does 100 most of the time.
data/avatar/default/avatar26.webp
Noisiv:

Yes, acknowledgment by server is what matters and not just what is drawn on screen. So only if something is late on screen, it can still be registered earlier by the server. But you are sending your input command toward server only as your reaction to what you're seeing on screen. And if on-screen is late due to v-sync, the end result will be late, regardless of the server registering in a timely manner.
Yes but my main point is that people call it `input lag` while is the whole experience lagging. And what happens is that then you browse forums with people arguing: `The VsYnC MakKESS my keYboArd go slower`.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Well, they kind of forgot to deliver interaction with adaptive sync here. Because regular screen now days has g/free-sync. And there people quite often disable v-sync. Then there is that option in windows settings. Spoiler: "How does it interact with this settings now?"

upload_2021-2-12_13-17-2.png
Will above apply to those windowed applications too? Will there be some kind of composition deadline in terms of milliseconds? (Like wait 1ms after main window finishes redraw till others have to be here or image is presented to driver/screen?)
asturur:

Yes but my main point is that people call it `input lag` while is the whole experience lagging. And what happens is that then you browse forums with people arguing: `The VsYnC MakKESS my keYboArd go slower`.
I think that proper name is "Motion to Photon Delay". And you are quite right in depiction of it.
data/avatar/default/avatar22.webp
asturur:

nvidia reflex i think default to vsync fast, that is a fake vsync, unlimited on the cpu / gpu side, then synced in another way on the screen. So you render all the frames as vsync off without being limited to the triple buffer, double back buffer. So you are effectively using vsync off with no tearing, that for most people is a solution.
Not 100% sure about that. For that a game implementation of nvidia reflex was not needed, it could be turned on via driver anyway . It can't be only triple buffering as I have always noticed that triple buffering even with a framerate lock is kind of inconsist, and I don't actually have the same feeling with Nvidia Reflex + Vsync.
data/avatar/default/avatar15.webp
asturur:

Input lag has always been the wrong definition imho. What vsync introduces is visual lag / reaction lag. You kick a button at time X, the frame goes in the rendering time and is in the back buffer at X + N, then goes in the front buffer at X + N + N. The game engine could shoot that input to the network before the frames are done. So you see it happening later, but the input is not suffering from lag. Is a visual lag issue imho. People think that their inputs are not considered because all the computer does nothing waiting for the next vsync, and that seems silly to me as a software developer. If it is true in games, is games fault.
you are right, its the delay of the image and not the input that is truly delayed. However, eye to hand coordination suffers when the motion you do takes more time to display, thus it is referred to as input lag.. it does not help that the pc knows where your crosshair and opponent currently is, when you dont.. the ‘input lag’ that is created makes it much harder to track a moving target of two reasons.. 1: the enemy is in reality no longer where you see him, since your monitor already shows an old image that is not up to date. 2: when you flick the mouse you will not see the motion until a few frames later.. how are you then supposed to hit the fire button exactly when crosshair meets the player? By the time you see them match both the player and your crosshair has already moved on for a couple frames.. this is for gaming... but snappy accurate update of the mouse onscreen is really a benefit in mostly all tasks, just the punishment for high delay is not as severe in most windows tasks.
data/avatar/default/avatar06.webp
Camaxide:

you are right, its the delay of the image and not the input that is truly delayed. However, eye to hand coordination suffers when the motion you do takes more time to display, thus it is referred to as input lag.. it does not help that the pc knows where your crosshair and opponent currently is, when you dont.. the ‘input lag’ that is created makes it much harder to track a moving target of two reasons.. 1: the enemy is in reality no longer where you see him, since your monitor already shows an old image that is not up to date. 2: when you flick the mouse you will not see the motion until a few frames later.. how are you then supposed to hit the fire button exactly when crosshair meets the player? By the time you see them match both the player and your crosshair has already moved on for a couple frames.. this is for gaming... but snappy accurate update of the mouse onscreen is really a benefit in mostly all tasks, just the punishment for high delay is not as severe in most windows tasks.
I still cannot explain how 1 frame can make that difference. Because with vsync off all you gain is not waiting for the next refresh, eventually drawing half and half ( or possible more pieces ) of frames on the screen. Or there is waiting in vsync that i do not understand?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
asturur:

I still cannot explain how 1 frame can make that difference. Because with vsync off all you gain is not waiting for the next refresh, eventually drawing half and half ( or possible more pieces ) of frames on the screen. Or there is waiting in vsync that i do not understand?
depending on the pressure from the swap chain it can feel like 2 frames.
data/avatar/default/avatar36.webp
ObscureangelPT:

Not 100% sure about that. For that a game implementation of nvidia reflex was not needed, it could be turned on via driver anyway . It can't be only triple buffering as I have always noticed that triple buffering even with a framerate lock is kind of inconsist, and I don't actually have the same feeling with Nvidia Reflex + Vsync.
i think vsync fast has nothing to do with triple buffering, it just keeps updating the back framebfuffer unthrottled, and then at the right moment it put the newest one in the front buffer. Nvidia describes it as acting as vsync off internally.
data/avatar/default/avatar12.webp
@asturur If that was the case, GPU Usage would stay at 99% giving everything it got, but that's not the case. GPU Usage drops to 60/70% since it's the only Usage it needs to reach the Framerate threeshold. Altough Warzone also features 2 types of Reflex, ON (the one that i use), and a higher mode which claims what you say.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Astyanax:

It never has been, there are applications that rely on being able to specify their tearing and swapmode methods that otherwise break. Application default has always been the recommended setting, and then control the vsync off state either from the application side or a driver profile. Nvidia need to remove the off state from the global setting and only expose it under profiles
Could maybe use the global settings like AMD does and have it apply to OpenGL here but keeping the override itself yet leaving it for advanced settings as a per-profile thing should users want to set the state to disabled regardless of compatibility concerns. That way it can't be left on off if there are some software that is in some way entirely incompatible with immediate presentation mode yet keep the option for those who want this setting on a per-profile basis. Internal profiles work too I suppose with a bit more work outright deny changes to some software if the breakage is bad enough to require this but a recommendation is a ton less work and testing while serving the same idea just reminding users to be mindful on setting it off globally. 🙂
https://forums.guru3d.com/data/avatars/m/236/236838.jpg
asturur:

I still cannot explain how 1 frame can make that difference. Because with vsync off all you gain is not waiting for the next refresh, eventually drawing half and half ( or possible more pieces ) of frames on the screen. Or there is waiting in vsync that i do not understand?
It makes a difference and the difference is explicit in less powerful GPUs. I was never able to get used to the delay in CS:GO when VSYNc is on.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
JonasBeckman:

Could maybe use the global settings like AMD does and have it apply to OpenGL here but keeping the override itself yet leaving it for advanced settings as a per-profile thing should users want to set the state to disabled regardless of compatibility concerns. That way it can't be left on off if there are some software that is in some way entirely incompatible with immediate presentation mode yet keep the option for those who want this setting on a per-profile basis. Internal profiles work too I suppose with a bit more work outright deny changes to some software if the breakage is bad enough to require this but a recommendation is a ton less work and testing while serving the same idea just reminding users to be mindful on setting it off globally. 🙂
That'd work, theres plenty of settings that don't do shit globally but do from the profile.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
For us G-sync users - is V-sync on in NVCPL and V-sync off in games still the best option?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Netherwind:

For us G-sync users - is V-sync on in NVCPL and V-sync off in games still the best option?
Yes. the issue is specifically about applications that expect their vsync on state to be respected.
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
Astyanax:

It never has been, there are applications that rely on being able to specify their tearing and swapmode methods that otherwise break. Application default has always been the recommended setting, and then control the vsync off state either from the application side or a driver profile. Nvidia need to remove the off state from the global setting and only expose it under profiles
Do you know of some example games like that which "break" if you force the control panel's v-sync rather than using their in-game options? The primary reason I typically used the control panel V-Sync rather than in-game (before I had G-Sync) is because then you know precisely what "type" of V-Sync you're using (traditional double buffered / linear triple buffering / fast sync / Adaptive V-Sync). Absolutely makes me irate when games don't even tell you what form of V-Sync they use when there's only one single option present. They also never communicate whether or not enabling the option did more than enable v-sync such as introducing a framerate limit automatically in-game (I've read some games did this, but I haven't found any that do personally -- barring games which just limit their framerate internally to 60 for physics reasons). Due to this I always forced my preferred method via the control panel then set FPS limits manually with RTSS or in-engine limiters in the case of Overwatch for example since it's limiter has low input delay. In general I really wish game setting were more verbose -- so, explaining exactly what type of v-sync is used with the in-game setting (and any other changes/optimizations that it "may" do) and also more games should do what Gears 5 does where they explain how much impact on CPU/GPU various settings have (So, in Gears 5 they'll say "X setting has a low CPU impact and high GPU impact" for example).
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
BlindBison:

Do you know of some example games like that which "break" if you force the control panel's v-sync rather than using their in-game options?
Vsync off issues range from Input doesn't work properly, to visual glitches and engine misbehaviors (but these are mostly framerate exceeding engine tolerances, and long been resolved with third party frame limits) the primary issue nvidia is addressing here has been when global vsync is off in applications that have accelerated interfaces but seperately draw them rather than updating them as one whole surface, this issue is worsened now as applications are drawing into their own layers without relying on DWM at all (at the discretion of the OS) so you can get flickering in Application X while scrolling in application Y. Now off the top of my head on what games have been (or previously) broken with vsync off, - Physics goes banana's Divinity Original Sin (since fixed) - Crash Homeworld (Eventually fixed by frame limiters) - At 1000fps, mouse input breaks. WarFrame (since fixed) - Crash I'm sure given enough games to test and the time to test them i could find more.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Interesting...my 5700XT/BenQ EW-3270 60Hz displays no visible tearing in ~99%+ of my games--I've left the global setting in my Adrenalins set to vsync-off--in both full screen and windowed borderless. No problems. I have maybe one game that looks and runs a bit better with vsync on--Grim Dawn & expansions. And the game doesn't page tear--it just stutters a bit unless I enable in-game vsync, for some reason. As 60Hz monitors are far more common than higher Hz monitors, turning vsync off is the only way to > 60 fps. I have some older games and benchmarks that run at hundreds of frames per second with no visible tearing. I well remember what tearing looks like from years ago, but with the last two 4k monitors I've owned, vsync off has not been a page-tearing issue. I chalked it up to the anti-flicker circuitry in both monitors. I'm now running an advanced beta version of Win10 and that hasn't changed. This is also interesting because back in the days when I owned a TNT1 and a TNT2, and V2 SLI, a V5 5.5K, and a V3, many years ago, The nVidia TNTs both had major problems with the vsync off condition--the 3dfx GPUs did not, so the difference was easy to see.
data/avatar/default/avatar01.webp
Thinking about it now I don't think I've ever forced anything globally in the CP. If there is ever anything that needs a change I just make / select a profile for it in CP or use Nvidia Profile Inspector. Simple.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Camaxide:

I highly disagree, everyone knows vsync introduced input lag, and many does not want that, and do not care about tearing. On a high refresh monitor the tearing is very subtle, and the benefit of improved input lag is major. So the best option is to default as always, and then let the user decide what is best for their own use.. because each user likely knows best what they want and need, and not someone else..
Not quite sure what you even trying to state. With off removed you can either globally go On, Fast or let the 3D Application decide, you can just pick the last option and control it either from profiles or games, no reason to have an option that forces it off for everything. Besides, you also have options that counter the input lag introduced by v-sync, works quite well with most properly coded engines.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
Noisiv:

Yes, acknowledgment by server is what matters and not just what is drawn on screen. So only if something is late on screen, it can still be registered earlier by the server. But you are sending your input command toward server only as your reaction to what you're seeing on screen. And if on-screen is late due to v-sync, the end result will be late, regardless of the server registering in a timely manner.
Competitive games do not use TCP ACK for every action, latency caused by the amount of acknowledgments introduced by such would make it unplayable. Somehow I think you know that though.