NVIDIA: G-SYNC Validation Runs into 94% failure rates
Click here to post a comment for NVIDIA: G-SYNC Validation Runs into 94% failure rates on our message forum
Michal Turlik 21
https://www.geforce.com/hardware/technology/adaptive-vsync/technology
Just as quoted by Denial, it is not exaclty the same...if I am not getting wrong the VESA adaptive sync can be referred as a method to process vertical refresh synchronization dynamically and at the same to allow further refresh rate synchronization implementations. The VESA standard can be perceived as a set of primitives to be used by the vendor to implement its own advanced refresh rate synchronization. That's where it enters to play the AMD VRR implementation (variable refresh rate) which is part of the AMD Freesync proprietary standard.
Technically a monitor which is VESA adaptive sync capable is not AMD Freesync capable due to the lack of the proprietary "extension".
Take what I have written with a grain of salt, I am not 100% sure to be correct.
The good news is however that NVIDIA does support the VESA standard and I never checked it 🙂
gerardfraser
Nvidia fooling people and are doing a good job,top class propaganda.Soon there will be no Freesync and only Gsync LOL. Anyway I am glad Nvidia finally accepted adaptive sync.Good job for them doing so,trying to discredit freesync monitors when they can actually work good with Nvidia is well typical Nvidia which does not matter to me because I made an informed decision based on experience with Freesync and Gsync monitors.
For me Freesync monitor on a Nvidia card is a better experience than Gsync monitor on a Nvidia card.Simple as that for me.
schmidtbag
That is a shockingly high amount of fails, but, I think it's good Nvidia is so picky about certification. At this rate, the only reason to pay the high premium for G-Sync is so you know you're getting the best possible experience.
However, I think it'd be worth it for Nvidia to have 3 separate tiers for certification. So, 94% of displays might fail for a gold rating, but, a 600 nit display should still qualify for a bronze.
ETAxDOA
Who cares, ... Forked over $$$ for the nvidia gsync/rtx experience, nearly three months after buying the card, finally got a replacement for a faulty card and the metro exodus game code is no longer valid. Nvidia gave less than zero fcuks they'd enticed me in with faulty cards and offers they won't honour... I can accept the practically of the codes being expired/invalid, but the total disregard for the consumer experience has pushed me across the red/green line towards a big FU Nvidia
Michal Turlik 21
Luc
Gsync module does only one thing and it cost a premium to consumers equal to an small computer: APU 50 €, MB 50€, 2x4 GB Ram DDR4 50€, 250 GB NVMe M.2 disk 50€...
People should have learn something about Nvidia's marketing machine after GTX 970 fiasco, or the GPP nonsense...
RealNC
gerardfraser
Here is a list of over 700+ Monitors with Freesync that will work with Gsync compatible Nvidia drivers.
https://www.amd.com/en/products/freesync-monitors
ruthan
It would be nice, if Nvidia could release some testing utility, i know that not everything is possible to test with SW only, but it would be good start. Otherwise im glad that someone really pushing display quality..
sinnedone
What everyone is forgetting is that Intel is supporting adaptive sync as well.
Nvidia is simply trying to stain the technology to paint their ecosystem in a better light.
Nice try, but I'll definitely be voting with my wallet.
To those saying adaptive sync is just a ploy to get you to purchase a new monitor you definitely need to experience it. I would put it up there as a GPU upgrade.
I'm going to be honest, if you have a 144hz+ monitor and have your settings as such that your lowest frames are still triple digits then it won't be as noticeable.
BUT if the games you play dip to the lowest point of the freesync range you will get very smooth gameplay. Like vsync on but without the 3 second delay.
Denial
schmidtbag
Denial
Reddoguk
I think 144hz/144fps is still very difficult to reach for today's GFX cards in most recent AAA games.
That's why i went for 75hz monitor because i can do 75hz/75fps in most games and somehow it's so much better than 60hz but reaching double that would need a beast of a system.
Tell me those that have say a 2070/2080 can you reach 144fps in say Rage 2?
Dribble
schmidtbag
Denial
MonstroMart
Still not working on my MSI Optix MAG27CQ
schmidtbag
jwb1