NVIDIA G-Sync explained (article)

Published by

Click here to post a comment for NVIDIA G-Sync explained (article) on our message forum
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Good article I hope this some year becomes available for everything else it won't get mainstream attention just like 3d vision things.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Nice article Hilbert 🙂
data/avatar/default/avatar23.webp
Nice article, but i think this is not really true:
Not a lot really but sure, Low FPS could be a drag as say 20 FPS and thus Hz on a LCD panel will look like crap, you'd literally see the screen refresh. Meaning low FPS moments could potentially be horrible with refreshes that you could see live on your screen. So in an optimal situation you will need a graphics card that can stay above 30 FPS as minimum.
20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker 🙂. Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all. Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.
data/avatar/default/avatar40.webp
Nice article, but i think this is not really true: 20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker 🙂. Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all. Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.
Agreed, also the whole sentence doesn't make that much sense. Is it missing few commas or what? 😀
https://forums.guru3d.com/data/avatars/m/242/242956.jpg
This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Nice article, but i think this is not really true: 20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker 🙂. Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all. Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.
They also stated that the minimum variable refresh rate is 30. Anything below they will duplicate frames. And yep, obviously it will not flicker.
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
You'll have your Mantle teet to suck on soon.
https://forums.guru3d.com/data/avatars/m/94/94596.jpg
Moderator
They also stated that the minimum variable refresh rate is 30. Anything below they will duplicate frames. And yep, obviously it will not flicker.
Where was this shown?
data/avatar/default/avatar39.webp
This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
Damn...you just broke my bullsh!t-o-meter.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
They are saying this can be retro fitted to certain monitors. They give a diagram and everything to hard wire it to monitors. mmmmm i could prolly do it no bother, but do i wish to force open my monitor to solder this module to it. My last monitors osd button broke/stuck and the monitor was a total nightmare to get into. Front bezzles are usually locked in with really weak small pieces of plastic. I broke a few of them on the last monitor.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Where was this shown?
Anandtech.
There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame.
https://forums.guru3d.com/data/avatars/m/239/239227.jpg
I said this in one other thread but monitors should already just work this way. We shouldnt need to buy extra stuff.
data/avatar/default/avatar09.webp
I said this in one other thread but monitors should already just work this way. We shouldnt need to buy extra stuff.
Well, they don't. 😉
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
This looks so awesome. It should improve gaming experience by a lot for everyone whos rig can't sustain filling v-synced monitor at 60fps with no drops. Even then the input lag could be nasty. It should also make gaming @ 4k much more comfortable. Now, when will we be able to get 4k monitor with fallback to FullHD @ 120Hz, and G-Sync, at <$1.5k?
https://forums.guru3d.com/data/avatars/m/236/236515.jpg
It does look awesome, but i just got a VG248QE a while back. Now i need to turn around and sell if i want yet another proprietary Nvidia feature or the feature in general? I take it you wont be able to add this piece of hardware to the panel you'll have to buy one. Which IIRC i saw them showcasing it with the VG248QE.
https://forums.guru3d.com/data/avatars/m/242/242956.jpg
You'll have your Mantle teet to suck on soon.
What makes you think i care about Mantle i just stated a fact and what needs to be there so i can buy it and make use of a great thing And i hope this is a little more open then what i'm used to seeing Nvidia do in the past with there technology At the end of the day this has nothing to with Mantle.sometimes Fantards like really do make me facepalm.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.
data/avatar/default/avatar37.webp
Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.
I recall Nvidia been in the news, they did offer physx to ATI/AMD in licensing deal, but they point blank refused. Who knows if they had agreed it probably would been available in ps4/xbone an been implemented in lot's of games now. Them refusing all those years ago basically killed it from been widely adopted. What goes around comes around, really think Nvidia will license Mantle, nope can see similar fate for it See outcome for G sync been totally different, simply because it's something everyone want's....will be must have feature.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
As far as I remember that talk about AMD being "offered" PhysX has been brought up a few times, basically there were some terms that AMD didn't like so they declined, or so it's rumored at least, I don't think we'll ever get a official explanation. 🙂 Similarly I can fully see AMD wanting some terms for letting Nvidia use Mantle, they aren't going to just give it away to the competition. EDIT: That said it would of course be awesome for us end-users if AMD and Nvidia could somehow get along and implement CUDA, PhysX, Mantle, GSync and whatever else (3D viewing?) together but I doubt that will happen anytime soon. (I guess it's similar between AMD CPU and Intel and their x86 / x64 stuff and extensions but I don't have much insight into these things so what would I know.)
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
I hope they will be able to release something for all monitor. Perhaps an active module that goes between the monitor and the cable. Having to buy a new monitor ain't ideal.