NVIDIA G-Sync - Is synchronizing monitor and graphics card a game changer ?
On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big thing actually. In recent years we all have been driven by the knowledge that on a 60 Hz monitor you want 60 FPS rendered, this was for a reason, you want the two as close as possible to each other as that offers you not only the best gaming experience, but also the best visual experience. This is why framerate limiters are so popular, you sync each rendered frame in line with your monitor refresh rate. Obviously 9 out of 10 times that is not happening. This results into tow anomalies that everybody knows and experiences, stutter and tearing.
So what is happening ?
Very simply put, the graphics card is always firing of frames as fast as it can possibly do, that FPS this is dynamic and can bounce from say 30 to 80 FPS in an matter of split seconds. On the eye side of things, you have this hardware which is the monitor, and it is a fixed device as it refreshes at 60 Hz (60Hz is example). Fixed and Dynamic are two different things and collide with each other. So on one end we have the graphics card rendering at a varying framerate while the monitor refreshes at 60 images per second. That causes a problem as with a slower or faster FPS then 60 you'll get multiple images displayed on the screen per refresh of the monitor. So graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instant load that the GPU sees.
In the past we solved problems like Vsync stutter and Tearing basically in two ways. The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This you guys all know and have learned as ‘VSync Off Mode’ it is the default way most gamers play.
The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, yup, we all refer to this as screen tearing. You can solve tearing though.
The solution to bypass tearing is to turn VSync on, here you will force the GPU to delay screen updates until the monitor cycles to the start of a new refresh cycle. That delay causes stutter whenever the GPU frame rate is below the display refresh rate. Iit also increases latency, which is the direct result for input lag, the visible delay between a button being pressed and the result occurring on-screen.
Enabling VSYNC helps a lot, but with the video card firing off all these images per refresh you can typically see some pulsing (I don't wanna call it vsync stuttering) when that framerate varies and your you pan from left to right in your 3D scene. So that is not perfect.
Alternatively most people disable VSYNC - but that runs into a problem as well, multiple images per refreshed Hz will result into the phenomenon that is screen tearing, which we all hate.
Basically this is why we all want extremely fast graphics cards as most of you guys want to enable VSYNC and have a graphics card that runs faster then 60 FPS.
What is the solve ?
Nvidia is releasing G-Sync. Now as I explained the graphics card is running dynamic Hz, the monitor is static Hz, these two don't really match together. G-Sync is both a software and a hardware solution that will solve screen tearing and stuttering. A daughter hardware board (it actually looks a little like a mobile MXM module) is placed into a G-Sync enabled monitor which will do something very interesting. With G-Sync the monitor will become a slave to your graphics card as the its refresh rate in Hz becomes dynamic. Yes, it is no longer static. So each time your graphics card has rendered one frame that frame is aligned up with the monitor refresh rate. So the refresh rate of the monitor will become dynamic. With both the graphics card and monitor both dynamically in sync with each other you have eliminated stutter and screen tearing completely.
It gets even better, without stutter and screen tearing on an nice IPS LCD panel even at 30+ Hz you'd be having an incredibly good gaming experience (visually). BTW monitors upto 177 hz will get supported with Gsync as well as 4K monitors.
Summed up : NVIDIA G-SYNC is a solution that pretty much eliminates screen tearing, VSync input lag, and stutter. You need a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience.
The Nvidia G-Sync module
So Hilbert, are there any foreseeable problems ?
Not a lot really. But sure, low FPS could get nasty as say 10 FPS on a LCD panel would look like weird. Now 10 fps doesn't mean that your panel will flicker at 10 Hz as LCDs do not flicker. Unlike CRTs which have physical refresh rate. Even if your video card gives 3 frames per sec, it will be slideshow, but it should be a pretty nice one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs. But sure, in an optimal situation you will need a graphics card that can stay above 30 FPS as minimum. Secondly, dynamically altering the Hz refresh rate on your monitor just has to put some load on the monitor hardware, it MIGHT have an effect on your monitors lifespan. Last but not least. It is Nvidia proprietary technology and thus works with selected Nvidia GeForce Graphics cards only.
When is it available ?
You can see the first monitors and upgrade kits later this year, realistically we expect good availability Q1 2013 already. One current ASUS model actually can be updated where (ASUS VG248QE) you can insert the G-Sync hardware yourself. G-Sync is going to included into monitors from ASUS, BenQ, Philips and ViewSonic.
What will prices be like ?
That is not yet disclosed, but we think you can expect a 75 EUR/USD price premium per monitor for this solution. But after such an upgrade, even a Geforce GTX 760 running 30+ Hz/FPS would result into a very nice visual gaming experience. We learned that Asus will release the VG248QE (used in the demo) in a G-Sync-Enhanced version for 399 U.S. dollars .
Will this be an NVIDIA only enabled feature
For now, yes. Currently these graphics cards will be G-Sync compatible: GTX TITAN, GTX 780, GTX 770,GTX 760, GTX 690, GTX 680, GTX 670, GTX 660 Ti, GTX 660, GTX 650 Ti Boost. You need to be running Windows 7 or higher as operating system.
Youtube is not a good way to demonstratete this at 30 FPS but please try and get an overview of the tech in this video recording we made.
In the end we feel Nvidia G-Sync has the potential to be a game changer in the PC gaming industry. As even with the more mainstream graphics card you'll be enhancing your graphics experience greatly, think of it .. no more vsync stutter or screen tearing. That means silky smooth input lag free gaming at say 40 FPS. As such G-Sync has huge potential for the you guys the gamers, and the hardware industry.
On the following page we have some photo's we took at the technology day in relation to G-Sync.
Nvidia GeForce GTX 750 and 750 Ti review In todays article we review the new GeForce GTX 750 and 750 Ti from Nvidia. These cards are affordable - low power - decent performance graphics cards that will allow you to game even at 1080P. these...
NVIDIA G-Sync Explored and Explained NVIDIA recently announced GSync, a technology that is named to be a game changer, yes G-Sync eliminated toe problems that come with VSYNC on and off. meaning no more Sync stuttering and or Screen te...
NVIDIA G-Sync explained On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really bi...
NVIDIA GF100 (Fermi) Technology preview Last week we arrived at Sin City not only to cover CES but there was something else going on as well. In Las Vegas, NVIDIA had organized a briefing for a select group of the press. From Europe perhaps ten to fifteen people where invited for this somewhat privileged preview -- the topic, a technical overview of project Fermi. Fermi is of course the family name of the latest generation of GPUs from NVIDIA. The first chipset deriving from Fermi will be called the GF100 GPU which will likely be used on what we think will be called products like GeForce 360 and GeForce 380. Join us in a nice technology preview.