Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module
If you have been reading our blog today then you would have noticed that NVIDIA made several technology announcements the past two days. The first major one being G-Sync, a new techology with the help of new monitor technology and software. It is called G-SYNC. G-Sync is a proprietary Nvidia daughter board going to included into monitors from ASUS, BenQ, Philips and ViewSonic. Read more after the break.
The G-Sync module will control drive LCD displays with timings fluctuating all over the place. As soon as the framebuffer is complete it will update the screen, eliminating tearing and stutter. After seeing some demos I can't be anything other then impressed. Very smooth stutter and tearing free framerates on the monitor. G-Sync is going to included into monitors from ASUS, BenQ, Philips and ViewSonic.
As you can read, I have no time to explain in depth what the technology does, but basically the videocard will sync up with the module and drive the LCD 100% synchronized. So whenever you graphics card has a frame ready, that is synced with the monitor. This results into a incredibly smooth experience on your monitor without stuttering and screen tearing. Seeing is believing though, that is thee truth. A set release date has not been announced, but we'll review a unit soon enough and we expect availability in the Christmas season already.
Nvidia:
Several years in the making, G-SYNC™ technology synchronises the monitor’s refresh rate to the GPU’s render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.
“Our commitment to create a pure gaming experience led us to G-SYNC,” said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. “This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you’ll never want to go back.”
Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronise with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.
System Requirements for NVIDIA G-SYNC DIY Kit Modification for ASUS VG248QE monitor
GPU:
- G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher
- GTX TITAN
- GTX 780
- GTX 770
- GTX 760
- GTX 690
- GTX 680
- GTX 670
- GTX 660 Ti
- GTX 660
- GTX 650 Ti Boost
Display:
- G-SYNC DIY modification kit requires an ASUS VG248QE monitor.
Driver:
- R331.58 or higher
Operating System:
- Windows 8.1
- Windows 8
- Windows 7
System Requirements for NVIDIA G-SYNC enabled monitors
GPU:
- G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher.
- GTX TITAN
- GTX 780
- GTX 770
- GTX 760
- GTX 690
- GTX 680
- GTX 670
- GTX 660 Ti
- GTX 660
- GTX 650 Ti Boost
Driver:
- R331.58 or higher
Operating System:
- Windows 8.1
- Windows 8
- Windows 7
VIDIA® G-SYNC™ is a groundbreaking new innovation that casts aside decades-old thinking to create the smoothest, most responsive computer displays ever seen. A monitor module you can install yourself, or buy pre-installed in gamer-focused monitors, NVIDIA G-SYNC waves goodbye to the days of screen tearing, input lag, and eyestrain-inducing stuttering caused by decades-old tech lazily carried over from analog TVs to modern-day monitors.
The Problem: Old Tech
When TVs were first developed they relied on CRTs which work by scanning a beam of electrons across the surface of a phosphorus tube. This beam causes a pixel on the tube to glow, and when enough pixels are activated quickly enough the CRT can give the impression of full motion video. Believe it or not, these early TVs had 60Hz refresh rates primarily because the United States power grid is based on 60Hz AC power. Matching TV refresh rates to that of the power grid made early electronics easier to build, and reduced power interference on the screen.
By the time PCs came to market in the early 1980s, CRT TV technology was well established and was the easiest and most cost effective technology for utilize for the creation of dedicated computer monitors. 60Hz and fixed refresh rates became standard, and system builders learned how to make the most of a less than perfect situation. Over the past three decades, even as display technology has evolved from CRTs to LCD and LEDs, no major company has challenged this thinking, and so syncing GPUs to monitor refresh rates remains the standard practice across the industry to this day.
Problematically, graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instantaneous load that the GPU sees. So with a fixed refresh rate, how do you get the GPU images to the screen? The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This we call ‘VSync Off Mode’ and it is the default way most gamers play. The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, commonly referred to as screen tearing. The established solution to screen tearing is to turn VSync on, to force the GPU to delay screen updates until the monitor cycles to the start of a new refresh cycle. This causes stutter whenever the GPU frame rate is below the display refresh rate. And it also increases latency, which introduces input lag, the visible delay between a button being pressed and the result occurring on-screen.
Worse still, many players suffer eyestrain when exposed to persistent VSync stuttering, and others develop headaches and migraines, which drove us to develop Adaptive VSync, an effective, critically-acclaimed solution. Despite this development, VSync’s input lag issues persist to this day, something that’s unacceptable for many enthusiasts, and an absolute no-go for eSports pro-gamers who custom-pick their GPUs, monitors, keyboards, and mice to minimize the life-and-death delay between action and reaction.
The Solution: NVIDIA G-SYNC
Enter NVIDIA G-SYNC, which eliminates screen tearing, VSync input lag, and stutter. To achieve this revolutionary feat, we build a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience that redefines gaming.
Industry luminaries John Carmack, Tim Sweeney, Johan Andersson and Mark Rein have been bowled over by NVIDIA G-SYNC’s game-enhancing technology. Pro eSports players and pro-gaming leagues are lining up to use NVIDIA G-SYNC, which will expose a player’s true skill, demanding even greater reflexes thanks to the unnoticeable delay between on-screen actions and keyboard commands. And in-house, our diehard gamers have been dominating lunchtime LAN matches, surreptitiously using G-SYNC monitors to gain the upper hand.
Online, if you have a NVIDIA G-SYNC monitor you’ll have a clear advantage over others, assuming you also have a low ping.
How To Upgrade To G-SYNC
If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders and System Builders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available.
The specifications of an ASUS VG248QE, before and after an upgrade to NVIDIA G-SYNC.
Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual is available to view online, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our gurus believe installation should take approximately 30 minutes.
If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.
A Groundbreaking Revolution Has Arrived
In this time of technological marvels, there are few advances one can truly call “innovative”, or “revolutionary”. NVIDIA G-SYNC, however, is one of the few, revolutionizing outmoded monitor technology with a truly innovative, groundbreaking advancement that has never before been attempted.
G-SYNC’s elimination of input lag, tearing, and stutter delivers a stunning visual experience on any G-SYNC-enhanced monitor; one so stunning that you’ll never want to use a ‘normal’ monitor ever again. In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.
If, like eSports pros, you want the clearest, fastest, smoothest, most responsive gaming experience possible, NVIDIA G-SYNC monitors are a game-changer, the likes of which cannot be found anywhere else. A true innovation in an era of iteration, NVIDIA G-SYNC will redefine the way you view games.
NVIDIA Announces SHIELD Release Date and New Lower Price - 06/21/2013 08:37 AM
NVIDIA’s new portable gaming device SHIELD is going to be released June 27 with a (now lower) price of $299. NVIDIA is cutting off fifty bucks to meet that exact price. And with that annou...
NVIDIA announces Tegra 4i SoC - 02/19/2013 03:24 PM
Nvidia today announced the Tegra 4i, a smartpone chip with an integrated LTE modem on the same physical die. This is a follow-on to the Tegra 4 that was announced at this year’s CES 2013. Fo...
NVIDIA aiming for their own tablets and smartphones? - 01/29/2013 02:45 PM
It would be great if they pulled that off. There are now rumors on the web that NVIDIA might be working on reference smartphone and tablet designs. These designs should by ready by mid-2013 and sho...
NVIDIA Announces Tegra 4: 72 GPU Cores, 4G LTE - 01/07/2013 10:28 AM
Graphics Guru's NVIDIA have announced announced the Tegra 4 chip, a quad core Cortex A15 chip packed with 72 GPU cores, expected to launch later this year. Alongside the chip, NVIDIA also introduced...
NVIDIA asks EVGA to Pull EVBot Support from GTX 680 Classified - 10/03/2012 06:59 AM
It was to be expected. According to an Overclockers website, NVIDIA asked EVGA to remove voltage control support from its EVBot module, specifically for the GeForce GTX 680 Classified graphics card. E...
Senior Member
Posts: 3461
Joined: 2011-05-10
The descriptions of it make it sound pretty amazing, fluid gameplay even with fps drops

Would like to see it in action.
Senior Member
Posts: 4469
Joined: 2008-03-03
Aha, i see a GTX780 Ti in that Pic....
interesting......
Senior Member
Posts: 5866
Joined: 2008-01-06
Same though came to me. ''If it does what it says it does'' I'm buying it.
Senior Member
Posts: 6589
Joined: 2004-09-30
i have such an awesome monitor, s23a700d but if it does what it does i might sell it and buy something like it.
Posts: 31492
Joined: 2005-01-08
Interesting and if it actually works then it could be a big selling point to all of us who are sensitive to things like that.