Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review
Ryzen 7800X3D preview - 7950X3D One CCD Disabled
MSI VIGOR GK71 SONIC Blue keyboard review
AMD Ryzen 9 7950X3D processor review
FSP Hydro G Pro 1000W (ATX 3.0, 1000W PSU) review
Addlink S90 Lite 2TB NVMe SSD review
ASUS ROG Rapture GT-AXE16000 WIFI6E router review

New Downloads
Intel ARC graphics Driver Download Version: 31.0.101.4148
GeForce 531.29 WHQL driver download
CrystalDiskInfo 9.0.0 Beta3 Download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1
CPU-Z download v2.05
AMD Chipset Drivers Download 5.02.19.2221
GeForce 531.18 WHQL driver download
ReShade download v5.7.0


New Forum Topics
European Pricing for AMD Ryzen 7 7800X3D: Early Listings and Estimated Cost NVIDIA GeForce 531.29 WHQL driver Download & Discussion Fake Samsung 980 Pro SSDs on the Rise: Beware of Counterfeit Drives Vulkan Beta Driver 531.32 is out Red Dead Redemption 2 failing to launch, Exit code 0xc0000005 AMD's Upcoming "Phoenix" Ryzen 7 7840U Silicon Could Shake Up the Ultra-Thin Notebook Market ASUS Unveils ProArt Display PA329CRV: A Professional 31.5-Inch Monitor for Graphic Designers Odd power(?) issue Crucial PCIe 5.0 SSD, T700, Impresses with Compact Heatsink and High Speeds OS Copying Software




Guru3D.com » News » Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module

Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module

by Hilbert Hagedoorn on: 10/19/2013 01:38 PM | source: | 142 comment(s)

If you have been reading our blog today then you would have noticed that NVIDIA made several technology announcements the past two days. The first major one being G-Sync, a new techology with the help of new monitor technology and software. It is called G-SYNC. G-Sync is a proprietary Nvidia daughter board going to included into monitors from ASUS, BenQ, Philips and ViewSonic.  Read more after the break.


The G-Sync module will control drive LCD displays with timings fluctuating all over the place. As soon as the framebuffer is complete it will update the screen, eliminating tearing and stutter. After seeing some demos I can't be anything other then impressed. Very smooth stutter and tearing free framerates on the monitor. G-Sync is going to included into monitors from ASUS, BenQ, Philips and ViewSonic.

As you can read, I have no time to explain in depth what the technology does, but basically the videocard will sync up with the module and drive the LCD 100% synchronized. So whenever you graphics card has a frame ready, that is synced with the monitor. This results into a incredibly smooth experience on your monitor without stuttering and screen tearing. Seeing is believing though, that is thee truth. A set release date has not been announced, but we'll review a unit soon enough and we expect availability in the Christmas season already.

Nvidia:

Several years in the making, G-SYNC™ technology synchronises the monitor’s refresh rate to the GPU’s render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.

“Our commitment to create a pure gaming experience led us to G-SYNC,” said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. “This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you’ll never want to go back.”

Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronise with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

System Requirements for NVIDIA G-SYNC DIY Kit Modification for ASUS VG248QE monitor

GPU:

    • G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher
  • GTX TITAN
  • GTX 780
  • GTX 770
  • GTX 760
  • GTX 690
  • GTX 680
  • GTX 670
  • GTX 660 Ti
  • GTX 660
  • GTX 650 Ti Boost

Display:

  • G-SYNC DIY modification kit requires an ASUS VG248QE monitor.

Driver:

  • R331.58 or higher

Operating System:

  • Windows 8.1
  • Windows 8
  • Windows 7

 

System Requirements for NVIDIA G-SYNC enabled monitors

GPU:

    • G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher.
  • GTX TITAN
  • GTX 780
  • GTX 770
  • GTX 760
  • GTX 690
  • GTX 680
  • GTX 670
  • GTX 660 Ti
  • GTX 660
  • GTX 650 Ti Boost

Driver:

  • R331.58 or higher

Operating System:

  • Windows 8.1
  • Windows 8
  • Windows 7

VIDIA® G-SYNC™ is a groundbreaking new innovation that casts aside decades-old thinking to create the smoothest, most responsive computer displays ever seen. A monitor module you can install yourself, or buy pre-installed in gamer-focused monitors, NVIDIA G-SYNC waves goodbye to the days of screen tearing, input lag, and eyestrain-inducing stuttering caused by decades-old tech lazily carried over from analog TVs to modern-day monitors.

The Problem: Old Tech

When TVs were first developed they relied on CRTs which work by scanning a beam of electrons across the surface of a phosphorus tube. This beam causes a pixel on the tube to glow, and when enough pixels are activated quickly enough the CRT can give the impression of full motion video. Believe it or not, these early TVs had 60Hz refresh rates primarily because the United States power grid is based on 60Hz AC power. Matching TV refresh rates to that of the power grid made early electronics easier to build, and reduced power interference on the screen.

By the time PCs came to market in the early 1980s, CRT TV technology was well established and was the easiest and most cost effective technology for utilize for the creation of dedicated computer monitors. 60Hz and fixed refresh rates became standard, and system builders learned how to make the most of a less than perfect situation. Over the past three decades, even as display technology has evolved from CRTs to LCD and LEDs, no major company has challenged this thinking, and so syncing GPUs to monitor refresh rates remains the standard practice across the industry to this day.

Problematically, graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instantaneous load that the GPU sees. So with a fixed refresh rate, how do you get the GPU images to the screen? The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This we call ‘VSync Off Mode’ and it is the default way most gamers play. The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, commonly referred to as screen tearing. The established solution to screen tearing is to turn VSync on, to force the GPU to delay screen updates until the monitor cycles to the start of a new refresh cycle. This causes stutter whenever the GPU frame rate is below the display refresh rate. And it also increases latency, which introduces input lag, the visible delay between a button being pressed and the result occurring on-screen.

Worse still, many players suffer eyestrain when exposed to persistent VSync stuttering, and others develop headaches and migraines, which drove us to develop Adaptive VSync, an effective, critically-acclaimed solution. Despite this development, VSync’s input lag issues persist to this day, something that’s unacceptable for many enthusiasts, and an absolute no-go for eSports pro-gamers who custom-pick their GPUs, monitors, keyboards, and mice to minimize the life-and-death delay between action and reaction.

The Solution: NVIDIA G-SYNC

                                        

Enter NVIDIA G-SYNC, which eliminates screen tearing, VSync input lag, and stutter. To achieve this revolutionary feat, we build a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience that redefines gaming.

Industry luminaries John Carmack, Tim Sweeney, Johan Andersson and Mark Rein have been bowled over by NVIDIA G-SYNC’s game-enhancing technology. Pro eSports players and pro-gaming leagues are lining up to use NVIDIA G-SYNC, which will expose a player’s true skill, demanding even greater reflexes thanks to the unnoticeable delay between on-screen actions and keyboard commands. And in-house, our diehard gamers have been dominating lunchtime LAN matches, surreptitiously using G-SYNC monitors to gain the upper hand.

Online, if you have a NVIDIA G-SYNC monitor you’ll have a clear advantage over others, assuming you also have a low ping.

How To Upgrade To G-SYNC

If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders and System Builders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available.

Product Details - NVIDIA G-SYNC DIY Kit Modification for ASUS VG248QE Monitor

           The specifications of an ASUS VG248QE, before and after an upgrade to NVIDIA G-SYNC.

Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual is available to view online, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

A Groundbreaking Revolution Has Arrived

In this time of technological marvels, there are few advances one can truly call “innovative”, or “revolutionary”. NVIDIA G-SYNC, however, is one of the few, revolutionizing outmoded monitor technology with a truly innovative, groundbreaking advancement that has never before been attempted.

G-SYNC’s elimination of input lag, tearing, and stutter delivers a stunning visual experience on any G-SYNC-enhanced monitor; one so stunning that you’ll never want to use a ‘normal’ monitor ever again. In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.

If, like eSports pros, you want the clearest, fastest, smoothest, most responsive gaming experience possible, NVIDIA G-SYNC monitors are a game-changer, the likes of which cannot be found anywhere else. A true innovation in an era of iteration, NVIDIA G-SYNC will redefine the way you view games.



Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module




« AMD is in the Green again · Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module · GeForce GTX 780 Ti makes an appearance »

Related Stories

NVIDIA Announces SHIELD Release Date and New Lower Price - 06/21/2013 08:37 AM
NVIDIA’s new portable gaming device SHIELD is going to be released June 27 with a (now lower) price of $299. NVIDIA is cutting off fifty bucks to meet that exact price. And with that annou...

NVIDIA announces Tegra 4i SoC - 02/19/2013 03:24 PM
Nvidia today announced the Tegra 4i, a smartpone chip with an integrated LTE modem on the same physical die. This is a follow-on to the Tegra 4 that was announced at this year’s CES 2013. Fo...

NVIDIA aiming for their own tablets and smartphones? - 01/29/2013 02:45 PM
It would be great if they pulled that off. There are now rumors on the web that  NVIDIA might be working on reference smartphone and tablet designs. These designs should by ready by mid-2013 and sho...

NVIDIA Announces Tegra 4: 72 GPU Cores, 4G LTE - 01/07/2013 10:28 AM
Graphics Guru's NVIDIA have announced announced the Tegra 4 chip, a quad core Cortex A15 chip packed with 72 GPU cores, expected to launch later this year. Alongside the chip, NVIDIA also introduced...

NVIDIA asks EVGA to Pull EVBot Support from GTX 680 Classified - 10/03/2012 06:59 AM
It was to be expected. According to an Overclockers website, NVIDIA asked EVGA to remove voltage control support from its EVBot module, specifically for the GeForce GTX 680 Classified graphics card. E...


29 pages « < 4 5 6 7 > »


S3nt3nc3
Senior Member



Posts: 309
Joined: 2010-12-08

#4677970 Posted on: 10/18/2013 08:23 PM
I see one potential downside of it.
Eye strain.
It takes me quite a while to get used to a monitor with a different refresh rate. My eyes hurt after 10-15 minutes even when i change refresh rate form 60 to 65 Hz and vice versa.
Can't even imagine how bad it might be when it will fluctuate between 40~120Hz.

MM10X
Senior Member



Posts: 4240
Joined: 2008-08-21

#4677985 Posted on: 10/18/2013 08:58 PM
@S3nt3nc3, I believe this is a smaller amount of people who have this issue. Personally, I can swap around refresh rates and the higher it gets the more comfortable my eyes feel. As its refresh as soon as the frame becomes available ... I don't see the low refresh rate being a problem.

I do see your point though.

wasteomind
Senior Member



Posts: 398
Joined: 2007-10-08

#4678006 Posted on: 10/18/2013 09:24 PM
Thank **** this issue is finally getting more attention. As long as this doesn't have any significant downsides, I will be upgrading to have this.

I ALWAYS use vsync as tearing and stuttering are the bane of my enjoyment. Even with vsync I noticed this issue and had given up hope there might ever be some solution. My goal the past generation has been 60fps @ 1080p and seeing drops into the 40s and 50s which should have been ok, have been agonizing. I thought I was going crazy, but this just confirms the issue with the technology at present.

This is a step in the right direction and I am sold. I just really hope this isn't a software limited technology like the crap from Lucid. Good ideas ruined by crappy software and no development support.

Clouseau
Senior Member



Posts: 2809
Joined: 2011-05-17

#4678018 Posted on: 10/18/2013 09:44 PM
Sounds like this eliminates the necessity of driver or programming optimization. If the gameplay is fluid enough fps has been completely removed from any argument.

Xendance
Senior Member



Posts: 5555
Joined: 2005-07-19

#4678020 Posted on: 10/18/2013 09:47 PM
I see one potential downside of it.
Eye strain.
It takes me quite a while to get used to a monitor with a different refresh rate. My eyes hurt after 10-15 minutes even when i change refresh rate form 60 to 65 Hz and vice versa.
Can't even imagine how bad it might be when it will fluctuate between 40~120Hz.

That isn't an issue with LCDs, since the pixels are always on. They just change the colour on each screen refresh.

Sounds like this eliminates the necessity of driver or programming optimization. If the gameplay is fluid enough fps has been completely removed from any argument.

This has nothing to do with game optimization...
There might not be any tearing with g-sync, but you'd still notice it in the input. Can you imagine moving your mouse around if the screen updates at 10 fps? Getting the visual feedback from the movement would take 100 milliseconds.

29 pages « < 4 5 6 7 > »


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023