Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
GALAX GeForce RTX 4070 Ti EX White review
Cougar Terminator gaming chair review
G.Skill TridentZ5 RGB DDR5 7200 CL34 2x16 GB review
ASUS TUF Gaming B760-PLUS WIFI D4 review
Netac NV7000 2 TB NVMe SSD Review
ASUS GeForce RTX 4080 Noctua OC Edition review
MSI Clutch GM51 Wireless mouse review
ASUS ROG STRIX B760-F Gaming WIFI review
Asus ROG Harpe Ace Aim Lab Edition mouse review
SteelSeries Arctis Nova Pro Headset review

New Downloads
HWiNFO Download v7.42
Intel ARC graphics Driver Download Version: 31.0.101.4257
CrystalDiskInfo 9.0.0 Beta4 Download
AIDA64 Download Version 6.88
GeForce 531.41 WHQL driver download
AMD Radeon Software Adrenalin 23.3.2 WHQL download
GeForce 531.29 WHQL driver download
AMD Ryzen Master Utility Download 2.10.2.2367
AMD Radeon Software Adrenalin 23.3.1 WHQL download
Display Driver Uninstaller Download version 18.0.6.1


New Forum Topics
New Details Emerge on AMDs A620 Chipset: What We Know So Far Entertainment Software Association (ESA) Cancels E3 2023 Event due to Lack of Industry Support G.SKILL Launches Up to DDR5-8200 DDR5 Memory Kits with 24GBx2 and 48GBx2 Capacities The Last of Us Part I PC Port Receives 77% negative ratings on Steam, due to poor optimization Windows: Line-Based vs. Message Signaled-Based Interrupts. MSI tool. Windows 11 Insider Builds NVIDIA's Upcoming RTX 4070 Graphics Card to Cost $599 Valve to Discontinue Support for Windows 7, 8, and 8.1 on Steam Starting 2024 Review: GALAX GeForce RTX 4070 Ti EX White AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry




Guru3D.com » News » Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module

Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module

by Hilbert Hagedoorn on: 10/19/2013 01:38 PM | source: | 142 comment(s)

If you have been reading our blog today then you would have noticed that NVIDIA made several technology announcements the past two days. The first major one being G-Sync, a new techology with the help of new monitor technology and software. It is called G-SYNC. G-Sync is a proprietary Nvidia daughter board going to included into monitors from ASUS, BenQ, Philips and ViewSonic.  Read more after the break.


The G-Sync module will control drive LCD displays with timings fluctuating all over the place. As soon as the framebuffer is complete it will update the screen, eliminating tearing and stutter. After seeing some demos I can't be anything other then impressed. Very smooth stutter and tearing free framerates on the monitor. G-Sync is going to included into monitors from ASUS, BenQ, Philips and ViewSonic.

As you can read, I have no time to explain in depth what the technology does, but basically the videocard will sync up with the module and drive the LCD 100% synchronized. So whenever you graphics card has a frame ready, that is synced with the monitor. This results into a incredibly smooth experience on your monitor without stuttering and screen tearing. Seeing is believing though, that is thee truth. A set release date has not been announced, but we'll review a unit soon enough and we expect availability in the Christmas season already.

Nvidia:

Several years in the making, G-SYNC™ technology synchronises the monitor’s refresh rate to the GPU’s render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.

“Our commitment to create a pure gaming experience led us to G-SYNC,” said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. “This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you’ll never want to go back.”

Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronise with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

System Requirements for NVIDIA G-SYNC DIY Kit Modification for ASUS VG248QE monitor

GPU:

    • G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher
  • GTX TITAN
  • GTX 780
  • GTX 770
  • GTX 760
  • GTX 690
  • GTX 680
  • GTX 670
  • GTX 660 Ti
  • GTX 660
  • GTX 650 Ti Boost

Display:

  • G-SYNC DIY modification kit requires an ASUS VG248QE monitor.

Driver:

  • R331.58 or higher

Operating System:

  • Windows 8.1
  • Windows 8
  • Windows 7

 

System Requirements for NVIDIA G-SYNC enabled monitors

GPU:

    • G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher.
  • GTX TITAN
  • GTX 780
  • GTX 770
  • GTX 760
  • GTX 690
  • GTX 680
  • GTX 670
  • GTX 660 Ti
  • GTX 660
  • GTX 650 Ti Boost

Driver:

  • R331.58 or higher

Operating System:

  • Windows 8.1
  • Windows 8
  • Windows 7

VIDIA® G-SYNC™ is a groundbreaking new innovation that casts aside decades-old thinking to create the smoothest, most responsive computer displays ever seen. A monitor module you can install yourself, or buy pre-installed in gamer-focused monitors, NVIDIA G-SYNC waves goodbye to the days of screen tearing, input lag, and eyestrain-inducing stuttering caused by decades-old tech lazily carried over from analog TVs to modern-day monitors.

The Problem: Old Tech

When TVs were first developed they relied on CRTs which work by scanning a beam of electrons across the surface of a phosphorus tube. This beam causes a pixel on the tube to glow, and when enough pixels are activated quickly enough the CRT can give the impression of full motion video. Believe it or not, these early TVs had 60Hz refresh rates primarily because the United States power grid is based on 60Hz AC power. Matching TV refresh rates to that of the power grid made early electronics easier to build, and reduced power interference on the screen.

By the time PCs came to market in the early 1980s, CRT TV technology was well established and was the easiest and most cost effective technology for utilize for the creation of dedicated computer monitors. 60Hz and fixed refresh rates became standard, and system builders learned how to make the most of a less than perfect situation. Over the past three decades, even as display technology has evolved from CRTs to LCD and LEDs, no major company has challenged this thinking, and so syncing GPUs to monitor refresh rates remains the standard practice across the industry to this day.

Problematically, graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instantaneous load that the GPU sees. So with a fixed refresh rate, how do you get the GPU images to the screen? The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This we call ‘VSync Off Mode’ and it is the default way most gamers play. The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, commonly referred to as screen tearing. The established solution to screen tearing is to turn VSync on, to force the GPU to delay screen updates until the monitor cycles to the start of a new refresh cycle. This causes stutter whenever the GPU frame rate is below the display refresh rate. And it also increases latency, which introduces input lag, the visible delay between a button being pressed and the result occurring on-screen.

Worse still, many players suffer eyestrain when exposed to persistent VSync stuttering, and others develop headaches and migraines, which drove us to develop Adaptive VSync, an effective, critically-acclaimed solution. Despite this development, VSync’s input lag issues persist to this day, something that’s unacceptable for many enthusiasts, and an absolute no-go for eSports pro-gamers who custom-pick their GPUs, monitors, keyboards, and mice to minimize the life-and-death delay between action and reaction.

The Solution: NVIDIA G-SYNC

                                        

Enter NVIDIA G-SYNC, which eliminates screen tearing, VSync input lag, and stutter. To achieve this revolutionary feat, we build a G-SYNC module into monitors, allowing G-SYNC to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience that redefines gaming.

Industry luminaries John Carmack, Tim Sweeney, Johan Andersson and Mark Rein have been bowled over by NVIDIA G-SYNC’s game-enhancing technology. Pro eSports players and pro-gaming leagues are lining up to use NVIDIA G-SYNC, which will expose a player’s true skill, demanding even greater reflexes thanks to the unnoticeable delay between on-screen actions and keyboard commands. And in-house, our diehard gamers have been dominating lunchtime LAN matches, surreptitiously using G-SYNC monitors to gain the upper hand.

Online, if you have a NVIDIA G-SYNC monitor you’ll have a clear advantage over others, assuming you also have a low ping.

How To Upgrade To G-SYNC

If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders and System Builders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available.

Product Details - NVIDIA G-SYNC DIY Kit Modification for ASUS VG248QE Monitor

           The specifications of an ASUS VG248QE, before and after an upgrade to NVIDIA G-SYNC.

Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual is available to view online, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

A Groundbreaking Revolution Has Arrived

In this time of technological marvels, there are few advances one can truly call “innovative”, or “revolutionary”. NVIDIA G-SYNC, however, is one of the few, revolutionizing outmoded monitor technology with a truly innovative, groundbreaking advancement that has never before been attempted.

G-SYNC’s elimination of input lag, tearing, and stutter delivers a stunning visual experience on any G-SYNC-enhanced monitor; one so stunning that you’ll never want to use a ‘normal’ monitor ever again. In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.

If, like eSports pros, you want the clearest, fastest, smoothest, most responsive gaming experience possible, NVIDIA G-SYNC monitors are a game-changer, the likes of which cannot be found anywhere else. A true innovation in an era of iteration, NVIDIA G-SYNC will redefine the way you view games.



Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module




« AMD is in the Green again · Nvidia announced G-Sync - eliminates stutter and screen tearing with a daughter module · GeForce GTX 780 Ti makes an appearance »

Related Stories

NVIDIA Announces SHIELD Release Date and New Lower Price - 06/21/2013 08:37 AM
NVIDIA’s new portable gaming device SHIELD is going to be released June 27 with a (now lower) price of $299. NVIDIA is cutting off fifty bucks to meet that exact price. And with that annou...

NVIDIA announces Tegra 4i SoC - 02/19/2013 03:24 PM
Nvidia today announced the Tegra 4i, a smartpone chip with an integrated LTE modem on the same physical die. This is a follow-on to the Tegra 4 that was announced at this year’s CES 2013. Fo...

NVIDIA aiming for their own tablets and smartphones? - 01/29/2013 02:45 PM
It would be great if they pulled that off. There are now rumors on the web that  NVIDIA might be working on reference smartphone and tablet designs. These designs should by ready by mid-2013 and sho...

NVIDIA Announces Tegra 4: 72 GPU Cores, 4G LTE - 01/07/2013 10:28 AM
Graphics Guru's NVIDIA have announced announced the Tegra 4 chip, a quad core Cortex A15 chip packed with 72 GPU cores, expected to launch later this year. Alongside the chip, NVIDIA also introduced...

NVIDIA asks EVGA to Pull EVBot Support from GTX 680 Classified - 10/03/2012 06:59 AM
It was to be expected. According to an Overclockers website, NVIDIA asked EVGA to remove voltage control support from its EVBot module, specifically for the GeForce GTX 680 Classified graphics card. E...


29 pages « < 26 27 28 29


RPGgamesplayer
Senior Member



Posts: 1475
Joined: 2011-07-03

#4680955 Posted on: 10/22/2013 07:31 PM
No worries dude Lmao happens to us all

k1net1cs
Senior Member



Posts: 3783
Joined: 2010-11-14

#4680966 Posted on: 10/22/2013 07:50 PM
NVIDIA G-Sync Overview and Explanation with Tom Petersen
http://www.youtube.com/watch?v=KhLYYYvFp9A&feature=player_embedded#t=0
Still watching the vid, but on minute 40 onward there's an explanation on how GSync handles low fps, and what kind of specs to be expected from the first few GSync monitors.

The min-max refresh rate you can get from GSync depends entirely on the panel's spec, so if your fps is dropping below the min refresh rate of the monitor, GSync will just repeat the last rendered frame; this is probably what the RAM chips are supposedly for.
Like rflair suggested, those chips are probably going to act as a buffer, but from the vid White posted, it seems that they are mainly used for low fps situations, where the fps is lower than the minimum refresh rate the panel supports.

The 1st gen monitors are likely to have a converted 3D Vision monitor panels that operates at 144Hz, so that's quite a high ceiling for the GPU.
Maybe that explains why 1st gen GSync monitors are around $399 (the 144Hz max refresh rate), but dunno.



At 1h03m onward there's an explanation on how compatible the GSync module is on different panels.

The basic idea is that the module is panel-type agnostic, but those panels have to use LVDS, which is a quite common panel interface used on LCDs.
And since the module is programmable, it's up to vendors to make the module supports various resolutions and panel types.

29 pages « < 26 27 28 29


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2023