Product Guide

NVIDIA Announces G-Sync Technology

NVIDIA Announces G-Sync Technology


NVIDIA Announces G-Sync Technology

It is Day Two of NVIDIA’s “The Way it’s Meant To Be Played" event and one of the biggest announcements that was made relates to lag, stutter and tearing - three perennial problems that have plagued gamers for a long time.

Briefly speaking, these problems arise because of an incompatibility between the GPU’s draw rates and the monitor’s refresh rates. GPU draw rates typically vary depending on the scene; whereas monitor refresh rates (with V-Sync on at least) are typically fixed. This incompatibility causes lag and stutter. Gamers overcome this by removing V-Sync, but this causes the phenomenon called tearing when the monitor over-samples and shows two or more frames in a single screen draw.

This is why you often game developers aiming for the magic average fps of  60 when designing games. Only by doing so, will they ensure that there’s no lag, stutter or tearing. However, as we all know, ensuring that games maintain at least 60 fps is a difficult task, and it also means sacrificing on graphics quality and visual effects.

G-Sync solves these problems by integrating a special chip into monitors, which, very simply, synchronizes the monitor’s refresh rates to the GPU’s draw rates. The end result, really is that monitors will now have a variable refresh rate, ranging from as low as 30 Hz to as high as 144 Hz. This is different from NVIDIA's existing Adaptive VSync technology, which simply turns V-Sync on and off depending on the GPU draw rates.

The implications of this are far ranging, but most significantly, this means that game developers now have greater wiggle room with regards to the scenes that they can design. Gaming industry luminaries such as Tim Sweeney (Founder CEO of Epic Games), Johan Andersson (Technical Director at DICE) and John Carmack (CTO Oculus VR, co-founder of iD software) all concurred that G-Sync can have a great impact on game design.

The effects of G-Sync are hard to capture on video, but based on our first impressions, we can say that the technology is truly effective. For example, below 60 fps, we can clearly see stuttering and lag on the monitor without G-Sync. However, on the monitor with the special G-Sync module, it was still buttery smooth.

As of now, ASUS, BenQ, Philips and Viewsonic have signed on and will monitors that will support G-Sync. These monitors will be made available in Q1 of next year. G-Sync will also require a GeForce GTX 650 Ti Boost and above graphics card to work.