With G-Sync, there doesn’t have to be a trade-off between screen tearing or input lag. Because of the dedicated G-Sync module that replaces the monitor scaler, NVIDIA has control at both ends of the graphics pipeline, from when the frame rate is rendered by the GPU till it is displayed on your screen.
Do note that this is nothing like Adaptive V-Sync, which was arguably a stopgap measure on NVIDIA’s part to fix the issues with V-Sync. All Adaptive V-Sync does is turn V-Sync off when the frame rates drop below the V-Sync cap in order to minimize stuttering; when frame rates exceed the cap, it turns it back on.
Instead of having to deal with a fixed refresh rate, G-Sync varies the panel’s refresh rate in accordance with the frame rate output, so the display is able to accept the frame as soon as the GPU has rendered it. This means that neither the display nor the GPU have to wait for each other, which translates into smoother gameplay, lower input lag, and improved responsiveness. The variable refresh rate also means that the panel will not have to draw on the same frame just because the graphics card cannot keep up, thus minimizing potential stuttering issues.
At the same time, tearing is prevented as the GPU polls the display to find out if it is in the middle of a vertical blanking interval (or the time between frames in layman's terms) so it will not output a frame in the middle of a scan interval.
On top of that, G-Sync also has a way of handling situations in which the frame rates fall below the minimum refresh rate of the monitor. NVIDIA has been sparse on the details of exactly how this works, but that hasn’t stopped various news sites from speculating on the details.
The prevailing theory is that NVIDIA uses some sort of frame repeating in order to continue to take advantage of G-Sync, essentially repeating frames alongside the variable refresh rate instead of reverting to the minimum refresh rate. Let’s say we have a monitor with a refresh rate between 30Hz and 144Hz. If frame rates fall to 29fps for instance, the display will end up refreshing at 58Hz, with each frame drawn twice and timed to coincide with the monitor refresh interval to maintain a stutter- and tear-free experience.
And should your card prove capable of pushing out frame rates above the maximum refresh rate, the monitor will simply operate as if V-Sync were enabled, with the frame rate capped at whatever the maximum refresh rate is.
Short of running out to get a G-Sync capable notebook, you’ll need to make sure you have the right card, monitor, drivers, and even cable to get G-Sync working properly.
The monitor part is an absolute no-brainer. G-Sync capable monitors are clearly labeled and come with a dedicated G-Sync module in place of the regular scaler. A variety of brands manufacture G-Sync monitors to date, including the likes of Acer, AOC, ASUS and BenQ.
And unlike AMD’s recently-introduced Low Frame Rate Compensation (LFRC) technology, which requires a monitor with a maximum refresh rate at least 2.5 times that of the minimum, any G-Sync monitor will allow you to enjoy the full benefit of the technology. Essentially, you’ll only have to worry about picking a G-Sync monitor that fits your budget and preferences (resolution, size, design etc.).
Next, you’ll need a compatible graphics card and updated drivers. As mentioned earlier, NVIDIA requires a GeForce GTX 650 Ti Boost (not the same as the GeForce GTX 650 Ti) at the minimum and driver version 340.52 or higher, and AMD cards will not work at all. For more details on AMD’s own variable refresh rate technology, you can check out our coverage on FreeSync here and here.
Finally, you need a DisplayPort 1.2 cable. However, while the first wave of G-Sync monitors was limited to DisplayPort only, newer G-Sync monitors like the ASUS ROG Swift PG279Q feature an updated module that allow inputs over HDMI 1.4. Still, G-Sync remains limited to DisplayPort only, unlike AMD’s FreeSync, which recently became available over HDMI.