NVIDIA G-Sync 101: Everything you need to know
The problem with V-Sync
Solving the perennial problem of stutter and lag
Gaming monitors used to be characterized by two simple features – fast pixel response times and high refresh rates. That changed in 2013 when NVIDIA first introduced us to G-Sync, a variable refresh rate technology that it said was the ultimate solution to the lag, stutter and screen tearing that have continually dogged gamers.
By synchronizing the monitor’s refresh rate with the frame render rate of a compatible NVIDIA card, G-Sync practically guaranteed a smooth and tear-free gaming experience for the first time. Indeed, in our first impressions of the technology over two years ago, we immediately noticed how G-Sync kept the gameplay butter-smooth, whereas a lesser monitor without G-Sync faltered and exhibited noticeable stutters under identical conditions. Once you experience the difference, you probably might have a hard time accepting regular displays.
G-Sync has come a long way since then, at least in terms of adoption. While initially confined to a limited number of high-end monitors, G-Sync has since made its way onto notebook displays and a growing number of gaming monitors (even curved ones!). Now, if you want to head out and buy a gaming monitor, you’ll have to decide if you want one that supports G-Sync as well.
Unfortunately, G-Sync monitors don’t come cheap, especially when compared to monitors that support FreeSync, AMD’s rival variable refresh rate technology. Because of the need to integrate a dedicated G-Sync module, which amounts to added costs on the part of manufacturers, G-Sync monitors often come with a significant price premium.
But even if you’re willing to swallow the upfront costs of a G-Sync capable monitor, you’ll still have to make sure that you have a compatible graphics card. It goes without saying that AMD cards are a no-go, and you’ll require at least an NVIDIA GeForce GTX 650 Ti Boost or newer, and have driver version 340.52 or higher.
With that said, G-Sync may still not be right for everyone, and there are even certain scenarios where you may not want to have G-Sync enabled. Read on to find out how G-Sync works, and how you can get the most out of it.
Before G-Sync, there was V-Sync
In the absence of G-Sync, gamers might experience either one of two issues – stuttering or tearing. The first generally occurs when frame rates are low, and is commonly referred to as lag, while the second tends to present itself when frame rates are too high and exceed the maximum refresh rate of the monitor.
The link between stuttering and low frame rates is clear. The graphics card needs to render a certain number of frames per second in order to deliver a playable experience. This figure is generally in the range of 20 to 30fps for single-player games, although players of fast-paced, reaction-based shooters will require a far higher figure (usually over 100fps at the very least).
But when it comes to screen tearing, that’s when a powerful graphics card can become more of an annoyance than a blessing. For instance, it would be a problem if your card was pushing 120fps on a 60Hz monitor, because the monitor can only handle 60 frames a second. The monitor essentially cannot keep up with your graphics card, and when a new frame is rendered in the middle of a scan interval, sections of two different frames are displayed on your screen, creating a disjointed image with horizontal tears.
The solution to that used to be V-Sync (or vertical synchronization), which is a fairly crude method at that. V-Sync helped to eliminate screen tearing by forcing the graphics card to wait for the monitor to be ready to display the next frame, effectively limiting the frame rate output of your card to the maximum refresh rate of your monitor. That’s all nice and well if you have a powerful-enough card that never drops below the maximum refresh rate, because if that happens even once, stuttering occurs.
V-Sync responds to frame rate dips by dropping the maximum cap even further, to 30 or 45fps for example. If frame rates increase, the cap is automatically raised to 60fps again. These changes in the frame rate cap can occur quite frequently in graphically-intensive games, resulting in noticeable juddering that can put quite a dampener on the entire experience. Consistent frame rates play a big part in determining whether or not you get fluid gameplay, which is why the rapidly alternating frame rates cap are such a problem, even if 30fps may appear more than enough for some folks.
Furthermore, when the graphics card cannot keep up with the monitor’s fixed refresh rate, the panel ends up displaying the same frame twice over more than one scan interval while waiting for the next one to be rendered, which contributes to the choppy experience as well.
And even if none of these issues occurred, V-Sync can introduce other unwanted issues like markedly increased input lag, which decreases the overall feeling of responsiveness. One of our most unpleasant experiences was in Portal 2, where turning V-Sync on created so much delay in our mouse movements that we were willing to turn it off and put up with the huge amounts of screen tearing that occurred while Wheatley was trying to bring the whole facility down.