Hello, V-Sync. Yes, thank you for meeting me here today. I invited you out because I felt the need to share some very important news: no one actually likes you. We just put up with you because, well, there's really not a better alternative. In truth, you're inconsistent, awkward, difficult to be around, cause obnoxious stuttering, and IT'S YOUR SURPRISE BIRTHDAY PAAAARRTY wheee everyone leap out now! OK, not really. But I figured those couple seconds of revelatory glee might help offset this falling pain piano of existential misery: you're being replaced. By something younger, faster, and more practical. Or at least, that's how it'll be if Nvidia has its way. G-Sync claims to eliminate hassles like stuttering, screen tearing, and the like by synchronizing monitor refresh to the GPU render rate instead of vice versa, which is what V-Sync does. The result, apparently, is worlds better.
Monitors, you see, are fixed at 60Hz refresh rates, but modern GPUs can output so much more. So, as is, you either enable V-Sync to keep the GPU in clumsy lockstep with the monitor (which leads to response lag, stuttering, etc), or you can disable V-Sync to get better response times, but risk screen tearing when the two fall out of sync. Both methods are far from optimal. Nvidia, however, claims that it's finally found a best-of-both-worlds solution. It explained in a blog post:
"With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate."
"This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC."
G-Sync-enabled displays will work with Nvidia's Kepler series and be available early next year from the likes of Asus, BenQ, Philips, and ViewSonic. They seem rather miraculous, so we'll have to wait and see how well they work in practice. On paper, though, this solution sounds pretty water-tight. Here's hoping it a) holds up once we're able to put it through its paces and b) isn't too expensive. It is, however, pretty proprietary at the moment, which is basically a deal-breaker for those running non-Nvidia hardware. Ah, format wars. Aren't they grand?