We’ve done IPS panel tech. We’ve done high refresh. So let’s wrap up the holy trinity of gaming-relevant monitor technologies of late. It’s time to talk frame syncing or adaptive sync. Probably better known via brand names like Nvidia G-Sync and AMD FreeSync, frame syncing technology is all about getting your games running smoother and without any nasty screen tearing. But here’s the twist. It does that without requiring that your games run faster or that you buy a $/£1,000 mega-GPU. And it really is rather lovely
.First, a quick note on what we’re actually doing here with these why-you-need posts. The last installment generated one or two complaints on linguistic-going-on-epistemological grounds. Who really needs a 120Hz monitor? While we’re at it, what do I actually mean by ‘need’? Who ‘needs’ anything but protection from the elements and physical sustenance, after all?
If it helps, just suffix the title with “if you’re already thinking of buying a new monitor any time soon.” We couldn’t fit that in the headline box, though. If you’re not in the market for a new screen, well, in purely functional terms, you could indeed play most games on a 10-year-old 15-inch LCD monitor or even 20-year-old CRT. Hell, the latter would actually have some advantage over most LCD monitors in terms of input lag and responsiveness.
And you know what? For a lot of games, much of the time, your enjoyment level would probably normalise once you’d adjusted to such antediluvian display technologies. There’s very, very little you truly need in terms of the latest snazzy display technologies.
On the other hand, playing games on a 27-inch, 1,440p IPS panel at 120Hz with frame syncing is bloody nice. So if you like, think of these posts in those terms. Why IPS, 120Hz or frame syncing is bloody nice. And thus not why you’ll keel over dead if you haven’t got them. Life will go on.
What’s the problem?
With that little detour negotiated, let’s get back on message with frame syncing or adaptive sync. Credit where it’s due, we have Nvidia to thank for it. Not that the company invented the notion. But it did commercialise it in the context of the PC. Nvidia put frame syncing on the map and into the consciousness of gamers. In typical Nvidia style, it did so via a proprietary solution involving end-to-end Nvidia clobber of not a little expense. But we’ll come back to specifics like that once we’ve covered the generalities.
There are two interrelated issues at play here: the problem of syncing the output of your video card with your monitor and a rendering error known as screen tearing.
First Nvidia: This is lag. That is stutter. Geddit?
The syncing bit is pretty straight forward to understand. Conventional displays have a fixed refresh rate, typically 60Hz or 60 times a second, though as we discussed in the last post, refresh rates up to 144Hz are now on offer.
Regardless of the monitor’s refresh rate, if it’s a standard LCD panel lacking frame-syncing capability that refresh rate is fixed, it doesn’t vary. And that creates a problem, because games usually run at variable frame rates and often dramatically so as you roam around a game world, perhaps moving from a small indoor space to a huge, open vista, or encounter an army of game characters that suddenly have to be rendered graphically as well as have their individual AI calculated. Frame rates jumping up and down by almost an order of magnitude are routine.
There are a few exceptions. In some games, just as a for instance, CPU bottlenecking can lead to pretty consistent frame rates some of the time.
But actually, that doesn’t matter because that frame rate certainly won’t be in perfectly sync with the refresh rate of the monitor. The bottom line is that you’re going to have a mismatch between the game engine frames being generated by your PC and your monitor’s refresh rate. The game is not going to simply run at a perfect 60Hz (we’ll base assumptions around a 60Hz setup from here on unless otherwise stated).
The net result is frames spewing out at inconvenient moments. OK, that doesn’t actually take the form of frames being fired at a monitor that can’t cope. It doesn’t happen quite that way since the output of your video card is 60Hz regardless of your gaming frame rate and the mismatch happens prior to the display signal being resolved and sent to your monitor. But it’s that there’s a mismatch, not where it happens in the display signal chain, that matters.
And the solution?
Anyway, there are two ways your display subsystem can solve the problem. First is to effectively stick whatever is ready onto the display. This is the default approach and the problem is that the monitor will frequently refresh before a frame is finished rendering on your graphics card, resulting in two different frames being combined on the screen with an obvious ‘tear’ line bisecting them. Ugly.
Now AMD: Whatever you can do, I can do better…
The other option is to force the game engine to sync with the monitor, an approach known as V-sync or vertical sync. The thing about V-sync is that it only works perfectly if every single frame is rendered in less than 1/60th of a second. Any slower you’ll have problems. Think of it this way. If a frame takes more than 1/60th of a second to render, it won’t be ready for the next screen refresh and the video card will have to repeat the previous frame. The result is visible lag and stutter.
What you really want, then, is a screen that refreshes on your games’ terms. A screen that that’s ready and waiting to refresh every time a new frame is ready. Not before and not after. Such a screen would not only be smoother. It also wouldn’t suffer from tearing. Say hello to adaptive sync.
What’s more, the beauty of this kind of frame syncing is that is doesn’t put additional load on your graphics subsystem. Quite the opposite. It makes the absolute most of whatever rendering power you have on offer. If your video card can pump out 40 frames per second in a given game, that’s exactly how many frames you’ll see and without any tearing. Your monitor will run at 40Hz.
If that all sounds thoroughly rosy, some of you might be wondering how all this fits in with high refresh. If 120Hz-plus is so wonderful, what on the earth is the big deal with, say, 40Hz? That, I’m afraid, is an annoyingly good question.
The answer is that an adaptive-synced 40Hz looks much nicer than you’d think given the modest refresh rate. The critical difference is that you don’t get any stalls and stutters to spoil the sense of fluid motion. Running at a higher refresh but with frequent stalls can subjectively look a lot less smooth.
On the other hand, really high refresh rates can solve that problem to a degree. If you’re running at 120Hz and you miss a refresh cycle, then a temporary drop down to an effective 60Hz may not be entirely obvious. And yet higher Hz look at their best when frame-synced. Even when running above 60Hz, you can have the odd laggardly frame here and there that stalls the rendering process. Without adaptive sync, that can mean missing a few refresh cycles. With it, you see the frame as soon as it’s ready, reducing any lag or stuttering to the absolute minimum. And indeed, the vast majority of monitors with adaptive sync support also support 120Hz-plus refresh rates.
It’s this glowing thing that might make Nvidia’s G-Sync the superior solution but certainly makes it more expensive…
The other obvious snag is that frame-syncing capability isn’t a standard feature of most displays. Right now, I believe the only way to get it is to buy a monitor equipped with Nvidia’s G-Sync technology, which incorporates a proprietary display controller board and needs to be fed with an Nvidia graphics card.
I say ‘believe’ because there is an alternative open standard that’s currently making its way to market. How to refer to it is a little complicated since it’s a much more open standard. I could just call AMD FreeSync. But you could also say it’s just a feature of the DisplayPort interface with a fancy name.
Either way, the good news is that it doesn’t require a fancy and expensive new controller board in your monitor. It simply makes use of the adaptive sync command in DisplayPort 1.2a. Of course, your video card and monitor will both need to be DisplayPort 1.2a compliant. Where things get a little complicated is that FreeSync is an AMD technology despite essentially being an expression of a DisplayPort feature. That means it only works with AMD video cards.
Nvidia could, of course, choose to support adaptive sync within DisplayPort. But then it wouldn’t be able to lock us into its video cards and flog its G-Sync controller boards to monitor makers. So for now, that seems unlikely and rather makes a mess of the adaptive sync market. You’ll need to pick and choose carefully to make sure you have a system with the right components. And that’s a pity. I’d very much like to see adaptive sync monitors that will work with basicaly any modern GPU. Oh well.
In any case, as I write these words I don’t think you can actually buy a monitor with FreeSync support. A number of ‘FreeSync’ monitors have been announced and popped up on the various online retailers for pre-order, but haven’t actually arrived with gamers yet.
G-Sync vs FreeSync
As for the pros and cons of G-Sync vs FreeSync, the former has theoretical technical superiority which I doubt will add up to much in practice, but then I haven’t seen FreeSync running so can’t really comment, while the latter will be cheaper. Currently, screens with G-Sync controllers are very pricey. It’s also just possible that existing monitors could be updated with new firmware to add support for FreeSync, but I’m not aware of any monitor maker announcing such a move.
UK site Overclockers.co.uk has a handy little landing page that holds all the G-Sync and FreeSync monitors they currently offer, which will give you a quick idea regards pricing. There’s also a list of FreeSync monitors here and one for G-Sync here.
Finally, as we conclude this holy trinity of supposedly must-have display technologies I sense we’re flirting with what’s known as the ‘Paradox of Choice’ in the monitor market. That’s the observation in the context of consumer choice that too much choice can actually make us unhappy. If you’re not routinely exposed to these technologies, it’s certainly not easy to prioritise this stuff. One day, IPS, high refresh and adaptive sync may all be pretty much standard. Until then, how are you supposed to choose between them given that few displays combine all three?
There is no easy answer. For whatever it’s worth, my own personal order of priority would be IPS, high refresh and then frame syncing. But you may not agree. And that’s why if you can possibly find a way of trying before buying, either through friends or at your local shops and stores, then make it happen.