Why You Need A Monitor With Adaptive Sync

We’ve done IPS panel tech. We’ve done high refresh. So let’s wrap up the holy trinity of gaming-relevant monitor technologies of late. It’s time to talk frame syncing or adaptive sync. Probably better known via brand names like Nvidia G-Sync and AMD FreeSync, frame syncing technology is all about getting your games running smoother and without any nasty screen tearing. But here’s the twist. It does that without requiring that your games run faster or that you buy a $/£1,000 mega-GPU. And it really is rather lovely

First, a quick note on what we’re actually doing here with these why-you-need posts. The last installment generated one or two complaints on linguistic-going-on-epistemological grounds. Who really needs a 120Hz monitor? While we’re at it, what do I actually mean by ‘need’? Who ‘needs’ anything but protection from the elements and physical sustenance, after all?

If it helps, just suffix the title with “if you’re already thinking of buying a new monitor any time soon.” We couldn’t fit that in the headline box, though. If you’re not in the market for a new screen, well, in purely functional terms, you could indeed play most games on a 10-year-old 15-inch LCD monitor or even 20-year-old CRT. Hell, the latter would actually have some advantage over most LCD monitors in terms of input lag and responsiveness.

And you know what? For a lot of games, much of the time, your enjoyment level would probably normalise once you’d adjusted to such antediluvian display technologies. There’s very, very little you truly need in terms of the latest snazzy display technologies.

On the other hand, playing games on a 27-inch, 1,440p IPS panel at 120Hz with frame syncing is bloody nice. So if you like, think of these posts in those terms. Why IPS, 120Hz or frame syncing is bloody nice. And thus not why you’ll keel over dead if you haven’t got them. Life will go on.

What’s the problem?

With that little detour negotiated, let’s get back on message with frame syncing or adaptive sync. Credit where it’s due, we have Nvidia to thank for it. Not that the company invented the notion. But it did commercialise it in the context of the PC. Nvidia put frame syncing on the map and into the consciousness of gamers. In typical Nvidia style, it did so via a proprietary solution involving end-to-end Nvidia clobber of not a little expense. But we’ll come back to specifics like that once we’ve covered the generalities.

There are two interrelated issues at play here: the problem of syncing the output of your video card with your monitor and a rendering error known as screen tearing.

First Nvidia: This is lag. That is stutter. Geddit?

The syncing bit is pretty straight forward to understand. Conventional displays have a fixed refresh rate, typically 60Hz or 60 times a second, though as we discussed in the last post, refresh rates up to 144Hz are now on offer.

Regardless of the monitor’s refresh rate, if it’s a standard LCD panel lacking frame-syncing capability that refresh rate is fixed, it doesn’t vary. And that creates a problem, because games usually run at variable frame rates and often dramatically so as you roam around a game world, perhaps moving from a small indoor space to a huge, open vista, or encounter an army of game characters that suddenly have to be rendered graphically as well as have their individual AI calculated. Frame rates jumping up and down by almost an order of magnitude are routine.

There are a few exceptions. In some games, just as a for instance, CPU bottlenecking can lead to pretty consistent frame rates some of the time.

But actually, that doesn’t matter because that frame rate certainly won’t be in perfectly sync with the refresh rate of the monitor. The bottom line is that you’re going to have a mismatch between the game engine frames being generated by your PC and your monitor’s refresh rate. The game is not going to simply run at a perfect 60Hz (we’ll base assumptions around a 60Hz setup from here on unless otherwise stated).

The net result is frames spewing out at inconvenient moments. OK, that doesn’t actually take the form of frames being fired at a monitor that can’t cope. It doesn’t happen quite that way since the output of your video card is 60Hz regardless of your gaming frame rate and the mismatch happens prior to the display signal being resolved and sent to your monitor. But it’s that there’s a mismatch, not where it happens in the display signal chain, that matters.

And the solution?

Anyway, there are two ways your display subsystem can solve the problem. First is to effectively stick whatever is ready onto the display. This is the default approach and the problem is that the monitor will frequently refresh before a frame is finished rendering on your graphics card, resulting in two different frames being combined on the screen with an obvious ‘tear’ line bisecting them. Ugly.

Now AMD: Whatever you can do, I can do better…

The other option is to force the game engine to sync with the monitor, an approach known as V-sync or vertical sync. The thing about V-sync is that it only works perfectly if every single frame is rendered in less than 1/60th of a second. Any slower you’ll have problems. Think of it this way. If a frame takes more than 1/60th of a second to render, it won’t be ready for the next screen refresh and the video card will have to repeat the previous frame. The result is visible lag and stutter.

What you really want, then, is a screen that refreshes on your games’ terms. A screen that that’s ready and waiting to refresh every time a new frame is ready. Not before and not after. Such a screen would not only be smoother. It also wouldn’t suffer from tearing. Say hello to adaptive sync.

What’s more, the beauty of this kind of frame syncing is that is doesn’t put additional load on your graphics subsystem. Quite the opposite. It makes the absolute most of whatever rendering power you have on offer. If your video card can pump out 40 frames per second in a given game, that’s exactly how many frames you’ll see and without any tearing. Your monitor will run at 40Hz.

If that all sounds thoroughly rosy, some of you might be wondering how all this fits in with high refresh. If 120Hz-plus is so wonderful, what on the earth is the big deal with, say, 40Hz? That, I’m afraid, is an annoyingly good question.

The answer is that an adaptive-synced 40Hz looks much nicer than you’d think given the modest refresh rate. The critical difference is that you don’t get any stalls and stutters to spoil the sense of fluid motion. Running at a higher refresh but with frequent stalls can subjectively look a lot less smooth.

On the other hand, really high refresh rates can solve that problem to a degree. If you’re running at 120Hz and you miss a refresh cycle, then a temporary drop down to an effective 60Hz may not be entirely obvious. And yet higher Hz look at their best when frame-synced. Even when running above 60Hz, you can have the odd laggardly frame here and there that stalls the rendering process. Without adaptive sync, that can mean missing a few refresh cycles. With it, you see the frame as soon as it’s ready, reducing any lag or stuttering to the absolute minimum. And indeed, the vast majority of monitors with adaptive sync support also support 120Hz-plus refresh rates.

It’s this glowing thing that might make Nvidia’s G-Sync the superior solution but certainly makes it more expensive…

The other obvious snag is that frame-syncing capability isn’t a standard feature of most displays. Right now, I believe the only way to get it is to buy a monitor equipped with Nvidia’s G-Sync technology, which incorporates a proprietary display controller board and needs to be fed with an Nvidia graphics card.

I say ‘believe’ because there is an alternative open standard that’s currently making its way to market. How to refer to it is a little complicated since it’s a much more open standard. I could just call AMD FreeSync. But you could also say it’s just a feature of the DisplayPort interface with a fancy name.

Either way, the good news is that it doesn’t require a fancy and expensive new controller board in your monitor. It simply makes use of the adaptive sync command in DisplayPort 1.2a. Of course, your video card and monitor will both need to be DisplayPort 1.2a compliant. Where things get a little complicated is that FreeSync is an AMD technology despite essentially being an expression of a DisplayPort feature. That means it only works with AMD video cards.

Nvidia could, of course, choose to support adaptive sync within DisplayPort. But then it wouldn’t be able to lock us into its video cards and flog its G-Sync controller boards to monitor makers. So for now, that seems unlikely and rather makes a mess of the adaptive sync market. You’ll need to pick and choose carefully to make sure you have a system with the right components. And that’s a pity. I’d very much like to see adaptive sync monitors that will work with basicaly any modern GPU. Oh well.

In any case, as I write these words I don’t think you can actually buy a monitor with FreeSync support. A number of ‘FreeSync’ monitors have been announced and popped up on the various online retailers for pre-order, but haven’t actually arrived with gamers yet.

G-Sync vs FreeSync

As for the pros and cons of G-Sync vs FreeSync, the former has theoretical technical superiority which I doubt will add up to much in practice, but then I haven’t seen FreeSync running so can’t really comment, while the latter will be cheaper. Currently, screens with G-Sync controllers are very pricey. It’s also just possible that existing monitors could be updated with new firmware to add support for FreeSync, but I’m not aware of any monitor maker announcing such a move.

UK site Overclockers.co.uk has a handy little landing page that holds all the G-Sync and FreeSync monitors they currently offer, which will give you a quick idea regards pricing. There’s also a list of FreeSync monitors here and one for G-Sync here.

Finally, as we conclude this holy trinity of supposedly must-have display technologies I sense we’re flirting with what’s known as the ‘Paradox of Choice’ in the monitor market. That’s the observation in the context of consumer choice that too much choice can actually make us unhappy. If you’re not routinely exposed to these technologies, it’s certainly not easy to prioritise this stuff. One day, IPS, high refresh and adaptive sync may all be pretty much standard. Until then, how are you supposed to choose between them given that few displays combine all three?

There is no easy answer. For whatever it’s worth, my own personal order of priority would be IPS, high refresh and then frame syncing. But you may not agree. And that’s why if you can possibly find a way of trying before buying, either through friends or at your local shops and stores, then make it happen.


  1. Jason Lefkowitz says:

    I appreciate these in-depth hardware posts, but I kind of wish they would go the Wirecutter route at the end and just say “so — if you want a monitor that has all these features, buy this one.” With a link to where I can buy it at Amazon or Newegg or wherever.

    It would help save a ton of time digging through feature lists to make sure my new monitor has all the RPS-endorsed features, some of which can be buried pretty deep in the spec list; I trust RPS to not recommend hardware that sucks, so I have no problem just jumping to buying the monitor you tell me to buy; and it could earn RPS a little extra revenue from sales routed through vendors that have affiliate programs (*cough* Amazon *cough*)

    If there’s no one clear winner, or you’re worried about appearing to push people towards one manufacturer too often, just list two or three models with brief statements of the pros/cons of each. Then revisit the post every six months or so and update the recommendations as needed.

    I’ve found a lot of solid products via the Wirecutter; I’d love to find more via RPS.

    • Ieolus says:

      That would be impossible, at least for this article. No IPS, high refresh, adaptive sync (choose your flavor) monitors exist on the market yet.

      • Jinoru says:

        Since they considered IPS to be the priority, giving a link to at least the best ones in that category and/or of the others might have been nice actually.

        • Ieolus says:

          But there are hundreds of decent IPS monitors… its the other factors that are limiting.

      • tetracycloide says:

        But one is set to come out next month… Interesting timing that.

    • trjp says:

      You realise RPS is read outside the USA – indeed BASED outside the USA – yes?

      • derbefrier says:

        and why is that relevant?

        • Voice of Majority says:

          If the shipping is not there, then such links would be counter productive.

          • TacticalNuclearPenguin says:

            I actually don’t see the point either, if you have a newegg USA link in dollars you still know two things:

            1) ballpark pricing
            2) The actual name of the thing you’re supposed to buy, which you can then try to find in your UK store.

            If there’s a problem in all of this is the price of such G-synched 1440p IPS thingies, for 700 pounds you might actually just add an extra 100-200 and get a professional level Eizo that looks 10 times better, has perfect hardware calibration in various color gamuts ( useful even for gaming, especially the dully colored ones ), makes the best Apple monitor look like a cheap toy with crappy performance and has a 5 years warrant.

            Input lag is fine, ghosting is fine, gaming performance is perfect for anyone who isn’t addicted to high refreshes ( which i can understand, if that’s your thing ) and at least you’re sure about two things:

            1) you’re looking at the best LCD image output money can buy
            2) you own a finely crafted and tested piece of equipment that was built to perfection rather than market hype.

          • Premium User Badge

            john_silence says:

            I’m right with you there, TacticalNuclearPenguin. I wish Eizo would include more of their professional tech inside the Foris line – an adaptive-sync Eizo with semi-pro specs, now that would be something.

          • Continuity says:

            “I actually don’t see the point either, if you have a newegg USA link in dollars you still know two things:

            1) ballpark pricing
            2) The actual name of the thing you’re supposed to buy, which you can then try to find in your UK store.”

            Given its a UK site with mostly UK based writers, I think its would be more appropriate to have a link to a UK store then you USA peasants can “try and find” the equivalent in your US stores.

          • Asurmen says:

            So you’re saying spend more money to buy a monitor that does none of the things you’re buying the cheaper monitor for? How is that useful advice?

          • TacticalNuclearPenguin says:

            It’s a personal preference, i’m just saying that we’re talking about extreme price points and that for a little more you’re buying state of the art stuff and not something random from Acer.

            Sure, you’re right, i’m talking about different qualities, but then i too thought i wanted a 1440p G-sync IPS, but then i stopped right there as i realized that i’d still be plagued by random color accuracies, a non uniform panel across all angles, approssimative gamma correction and so on, and since i care about those things i couldn’t justify the price tag.

            I’m not saying such type of monitor shouldn’t focus on these things, afterall it’s a long awaited thing that will please a lot of people, what i’m merely giving there is a different perspective. You might be unsure about what yoy REALLY want in a monitor, as i was before, and i’d like to raise awareness about certain products that i found myself in love despite them scaring me away before due to their atrocious price.

          • Asurmen says:

            That’s fine, and thanks for explaining, it’s just your previous post made it seem like they were flat out better.

          • TacticalNuclearPenguin says:

            Well, for anything that is not pure gaming they are and by a good margin though. My recommendation for someone that is willing to drop so much cash is to at least give a good try at different options, because you might aswell end up like me and find that afterall it wasn’t just gaming features that you were after.

            But sure, if you genuinely need such gaming features paired with an high resolution IPS there’s indeed little room for discussion for the time being.

    • 9of9 says:

      Keep an eye on these guys: link to hexus.net Should be out in March – IPS, FreeSync and G-Sync in one package, 1440p and 144Hz. Will probably cost an arm and a leg, but you seem to get the whole package in there.

      • drussard says:

        Two different monitors there and unfortunately, the freesync monitor will be a TN panel.. so no dice for me. The Gsync IPS is looking really good.. only I don’t run NVidia so more waiting to come.

      • Continuity says:

        As nice as the Gsync IPS sounds, it is nearly £700 so kinda hard to justify the cost for most people I think.

  2. Wisq says:

    There’s two things keeping me from getting a G-sync monitor: One, lack of IPS; and two, lack of inputs other than DisplayPort. You’ve covered the former, but the latter is frustrating, and I’m not sure if it’s a permanent future issue or just a temporary trend due to the rush to get these monitors to market.

    It seems like the LCD makers consider multiple inputs to be purely a backwards-compatibility thing (“get them using our monitor no matter what kind of connector they have”), while I’ve been using them as a sort of video-switcher between multiple systems.

    Personally, I’ve got my gaming system on the DisplayPort (being the most modern and capable interface), my laptop on HDMI (don’t care about “only” 30fps at 1440p), my Playstation 3 on DVI, and my old home Linux server on the VGA DSUB. It’s great to switch between them without having to replug stuff or buy an expensive and hard-to-find KVM / switchbox.

    My frustration at being limited to a single DisplayPort is exactly why I bought my current monitor (back in December) to replace my old Apple Cinema Display. I had been holding out for G-sync, but the deciding factor was when I saw how hard/impossible it would be to get 1440p, IPS, and G-sync all in the same package, and adding multiple inputs to that made it obviously impossible, so I gave up and got myself what amounted to just a better 1440p IPS 60Hz because whatever.

    • Wisq says:

      Oh, and before people try to inform me: Yes, I know that G-sync specifically requires DisplayPort. What I want is not G-sync on everything, just G-sync on the DisplayPort and normal rendering on the other ports.

      (Of course, if they did that, I’m sure many people would fail to read the manual, plug into other ports, not get G-sync, and flip out when they discover it’s not working. So, I dunno. Monitors for advanced users, please?)

      • Marley says:

        So just to clarify here you already have a monitor which all your other stuff is hooked up to? why not just keep using that and have a g-sync monitor hooked up to your pc by itself? also my G-snyc monitor has an HDMI input(obviously not g-sync compatible) on top of the displayport so im assuming it would be fine to hook other things up to it.

        • TacticalNuclearPenguin says:

          That might work, but maybe he/she wants to upgrade from a worse monitor that not only happens to have a slower refresh and no g-sync capability, but also a lower image quality.

          In that case wanting just one better monitor to pair both your PC and consoles would be a good idea, and since i’m a good sport i’ll leave out of the conversation hypothetical real estates issues.

        • jrodman says:

          That seems better if you have a big desk.

    • Premium User Badge

      phuzz says:

      There is a BenQ monitor (the XL2420G) with G-Sync and multiple inputs, but of course that one is TN not IPS.
      Personally I’m glad. I don’t have the cash to buy a new monitor now, but I hopefully will in six months to a year. Hopefully by then a monitor with all the right specs will have come along and I’ll have a clear idea of what to get.
      In the meantime the answer is clear, don’t upgrade your monitor yet.

  3. Tacroy says:

    How do Free / G-sync work with games in windowed mode? Do they at all? I spend most of my time in a window, so shelling out for something that’ll almost never be on sounds like a losing proposition.

    • PoulWrist says:

      Very fair point! I have not seen this discussed at any point in such an article and it seems quite important as the borderless window option is one of the best features in any game.

    • FlipMooMonkey says:

      From what I remember G-Sync does only work in full screen. Presumably this would be the same for Free-sync too.

      • Tacroy says:

        Yeah, then in that case the *Syncs are not relevant to my interests.

    • Mungrul says:

      I can confirm that windowed games can’t take advantage of G-Sync.
      I recently bought an ASUS ROG PG278Q. My preferred method of running games on my old Dell is full-screen windowed if the game offers it, as alt-tabbing is seamless.
      G-Sync requires full-screen in order to take effect. If you run windowed full-creen, you might as well go back to your old monitor.

      As it happens, I returned the ASUS four days after buying it, as while the size, resolution and G-Sync were absolutely lovely, it was a TN panel, and I didn’t realise just how bad those things are. Colour accuracy was terrible, viewing angles were worse, and to top it all off my eyes were sore after using it.
      Seriously, don’t discount the viewing angle thing. You may think “Oh, that won’t affect me as I sit close and in a central position”, but it really will be noticeable.

      My seven year old Dell 2407WFP looks a LOT better picture-quality wise then the ASUS, and the fact that it’s so old as well as it only cost me £400 as opposed to the ROG’s £630 price resulted in the swift return of the ASUS.

      Avoid the ASUS ROG PG278Q at all costs. It’s over-priced junk that is bad for your eyes.

    • OmNomNom says:

      G sync will not work windowed

    • Caerphoto says:

      The replies here do make me wonder why Windows itself doesn’t run in free/g-sync mode, since it’s hardware accelerated anyway.

  4. edwardh says:

    Personally, I’ll stick with my cPVA panel. Because while I don’t play many action-driven games where things like screen tearing is a really big pain in the ass, I do appreciate a contrast ratio of ~1:3000 and a black point of 0.02 cd/m². Mmmm…. black. Almost, at least.
    Until OLED, I doubt that I’ll ever use anything else.

    • TacticalNuclearPenguin says:

      Yep, strong supporter of super high native contrast ratios here, and off course only the native kind, not the stupid kind that plagues HDTVs.

      I eventuallyt moved away from VA due to their other problems, but i’ll give you that there’s nothing comparable for that rather low price. While the interwebz seem to be fixated with IPS, i really don’t think any random one can be considered an upgrade.

      But then, some even delude themselves that IPS glow doesn’t exist anymore and that polarizers are no longer needed.

  5. Veles says:

    I think nVidia are missing a bit of a trick. Sell their G-sync hardware to monitor manufacturers for dead cheap to get it in as many products as possible thereby dominating the market. If G-sync monitors were only fractionally more expensive than a normal one, I think that might sway a few more people away from getting an AMD card.

    • remon says:

      Freesync is already based on the free displayport standard. Why would any monitor maker buy the proprietary one? And besides, getting locked in a vendor is only good for Nvidia in this case, not for the monitor makers.

      • TacticalNuclearPenguin says:

        AMD in this case is locked aswell, and before any decent comparisons it would be rather a stretch to call both technologies equivalent.

        I’ll side with whichever brand fits me better, but i have no delusions that there might be differences between both approaches. In this particular scenario Nvidia seems the most hardware based solution, so my personal preferences without further evidence from the competition goes there.

        • Asurmen says:

          How are AMD locked in?

          • TacticalNuclearPenguin says:

            For now whatever the reason the only certain thing is that you have to choose one, you can’t get both.

          • remon says:

            That is Nvidias problem though, there is a VESA standard out.

          • TacticalNuclearPenguin says:

            Yeah, but let’s not rush too quickly to conclusions, the VESA standard is open but AMD’s implementation is still their own.

            Maybe Nvidia might license it for their uses too but they spent some R&D and money on their G-sync well before DisplayPort was mature enough for other solutions, so i’m not sure i can blame them too much.

            Much in the same way AMD claimed that Nvidia could use Mantle aswell if they wanted, but at this point it seems to me that they’re just riding the wave of popular discontent for Nvidia’s “lockdowns”, and that doesn’t really help with perspective.

        • remon says:

          As Asurmen said, how is AMD locked in?

          The monitors don’t have to support Freesync, they have to support Adaptive Sync, which is supported by AMD in the form of FreeSync. If Nvidia adapted their G-sync to Adaptive Sync and called it for example A-sync there would be no need for extra boards and any monitor that supported Adaptive Sync would support Nvidias and AMDs version of it.

          • TacticalNuclearPenguin says:

            AMD still has their own implementation because that VESA standard is absolutely not born with gaming needs in mind.

            Nvidia spent some decent money before any open standard was available, the only way i see them adapting to something else ( and they might ) would be for Free Synch to gain huge traction and G-sync to flop.

            Eitherway for the partial knowledge i have on the matter AMD’s implementation is partly software based, which means that before some proper comparisons are made i’m not particularly inclined to believe both technologies are going to be equal.

            And i’m no fanboy either, let’s make this clear. Actually, Nvidia this time around might force my hand to swap to AMD for the next generations of cards if all the rumors are to be believed, and i hope they aren’t.

          • Asurmen says:

            They do have their own implementation, but of an open standard. AMD don’t lock you in, Nvidia do.

      • Veles says:

        Monitor makers are already buying into the g-sync standard – because people want g-sync. By offering g-sync cheaper, monitor manufacturers can offer it in lower end products.

        History has shown that expensive hardware such as physics processors always flop because no-one is willing to pay the money – therefore manufacturers/developers are not willing to utilise it due to a low userbase.

        This is a great loss leader opportunity for nVidia – free sync isn’t really as mature yet so this is their chance to break the competition before the race has properly started.

        • Asurmen says:

          They’ve got a handful of months to do that. It isn’t going to happen. The only thing that will keep them ahead now is Freesync not being as good as G-sync.

  6. Kyrsa says:

    Doesn’t enabling vsync with triple buffering solve this issue without needing hardware? I guess you do get slight input lag and you lose a small amount of video memory because you need another frame buffer… but is it really noticable? Cant say I ever have.

    • Ieolus says:

      Isn’t v-sync stuck at 60Hz? So anyone with high-refresh monitors plus GPUs that can maintain high FPS would gain nothing, no?

      • Kyrsa says:

        VSync occurs at your monitor’s refresh rate, so if you have a 120Hz monitor you get 120 sync’s per second… So, if you have a fast enough GPU, still 120 frames / sec.

    • OmNomNom says:

      Unfortunately, depending on the implementation triple buffering input lag can be very noticeable. Also triple buffering still tends to induce stutter. This mostly or completely disappears with gsync.

      Of course gsync isnt as valuable if your games already run consistently at your monitor refresh anyway

  7. tehfish says:

    *sigh* Nvidia trying to lock people into proprietary tech again.

    This is the core reason i no longer buy Nvidia products, even if they might be better.
    They’ve already set PC physics back years with their deliberate hobbling of CPU physx, now they’re doing this…

    • Premium User Badge

      phuzz says:

      At least with Phyx it wasn’t really noticeable, unless you had played on a machine with nVida and moved elsewhere. GSync means you have to spend extra on a monitor, and then given that a monitor will generally last for several GPU upgrades, you’re locked in for a while.
      Of course, if a manufacturer sold a monitor with GSync that was also compatible with Freesync (and I can’t see any technical reasons why not) that might be a different proposition. I’d be prepared to spend a bit more on a monitor if I knew it would be compatible with which ever type of graphics card i bought in the future.

  8. fish99 says:

    Since I got a 120Hz screen I’ve noticed I can have v-sync on and not get a framerate hit. In theory the average wait when there’s no backbuffer available to render to should be reduced from 8ms to 4ms, so the framerate hit should be smaller but still measurable, so I don’t entirely understand what’s going on there. The games can’t all have triple buffering, and I’ve noticed the same thing across all games.

    Either way though the bottom line is I don’t need g-sync/free sync/whatever.

  9. vorador says:

    Indeed when FreeSync hits the market we will talk. G-Sync is pricey due to the manufacturer having to purchase controller boards from nVidia, but it is proven to work. While FreeSync supposedly can do the same, is royalty-free and is going to be integrated into VESA standards for DisplayPort, but the proof is in the pudding and the dessert is coming soon, but not yet on the table.

    The reason why FreeSync isn’t supported by nVidia is because they don’t want to. So depending on your VGA choices, you will have to choose one or the other.

    The question is: can a monitor have both? I think so, because since FreeSync is riding on a function of the VESA standard, it shouldn’t be impossible. It all depends on nVidia not being a dick and blocking it from working on a G-Sync controller.

  10. DanMan says:

    A bit superficial on the tech part, but I guess it’ll have to do for RPS.

  11. aircool says:

    I’m due an update of both video card and monitor sometime within the next six months, and then there’s that oculus thing. It’s a good thing that I get lots of use from my PC…

  12. Rise / Run says:

    So I am gaming on a 15 year old 24″ LCD (1920×1200, $150 from Dell at the time, thank you very much), and while it’s not as vivid as some more modern screens, it still outperforms most everything I see available in shops.

    Oddly, I’ve never ever witnessed ‘screen tearing’. I understand conceptually what you are describing, but haven’t seen it. Odd, that.

    • JamesTheNumberless says:

      Crikey, I thought my 30” dell was getting on a bit, at the ripe old age of 8. You had a 24” LCD screen 3 years before most people even had a flat-screen at all and you paid about a 10th of what I paid for mine! :|

    • iyokus says:

      A quick Google says a 15″ LCD monitor cost about $500 in 2001. So, what was your secret?

      • Mr Coot says:

        There is perhaps a ‘0’ missing from the end of the $150 pricetag. o.O

    • tehfish says:

      15years old? Really? What model?

      Anyhow, i also game on a 24″ 1920×1200 Dell monitor (Dell WFP2408*) I can say screen tearing is something you’ll encounter daily if you do not enable vsync. Unless you’re somehow running games that do not deviate far from 60fps at any time…

      *the 08 on the end stands for the release year, there was also a 2407 model

      • JamesTheNumberless says:

        Cool. Mine is a 3007 – I never actually caught on to the fact that it has its year in its name! It’s still a nice monitor even though it’s beginning to fade a bit. While other sizes (even 27”) seem to have come down a lot in price, 30” monitors are still expensive so I’m very glad mine’s lasted this long.

        The only real issue with it is the lack of HDMI ports, meaning I can’t plug in a console. It does get extremely hot though and in the night it makes cracking noises as it cools down :)

        When it dies it’ll probably be more economical to go for dual 24” screens but I will really miss having one giant screen when playing games.

      • drewski says:

        I game on a notebook so I must be getting screentear out the wazoo but I never really notice it. Much rather have a more responsive framerate with tearing.

        But then I’ve probably not gamed on a genuinely capable PC for about 5 years, so maybe I’ve just forgotten what high framerate low tear looks like…

    • Arren says:

      ::long sound of hot air escaping::

  13. Super Rostropovich 64 says:

    God damn it, RPS! Stop telling me why I need beautiful and expensive things!

    • frightlever says:

      I’ve read the three articles and I literally have no idea what I want now. Well, obviously I want a multi-sync, IPS 120Hz monitor but they don’t exist so I can just put off the decision for another long while.

      • Gryz says:

        Acer XB270HU. 27″ IPS 2560×1440 with G-Sync and ULMB.

        [url=https://twitter.com/tftcentral]TFT Central[/url] got their review unit yesterday. Review is expected in 1-2 weeks. Available in March (let’s hope early March). Expected price: 700 euros.

        • TacticalNuclearPenguin says:

          I was led to believe that 700 would be in Pounds, rather than Euro. Are you sure about that?

          If that’s true Asus will really have a lot of tears to cry.

          • TacticalNuclearPenguin says:

            Also i’m not sure i like this idea of only offering DP. Sure, you need that for G-sync, but that leaves you with poor options for other stuff, like consoles.

            Then again they probably would need a lot of extra electronics ( and price ) to have it work like a “regular” monitor and support fixed refreshes.

  14. Person of Interest says:

    What’s more, the beauty of this kind of frame syncing is that is doesn’t put additional load on your graphics subsystem. Quite the opposite. It makes the absolute most of whatever rendering power you have on offer. If your video card can pump out 40 frames per second in a given game, that’s exactly how many frames you’ll see and without any tearing. Your monitor will run at 40Hz.

    I think this is poorly worded. It absolutely puts additional load on your entire system when you switch from a V-sync 60Hz screen to a frame-sync screen, if you are playing a game that your system is only capable of running at 40 frames per second. That’s because the game would only render at 30 frames per second on the V-sync screen, so the frame-sync screen running at 40 frames per second will likely cause your graphics subsystem to increase from 75% utilization to 100% utilization.

    Unless you run games with V-sync off, you should expect higher power consumption, heat production, and fan noise from your system when you switch from a standard-sync monitor to frame-sync monitor.

    • tehfish says:

      “That’s because the game would only render at 30 frames per second on the V-sync screen, so the frame-sync screen running at 40 frames per second will likely cause your graphics subsystem to increase from 75% utilization to 100% utilization.”

      That’d be true, It’d cause more utilisation, but unless you were on a severely overloaded PC thermal-wise, that wouldn’t be a bad thing. And even if it were, it’d be a cooling issue not a game issue…

      • tehfish says:

        I see complaints about games ‘overheating’ computers quite often…

        Regardless of how badly a game is coded, a PC should be able to handle both 100% CPU and GPU load for extended periods. If it cannot, that is a hardware issue, not software.

        That a game might load a CPU/GPU to 100% if not needed is a software issue, but that does not stop the requirement of hardware being able to handle that…

        • jrodman says:

          Well, it *should*, but this is consumer hardware. It doesn’t always do so.

    • SuicideKing says:

      What? The original article is fine, for all intents and purposes. What you’ve written will confuse people.

      • Person of Interest says:

        I admit what I wrote may be confusing. I blame the disabled Edit Comment feature, as I usually need to revise my posts for clarity (this one will be no different). But I’ve tried, and have been unable to think of how “mak[ing] the absolute most of whatever rendering power you have on offer”, “doesn’t put additional load on your graphics subsystem. Quite the opposite”. I don’t think Mr. Laird meant it that way, but that’s how I interpreted it.

    • OmNomNom says:

      Except your PC still tries to generate additional frames but they never end up being used

      • Person of Interest says:

        That’s simply not true. With V-sync enabled, the graphics card does not start drawing frame 2 until frame 1 is being sent to the monitor. It then waits for the next sync to send frame 2 and start drawing frame 3.

        With V-sync disabled, frame 2 interrupts partway through frame 1’s delivery, the rest of frame 1 is discarded, and work on frame 3 starts immediately.

        In no situation should the PC be drawing frames that never get used.

        • TacticalNuclearPenguin says:

          One thing is for sure, V-sync absolutely helps during summer especially if you have a lot of horsepower paired with a decent TDP, which enables your card to work at half if not less it’s maximum.

          Well, not if you’re playing AC:U that is.

        • OmNomNom says:

          I stand corrected

        • njursten says:

          Now, I’m not 100% sure, but don’t you normally start rendering the next frame to a secondary buffer? Why would you wait? With some luck, the next frame will not be delayed 1 update, if you can manage to render at 45 FPS at least.
          I guess one point would be to decrease input lag. But input lag is often quite horrible with v-sync, isn’t it?

  15. Premium User Badge

    john_silence says:

    I have my eye on the upcoming LG 29UM67, one of those ultra-wide, 2560×1080 29-inchers. It ticks all of the boxes (ish): IPS, Freesync, 75 HZ. It’s supposed to retail for 319 euros, which sounds pretty aggressive. If it comes in white, as the track record for its 29UM65 predecessor suggests it may, it will be hard to resist.
    Sure, ultra-wide can look odd. I’ve never seen one in real life. Sometimes it seems great, and many enthuse about it; other times it looks ridiculously elongated and plain weird.
    Sure, 75 Hz isn’t huge – but with the added pixel count due to an increased width, not many games will sail far past the mark on a single GPU at max settings anyway.
    Really the main problem is that, for the time being, you are effectively tying your adaptive-sync capability to either Nvidia or AMD, with no way out.
    For instance, I have an oversized R9 290 housed inside a Mini-ITX case. Even with additional fans, I have to open the side panel when I launch Battlefield 4 so the CPU doesn’t reach 80°… The smart thing to do would be to replace it, come the next generation of GPU’s, with a successor to the current GTX970 “pocket rockets” that are literally half the size of my R9 290 (!). But the existence of such a card among the crop of next generation Nvidia GPU’s is pure speculation. And AMD may well choose to reduce die size, noise and temperatures for the 20nm 390 cards – there are insistent rumours of factory-standard liquid cooling and stuff. Having to side with either company is an unappealing prospect…
    Freesync does seem like a better bet, since it is an open standard, less expensive (therefore could catch up on G-Sync and gain traction more quickly), and from the off will come in all of the fashionable flavours, albeit arguably never in the right combination (24″ to 34″, FHD to 4K, TN as well as IPS, 60Hz to 144Hz…).
    Nvidia would help everyone by settling for the inevitable and supporting an open standard, but it is a given that they will wait a little while before they do – there’s perverse corporate pride for you :D

    • Love Albatross says:

      I used a 29-inch 21:9 display for a while, and my advice would be this: wait until there’s a 34-inch version. I love the 21:9 format, it’s really nice for games, but 29-inch displays feel weirdly restrictive in the vertical. 34-inch is much nicer.

      • Premium User Badge

        john_silence says:

        Thanks for the input, Love Albatross. Pretty much confirms my impression. LG’s 34UM95 is supposed to be great actually, better than the 65’s. Maybe they’ll release an 97 version down the line.

  16. dsch says:

    ‘Epistemological’ is not a fancy way of saying ‘philosophical’.

  17. KDR_11k says:

    Great, just after I decided to forego g-sync and ordered a monitor…

    • Mr Coot says:

      Don’t stress. g-sync/freesync will probably be the betamax/VHS issue of our decade. Personally I am waiting til DisplayPort 1.3 (which was released in Sept 2014) becomes prevalent and will buy a monitor with sync technology based on that.

  18. SuicideKing says:

    Nvidia’s recently announced “mobile G-Sync” which is likely just using adaptive sync over eDP, like AMD’s original FreeSync demo. May be a sign of things to come.

    My order of preference would be IPS, frame sync and then high refresh, simply because high refresh is fairly useless without a beefy GPU (relative to screen resolution), which is exactly what frame sync aims to provide a solution for.

  19. Caerphoto says:

    I have to wonder, why were LCD monitors even designed with a fixed refresh in the first place? It’s not like a CRT that needs to constantly draw pixels or they disappear – seems to me like the default behaviour of LCDs should have been FreeSync-style right from the start.

  20. drewski says:

    Well I’ve gone from being thoroughly confused to still thoroughly confused to evermore thoroughly confused.

    Apart from the knowledge that I apparently need to spend the cost of a cheap car on my next monitor, I have no idea what’s good, what isn’t, what to prioritise, what looks better, what plays better, or, well, anything really.

    And I thought motherboards were pains in the ass to figure out when buying.

  21. one2fwee says:

    What we also need are 16:10 4k monitors; 3840×2400 .

    You may ask why – well for a start they would allow native integer scaling of lots of 4:3 resolutions.
    Both 640*480 and 800*600 would go into it natively with simple pixel multiplying. This would mean you wouldn’t get horrible scaling fuzz and mess on things SD console games or simply older pc games.

    Of course, there’s still the problem of 1024*768 and 1280*960 but most games that do those resolutions i imagine also support 1600*1200, which would also be fine.

    Also, a cheap DIY video transcoder and scaler project for old consoles would be nice – so you can input rgb scart, component etc and it would convert to something you can actually use with minimal lag. Unfortunately even VGA is being removed from monitors now, so you can’t just get a simple transcoder. Supersucks!
    And all the commercial solutions are super expensive or downright awful.
    All i want is simple pixel doubling damnits, don’t need any scaling at all – except linear scaling horizontally for pixel aspect correction. Maybe some scanlines depending on preference.
    But none of those things are that hard surely?!