G-Sync vs. Freesync: Which Dynamic Refresh Is Best?

The best things in life aren't free

It feels like whole months since there was a good old fashioned fisticuffs between AMD and Nvidia. They do so love a PR punch up. But this one’s a bit different. Nvidia’s G-Sync technology versus AMD’s FreeSync isn’t the usual trench warfare over fractions of a frame per second. It’s much more interesting than that. It’s all about something called dynamic or adaptive refresh and how that can make games run much more smoothly without necessarily upgrading your video card and even at modest frame rates. G-Sync has been available for a while. But now the first FreeSync panels are out battle can commence…

I’ve covered the basics of adaptive refresh before, so I won’t retread that ground again other than to point out that at the very least you’ll need a new monitor. Instead, I’ll send you over here for a refresher and dive straight into my first impressions.

For context, the systems I played with were running pretty similar panels. The G-Sync was an Asus ROG monitor with a 27-inch 2,560 by 1,440 TN panel, the FreeSync effort matched all those metrics and was made by Acer. Obviously, we’re talking AMD graphics in one rig, Nvidia in the other. For now, Free-Sync is AMD-only and G-Sync is restricted to Nvidia graphics cards. It was ever thus.

Both setups were running in-game and with tech demos supplied by Nvidia and AMD. They’ve cooked up demos to help pinpoint the benefits of dynamic refresh and handily you can run them on each on both platforms, which helps exclude any 3D dodginess.

Anyway, I’ll dispense with the suspense right away by saying that, as things stand, G-Sync is very clearly the better technology by pretty much every metric save for cost. Firstly, it’s just smoother, especially at lower frame rates.

This isn’t a night and day difference. Arguably, the delta is so small that you wouldn’t notice it unless consciously looking out for it. But FreeSync just isn’t as flawlessly smooth at higher frame rates as G-Sync nor does is it as consistent at lower frame rates. It’s worth remembering that as the frame rate drops, there’s a limit to the smoothness that can be achieved with either of these syncing techs. A perfectly synced 20 frames per second is not going to be buttery smooth. But G-Sync still makes a better fist of it, subjectively at least.

Nor is FreeSync as robust. Unfortunately, dynamic refresh is one of those technologies that can have you staring at the screen, scratching your head and wondering whether it’s even working. Is the stuttering you’re seeing because the feature isn’t enabled or because it’s just a bit crap, for instance?

However, when you see screen tearing, you know that it can’t actually be functioning. In some scenarios, that’s exactly what I saw with FreeSync. It didn’t always work in-game when it was switched on. G-Sync always did, as far as I could tell.

Oh god, it’s ghosting…

The final black mark next to FreeSync’s name involves ghosting. I’m actually a little reluctant to call it that since I’m not sure the problem is necessarily the same as the conventional LCD monitor ghosting. But it looks pretty similar and gets the idea across.

Essentially, with FreeSync enabled, a shadowy ‘ghost’ version of moving objects can be seen trailing just behind in their wake. Much depends on speed of movement and the colours of both the objects and the background. But as it happens, it’s particularly apparent with AMD’s FreeSync demo involving a 3D-rendered wind turbine. The ghosting that appears behind the blades with FreeSync enabled is as obvious as it is ugly.

Critically, it’s not there with FreeSync off, regardless of any other settings including refresh rates or V-sync. So it’s not an inherent panel problem. You can also run the same demo on the Asus G-Sync panel with dynamic refresh enabled and clearly see that Nvidia’s solution doesn’t have the same problem.

Moreover, this seemingly response-related problem means the FreeSync panel tends to look that little bit blurrier with FreeSync enabled. Not good. This ghosting issue has been fairly widely reported (there’s a video here showing the issue fairly clearly on a BenQ FreeSync screen) to the extent that I’m fairly confident it’s a general FreeSync issue and not specific to the Acer monitor I saw.

So, what’s going on? Certainly, it’s perilous to take the word of either protagonist on this kind of subject. Agendas are legion.

But based on what I’ve seen and what’s been said, this is my take. Pretty much any modern monitor has image processing electronics. Monitors don’t merely take the signal from your graphics card and chuck it onto the LCD panel untouched.

Instead, they process the signal in a variety of different ways. One of those is aimed at reducing blur and persistence. In other words, improving pixel response. A typical strategy is something known as pixel overdrive which, among other things, involves jolting pixels with increased voltage in order to ramp up response rates.

Anyway, the precise details aren’t all that critical. What matters is that you’re going to notice a negative impact on response and possibly other areas of image quality if the image processing is either not working properly or entirely disabled. Modern monitors look better not just because the panels have themselves improved but also thanks to image processing. Lose the latter and you’ve got a problem.

Nvidia’s G-Sync monitor chipset makes for a more expensive adaptive refresh solution. But for now it also makes for the best…

I suspect this is what’s happening currently with AMD’s Freesync tech. It may well be that technologies like overdrive are only properly supported when the monitor is running within a range of fixed refresh rates supported by its firmware. Turn on dynamic refresh and the image processing is compromised or perhaps bypassed entirely.

Nvidia’s G-Sync tech, by contrast, is essentially a wholesale replacement for the monitor’s image processing chipset and firmware and allows technologies like overdrive to work properly when G-Sync is enabled. This is what Nvidia means when it talks about G-Sync being ‘tuned’ for each panel.

Again, the above is my take on the causes of the ghosting problem. It makes sense, but I’m not absolutely certain it’s precisely what’s happening. If it is, it’s possible in future to imagine monitors with fuller firmware support for FreeSync that allows image processing to function correctly with dynamic refresh enabled.

Either way, I’m not entirely clear on the prospect for fixing the ghosting problem on these early FreeSync panels. As they currently stand, they’re impossible to recommend. At best, I’d characterise FreeSync in it’s existing state as a fun little extra I’d be happy to have for free. But I wouldn’t want to pay a premium or have it dictate my choice of monitor.

G-Sync, on the other hand, is a trickier question. Having seen the two technologies side by side, I’ve a clearer idea of why it’s undoubtedly the better choice right now. Whether I think it’s worth the price premium is another matter.

On balance, I’d probably say not. I’d be happy, merely, with an affordable high refresh panel. Then again, high refresh panels that lack dynamic refresh support are themselves usually still quite pricey and certainly limit your options. The choice between 120Hz and IPS, for instance, isn’t one I want to make. I want to have both.

One day, most monitors will probably be 120Hz-plus and support dynamic refresh and it’ll all work without compromising image quality in other ways. But that’s a few years away yet, at best.

54 Comments

  1. Siimon says:

    Thanks for this article! :)

  2. vorador says:

    Weird.

    Other more technical sites have run comparisons between FreeSync and G-Sync and found barely a difference. Guess it’s in the eye of the beholder, or in this case it’s possible the monitor is a lemon. Who knows.

    • DanMan says:

      Seems reasonable. My take is that since Nvidia chooses the hardware they need to have GSync working, you can expect it to work well. In case of Freesync it’s up to the monitor’s manufacturer which hardware they build into the thing, so there can be quality differences.

      • freneticponies says:

        Pretty much. Freesync is qualitatively and technically better in every way, if you get the right monitor. For early adopters looking at it I don’t see why you wouldn’t be the type to do this anyway.

        • Gryz says:

          FreeSync is better in every way ?
          Are you an expert on monitors ?

          Check out this YT-movie by PCPer.
          link to youtube.com
          They *measured* stuff. They saw that FreeSync is less smooth at lower framerates than G-Sync. I think very few people understand the details on what’s going on. And even fewer understand how to measure things.

          One thing you should realize: each frame does not automagically appear from the GPU in the monitor. The frame goes over a cable. With limited bandwidth. To be more precise: with DP1.2a a 2560×1440 frame takes about 5.5 milliseconds to transmit to the monitor.

          Now imagine what happens when the gap between rendered 2 frames is longer than 33 ms (or 25ms on a 40Hz-xHz monitor). Someone has to decide that the last frame has to be shown on the monitor a 2nd time. With FreeSync, the GPU has to make that decision. With G-Sync, the monitor makes that decision. The G-Sync monitor can delay that decision until the current frame has been displayed 33ms. With FreeSync, the GPU has to make that decision 5.5ms earlier. Once the FreeSync GPU started sending the same frame again, it can’t anticipate if the next frame is suddenly ready. With FreeSync, at low framerates, the displayed frames will be more out of sync with the exact rendering of frames.

          If I, a simple amateur, can come up with an algorithm that makes use of that memory on the G-Sync module, to make a smoother experience, don’t you think that professionals at nVidia can come up with even better stuff ?

          • TacticalNuclearPenguin says:

            That was the entire point with G-sync at the end of the day, a purpose built hardware solution with proper monitor/GPU interaction, whereas Free-sync is mostly an adaptation of a VESA standard that was created for power saving and not necessarily for gaming. That feature was just supposed to avoid pointless refreshes when not needed, it wasn’t made to tackle the stringent requirements of a gaming scenario in which every little refresh correction needs to happen with no latency on the spot, with the monitor holding the sceptre of command.

            AMD can surely polish the thing even further and it might get better, this is not something i’d rule out. Thing is, it may happen but it also may not, as it simply isn’t a dedicated hardware solution that would make the problem far easier to solve, as G-sync is demonstrating.

            Then again, i wish them best and i hope they do, just like i do hope they release their bloody 390X quick because i’m horribly itching for an upgrade. Should that card prove to be a monster, so be it, otherwise i will wait for it’s second function to happen: the release of a 980ti at half the price of that stupid Titan X, which sounds even tastier as rumors suggest that the shader count won’t be cut, only the RAM which is still a whopping 6GB anyway.

    • nimbulan says:

      I’ve read quite a number of comparison articles and it’s quite clear the the ghosting issue exists on all FreeSync monitors from all manufacturers. A lot of outlets likely just didn’t notice the problem since they weren’t looking for it or may be used to ghosting from testing lower-end monitors.

      As for the smoothness issue, there’s two parts to it. Firstly, G-Sync not only has a lower refresh rate limit (30Hz vs 40/48Hz for FreeSync) but will continue to function below this limit by doubling/tripling/etc the refresh rate. FreeSync falls back on vsync on/off below that point. On the top end, G-Sync falls back to vsync and prevents the framerate from going any higher to maintain the smoothest experience while FreeSync gives the option to turn vsync off for the absolute minimum input lag at the expense of smoothness and screen tearing.

      Now the issues with FreeSync could potentially be solved in the future with driver or firmware updates but at this point it’s hard to say if or when that will happen.

      • remon says:

        Actualy, Freesync’s lower hz limit is lower than Gsyncs. It is supposed to go as low as 9 hz. The problem lies on the panels these monitors use.

        • childofthekorn says:

          ^This

          The theoretical limits of freesync are lower but if the manufacturers don’t meet those specifications its going to not work as originally intended. May be the prime caveat of FreeSync. Do your research before choosing a monitor folks.

    • aliquis says:

      Yeah it was a shitty “article”, this is much better since it says what’s actually going on:
      link to wccftech.com

      In the case of Freesync if you go below the refresh rate the monitor is capable with it will obviously not be able to drop even lower. So it will revert back to using that refresh rate with or without v-sync.

      If you decide to do it without v-sync then obviously you’ll get tearing.

      Nvidia was smarter (?) there and earlier went with the highest refresh rate the monitor was capable with and with V-sync. So it wasn’t synced to the low refresh rate but you get no tearing and your actual frame rate was so low anyway .. So hence you don’t get any tearing.

      What Nvidia has done lately though is to improve it even further with simply doubling the picture rate you get so say if your card render at 24 FPS and your monitor can’t do lower than 40 Hz it will just draw the same picture twice for a 48 FPS/Hz signal which the monitor will handle just fine. With no tearing.

      So that could be why one got tearing. And I suppose one could change the behavior of Freesync to a similar one too over time. I think something else seemed clever with Freesync before (if nothing else it support a wider range of refresh rates.)

      As for ghosting I didn’t read this text block all that much since it seemed to be subjective on two different monitors anyway and not go into details. But it could of course simply be that the panel used in the Acer monitor isn’t as good as the one in the Asus one and simply has more ghosting for instance. Regardless of the technology used.

      • Cederic says:

        Of course, had you read the text block you would have found that it addresses and negates your suggested explanation for ghosting.

        Spending more for a Freesync monitor then getting tearing anyway sounds like Freesync isn’t working. Your explanation is that it’s intentionally disabled at certain refresh rates. My interpretation is that it’s thus broken by design and thus not worth the investment.

  3. DanMan says:

    /engage smart-ass mode

    It’s actually called Adaptive Sync

    /disengage smart-ass mode

    • Gryz says:

      Adaptive Sync is a feature in a protocol specification. Namely in DisplayPort 1.2a.
      FreeSync is a trademark owned by AMD.
      DPAS is a technology/protocols/feature used to implement FreeSync.

  4. phelix says:

    The article talks about an Asus ROG and Acer equivalent, but the ghosting image shows a Samsung? Did the Acer produce the same artifacts?

    • childofthekorn says:

      That is part of the issue. Anandtech could see no perceivable different between GSync and FreeSync @ 24Hz, so it makes it difficult to take this article seriously. I smelled bias from the first paragraph. Check out WCCFtech, anandtech and guru3D for better coverage of GSync vs FreeSync.

      • Gryz says:

        Can you link us the article/page where Anandtech tested at 24fps and came to that conclusion ? I don’t see it in Anandtech’s most recent “The AMD FreeSync Review” article.

      • TacticalNuclearPenguin says:

        If you smelled bias you’re not reading him for that long, as most of his past articles are all about recommending AMD. Not the CPUs, off course.

        As far as my memory goes he simply always tried to recommend the best possible bang-for-buck, and that matches AMD in the GPUs department, with the exception of some outstanding products that no matter the asking price are exceptional in their own way, like the latest “cheap” 6 cores from Intel.

        If now he simply can’t recommend the cheapest option there i’d say something is going on. Even if his feeling is indeed just that, something subjective, you’re still not safe as you might feel it exactly the same with the monitor on your hands.

        As he says at the end, the market is still not stable enough around that stuff for now, especially not if you want more panel/prices options compatible with these new features. Even G-sync that already works great today has just a single option on the IPS front ( if that’s your thing ), fine for a early adopter with lots of money but not for anyone else really.

      • thedosbox says:

        Anandtech’s latest freesync review notes that other sites have experienced ghosting on freesync, along with this quote “The FreeSync displays so far appear to not have the same level of anti-ghosting as the currently available G-SYNC panels”.

  5. Premium User Badge

    Wisq says:

    Yeah. When nVidia announced G-sync, amidst the positives, I also saw a lot of responses along the lines of “oh god, more proprietary vendor-splitting”. When AMD announced FreeSync, it was, in some parts, seen as finally doing the Right Thing, using an open and specifications-abiding extension to provide the same thing.

    The automatic assumption was that nVidia’s solution was proprietary because they’re greedy, and AMD’s wasn’t because they’re the Good Guys in this particular scenario. But if there really is a quality difference between the two, that suggests an alternate narrative: That nVidia tried the specs-using method and it wasn’t satisfactory, so they made their own; and that AMD, blindsided and playing catch-up, just does the simple/lazy thing that nVidia rejected.

    I’m not saying either one of those scenarios is correct; honestly, as two big corporations, I don’t trust either of them, and I’m certainly not going to fanboy-align myself with them. But my current impression of the industry is that AMD has bet the farm on merging CPUs and GPUs and getting themselves in modern gaming consoles, leaving the actual innovation to nVidia and Intel and playing catch-up where they can.

    I always try to approach every PC build with an open mind and a lot of research, but I also build for performance and versatility (not necessarily price) — and every time I do the research, it seems like Intel is soundly beating AMD on performance, and nVidia is beating them on features (and driver stability). Granted, my last build was three years ago — but that’s because it still performs great today.

    • remon says:

      Your ending assumption, that AMD has left innovation to Nvidia and Intel, is wrong. HBM memory is AMD’s and Hynix’s design, which Nvidia also adopting it, and Mantle has paved the way for Vulkan and Dx12. Both of these are much bigger deals than anything Nvidia has come up with lately. And there is other stuff of course, like LiquidVR.

    • monkeybars says:

      nVidia could have made G-sync open to AMD and not sacrificed the quality of their product. They didn’t.

      • Premium User Badge

        phuzz says:

        And AMD (and everyone) knew that whatever they said about FreeSync being and open spec, there was no way nVidia was going to support it while they have GSync.
        We just have to hope that someone puts out a monitor that supports both. I don’t want to have my choice of GPU dictated by the monitor I buy, when I upgrade my GPU every few years, but I expect a monitor to last a decade or so.

  6. Premium User Badge

    Mungrul says:

    Hey Jeremy, are you going to be reviewing the Acer XB270HU?
    Apparently it’ll be available next week, and it sounds like the holy grail of monitors,G-Sync with IPS.
    Having tried the Asus you used for this piece, I’m keen to see if IPS is a massive improvement. I found the TN panel of the Asus unbearable and actually physically painful to use, and ended up sending it back.
    It wasn’t all bad mind you. G-Sync really did impress me.

    • Gryz says:

      The Acer XB270HU has been available for 2-3 weeks now. At the moment they are sold out (world-wide it seems). The next batch will arrive in stores on April 20th (most likely). You can find the ultimate review at TFTCentral:
      link to tftcentral.co.uk

      I had one on my desk for 3 days. I sent it back, because it had backlight bleeding and/or IPS-glow in the right lower corner. This becomes more apparent when playing dark games in a dark room (which I do a lot). More people have complained about this issue. But many find it acceptable. Besides the blb/glow, the monitor was FANTASTIC. :) Just my subjective opinion. For an objective opinion, read the TFTCentral review.

      • TacticalNuclearPenguin says:

        That kind of problem is extremely common in various panels nowadays, mostly because LEDs offer an excuse to play the cheap card and use edge-lighting instead of back-lighting, which off course is hardly going to end well.

        IPS glow on the other hand is still a thing and it’s impossible to solve unless you use a polarizer, and that mostly happens on the very expensive ones only.

        Wild color inaccuracies, incorrect gamma ramps and color temperatures instead are something that plagues every monitor that is not absolutely professional grade and made perfect directly from the factory, and i’m talking about only the holy-grail of them. Even Dell’s wide-gamut “professional” ones are really not much accurate and rather lousy, especially on sRGB emulation as you can see on TFTcentral, despite coming with their factory reports that claim otherwise. Bleeding and IPS glow are rampant aswell.

        If you want to kill IPS glow, Bleeding and you want a close to perfection color accuracy and absolutely perfect 2.2 gamma curve, with panel uniformity correction trown in the mix, simply go for this.

        I did, it was a monetary sacrifice for me, but it was the one that made me the most proud and i don’t regret it in the slightest. You clearly don’t mind spending a lot on monitors, so that’s my suggestion for you even if you’re throwing high refreshes out of the window, along with G-sync. Input lag is quite fast aswell, there’s a review on Prad.de. Try it if you can send it back, even though i’m pretty sure that you’ll keep it. I wasn’t sure i could really justify such a price “just” for a perfect image, but that’s exactly what happened instead.

        • TacticalNuclearPenguin says:

          P.S. Know that most monitors that show some brighter areas in a not-so-severe fashion can be “fixed” by gently tapping that part and/or brushing a little ( use microfiber ) in varying patterns, depending on what seems to work best in your particular case. Generally if it’s on a corner you want to procede diagonally, starting a bit further away.

          This is sadly a byproduct of increasingly smaller and more delicate cells coupled with all the crap that inevitably happens during manufacturing, including a not-so-careful assembling with lousy industry standards as the culprit.

          Anyway, this “fix” only applies to small defects and it can’t solve the big stuff. Eitherway be gentle.

  7. buxcador says:

    Ghostery is an unrecoverable flaw.

    Non acceptable.

    There is no negotiation possible.

  8. mitcHELLspawn says:

    Cool article ! Great time for me to read it too since I literally just bought a gsync monitor last week! I bought the Acer 4k Gsync 1ms monitor and I am so glad I went with it! The thing about 4k gaming is even with my 2 980s sli it’s hard to get a complete 60 fps all the time with the settings maxed on the newest games… but now with this monitor it makes the drops into the mid 40s entirely acceptable and it literally still feels like I’m gaming at 60 fps. I absolutely love it. Best investment I’ve made in a while.

  9. tehfish says:

    For me it’s a very simple decision: freesync is the open VESA industry standard, gysyc is proprietary tech that locks you to nvidia only. Result AMD automatic victory regardless of performance.

  10. Tacroy says:

    The whole idea behind adaptive framerates is fundamentally flawed because it doesn’t have buy-in from the operating system – which means that if you want to G/Free sync your games, you have to play them fullscreened. Not in a window. Not borderless. Full screen lose complete control of your entire display surface just for this sort of thing.

    I can’t imagine it being worthwhile until Windows 11 or SteamOS or whatever natively support adaptive framerates at the OS level.

    • grimdanfango says:

      I can’t quite see what you expect from the OS here. What happens if you have two different 3D apps open in different windows? How do you tell a monitor to refresh at two different rates simultaneously?

      Adaptive refresh only makes sense when locked to a single fullscreen application.

      The biggest problem I’ve found lately is that Unity updated their engine at some point to basically force “borderless window” as the default-and-only fullscreen mode, which means that any game based on a recent build of Unity no longer functions whatsoever with G-Sync. It’s really starting to annoy me as more and more games update to the latest version of the engine. I just really don’t get the point of borderless window. There are 15-year-old games that work perfectly with G-Sync, but a whole slew of current releases that break it!

      • jrodman says:

        It seems like full-screen borderless could be made to work with a certain amount of techno-wizardry, though it may well not be worth it.

      • Low Life says:

        I love borderless windowed but this problem (and a few others) indicates that it’s not something that should be considered the only way to play in a full screen mode. It’s a PC game, just give the user the option to choose what they want.

      • cyberdrizzt says:

        You could try using Exclusive Fullscreen Mode in Unity, that should set it to a ‘true’ fullscreen, it only works with DirectX 9 though.

      • PoulWrist says:

        Borderless window is the single most perfect feature ever. Fullscreen makes a lot of games perform worse, for some reason. Noone wants to look at a window border, but we all want easy access to the chat clients, browser windows and so on that sit in the background, or on our second/third monitors. Without the mess of alt-tabbing out of a fullscreen app.
        On top, fullscreen often messes with what’s going on on multiple monitors because resolution can flicker between stages of the game starting/loading/alt-tabbing out of it, making windows lose their position or make the windows on the second monitor useless during gameplay.

  11. Xan says:

    Having just recently built a rig with G-Sync, I can say it’s indeed a very remarkable experience.
    As one of the G-Sync reviews put it, it’s like Hobbit vs “normal” movies: weird at first.
    I’m using BenQ XL2420G (mostly because I needed 24” over 27” due to some space constraints).

  12. Alfius says:

    With apologies if this has been addressed in comments already:

    What’s the required outlay to set something like this up? I’m running a year-and-a-bit old GTX 770 with a close to five year old Samsung monitor, what is my upgrade path to a G-sync capable set-up? Do I require a new GPU and monitor or does every modern Nvidia card have the capability to interface with a G-sync monitor built in? For that matter, is my card a ‘modern’ Nvidia card at all?

    • Menthalion says:

      @Alfius: Since GTX 770 supports GSYNC (source: link to geforce.com) you should be good to go with just a new GSYNC monitor.

    • Xan says:

      Yes, 650Ti and up can support it.

    • grimdanfango says:

      The other requirement is to be able to maintain at least 35fps at all times, as G-Sync will disable below this.

      I really hope we end up with monitors that can sustain the image far longer without a refresh, because I think G-Sync would come into its own in the 20-30 fps realm… it could make a juddery mess of a game actually feel reasonably playable.

  13. Rymdkejsaren says:

    ‘NSYNC!

  14. zat0ichi says:

    want nivida – can’t afford it – get AMD (and be prepared to roll your sleeves up a bit.)

    By the time I can afford 1440p or 4K Gsync should be a cheap add-on.

    Early adopters get burnt every time.

  15. PenguinJim says:

    “The choice between 120Hz and IPS, for instance, isn’t one I want to make. I want to have both.”

    And 16:10 as well, please. I didn’t build a gaming PC only to have the screen optimised for 16:9 TV & movies instead of being better for gaming, office & productivity, video editing, 4:3 TV and playing Internet one-upmanship.

    • Cederic says:

      My 16:10 vs 16:9 distress disappeared when I went to 2560×1440. It’s 16:9 but it has that crucial extra vertical space.

  16. Lanessar says:

    Well, while this is entirely anecdotal – I wanted to throw my experience as an early adopter out there.

    I recently purchased a new computer, and went with an Intel/nVidia setup instead of AMD/ATI (which I’ve been using for the last 4 years or so), and specifically set out to have the “G-sync experience”. This also included the purchase of a ROG Swift monitor, and using a 970 GTX from EVGA.

    I’ve been gaming with my old rig at usually 45-55 FPS on most of the games I’ve been playing. Some stutters started to become noticeable, especially dropping from hyperspace to station, and the… I wouldn’t use the word “stutter”, but… let’s say SLIGHTLY “hitchy” feeling when moving the camera in free look was getting to me. It was sort of a lesser version of the slide show effect when frames drop to sub-30 FPS. Very minute. But it started bothering me on games like Pillars of Eternity and, of course, Elite.

    Of course, switching from my old rig to the new one handled much of that. G-sync really shines on “Elite: Dangerous”. I was stably getting about 80 FPS in a space station with all the bells and whistles, and I went to Freelook mode.

    There was literally no motion blur with rapid movements. It was clear, text in the station was readable while the camera is in motion. However, in open space, and in most combat scenarios, I’m pulling 120-140 FPS. The transition from space > station (140 to 80) is pretty seamless, it doesn’t seem slow or feel lack of responsiveness. Even dropping to around 50 FPS, g-sync handles pretty gracefully – although it does appear a little “sluggish”, it certainly isn’t the jarring “stutters” that I used to see.

    The effect is also visible in Heaven 4.0 – you can turn off g-sync in that demo and start seeing the difference in frame delivery when transitioning into scenes with tesselation while moving (like… scene 6 I think). Even a 970 drops from 60-80 FPS down to 30-40 on high settings, and it becomes visible that g-sync, while not delivering “more FPS”, does a better job at handling the drop and zero “ghosting”.

    Playing games like “South Park: Stick of Truth” or “Darkest Dungeon”, which is locked at 30 FPS, there’s zero benefit. Some games like Pillars of Eternity don’t really seem to gain any benefits over 60 FPS as far as fidelity, so obviously, if titles like these are your bread and butter, you needn’t worry about it.

    I personally think G-sync is the way to go if you are buying right now. The solution is more robust than Freesync, has had a little longer to work out the kinks, and relying on hardware rather than software to handle the matter most likely improves performance.

    I do feel that Freesync will mature and may equal G-sync sometime in the near future; but right now, it doesn’t appear that way.

    • Lanessar says:

      Sorry, due to formatting and lack of sufficient coffee, one of the paragraphs up there didn’t get inserted, so it sounds like “more FPS = g-sync” which isn’t the case.

      Insert a paragraph after “of course, switching from my old rig” and I think it communicates more clearly the difference between a better performing card and what g-sync does with that card when FPS degrades, in my personal experience.

    • grimdanfango says:

      Are you actually getting G-Sync in Pillars of Eternity (What mode does the monitor menu report when ingame)? My experience lately is that the majority of Unity engine games force borderless-window as fullscreen, and don’t function at all with G-Sync as a result. Annoyingly, Pillars *did* work during the beta, but they must have upgraded Unity at some point. Same thing happened with Kerbal Space Program. It worked until a recent update.

      While it might not seem like a useful thing, I think G-Sync comes into its own in any game where you pan the camera around a lot, so I reckon Pillars and games like it could stand to benefit a lot.

      • Lanessar says:

        I had to test this, actually. No, G-synch isn’t activating.

        It’s not really needed though. There are zero frame drops – I have a solid 135 FPS with every setting maxxed out.

  17. grimdanfango says:

    If anyone wants a great G-Sync test… I’d recommend going back and playing Mirror’s Edge. I recently played it through start to finish in one sitting, with my GTX 980 and ROG Swift.

    It’s an absolutely beautiful test case… I was able to play with all settings maxed out, at 2560×1440/8xS AA, and it kept up frame rates of between 100 and 120fps… which normally would mean V-Sync back to 60, but with G-Sync, it’s absolutely buttery-smooth the entire time, for a buttery-smooth flowing game. It was an absolute joy to play :-)

    And it’s still one of the best looking games I’ve seen.

  18. Astatine says:

    Another early adopter here…

    I have the Benq XL2730Z (driven with a Radeon 290X). Upgraded from a 60Hz IPS display and I’m really happy with it. Yes, the colour’s not as “good”[1], but it’s not like I’m doing professional photo work: I tuned the colour balance and I look at the screen on-axis and it looks fine to me.

    FreeSync up to 144Hz is an absolute revelation compared to the old 60Hz max, and it helps that this display’s got no perceptible input lag, whilst the old one had just enough to be noticeable and annoying. For gaming, it’s brilliant. Yes, there’s the 40Hz glitch where FreeSync drops out, and it’s annoying, but at that framerate the old display was already a worse experience. And yes, there’s a ghosting artifact, but I don’t notice it unless I’m deliberately looking for it. All screens have some artifact or other…

    I haven’t tried a GSync display. It makes sense that they are better in some ways at this stage since Nvidia can require certain quality controls as part of their certification process; but don’t go believing that FreeSync is a bad experience — it might not be perfect but it’s absolutely better than not having it.

    I’m a bit worried that there’s a growing opinion in PC enthusiast circles that Nvidia products must always be better than AMD (regardless of objective comparison), and these articles are affirming that view, I’m afraid. Nvidia might have more expensive, shiny “halo” products than AMD do, but given a budget and spec, they don’t always win over the competition. (Look at the price premium attached to GSync screens, for a start.)

    [1] My old Dell U2711 had a persistent yellow tinge that I could never quite get rid of unless I set the colour temperature to crazy-hot. So much for IPS perfection, huh? ;)

  19. mrsuperguy says:

    I think if I had the luxury of choosing — I.e the money to buy a new monitor when I have a perfectly good 1080p 60hz asus one now — I’d go for one of the G-sync monitors listed in one of links in the article linked at the beginning of this one (this particular one… I think is about as expensive as my current one was from amazon: link to bit.ly) for a couple of reasons:
    1- I don’t think it’s anymore expensive or isn’t much more expensive than my current monitor
    2- Even if free sync actually is just as good as G-sync and if my memory serves me correctly then G-sync actually isn’t un-reasonably expensive
    3- From what I’ve heard, Nvidea cards are just better and worth the expense so that sort of decides which adaptive sync tech I pick as well (is that stupid?)

    Anyway, if and when I actually do have the luxury to be able to make this choice for real, hopefully it’ll be a bit clearer.

  20. Unclepauly says:

    The panel processing thing just isn’t true. Alot of us have bought korean 1440p panels that have no image processing and look beautiful with almost zero input lag to boot.

    • Unclepauly says:

      I was referring to the article stating that without image processing we would have ugly screens