FreeSync vs G-Sync revisited: FreeSync 2 is coming

This didn’t go too well for AMD’s FreeSync technology last time around. But lo, a shiny new version of FreeSync is inbound. Give it up for FreeSync 2: This Time It Actually Works. OK, that’s a little unfair. But hold onto your mechanical keyboards, folks, because FreeSync 2 is as much about streamlining the PC for HDR support (and indeed making AMD your weapon of choice for HDR gaming) as it is syncing your graphics card and your monitor nicely. Confused? You aren’t the only one…

Since my last missive on FreeSync, it seems the worst flaws have been polished out of the existing implementation with the latest monitors. I was recently twiddling with the new LG 38UC99. It’s a 38 inch 21:9 monster with a 3,840 by 1,600 native res. But that’s by the by. It also supports FreeSync and it didn’t have any of that ghastly ghosting when FreeSync is enabled.

My understanding, as I’ve mentioned previously, is that with some early panels FreeSync didn’t play nicely with the built-in image processing. More specifically, some monitors had to disable response-boosting pixel overdrive in order to enable FreeSync. Not good.

Actually, it’s a bit of a deal breaker. You can argue the toss whether rendering smoothness is more important than pixel response. But not being able to have both at the same time makes that kind of flawed FreeSync implementation pretty pointless.

Anywho, I’ve had my finger off the FreeSync pulse for a while, so all this isn’t exactly news. But it’s nevertheless welcome to know that FreeSync’s most obvious flaw was more of an early-adoption stumble than an insurmountable hurdle.

All that said, purely in terms of the experience I’d still lean toward Nvidia’s alternative G-Sync. Subjectively, it has always seemed more polished, reliable and effective. The downside, of course, is the cost. G-Sync involves dedicated hardware inside the monitor and it’s pricey.

Ah, the cool green glow of proprietary profits!

It also locks you into Nvidia GPUs which is thoroughly off putting. Not for any specifically anti-Nvidia reasons, but instead because monitors tend to be the longest-serving of all components for a lot of PC enthusiasts and gamers and the prospect of being boxed in on your GPU choice when the time comes to upgrade isn’t remotely appealing.

The same problem has applied to FreeSync, of course, and AMD hardware. It’s just FreeSync doesn’t noticeably increase the price of a monitor and often wasn’t terribly well implemented, making the lock-in somewhat academic.

At least that has been the case. Enter FreeSync 2, a second-generation effort that looks set to be technically much more wide ranging. The most significant new feature is a bespoke transport solution for getting an HDR signal from your PC to the monitor (HDR or high dynamic range in the context of displays being something I’ve covered here).

HDR is a complicated technology in many ways and that extends to the fact that it’s far from obvious to the average punter what’s required to achieve it on the PC. FreeSync 2 is having a stab at becoming the default HDR hardware standard for the PC, the idea being if you have a FreeSync 2 monitor and a FreeSync 2 video card, then you are HDR ready. Well, on the hardware side, at least.

Transport stream plus display tone mapping = input lag.

The details involve a bespoke display pipeline to compensate for Window’s currently patchy support for HDR (it’s been reported that a fix for that particular problem will arrive with the upcoming Windows 10 Creators edition), plus – and this could be the really critical bit – measures to reduce lag on the monitor side of the equation. Latency is apparently an issue for the image processors in HDR displays when performing internal HDR tone mapping. This isn’t something I’ve experienced (the lag, that is). But then when I tried an HDR TV for PC gaming recently, it wasn’t actually running in HDR mode. Whatever, FreeSync’s big trick is to streamline the whole process and remove the need for internal HDR tone mapping in the display, neatly sidestepping that lag problem.

Overall, it strikes me as a fairly smart move, what with HDR now dominating the high-end HDTV market and also driving into the console gaming market. If AMD can wangle its way to being seen as the default choice for achieving painless HDR, it could be something of a marketing coup.

What I’m less wild about is the prospect of an HDR war between Nvidia and AMD. AMD says that all existing FreeSync 1 compatible cards will also support FreeSync 2, which is good news. What I don’t detect is any indication that FreeSync 2 will be anything other than exclusive to AMD graphics cards. In fact, quite the opposite, there’s even some talk of AMD charging royalties, which is moving things in completely the wrong direction if true.

Far better for us gamers would be for AMD and Nvidia to get together and agree on a joint HDR standard for the PC. In the end, beating each other over the head with proprietary HDR tech is likely to be a zero-sum endeavour. They’ll both have it. Hardly anybody will really understand who has the better solution. It won’t be a driving factor for purchases. So they may as well do the right thing for the consumer and adopt a cross-compatible solution. But that’s not going to happen, is it?

A UHD HDR TV as gaming monitor? That’s a story for another day…

Whatever does unfold, bundling all this HDR stuff under the FreeSync brand seems a bit odd. But AMD is also tightening up requirements on the actual frame syncing side of things. Currently, monitors can be sold as FreeSync capable without actually fully supporting FreeSync. Cheaper panels with narrow refresh rate ranges don’t support the Low Framerate Compensation feature, for instance.

FreeSync 2 will make that a requirement. It’ll also make low latency a requirement for any panel, regardless of what mode it’s running in, though it’s not yet clear just what will qualify as low latency for FreeSync 2’s purposes.

Generally, while it all sounds fairly promising I’m slightly uncomfortable about how things are going to shake out in terms of standards and compatibility. HDR feels like a recipe for finding yourself stuck down a hardware dead end having spent to much money to reverse out. I’d so much rather see open standards than proprietary solutions given that all displays (and probably games, too) are likely to be HDR in the not to distant future.

Still, FreeSync 2 is expected to roll out later this year. So there isn’t long to wait find out what it’s really like.

30 Comments

  1. FriendlyFire says:

    FWIW, Nvidia’s G-Sync HDR is basically the same thing as this, so they’re on more or less even terms (well, aside from the hefty extra G-Sync carries inherently).

    I wonder if this will compel Nvidia to support Adaptive Sync eventually.

  2. Wut The Melon says:

    Although I’ve not seen a lot of HDR screens in action, I’ve heard a lot of people get really excited about it as the next big thing in screens and to be honest it does sound like that! Good thing it’s finally coming to PC.

    Though when you say all screens are going to be HDR in the future, I don’t think that is the very near future – as far as I understand, you really need the higher contrast ratios of VA and OLED panels to get impressive results from HDR.

  3. Odin86 says:

    I find it a bit odd that on the one hand you say you recommend Nvidia’s G-Sync and then on the other you say chastise AMD for going down the proprietary route.

    It’s worth noting that Adaptive Sync which is a part of Display 1.2a will make Adaptive Sync available to everybody that makes use of it. Intel are already on board as well and obviously AMD is. The only one sitting on the sidelines with their proprietary module is Nvidia.

    AMD also has Freesync running on HDMI as well as DisplayPort and speaking of HDMI, the HDMI foundation recently announced VRR or Variable Refresh Rate will be a part of HDMI 2.1 so future consoles should have the same technology as well. In fact they might be able to do something with current generation consoles with a patch, we will have to wait and see.

    Everybody is on board the Adaptive Sync market including the cable makers. Supporting Adaptive Sync without any proprietary standards is a possibility now, unless you know, you want to use those proprietary standards for some reason…

    • thedosbox says:

      He recommends it because G-Sync generally “just works”. From the article:

      some early panels FreeSync didn’t play nicely with the built-in image processing. More specifically, some monitors had to disable response-boosting pixel overdrive in order to enable FreeSync. Not good.

      Actually, it’s a bit of a deal breaker. You can argue the toss whether rendering smoothness is more important than pixel response. But not being able to have both at the same time makes that kind of flawed FreeSync implementation pretty pointless.

      • GenialityOfEvil says:

        Also because Freesync is already de-facto proprietary, he just chastised the rumoured move to royalties.

  4. frenchy2k1 says:

    HDR, both in the living room and on computers, is a mess.
    HDR itself is just a catch-all marketing term.
    Behind it hide greater luminosity and contrasts (that no equipment reaches yet), expanded color space (rec2020) and improved color accuracy (from 6-8 bits to 10-12 bits).
    Problem being that only Apple Operating Systems (mac and iOS) support multiple color spaces natively so far, so plugging an HDR display to your computer will result in a mess.
    A good starting point:
    link to pcgamer.com

    • Vedharta says:

      Even just support for 10bit (again 32bit but with only 2 bits of alpha, very awkward pixel format really with components not being byte sized anymore)

      Only very few applications support it and the vendors have artificial limitations (i’m sure they will come up with a ‘reason’) to lock it only to fullscreen or only ‘workstation cards’.

      And thats not counting the amount of software that just hopelessly breaks on 10bit colour; keep in mind, its 32bits. I’ve seen some software that assumes 32bits is 8:8:8:8 and then things go visibly horribly wrong :-D

      On my GNU/Linux machine 10Bit mode doesn’t even manage to boot to desktop due to interesting…bugs.

      A mess, yes!

      • Premium User Badge

        particlese says:

        Ah, thanks for the Linux comment – I was wondering about that. :(

        Linux and its ecosystem do seem to respect color correction profiles better than Windows, in my experience, so here’s hoping HDR and friends gain better support in the near future, too.

    • brucethemoose says:

      Rec2020 is a fantasy, DCI-P3 is the best we can hope for without some disruptive new display technology (QLEDs maybe?).

    • The Bitcher III says:

      Freesync started off behind Gsync in subjective performance and usabiilty but has long since caught up.

      Any form of adaptive sync is just about the single most significant quality of experience upgrade available to PC gamers.

      Anyone not prioritising it when making related purchases is missing out hugely. Factor in the the price difference between FS/FS monitors and it can make the choice of GPU pretty obvious.

      Personally, I have an LG 34 UC88b curved ultrawide. There isn’t an equivalent Gsync monitor – the ACER/ASUS uses the previous generation of LG ultrawide panel. I had one of these. Briefly. It was hideous. The picture was deformed at the corners, BLB and IPS glare was hideous. There was an orange glow visible in broad daylight.

      I returned it, and had to pay postage and packing because it was deemed ‘within normal performance’.

      The Acers have other ‘known’ issues (scanlines) – Acer’s ultimate response was along the lines of ‘such issues are part and parcel of ‘enthusiast’ grade technology. ie: Go Fish, thanks for the £800 quid.

      I ended up with a GTX 1080 to partner the screen. Getting a steady 60fps is every bit as frustrating as it ever has been.
      I’d swap it for a freesync (AMD) card – even with 30% less grunt – in an instant.

      • neems says:

        I have the exact same setup as you, but I can’t say that I have significant problems with performance. 3440×1440 actually seems to be the sweet spot for the GTX 1080, it’s too powerful for 2560×1440 and not enough for 4k.

        • Ghostwise says:

          Ditto, the Asus PG348 I have works fine.

          Of course, it took nearly 2 years in savings, so you bet I’d have it replaced as many times as necessary to get a good one. Which Amazon confirmed would be fine by them, given the French consumer protection laws.

          But the first one was good.

      • Premium User Badge

        Ericusson says:

        Whatever you say.

        I have an Acer x34 predator with minimal glow and awesome picture. It is an excellent monitor on the first model I received. And there are in developed countries enough guarantees to get your money back if you randomly get a bad product.
        Combined with 1080 GPU, it simply works.

        G-sync is a very nice and robust. This monitor is made to stay there for a long foreseeable future as much as the 1080.

        I grew enough through generations and generations of hardware since the first 3dfx cards to not care about marketing trend BS that is and has always been pushed around as is HDR these days.

        For the consumer, technological trends is inconsequential except for one’s desire excited by the sector news business.

  5. Sakkura says:

    The original version of Freesync was already clearly superior to G-Sync, so I don’t know why you’re claiming the opposite.

    • Photonboy says:

      ??Sakkura,
      Not sure where you got the idea that Freesync was superior to GSync but that’s just wrong.

      Original Freesync didn’t have drivers for LFC so even if they had the supported 2.5x max/min range there was no supported on the low end.

      Then there’s the severe OVERDRIVE issue the article comments on. The point of NVidia’s GSYNC is a hardware module to help avoid these issues. Not so with Freesync as many manufacturers did the minimum to get the Freesync or just Adaptive Sync stamp.

      The ONLY way Freesync could be confused as “better” was the PAPER SPEC of “9Hz to 244Hz” for AMD vs “30Hz to 144Hz” for NVidia prior to Freesync physically launching and then we discovered the lack of low-end support and blurring issues.

      What we got for example was some 40Hz to 60Hz range where it was ONLY smooth between 40FPS and 60FPS. AMD tried to confuse people with the paper spec, and misinformed about the LATENCY of Freesync vs GSync as well.

      • PseudoKnight says:

        You’re mostly criticizing Freesync for the failures of individual monitors. We shop around for every other aspect of a monitor, why not the implementation of Freesync too? In any case, Freesync 2 requires some things that were optional in Freesync, so the “stamp” means more. I recommend people not shop by stamp though.

    • lglethal says:

      If your experience with FreeSync was good and you’re happy with it, thats great News. But AMD had Problems with FreeSync, a lot of which came down to the Monitor manufacturers and their implementations of FreeSync, but there WERE Problems – Ghosting and Low Latency in particular.

      Nvidia maintained greater control over the implementation of G-Sync and so when you bought a G-Sync Monitor you KNEW it would work as advertised. This led to both the perception and, for a lot of People, the experience that G-Sync was better than FreeSync. If AMD had maintained greater control over Monitor manufacturers using Freesync maybe this debate would be very different. It sounds like they are going to try and do that with Freesync 2, but we’ll see how much control they will actually be able to have.

      So if your implementation of FreeSync worked exactly as it should have and you’re happy with it, feel free to feel a bit smug and condescending of other AMD FreeSync Users, but just because it worked for you, doesnt mean you should discount the opinions of other FreeSync users who suffered Problems with the implementation…

      • Sakkura says:

        Those are not problems with Freesync, those are problems with individual monitors. It’s an indisputable fact that Freesync is superior to G-Sync, because you can get exactly the same functionality for a fraction of the price, and there is a bigger selection. You also get a high likelihood of future support for Intel’s adaptive sync solution.

        • Premium User Badge

          Ericusson says:

          If there are systemic problems with freesync monitors, it’s a systemic problem that can be linked to freesync specifications and implementation guidelines, however much you want to defend the thing.

          • Jeremy Laird says:

            The views you get from Sakkura et al on this kind of thing are fairly standard derived-from-reading-stuff-on-the-internet rather than hands-on experience.

            In practice, when you’ve seen 15 different FreeSync monitors and all of them had ghosting, meanwhile, you’ve seen a similar number of G-Sync screens and none of them had any G-Sync related issues, the reasons for that are somewhat academic. FreeSync has been unambiguously patchy in implementation. Theoretical superiority doesn’t get you very far in that context.

            As I mentioned above, my finger hadn’t been on the FreeSync pulse for a while and I made it clear I wasn’t breaking any news by observing that you can now get FreeSync screens that don’t ghost. But it’s not the only issue you need to think about when considering FreeSync implementation when buying a monitor and the fact is, G-Sync is a much, much more straight forward purchase proposition. It’s more tightly controlled and far more consistently implemented.

            But G-Sync is expensive and involves lock in, which makes it too very unappealing. Eventually, no doubt we’ll get to the point where adaptive sync just works with any card and screen combo – as indicated above, as it becomes part of various interface standards then hopefully all the proprietary nonsense will fall away.

  6. SteelPriest says:

    When i switched to NVIDIA thanks to a 144hz G-sync monitor purchase, it was largely because every freesync monitor worked only on a very small frequency range. Is that fixed yet?

    • Sakkura says:

      Yes. You just need to check what the range is on the monitor you’re buying.

  7. noodlecake says:

    HDR gaming? Games have had HDR for a while now. Isn’t it just the effect that simulates the way that the eye takes a little bit of time adjusting to changes in light levels? I remember thinking it was awesome 7 or 8 years ago. I think its even an option in Fallout 3.

    • noodlecake says:

      No idea why I felt the need to write that before reading any of the article. It’s a little confusing having HDR in photography and HDR in video games mean the exact opposite thing. And then HDR with graphics cards and monitors means a third thing that I don’t understand.

    • GenialityOfEvil says:

      No, HDR is a group of advancements in display technology that increase the range of possible colours that can be displayed, the brightness of the display and a smattering of other things.
      link to en.wikipedia.org

  8. Premium User Badge

    Wisq says:

    From a theoretical and moral point of view — yes, it’s good that AMD went the “write a spec” route, and bad that nVidia went the proprietary route.

    But if you look purely at the practical side of things — nVidia doesn’t support FreeSync any more than AMD supports G-sync. And those two companies are the only real candidates for serious gaming.

    So really, you’ve just got “nVidia’s sync tech” versus “AMD’s sync tech”. Until that changes, going FreeSync will lock you in to AMD just as much as going G-sync will lock you in to nVidia.

    • Sakkura says:

      Intel has pledged to support DisplayPort Adaptive-Sync, so going with AMD also means you should have support once Intel launches their version.

    • MooseMuffin says:

      The HDMI 2.1 spec also includes variable refresh rates, which seems like it would force nvidia to support an open standard eventually.

      • Premium User Badge

        daktaklakpak says:

        Not necessarily. It’s a pretty trivial matter to put a handshake protocol into the Gsynch module, and have the card refuse to turn on adaptive synch over HDMI 2.1 unless the nvidia module is found. Who knows what they will actually do, but this would be in line with their current and past approaches.