Nvidia Slays The Beast, V-Sync, With G-Sync

By Nathan Grayson on October 19th, 2013 at 12:00 pm.

I wish all Nvidia hardware actually glowed like that.

Hello, V-Sync. Yes, thank you for meeting me here today. I invited you out because I felt the need to share some very important news: no one actually likes you. We just put up with you because, well, there’s really not a better alternative. In truth, you’re inconsistent, awkward, difficult to be around, cause obnoxious stuttering, and IT’S YOUR SURPRISE BIRTHDAY PAAAARRTY wheee everyone leap out now! OK, not really. But I figured those couple seconds of revelatory glee might help offset this falling pain piano of existential misery: you’re being replaced. By something younger, faster, and more practical. Or at least, that’s how it’ll be if Nvidia has its way. G-Sync claims to eliminate hassles like stuttering, screen tearing, and the like by synchronizing monitor refresh to the GPU render rate instead of vice versa, which is what V-Sync does. The result, apparently, is worlds better.

Monitors, you see, are fixed at 60Hz refresh rates, but modern GPUs can output so much more. So, as is, you either enable V-Sync to keep the GPU in clumsy lockstep with the monitor (which leads to response lag, stuttering, etc), or you can disable V-Sync to get better response times, but risk screen tearing when the two fall out of sync. Both methods are far from optimal. Nvidia, however, claims that it’s finally found a best-of-both-worlds solution. It explained in a blog post:

“With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.”

“This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.”

G-Sync-enabled displays will work with Nvidia’s Kepler series and be available early next year from the likes of Asus, BenQ, Philips, and ViewSonic. They seem rather miraculous, so we’ll have to wait and see how well they work in practice. On paper, though, this solution sounds pretty water-tight. Here’s hoping it a) holds up once we’re able to put it through its paces and b) isn’t too expensive. It is, however, pretty proprietary at the moment, which is basically a deal-breaker for those running non-Nvidia hardware. Ah, format wars. Aren’t they grand?

, , .

127 Comments »

Sponsored links by Taboola
  1. Text_Fish says:

    Another tick for Steambox over the other consoles I guess?

    • Premium User Badge

      Smashbox says:

      Nvidia based steam boxes plugged into future desktop displays I suppose. Doubt we’ll see TVs with this.

  2. BTAxis says:

    So now we can look forward to 20Hz refresh during Planetside 2. Neat.

    • sandineyes says:

      Actually, no. G-sync won’t allow a monitor to go below 30Hz. Apparently once you go below that, monitors will flicker. So, below 30fps, frames will be duplicated.

    • LionsPhil says:

      You shouldn’t really be able to tell, though, since flatscreens don’t “refresh” like CRTs did, with a series of scanlines that fade away, AFAIK. It shouldn’t be all flickery.

      I’m guessing this is “just” a pair of framebuffers in the display, and the GPU can poke the display to tell it when to flip which one it’s holding as shown, vs which one is being written to, i.e. hardware double buffering. Which is actually pretty neat. If this works and isn’t horrendously Evil (driver hate, proprietary dickery, premium pricing, the usual), it’s a pretty nice solution.

  3. lociash says:

    >Monitors, you see, are fixed at 60Hz refresh rates

    That’s awkward, I must have got ripped off when I bought these two 120hz monitors on my desk.

    • Premium User Badge

      golem09 says:

      So you have slighy more fixed numbers to choose from, oh great. They still rely on Vsync.

      I’m excited for this, but the problem is, I don’t use monitors. So I’ll have to wait ever longer until this comes around to 32″ HDTVs.

      • MarcP says:

        I’d wager most people interested in 120hz displays are going to turn off VSync anyway.

        • Premium User Badge

          golem09 says:

          Well, then they can still enjoy their screen tearing if the fps drop even once.

          • asura kinkaid says:

            no visible tearing regardless of the framerate with V-Sync off (all games) on my 120Hz Asus monitor. If there is I’m not seeing it.. plenty visible on my other (60hz montors) ofc

          • Grey Poupon says:

            Or they could just enable double/triple buffering. And you don’t have to buy a new monitor for that. Or a new GPU. Or a new game. Or pancakes.

            I think I’ll go buy some pancakes.

          • Zyrusticae says:

            The whole point of G-Sync is that you don’t need to use buffering (which increases input latency) or deal with any tearing whatsoever, however minor it may be at 120hz.

            Of course, tearing is indeed quite a bit less noticeable at 120+ refreshes per second than 60 refreshes or less, but there’s still tearing. Some people will be very sensitive to that, some people may not be. This kind of product is aimed at the former.

      • nrvsNRG says:

        32″ HDTV?….eeeuuughh!

  4. iARDAs says:

    Man the gap between console gaming and PC gaming will even be greater soon.

    • trjp says:

      I’m not sure why that would matter to you and as we’re about to enter a new gen of consoles – I’m not sure it’s going to be in the direction you’re thinking about.

      Simple fact is that the new consoles will result in less attention being paid to PC gaming for a while – settle down, clear your backlog…

      • iARDAs says:

        Well since the new generation is basically PS3.5, I am sure the PC gaming will excel even more compared to them.

        • Premium User Badge

          liquidsoap89 says:

          And why is it that we must be compared to them exactly?

          • stupid_mcgee says:

            Like it or not, the simple fact is that the console market beats the ever-loving shit out of PC game sales.

            Yeah, certain genres rake in megabucks, but the average is weighted heavily on the console market. Though PC sales tend do reliably have a much better longtail than the console market. After 6 months, you might as stop selling new titles and just let the used market take over. You’re especially not going to sell any new console games on titles 1 to 1.5 years out.

  5. spectone says:

    Hopefully this becomes an open standard and that it migrates to TVs as well.

    • Sonblade says:

      It’s Nvidia, so it’ll be open as long as anyone who wants to use this (AMD, Intel) has a nice, fat, open wallets.

  6. MattM says:

    I usually don’t use v-sync, instead I just cap the framerate at 120. In so many games v-sync seems to cause mouse lag. Although I am pretty picky about visual quality, I don’t notice the tearing unless I stop to look. Still if this G-sync really works I am gonna be psyched.

    • Premium User Badge

      bglamb says:

      Tearing is one of the most jarring things for me. It’s weird how different things can seem to obvious/invisible to different people. Personally when I need performance AA is the first thing to go. I don’t notice the jagged lines at all.

      • Premium User Badge

        Rizlar says:

        It is indeed funny, I always have some level of AA on since jagged edges can really get to me, while I don’t turn on V-Sync at all unless the tearing is particularly bad.

        Also should this new technology not be called V-Sync-Sync? Presumably Nvidia planned to call it N-Sync, but had to change at the last minute.

      • iniudan says:

        AA also one of the first thing to go, with one exception, game with long range view, then I want at least want it up to a level where far away object will not look deformed.

    • timethor says:

      In one of his keynotes John Carmack remarked how some people might care a lot about the latest extremely expensive shader technique that may make some pixels a tiny bit lighter, while they’re ignoring the giant tear line going across the screen…

      I’ll always prefer a smooth, tear free 60 fps (or 120 fps if I could afford it) over whatever minor graphical improvement comes from the “ultra” settings compared to “very high”. And I wish all developers were the same, prioritising performance over eye-candy.

      • asura kinkaid says:

        amen to that. THIS is one of the core reasons COD is STILL outselling everything in the genre. despite being a very stale franchise in practically every other sense.

  7. phelix says:

    Interesting! So how does this affect monitor lifetime?

  8. 00000 says:

    This needs to be in the Oculus Rift. Too bad it’s nVidia who came up with this and not a 3rd party.

    • SuicideKing says:

      That doesn’t stop it from making it to the Rift, Carmack seemed to love it.

  9. Trespasser in the Stereo Field says:

    I have a dirty little secret: in my, oh, 20 years of PC gaming, I have never once checked or unchecked the v-sync option. I have no idea what it’s supposed to do or help with and I don’t seem to have ever needed it. Thank you. I feel better now that’s off my chest.

    • BooleanBob says:

      I once returned a PC to small-business dealership because I didn’t understand what was causing screen tear. I thought there was something wrong with the graphics card, or the cpu. The business in question soon went under, and the replacement PC I bought from a large megachain cost more and had a lower spec.

      G-sync seems pretty exciting, though it seems so obvious even to me that I wonder how the industry has gone so long without thinking of it.

      • C0llic says:

        Well, usually V Sync is on by default with many games. You would have noticed the screen tearing with it off I’m sure. The only reason to turn it off is for competitve fps gaming, or if a particular game has stuttering or responce issues with it on.

        Whether V Sync is an issue or not varies wildly and is usually a crapshoot from computer to computer.

        • BooleanBob says:

          What I meant was that I had literally no understanding of what screen tear was, or that V-sync being disabled was the reason for it. Had I known I would have been able to make peace with it either way, but I didn’t, so I saw the effects of screen tear and assumed there was a hardware defect. It was my own ignorance that was the problem.

          • C0llic says:

            Yep, I actually meant to reply to the top level coment, not yours. I got what you meant :)

      • TechnicalBen says:

        Constant screen tearing (in explorer etc) suggests the drivers were never installed properly in the first place. You were well to return it. Tearing in games only, would probably just be a setting somewhere, they should have pointed you in the right direction, and even demonstrated the fix to you in the store.

        But both of the above require work, and a good employee who can engage customers (as we all know we can be problem customers at times, so need some help understanding things).

    • Premium User Badge

      golem09 says:

      This is screen tearing:
      http://www.youtube.com/watch?v=K9z3MQeogcY
      Vsync goes around that by making the frames slower/laggier instead of breaking up in the middle.

    • Perjoss says:

      Search google images for ‘screen tearing’ you will notice that the images look like they are cut and offset horizontally, some games suffer from this much worse than others, the short version is that checking the v-synch options will stop this from happening, but will also cause you game to perform slightly worse in terms of frames per second.

      the long version: I’m not even nearly smart enough to explain this version properly.

    • sophof says:

      You often see complains of laggy controls with new games. I’m convinced this is always V-Sync switched on and people not knowing what it does, so don’t feel too bad ;)

    • Baines says:

      V-sync is supposed to stop screen tearing, where the bottom half of the screen draws a different frame than the top half.

      What it actually tends to do is introduce control lag. I found the second level of Bit.Trip.Runner unplayable until I disabled V-sync in the game, and the Steam forum for the game showed others who had the same issue. KOF XIII forced V-sync on in fullscreen, which is most likely connected to why fullscreen had around twice the lag of playing windowed.

      Screen tearing is a nuisance, but control lag can outright ruin a game. A sadly funny thing is that V-sync doesn’t even eliminate screen tearing sometimes, with it still happening occasionally. I’ve played a couple of games where having v-sync on actually made screen tearing worse.

  10. Premium User Badge

    daphne says:

    Having experienced it first-hand, Anand Lal Shimpi of Anandtech described this as “awesome” and a “game-changer” in his impressions article here: http://anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness

    I’m inclined to believe him, because he’s Anand Lal Shimpi.

    • Velko says:

      I have no idea who that is, but with a name like that I too am inclined to believe whatever he/she says.

      • Fenix says:

        He is the founder of AnandTech, one of the big names in the tech site world.

        • Premium User Badge

          Don Reba says:

          So, the site could have been called ShimpiTech? That’s just grand.

    • Premium User Badge

      FriendlyFire says:

      Yeah, the fact he’s so excited about this is perhaps the biggest endorsement I can get. Having Carmack, Sweeney and Andersson on stage during the announcement is also a good endorsement, but trickier since you just don’t know how much they’re holding off due to being paid to be there.

  11. C0llic says:

    This sounds great in theory. My worry is that it’s a proprietary solution. Does this mean that we will have an ATI solution as well? Will we be locking ourselves in to a graphic card companies products when we buy our monitor? Or even worse, will this thing, even if it’s as great as it sounds, simply not take off because it requires new hardware to function. Will the various monitor companies stop prodcuing these models if they fail to sell.

    I suppose it will depends on whether the new tech will neccessitate a price hike by the monitor companies. If it does, this potentially great idea could just fail regardless; at least until someone comes up with an open solution and its a safe enough bet to include it in gaming monitors as a matter of course. Monitors, while genrally pretty cheap these days, are a peice of hardware gamers don’t count on replacing until they absolutely have to. It might be a tough sell unless it really is that great of a leap forward (and is widely adopted).

    • trjp says:

      I cannot imagine monitor companies buying into a ‘one platform’ solution to this.

      Either they will enable a software-driven ‘frame control’ or they will not. It needs to be a standard (as with all the other stuff which monitors send back up the cable) or it will be mostly pointless.

      Which is what about 80% of nVidias ideas become – sadly – they’re not team players by any means…

      • whoCares says:

        What if I told you:

        Nvidia IS a team.

      • Premium User Badge

        Don Reba says:

        It could become similar to the OpenCL/CUDA situation, with Intel and ATI agreeing on a standard, and NVIDIA continuing to advance its own proprietary technology and winning.

    • SuicideKing says:

      AnandTech’s speculating that the cicuitry required for the tech (replaces the “scaler” in a standard LCD) will cost $100 initially at least, but may be available as a standalone kit so that people can mod their own monitors.

    • marach says:

      It’s $170 for a bit of memory, a variable rate clock, and the ability to read Displayports end of frame packet. And I’m not kidding here. That’s all this board does (note it requires DP).

      It basically overrides the display board so that a frame is redrawn every time it receives the EOF packet. I’m actually scratching my head trying to work out what else they’ve done to warrent $170…

      • Premium User Badge

        FriendlyFire says:

        A bit of memory? It’s got at least 768mb (potentially even 1.5gb) of DDRL3 and a controller. Possibly more, there aren’t a whole lot of details regarding the board.

        Keep in mind that low-volume silicon is quite expensive since circuit boards have really large fixed costs initially. If the tech gains traction prices should fall fairly quickly.

    • BlacKHeaDSg1 says:

      Sadly it works only with Nvidia graphic card (for now), acording to their system requirements http://www.geforce.com/hardware/technology/g-sync/system-requirements. I can only hope that G-Sync will also work with ATI graphic card.

  12. Tei says:

    If this deliver what is describe, could be really interesting. How is now, games with v-sync on must deliver 30 or 60 fps, numbers like that. If you can’t deliver 60, you must do 30. If this work in a way that if some game can’t deliver 60, but can deliver 50, we will have a smoother gameplay with the best possible framerate.

    SSD and Monitors operate emulated much older technologies. A SSD works emulating a hard-disc, with sectors and discs. And a monitor emulate a CRT TV. Its time to stop emulating things (with all the expensive foobars, and sacrifices) and being the real thing a SSD or a LCD can be.

    • MattM says:

      If you want to update all the pixels in a lcd at once instead of scanning though and updating them one at a time then you would need a separate signal path from each pixel in the buffer to each pixel in the monitor. That would be about 2 million channels for a 1080p image. Its possible to send multiple (~160) signals over a single fibre cable but that would still be a pretty thick monitor cable.
      Pretty much every time a computer works with an image, it does so pixel by pixel and I don’t know how we could get away from that.

  13. DatonKallandor says:

    Well that sounds neat and all but doesn’t solve the big problem:
    If you’ve got a really powerful machine you don’t use V-sync because of any screen tearing or stuff like that. You use it because it stops your machine from overheating and drawing too much power because in the last console cycle developers got so incredibly lazy they don’t stop the game from trying to get as many frames as possible even when there’s nothing going on. Such as in Menus.

    G-Sync wouldn’t work like V-sync in the respect that it stops generic old main menu from trying to draw 300 frames at max GPU and CPU load.

    It’s a better solution in the intended category, but destroys the more important “protect your system from incompetent developers” use.

    • GamesInquirer says:

      I used to use vsync for that reason and hate any game without vsync options, that vsync often doesn’t work in windowed mode, that driver level vsync options often didn’t function for all games, and so on. Then I finally discovered MSI (I don’t have an MSI card anymore but it works fine for all) Afterburner’s embedded RivaTuner program which allows frame limiting and has also replaced FRAPS for my FPS overlay and screenshot/video taking for me. Afterburner also works as a fine temperature/load etc monitor, it’s primarily for such GPU diagnostics and overclocking/fan speed options (that I don’t use atm). There are other similar solutions too but this is the one I recommend. Great free utility.

    • LionsPhil says:

      DisplayPort/HDMI/DVI only has finite bandwidth, so it can only ever send so many frames to the display per second. That’s one hard cap. Another will be on the panel’s own capabilities, given most of them max out at 60, and even 120 shouldn’t be as bad as uncapped rendering making you graphics card make that delightful high-pitched whine of pain.

      • MattM says:

        With v-sync off the card renders as fast as it can and renders complete frames even if it only finishes sending part of each frame to the monitor before switching to the next one. This increases power draw/heat/noise. A properly ventilated case and card should be able to handle this high load (if it can’t then you need more airflow), but some people get nervous when their card gets hotter than normal.
        As others have said there are now easy ways to limit framerate without v-sync if that’s your preference. I like use NvidiaInspector to set the limiter built into the Nvidia driver.

        • LionsPhil says:

          We’re not talking about the case where there is no limiting.
          We’re talking about the case where the limiting is G-Sync instead of V-Sync.

    • MattM says:

      I think its implicit that the render rate would be capped at the monitor refresh rate. If it wasn’t then you would either be overdriving the monitor’s refresh rate, building up a large queue of frames, or never letting the monitor finish displaying a full image and the bottom lines of the screen would never be updated.

  14. GamesInquirer says:

    So is this why we still don’t have a driver level triple buffered vsync option (basically vsync without the performance penalty, I’m not a competitive FPS player and any input lag is imperceptible for me too) that works for both D3D and OGL and instead have to hope the game we want to play has it as an option? Because they were busy developing more hardware to sell and triple buffering reduces its value?

    Anyway, until it’s integrated in monitors and doesn’t cost any more than a monitor without and the monitor I’m using now which is relatively new dies, I won’t really care to investigate unless I see it in person and it’s the revelation it’s hyped to be. Right now, personally I don’t notice stutters unless my PC struggles to run a game and I don’t see how better monitor syncing can fix that struggling, or somehow make low 30-40fps frame rates look smoother or magically make text in motion appear more readable (obviously outside the occasion of a struggle or tear happening right then and at that part of the screen) or whatever else it’s currently hyped to do. Maybe I’m being dumb in this matter.

    How come tearing is worse in some games than others anyway? It kinda makes me think that a software solution without any real drawbacks should be possible by the engine’s rendering.

    • Premium User Badge

      FriendlyFire says:

      You do realize the game has no control over screen tearing, right? The GPU doesn’t even know there’s tearing. It’s the monitor that’s reading the framebuffer just as it’s being updated by the GPU.

      V-sync was implemented specifically to counter this issue. It’s the “software solution” you’re talking about. You can’t have any better without monitors getting more intelligent, which is what G-sync does.

      • GamesInquirer says:

        Why yes, I do realize all that (though again, I don’t know about vsync being as good as it can be, you’d probably say the same before triple buffering was introduced and you’ve failed to provide reasons for it or even an answer to my question about why some games/engines are more affected by tearing than others, with vsync off in all cases obviously), and nothing I said implied otherwise, so I don’t know why you replied to me just to patronize.

  15. CookPassBabtridge says:

    I like screen tearing. For some reason it feels like water smooshing over my eyeballs.V-sync always off for me as I prefer the extra performance

  16. Premium User Badge

    lowprices says:

    I can’t help but feel they’ve missed a trick by not calling this N-Sync.

  17. Major Seventy Six says:

    Am I just getting old and my eyes don’t catch it ?

    I haven’t experienced tearing or stuttering in years while playing PC games, now that I think of it, the last time was on my Asus GeForce EN7950GT. I since then owned an Asus Radeon EAH4870 and now a Gigabyte Radeon 7870OC. Could it be possible that AMD GPUs already do a better job at preventing tearing stuttering?

    • GamesInquirer says:

      No, tearing is there on my 7970 OC. The severity varies per game and vsync does fix it regardless of GPU (while triple buffering doesn’t have performance issues), obviously.

    • Glottis1 says:

      I am not expert, but someone said in another comment that v-sync is enabled by default in most games. As for stuttering, i think you start to notice it only after you know how much better it could be.

  18. Glottis1 says:

    Does this have effect on competitive multiplayer games?

    • C0llic says:

      Well, yes if it does what it says. You could have this turned on and not have to put up with screen tearing. The majority of people who game competitvely won’t use V sync for the reasons listed in the article.

  19. Reginald XVII Archduke of Butts says:

    I hope Dell get on board, too. If it’s not an Ultrasharp, I don’t want it on my desk.

    • Sheng-ji says:

      Speaking as someone with both a U2713 and a PB278 on my desk, I can say that ASUS make monitors that easily equal Dell, though when I need wide gaumet, I use a U3014 (which makes my games look all kinds of oversaturated awful!) and wouldn’t use any other brand.

      • Reginald XVII Archduke of Butts says:

        That’s a fair point. I’ve seen some pretty decent panels out of ASUS lately, I guess I just need to put in the effort and figure out which ones are IPS.

  20. sabasNL says:

    Never understood V-Sync. Never had problems when I turned it off, and never noticed anything when I turned it on.
    Could somebody explain it to me please?

    • Monkeh says:

      Mostly prevents screen-tearing. If you don’t know what that is, it looks something this: http://static.giantbomb.com/uploads/original/6/62225/1156042-ggdsg_19.jpg

      I hardly notice screen-tearing during games either, so I always have V-Sync turned off, seeing as for most games it locks your FPS at 30 with it on.

      • sabasNL says:

        Oh, so that’s what screen-tearing is haha. No, never had that either.
        I’ll turn off vsync then, never noticed it capped the FPS.

        Cheers!

    • CookPassBabtridge says:

      As I understand, it has to do with the mismatch between when the gpu spits out a new frame and when the monitor looks for a new frame. If the two are out of time with one another, you get half an old frame plus half a new one. The result is a line between the two halves that look like a “tear”. V-sync makes the two wait for each other until they are both ready for a new ‘page’, causing a delay and hogging resources. I’m sure someone will come along with a better decscription however.

    • Low Life says:

      This is a very good post explaining V-sync, why it’s used and how G-sync is different: http://www.neogaf.com/forum/showpost.php?p=86542030&postcount=236

  21. suibhne says:

    Oh, just grand. Now nVidia and AMD both have “game-changer” technologies in their next gen of cards, which will end up being incompletely shared across the industry if at all.

    • rulez says:

      What is AMDs new thing?

      • Premium User Badge

        FriendlyFire says:

        Mantle, their new low-level API.

        I honestly prefer Nvidia’s new thing; at least it doesn’t split up the entire rendering pipeline in three.

        • suibhne says:

          I guess there’s also the fact that 1) nVidia’s tech will arguably benefit all games (as long as you have the right monitor…), and 2) AMD’s tech will arguably be unnecessary for high-powered PCs anyway, given the typical performance delta between PCs and consoles (tho it could be very helpful for mid-range or low-end PCs…).

          • GamesInquirer says:

            2) is kind of silly, even the high powered PC could simply get additional room for even more AA, enhancements a la icehancer and sweetfx, user mods, supersampling (if that ever becomes easy on AMD) and much more. Not to mention ~4k and/or 3D gaming severely cripples just about any current GPU and there are plenty games way too demanding for their own good (justified like Crysis 3 and Arma 3 or not like GTAIV) that bring high powered PCs to their knees when maxed even in reasonable resolution. The point of buying the latest and greatest is to have the best performance and quality, why would an additional x percentage to said performance be meaningless? Of course the majority of even enthusiast gamers don’t have twin GPU beasts or even a single of the most expensive GPU so will most welcome additional performance gains.

  22. Borsook says:

    Actually what the article claims is not entirely true, monitors AREN’T fixed at 60hz, LCD monitors are. CRT monitors go even as high as 200hz.

    • Premium User Badge

      Don Reba says:

      It should also be noted that a hand-crank projector is only fixed at whichever rate the operator spins the handle.

    • IshtarGate says:

      I think the stress is on ‘fixed’. CRTs go really high (and I love them for it), but you still run on a fixed refresh rate. This is the most ideal solution yet devised: your monitor is now literally 1:1 with what the graphics card is outputting.

  23. rulez says:

    I heard tearing is much worse when using SLI. Is this the same type of tearing G-Sync will eliminate?

    • souroldlemon says:

      good point. stuttering in sli or crossfire is basically the same problem made worse by gpus not being coordinated right, so i would imagine that it would help there as well.

  24. fish99 says:

    Nice technology for the people who need it, but with a 120Hz screen I can just leave v-sync on and get neither tearing nor a framerate drop. Haven’t seen tearing since I got my current LCD.

    Of course most games implement tripple buffering these days anyway, which means you can have no tearing and very little framerate drop from having v-sync on, but that does introduce some lag, but no more than everyone using a HDTV is getting (they average around 30-35ms input lag).

  25. umaxtu says:

    So AMD has Mantle and Nvidia has G-sync. The competition is getting interesting.

    • Moraven says:

      And Mantle is supposed to be open for anyone to use while G-Sync is Nvidia only hardware.

      • MattM says:

        Mantle is open for developers to use in their games, but its a AMD specific API isn’t it?

        • GamesInquirer says:

          Yes, it’s made to take advantage of AMD’s modern architecture, even if it was open (which it isn’t) it would be pointless for others to attempt to utilize it rather than simply make their own API. The whole point is to customize development to this one architecture, like console development is customized to specific models, in order to yield better performance than you get with a generic standard any architecture then interprets like DirectX.

  26. HisDivineOrder says:

    Not that nVidia is exactly the best about not doing proprietary things, but AMD IS trying to basically shut out every GPU maker except themselves of high performance ports of games, so…

    I have less sympathy for AMD now about such things. Mantle is such a horrible idea that all these “smaller” proprietary things nVidia does seem far less important than if Mantle were to take off.

  27. Premium User Badge

    Etto says:

    Really excited for what this could potentially become! I’ve been utilizing V-sync for many years to mitigate screen tearing, but always hated the noticeable input lag as well.

    To get the optimal feeling from games I currently limit the framerate to 59 with DXtory and force V-sync through Nvidia control panel. This gets rid of both screen tearing and input lag, but some games absolutely hate DXtory injecting itself into them or V-sync in general.

    • nimbulan says:

      FYI you can do framerate limiting through nVidia’s drivers using nVidia Inspector now. There shouldn’t be any need for DXtory.

  28. Bugste81 says:

    If I can buy something that plugs into my Dell 2711 Monitor that did G-Sync I’d be first in line for this type of kit.

  29. Low Life says:

    A nice panel with misters Carmack, Sweeney and Andersson at the Nvidia event: http://www.youtube.com/watch?v=MH2hjhcfWic

    They cover everything from G-Sync to VR and even AMD’s Mantle.

  30. sharkh20 says:

    We shall see. The first thing I do when starting up a new game is to go to options and turn off v-sync.

  31. nimbulan says:

    I hope this is as good as it sounds. It all depends on the pricing and display quality though, and I have a suspicion this feature will only be available on 120/144Hz TN monitors. I’ve been thinking about getting a new monitor recently since my newer one is 7 years old now. Now I think I’ll wait and see what’s available with G-sync support before I make a decision.

  32. Premium User Badge

    Wisq says:

    One, I’m pretty sure that triple buffering solved all the problems with V-sync, so I dunno why we really need this G-sync.

    Two, to say that G-sync is going to eliminate stuttering is silly. It’s just that now your monitor will stutter in sync with your framerate, instead of happily continuing along at 60 Hz and repeating some frames.

    It’s true that the whole fixed refresh rate thing is pretty archaic and it’s time to get rid of it. And having no tearing and immediate refresh is nifty. But while it’s true that this deserves to be the future of computer monitors, I do think the benefits are almost certainly overstated. (Although I bet there’ll be a lot of competitive gamers jumping on this ASAP, because they just can’t stand that horrible 8ms maximum response time on their 120 Hz monitors.)

    • SuicideKing says:

      No, you’re not correct here.

      Triple buffering mostly solves input lag, but doesn’t solve Vsync’s major problem: the moment that the FPS drops below the refresh rate, you’re going to end up drawing at the next highest factor of the refresh rate.

      So 55 fps will appear as 30 fps on a 60 Hz monitor with vsync enabled. This is why Nvidia introduced adaptive vsync, though that re-introduces the tearing issue. In my experience triple buffering doesn’t solve tearing, will have to recheck.

      G-sync DOES eliminate VISIBLE stuttering, which is the whole point. Since the monitor is merely drawing what the GPU is putting out at the same rate, there aren’t going to be any half rendered or dropped frames. But, if the stuttering isn’t originating from the GPU side of things (but the CPU, for example) then i guess it’ll still be visible.

      I think Nvidia kept repeating that you’d still want a min. of 30 fps for it to work well, so i think it does need the fps to be bearable in the first place. I’d guess that the number would be closer to 24 fps for a lot of games.

  33. Megakoresh says:

    I suppose this means all those monitors will have to be 120Hz then, since otherwise for any fps above 60 that you card is able to pull, you won’t be seeing any improvement in experience. 120fps though is more than enough to satisfy anyone sensible.

    Well if it works, then I know what I am going to be buying as a second monitor I was planning to buy soon­™

    • LionsPhil says:

      You won’t see an “improvement in experience” for framerates above 60 anyway, with or without G-sync?

      Never mind human perception (I’m sure you’re Superman), if you rendered frames more often than the monitor displayed them, you are guaranteed to just throw away frames or parts of frames (the latter introducing tearing).

      • KillahMate says:

        The difference between a 60Hz and a 120Hz monitor is actually quite plainly visible, even in Windows, but especially in an FPS game (provided the GPU is keeping up with the framerate).

      • Sheng-ji says:

        Eyes and our brains don’t work in frames per second, each individual rod and cone may have a chemical refresh rate of 10 times per second but clearly we can perceive faster motion than that. In theory, the brain can interpret an fps rate equal to the number of rods and cones you have total, if it needed too as it can desync their firing and this may explain the “slowdown” effect people experience during moments of tremendous stress. The cost would be colour, three d vision and frankly your vision would be of “movement”, literally no details – your brain would fill those in later, when you access memories of the event!

        Realistically, any human can easily tell the difference between 60 and 120 fps, but smoothness aside, it allows your eyes to double their desyncing and gather twice the amount of information which leads to less headaches, nausea and better reaction times.

        • LionsPhil says:

          I’m just going to reply to that with an article from a neuroscience PhD student, because having this armchair argument over and over is boring.

          Note that one of the peer-reviewed papers is a study on framerates in FPS games. Their game of choice was Quake III, not some sloppy auto-aiming 360-pad QTE-o-rama, and their participants included a chunk of 20-something gamers. The difference in percieved quality and game performance is already falling off sharply between 30 and 60 FPS.

          • KillahMate says:

            So then what’s the minimum frame rate at which a video game should be rendered to ensure that it doesn’t suffer from jitter or choppiness? There’s no maximal rendering framerate above which aliasing effects are guaranteed to be eliminated. […] And the relevant question is really whether the choppiness bothers you or not, rather than whether it’s visible. Claypool, Claypool and Damaa (2006) report that performance saturates in a first-person shooter game at about 30 fps (above; notice that confidence intervals overlap between 30 and 60 fps). So, say it loud, say it proud: 30 fps is good enough for me!

            What this means is that the minimum average tolerable framerate is 30fps, with some people requiring much more to avoid intolerable choppiness. Increasing the framerate will increase the apparent temporal smoothness, with no upper bound (“There’s no maximal rendering framerate above which aliasing effects are guaranteed to be eliminated.”) Therefore once you’ve past your minimum tolerable framerate, likely around 30fps, it’s a matter of how smooth you need/want/like your display to be. It’s fine to be satisfied with 60fps, but it’s disingenuous to imply there’s no perceptible difference between 60fps and 120fps. After all, 120fps monitors aren’t exactly relying on the placebo effect for sales.

          • fish99 says:

            Have you used a 120Hz monitor LionsPhil, because otherwise you’re not speaking from experience. I own one and there’s a big jump in smoothness of movement. Just moving the mouse quickly on the desktop, or dragging a window about, you can instantly see the difference, and when a game gets up to 120 fps to match the refresh rate, there’s a liquidity to the movement which isn’t there at 60 Hz.

            The actual capabilities of various parts of the eye aren’t really that relevant, because it’s about how many frames the eye has to sample from during that period. The more frames, the smoother things look, and the less likely the eye is to notice that it’s looking at a collection of still frames, rather than fluid movement.

      • Megakoresh says:

        Of course I won’t. If Monitor refresh rate is below the given frame rate, the only perceived difference can be the controls. And since I am not really an e-sports gamer, I won’t feel that difference. As far as visual difference is concerned, the monitor’s refresh rate will have to be greater or the same as the fps pulled by the GPU.

        I am not sure what the “Superhuman” thing is about there. It’s common knowledge that 120Hz monitors and framerates feel much smoother and less tiring to any human eye, with the strength of the effect varying from person to person.

  34. fish99 says:

    “Monitors, you see, are fixed at 60Hz refresh rates”

    Even the 120Hz ones? :p

  35. Sic says:

    G-Sync-enabled displays will work with Nvidia’s Kepler series and be available early next year from the likes of Asus, BenQ, Philips, and ViewSonic.

    This is why something like this is not interesting whatsoever. Just like monitors with an 120 Hz vertical refresh rate. If companies that couldn’t produce a proper display if their life depended on it are the only ones making displays with this tech, what’s the point?

    • C0llic says:

      BenQ make very good monitors. I can’t speak for the others, but that simple fact leads me to assume your comment is as uninformed as it sounds.

      • Sic says:

        Not only do I have a degree related to this, but my work has included advising the TV industry about what equipment they should buy (including things like monitors and displays).

        Now, I don’t know if you work for BenQ, or you’re just generally ignorant about display tech, but BenQ isn’t generally regarded as a producer of “very good monitors”. They, like any other producer, have access to good panels, so it’s not like they exclusively make unusable displays, but on a whole, they’re considered to be opposite to NEC and Eizo, on the scale of which producers are worth their salt.

  36. nancy478 says:

    > Start working at home with Google! Its by-far the best job Ive had. Last Wednesday I got a brand new BMW since getting a check for $6474. I began this 8-months ago and immediately was bringing home at least $77 per hour. Useful Reference http://ow.ly/pAu07
    WORK LESS EARN MORE

  37. bad guy says:

    Ain’t nothin but a G Sync

  38. reetaangel5050 says:

    I’m shocked that some one can profit $9048 in one month on the internet. did you look at this link Go to site and open Home for details >>>>>> http://x.co/2d5tA

  39. GamesInquirer says:

    What about playing in windowed mode?

  40. Josh W says:

    Ok, this could be pretty interesting, as you could get high frequency temporal patterns out of a monitor that have actually never happened before, thanks to prime divisors of frequency etc. It also suggests that bullet hell shooters could get even more rediculous, and that computer stutters could start causing epileptic fits, swerving their way through the frequency spectrum and hitting those that set people off.