Why You Need A High-Refresh / 120Hz-plus Monitor

Did somebody say something about IPS and high refresh?

Last week we rolled out the first in a new series of why-you-need-stuff posts. The idea being, assumptions about what is good and why come a little too easily with the ongoing churn of PC hardware news and product launches. So, let’s go back to basics with these assumed goodnesses. I kicked off with IPS monitor technology and while healthy discussion of the pros and cons of IPS ensued, so did some wailing and gnashing of teeth that a gaming website had appeared to be dismissive of high refresh rates and glossed over 120/144Hz.

This was because high refresh rates are a separate issue from panel type. Something worthy of a post of its own. This post, in fact. Is faster really better when it comes to screens?

Begin at the beginning

First, what exactly is a high-refresh monitor? In the context of modern LCD panels, it means a monitor capable of 120Hz or better at native resolution. That, in turn, means the screen can redraw every pixel 120 times per second.

Put another way, it means a monitor capable of truly rendering 120 individual frames per second in a game rather than simply discarding additional frames above 60 frames per second, as per a conventional flat panel screen.

120Hz is the starting point for high refresh probably because it’s a simple doubling of the 60Hz refresh that has hitherto been the standard for flat panel monitors. Faster panels rated at 144Hz are also now available along with what I would call pseudo 240Hz (more on that later). No doubt even quicker monitors are forthcoming. Bigger numbers have to be better, right?

If that’s the basic definition of a high-refresh panel, do these things actually work? Are such lofty refresh rates even visible to the laggardly human eye? After all, very nearly all the TV and movies you’ve ever seen will have been rendered at somewhere between 24 and 30 frames per second. And fluid motion is perfectly possible in those formats, isn’t it?

Likewise, doesn’t anything over a solid 30fps or so feel smooth in-game? What’s more, the much heralded and supposedly silky smooth new HFR or high frame rate format for movies (as per the recent Hobbit money-machine trilogy) is a mere 48 frames per second and thus not even on a par with a standard PC monitor. What, then, to make of hugely higher 120Hz and beyond?

The eyes have it
There are two ways of looking at this. The first involves the science of what the human eye and brain can see and process. The second way of looking at high refresh is, well, to literally look at high refresh. Then decide for yourself whether you can tell the difference.

Until recently, really quick 144Hz panels like the Asus Rog have been exclusively TN panels…

The former is intriguing but the latter is what actually matters. If somebody has to tell you that you are looking at something new and supposedly better because otherwise you wouldn’t have noticed, I’m not convinced the benefits are sufficient.

That said, for the record there are various scientific theories of the human eye and image processing in the brain that are relevant. I don’t think any single element provides the full story. But one interesting component I wasn’t aware of until recently involves what you might call wobble. By that I mean your eyes wobble constantly, but just a little bit. It’s a process known as microtremor and in evolutionary terms it’s intentional, even if intention is a misnomer in the context of evolution. But you know what I mean. The purpose or, if you prefer, consequence of microtremor, it’s thought, is to increase the effective resolution of your retina, apparently doubling it.

Now it just so happens that this wobble ranges in frequency from 70Hz to just over 100Hz in humans. And that, to me, seems significant. Because in my experience mucking around with high refresh monitors, the jump from 60Hz to, say, 85Hz is subjectively substantial. Bumping things up to 100Hz makes a difference, too. But after that, well, I’d have a hard time spotting the difference between 120Hz and 144Hz in terms of fluidity, that’s for sure. I’d like to see anybody reliably distinguish between the two on those terms. For me, the law of diminishing returns begins to kick in above 100Hz.

Exactly when the benefits of ever higher refresh begin to fall away will vary from person to person. But the key point is this. Refresh rates above 60Hz do make a difference. The problem is that it’s one of those things you can’t fully grasp without seeing it with your own eyes. But for those who haven’t, it’s a combination of utter fluidity and responsiveness that makes a high refresh monitor so special.

Something you can’t unsee

It’s also one of those things you can’t unsee. Once you’ve experienced high refresh, you won’t want to go back. Suddenly, your old 60Hz monitor will look juddery and clapped out. If you’re on 60Hz and haven’t seen high refresh, try this. Whip your mouse around the screen in a big circle. What you’ll see is a trail of many mouse pointers as they jump across the screen in the time it takes to go from one frame to the next at 60Hz. The gap between each pointer effectively represents 1/60th of a second.

In reality, those jumps are still there at 120Hz, only smaller. But what’s important is that it doesn’t seem like it. At 120Hz, things seem to move like a real objects in space and time.

The same applies in-game. You have a sense of things actually moving, rather than images being rendered and animated frame by frame. In short, it peels away another layer separating you from a fuller suspension of disbelief in what you are looking at. It also makes responses to control inputs snappier. At least twice as snappy, as it happens, with a 120Hz panel, even more so with faster screens, which is the real benefit of 144Hz panels, as opposed to even greater visual fluidity.

The way to understand this point is to imagine a screen that only refreshed once a second. If you move your mouse just after the screen has updated, you’ll be waiting a full second to see any reaction. The faster the refresh, the quicker a screen can respond to inputs. Simple.

Pixel response and panel type

In the context of all this speed, an inevitable question follows. How fast a panel in terms of pixel response do you need to render 120Hz? There isn’t a straight forward response to this. On the one hand, with the right electronics, you could drive a very slow panel at 120Hz. On the other, the pixels wouldn’t update fast enough to keep up with the refresh rate. So blurring will ensue.

Eizo’s 240Hz gaming panel isn’t quite what it seems…

The simple maths says 120Hz is roughly equivalent to 8ms. And that’s a response time well inside a modern TN panel’s specs and many newer IPS panels. Well, it is in terms of so-called grey-to-grey response. I’m not sure there’s any IPS panel capable of fully switching from white to black in 8ms.

The upshot of which is that TN is better suited to high refresh than IPS (and indeed VA panels). And thus high refresh monitors kicked off with TN tech and only very recently has the industry begun to add IPS in high refresh format (the Acer panel at the top of this post is IPS and 144Hz).

The other problem with high refresh, of course, is the load it puts on your graphics subsystem. Double the refresh means double the frame rate and double the work load, which is a big ask even at 1080p, let alone higher resolutions. Even with a seriously zippy GPU, achieving a consistent 120fps at maximum detail is a huge amount of rendering. Suddenly, you may have to choose between slick 120Hz refresh and full graphics detail.

Similarly, you may have a problem simply in terms of interface bandwidth with 120Hz-plus refresh. It’s more than most legacy HDMI ports on video cards can cope with at 1080p and it’s beyond the remit of dual-link DVI at 2,560 by 1,440. Safe to assume you’ll need a DisplayPort output.

The sordid matter of money

On the other hand, you’re never actually worse off with 120Hz, and for me that’s the clinching argument. It’s gorgeous when it’s giving you all those frames. When it isn’t, you’re no worse off than before. Well, not unless you’ve chosen TN over IPS with your 120Hz-plus panel, in which case there has been a trade off. However, now that high refresh IPS technology is appearing. I see no reason to turn it down other than cost. Acer’s 27-inch, IPS, 144Hz effort, the Predator XB270HU (due to arrive in March), is nearly £700 (a USD price eludes me). But then better tech usually costs more money.

Finally, a quick word on those 240Hz panels that have popped up recently. As far as I am aware, what they’re doing is frame insertion or doubling. They accept a 120Hz input and then show each frame twice. Theoretically there may be some advantage to this. It’s a common ruse with HDTVs where they really go to town, ramping up a 30Hz signal to 200Hz. But in my view and for reasons I don’t particularly feel the need to quantify, it’s basically bollocks.

For the record, in case you’re concerned by their omission, we will get to subjects such as screen tear or frame syncing later on in this series.

From this site

103 Comments

  1. Asurmen says:

    There are patches for drivers that unlock the limited pixel clock both AMD and Nvidia force on us, giving dual link DVI the bandwidth necessary for high res high refresh monitors. I have to do it for my 1440p IPS monitor. Cons of the patches mainly revolve around it breaking hardware decoding and acceleration of various codecs, for which there are work arounds.

    • Sakkura says:

      Or you could just go DisplayPort.

      • Asurmen says:

        Buying a new monitor for DP just sounds utterly pointless. I would like to point out the patching method is only required if you’re buying a monitor specifically to overclock.

        • Sakkura says:

          You could have gone DisplayPort when you bought your current monitor. Though you obviously didn’t know this was going to happen.

      • Retne says:

        Display Port is also very annoying (at least on my W8.1, Nvidia) set-up with multiple monitors, as it keeps sense monitors that are asleep and moving all your windows / icons.

        I just want to turn that off! But can’t. ’cause MS or Nvidia, or someone knows best. Grrrr….

  2. Premium User Badge

    JiminyJickers says:

    I just got the Acer one in the top screenshot, it is so good. The fast response and G-Sync made such a massive difference in games it is incredible. I can never go back to the old monitors again.

    • Faxanadu says:

      What’s it called and what is G-sync? :/

      Also, on topic: Doesn’t going 120Hz also require more from your computer?

      • Sakkura says:

        G-sync allows the monitor to synchronize its refresh rate to the framerate that the computer is capable of at any given point in time. It means you won’t have to deal with either tearing (when the monitor refreshes while only part of the next frame has been rendered, leaving a chunk of old frame next to a chunk of new frame, with a visible tear if there’s movement) or V-sync, which can increase input lag and which forces the graphics card to drop to 30 FPS if it dips even a bit below 60 FPS, leading to a noticeable stutter.

        AMD is introducing Freesync, which does largely the same thing. G-sync requires a special module in the monitor, which is an expensive solution and is only available on a few models so far. Freesync support (technically called Adaptive Sync on the monitor side) looks like it will be significantly cheaper and more widely available.

        • TacticalNuclearPenguin says:

          Let’s just not jump to the conclusion that they’re going to perform exactly the same, until both things are properly tested it’s hard to tell.

          A different alchemy of hardware/software support might be more than enough to change things, and i’d wager that whichever solution gets the most out of the “hardware” part of the equation is going to produce the better result.

          • Asurmen says:

            Post you’re replying to hasn’t jumped to that conclusion.

        • Premium User Badge

          phuzz says:

          Just don’t, whatever you do, so much as hint that the nVidia way of doing things is any better or worse than the AMD method (or vice versa), or we’ll have a hundred rabid fanboys descending on us to have a flame war before you can say “oops, I forgot my asbestos undies”.

      • fish99 says:

        120Hz doesn’t require more, but 120Hz + 120 FPS does. There’s other advantages though that are always present whether your framerate gets over 60 or not, like the framerate drop from v-sync almost completely vanishes with a 120Hz display (irrespective of framerate). And if you want to leave v-sync off, the tears you get are a lot less noticeable with a high refresh rate.

      • Premium User Badge

        JiminyJickers says:

        The one I have is Acer XB270H. It does 144 Hz and it doesn’t use more processing power, it allows the monitor to display what your computer is capable of instead of being limited to 60 fps (assuming you use V-sync, I can’t handle screen tearing so before g-sync I always enabled v-sync).

        As explained by Sakkura G-Sync syncs the monitor with your graphics card instead of slowing down your computer when you enable v-sync. With G-sync there is no screen tearing and no mouse lag. It has made such a massive difference for me.

        • Gryz says:

          So you own a Acer XB270H ? Is that monitor different from the Acer Acer XB270HU ? (There’s a U at the end).

          You must be the first person in the world to have that monitor. It will be released next month. No idea if that is early March or late March. I have seen zero reviews of it on the web. Because no review-site has one yet. I also don’t think the XB270HU has been shown at any demonstration by Acer yet.

          Acer has said that the XB270HU will also support ULMB. I’m curious how well that will work. As the panel has 4ms g2g, while all other ULMB monitors use a TN panel with 1ms g2g. Did you try out ULMB yet ? Anything you can tell us about it ? Does it work well ? Does it work as well as on a TN panel ?

          • Premium User Badge

            JiminyJickers says:

            The front of my monitor says XB270H and the box says Model: XB270H Version: XB270H Abprz. Don’t really know the difference from the one you mention but my one is 1 ms and supports ULMB, however, that only works at 85, 100, and 120 Hz, since I run it at 144 Hz I don’t use it.

          • Gryz says:

            Thanks for the reply.
            I should have googled a bit. Your XB270H does indeed exist. It seems to be a new model, released only 2 months ago. People seem to use the same picture (jpg) for both the XB270H and the XB270HU. The XB270H is another monitor.

            Your monitor is a TN-panel, 27″, 1920×1080 with G-Sync and ULMB.
            The upcoming XB270HU is a whole other monitor.
            It’s a IPS-panel, also 27″, 2560×1440, with G-Sync and ULMB.
            Your monitor costs (here in NL) roughly 350 euros. The XB270HU is expected to be double that price.

            Congrats with your new monitor. A 1080p monitor is actually much better for gaming than a higher res one.
            Still, I am waiting for the XB270HU. I wanna see the first IPS screen with G-Sync. I might buy it.

  3. Arren says:

    …even if intention is a misnomer in the context of evolution.

    I salute your attention to detail, Mr. Laird.

  4. montorsi says:

    I’m pretty happy with my 144hz BenQ. I set it down to 100hz because the image quality suffers a bit at 120 and 144, but still it’s nice not having to worry about tearing or any of that nonsense I had to deal with on a 60hz monitor.

    • frightlever says:

      Unless I’ve misunderstood this completely, aren’t you MORE likely to get tearing on a 100Hz display? ie your PC is less likely to be able to achieve 100 frames per second than it is 60 frames per second, with tearing being caused by the phase difference resulting between the FPS of the output and the Hz of the display.

      NB, I’m probably wrong!

      Anyway, I’ve never personally experienced flicker or screen tearing in a game because I’m blessed with terrible eyesight and just don’t notice it or something.

      • Jannn says:

        Indeed he/she is more likely to experience tearing on 100 Hz versus 144 Hz. You understood that very well. However, a separate issue is worse image quality. The sweet spot for him/her is 100 Hz.

      • Cleave says:

        You tend to get bad tearing when your actual frame rate is higher than the refresh rate of the display which is why you get such bad tearing at > 60 fps on a standard monitor with v-sync turned off. You will still get tearing on a 120Hz monitor but it won’t be so bad. G-sync eliminates tearing completely (except when your frame rate is above 144) by matching the refresh rate to the frame rate.

      • Apocalypse says:

        The higher your displays hz the less tearing. Tearing is an issue that mainly occurs when your game outputs more FPS than your monitor can show, because that is when it more likely that a frame update will come in the middle of a refresh and not the wait time between monitor refreshes. Having a game at ~70 fps with a 144 screen will give you nearly no tearing. Naturally g-sync and adaptive sync will reduce the tearing to zero without frame-rate reduction, and vsync can solve the issue too.

        Reducing your screen to 100hz in increase image quality should be still fine. The reason for this to actually help with imagine quality is btw mentioned as well in the article, some screens overshoot their color targets on their maximum refresh-rate quite a lot or not consistently reach the correct values, etc, so it a nice trick to reduce those than to ‘just’ 100hz. For most people the sweet spot is reached already at this refresh-rate and 120 or 144hz would be hard to notice anyway. Personally I would be even fine at 85hz already.

        Another note: If you test this stuff for yourself, keep in mind that it all about fast movement. Slow games with not much movement on the screen will not benefit much at all from this, while at the same time even moving a window in windows will be readable while moving at 120hz, unlike with a 60hz which is not smooth enough to keep your eyes focused on the ‘jumping’ text.
        If you eyes are stressed like mine from to long work with to small fonts you actually might consider a gaming monitor a real update to your workspace. Ironic isn’t it?

  5. Premium User Badge

    steves says:

    Well, that was pretty interesting. I already knew about saccades, but microtremors was a new one.

    Things I have learned today; “Ocular microtremor can potentially help in the difficult diagnosis of brainstem death

    And yeah, you can never go back. Although I find it kind of amusing that us cutting-edge hardware geeks have stopped worrying too much about MHz/GHz and gone back to mere tripe-digit Hz!

    • welverin says:

      The microtremors are also important because your brain ignores any static image on your retina.

    • DeadlyDad says:

      In case anyone is interested in why microtremors are so important, check out this infographic by XKCD artist, Randall Munroe. The vision process requires WAY more cerebral bandwidth than you might think.

  6. nuno.leong says:

    If my graphics card pumps out an average of 70-80 fps, do i have any advantage on getting a 120Hz monitor, or it won’t make a difference ?

    • Lachlan1 says:

      In my experience it did, turning off AA will help you get higher still and, imo, the fluidity is worth it.

    • DanMan says:

      You’ll still get tearing at least, which kind of defeats the purpose.

      • joa says:

        You know you can fix this with V-sync? I always have it on – and I never have tearing.

        • zentropy says:

          I’ve always had it off and never noticed tearing. :P

        • DanMan says:

          But then it’ll drop to 60Hz, also nullifying his monitors extra capability.

          Tearing occurs whenever the games framerate isn’t in sync with the displays refresh rate. You notice it most, if the fps < Hz, but it also happens the other way round.

          You see, the game renders new frames as fast as it can. When one is finished, it is being written to the screen buffer line by line. The graphics card and the monitor (without Adaptive Sync) have agreed to a certain, constant refresh rate. That doesn't mean there's currently a complete frame available in the buffer though, unless VSync is enabled. So part of the image that's being sent to the monitor is still the one that came before, since the buffer is not being flushed in between. That's why you see tearing.

      • Asurmen says:

        Why would you get tearing when the fps is lower than the refresh?

        • clumsyandshy says:

          Because you have not enabled v-sync, that is why.
          If you do have v-sync enabled you will instead have to wait for the next frame to be ready and thus instead experience stuttering.

  7. Urthman says:

    I have zero confidence that a writer knows anything about this subject when they seem to think movie frame rates have any relevance to frame rates for games.

    Hint: there’s a very significant difference in how the two kinds of images are generated.

    • Premium User Badge

      Mungrul says:

      He used that as a starting point to explain how different refresh rates and frame rates are for gaming. Or did you not read the whole thing?

    • Lachlan1 says:

      It was to shoosh up the “but the human eye can’t see any more than x fps” crowd. He did it subtly and cleverly without directly telling them of their folly

      • TacticalNuclearPenguin says:

        I too think he was just being sarcastic.

        Then again the real reason why 24-30 FPS in a movie look sort of fine is still unaddressed. It’s claimed it’s due to motion blur and that it’s true to an extent, but the most important bit is that that fixed framerate is not just an “average” like in gaming, but it’s actually perfectly timed, with every frame following the other with always the same time interval. That’s enough for our brains, not optimal but decent enough, but games hardly can afford this luxury.

        Plus there’s the matter that a monitor can only give out a discrete amount of frames, when you look at the real world you see a perfect continuity, something very “analog” if you like, and our eyes are more adequate for that kind of thing.

        • jrodman says:

          Amiga games with bang-on 50hz always felt much smoother to me than any of the competitors before or after, so I agree with this entirely.

          Nail the framerates down and I’ll just notice the game. Make them jump around and I’ll be annoyed about the stuttering.

          However, for the games I play (not high action), I’m very doubtful this 120hz would actually be assistive to the experience, though i might think it “looked nice” if I saw it.

        • joa says:

          I think the motion blur is pretty important – when you see film shot with a very high shutter speed it has a juddery, low-framerate effect.

          • TacticalNuclearPenguin says:

            Yeah, that still has it’s own role, i wasn’t trying to deny that!

    • Buuurr says:

      Point – missed. Read the article.

    • Timbrelaine says:

      Frame rates are possibly the way in which physical cameras and virtual renderers differ least. Increasing the fps of a camera gets you more samples of your subject, doing the same with a renderer gets you more samples of the simulation.

      • Urthman says:

        But the frames themselves are different. One frame from a camera captures all the light and motion during the time the shutter is open. A game frame only displays a motionless instant.

    • Jeremy Laird says:

      You can’t have read the post in full. For if you had, you would have grasped that the point of mentioning low frame rate video formats was to draw contrast with them . To say in short, that they ultimately are not relevant.

      But never mind.

      • CaidKean says:

        Jeremy, any chance you’ll highlight the issue of eye-tracking based motion blur that we face on LCDs, even on 120Hz ones? It might be nice to inform people that this is also something that newer high-end displays lets you do away with by implementing the ability to turn on strobed backlights (replicating plasma and CRTs flicker to remove the motion blur caused by our eyes being fed insufficient data by a monitor whose light is always on.)

        G-Sync enabled monitors feature this via ULMB (Ultra Low Motion Blur) for example. This was one of the most glaring changes when I made the transition from CRTs to LCDs, the fact that even at higher Hzs you still have motion blur. Though it is naturally reduced considerably from 60Hz to 120Hz (if the framerate keeps up) since the eye is fed data faster. You’d still need like around 1000Hz monitors to have no motion blur without relying on flicker to reset your vision though, if my understanding is correct.

  8. jarowdowsky says:

    I gave the Asus VG248QE a try last year – it was incredibly smooth, just a revelation. However, I was really disappointed with the TN panel, and I’m primarily a film buff and gamer close second.

    However the seeing headache and vomiting whenever I moved kicked in after two days use so back it went to Amazon. Head finally felt right about a week later.

    Gonna assume it was due to the blinking in the back light but jesus, never known pain like it outside of when I shattered a tooth open…

  9. skalpadda says:

    It’s also one of those things you can’t unsee. Once you’ve experienced high refresh, you won’t want to go back.

    Perhaps you won’t want to, but humans tend to get used and readjust to things very quickly. Colour television is great and all, but it’s not like we can no longer enjoy black and white films.

    • frightlever says:

      I can’t watch black and white movies. I have to watch them through an old piece of orange cellophane from a 70s bottle of Lucozade or I get sick.

  10. Premium User Badge

    Mungrul says:

    Okay, I wanted to post about my bad experience with the Asus ROG, but the comments system seems to have eaten it, and whenever I try to post it again, it tells me it’s detected a duplicate comment >:E

    • Premium User Badge

      Mungrul says:

      Let’s see if it likes me today:

      I’ve been using a Dell 2407WFP for seven years now.

      I bought an Asus 27″ PG278Q ROG Swift last week. It arrived on Saturday. I returned it on Wednesday.

      I was utterly appalled by the picture quality drop when compared to my old Dell. Don’t get me wrong, the high refresh, and more importantly, GSync were utterly lovely things, and my 970 was able to drive the monitor at full res.
      But I’d never used a TN monitor before, and like many others, I poo-pooed the idea that viewing angles matter when you’re sat that close to the monitor.
      The unfortunate truth of the matter is that they really do. Colours varied from the top to the bottom of the monitor, and visibly shifted with even the slightest head movement. And I was also shocked to discover just how bad colour reproduction was on this panel. No matter how hard I tried, I couldn’t get consistent colours, and it frequently looked washed out. On top of that, the damn thing was so bright it actually hurt, and I found myself with sore eyes after using it.
      For a monitor that cost me more than £600, I was quite frankly disgusted that it was demonstrably so poor when put next to my seven year old, £400 Dell.
      I can only think that people declaring this monitor the best gaming monitor out there must never have experienced anything other than TN.

      What do I take away from this rather expensive lesson?
      Well, my next monitor will definitely be 144Hz capable and feature GSync. The one thing the Asus did was sell me on that tech.
      Also, 27″ and 16:9 2560 x 1440 is a lovely format. I’ve gotten used to 16:10 thanks to my Dell, but the Asus proved that 16:9 at that size is more than workable.
      And I’ll avoid TN monitors like the plague. Utter, utter shite.

      The new Acer Predator XB270HU that’s on the way looks like it could be the IPS alternative to the Asus, and therefore something of a gamer’s Holy Grail, but I shall definitely wait for reviews before plonking down the cash after being burned so badly.

      Oh, and can we PLEASE get these manufacturers to stop using adolescent branding graphics on the boxes of these things? It’s rather embarrassing being forty+ and lugging one of these things through London when they look like they were designed by a Nineties Sega throwback. A manufacturer logo, model number and diagram of the monitor is all that’s needed.

  11. Lachlan1 says:

    But the human eye can’t see any more than 5 fps

    (yep, sarcasm)

  12. liquidsoap89 says:

    The price is the big issue for me. I have 3 Samsung monitors that I bought for about $150 each a few years ago. They look fine, and they’re 60hz. I would love to get that IPS Acer mentioned; but holy hell, that thing would probably cost at least $1000 Canadian. That’s a huge jump from $150.

    • Buuurr says:

      “. I would love to get that IPS Acer mentioned; but holy hell, that thing would probably cost at least $1000 Canadian. That’s a huge jump from $150.”

      Yep, there are some major disadvantages when living in the colonies.

  13. Iskariot says:

    I have a 120 Hz monitor. And I feel you really do not need it at all. It will be no priority at all when I buy my next monitor.
    60 Hz is fine.

  14. DanMan says:

    I’m pretty happy with reliable 60fps in my games, which the 970 I got can supply at about max details. I don’t plan to mess with that setup any time soon, so I’m not that keen on higher refresh rates.

    I probably wouldn’t go beyond real 120 Hz either. It really gets hard to tell the difference from there.

  15. philosoma says:

    I really notice it when using TrackIR. Is nice and fluid above 100fps, 60 feels a little sluggish now, I have to drop a few settings to get me over 100 on some games. Yeah the monitor was costly though.

  16. geldonyetich says:

    Personally, I already had a 120hz monitor because I wanted to use 3D shutter glasses. That’s another point besides microtremors and smooth movement to consider, because stereographics are great… on the few games that actually support them. But one coveat to bear in mind: apparently shutter glasses produce a darker picture, and to compensate for that NVIDIA has come up with what they call “lightboost.” I wonder how many IPS monitors feature lightboost, if any? I wonder of the Occulus Rift is truly poised to make this all moot?

    • MattM says:

      I got those glasses, but found I couldn’t stand the 60hz flicker they caused.

  17. tomek says:

    The biggest advantage is havung a 120Hz WITH Lightboost on for gaming: link to blurbusters.com

    If you play FPS you will never go back to anything without it.

  18. PopeRatzo says:

    I have a dumb question: If I double the refresh rate, does my video card have to be twice as powerful to get the same quality, since it has to draw twice as many screens? I ask because I’m getting by with a mid-level video card, but I’m due to get a new monitor soon.

    • Person of Interest says:

      Yes, in fact I think the demands on your system might be even more severe than that, because if anything else is a bottleneck (CPU, etc) then that will also have to be upgraded. Unlike with a resolution boost, which basically depends on only the graphics card.

    • All is Well says:

      If I’ve understood things correctly (it’s a definite possibility that I haven’t, so someone please correct me if I’m wrong), you don’t need a more powerful GPU. Your GPU will always draw as many screens as it can, or at least as many as you let it, if you cap it. If it can draw 60 screens in a second, it will. These will be sent to the monitor which will display as many of them as it can, which will depend on the refresh rate – if the refresh rate is 60Hz, it will display all of the screens drawn by the GPU. If your GPU draws more than 60FPS, the “extra” frames won’t be displayed, since the screen can’t physically refresh the image that many times in a second. If you double the refresh rate, it will be able to display twice as many images per second, so the “extra” screens will now be shown. But if your GPU is still only putting out 60 FPS, then it will only display those 60 frames. Nothing will have changed.

      A simple example: your screen is probably a 60Hz screen. That does not mean that your GPU always has to draw 60 screens per second, right? Same with 120Hz screens – the only difference is the upper limit.

      • All is Well says:

        I should clarify: If you want to actually see a difference, then yes, you will need more powerful hardware, because a mid-range card probably won’t put out 120 FPS on any higher quality settings. But simply doubling the refresh rate won’t put any extra strain on your system by itself.

        • Person of Interest says:

          Yes, sorry, I may have missed PopeRatzo’s point entirely. A system can just as easily render 60fps to a 120Hz screen as it can to a 60Hz screen.

          • All is Well says:

            After seeing your comment, I was worried I was the one who’d gotten it wrong! Hence the clarification that he might actually need something more powerful. But this way he’ll have an answer regardless of what he meant!

        • snv says:

          Also, you do not need to have more than 60 fps from your GFX card to have a benefit from a high frequency monitor.

          Think about vsync and tearing — the higher your monitors frequency the less this is an issue ( and the less of a benefit you can get from g-sync / freesync )

    • OmNomNom says:

      Well really it depends how much ‘spare’ power your PC / GPU currently has. It probably won’t be twice as much power needed.

      There are also other things your GPU does that won’t necessarily require double power, such as rendering the textures.

  19. teddancin says:

    For people having a hard time visualizing the benefits of high refresh, think of it like this – your video card is essentially shooting out frames on a pulse, and they’re being caught on a different pulse. If you double the refresh rate of the “catch” pulse, you’re essentially making frame placement twice as accurate.

    TacticalNuclearPenguin wisely pointed out that game-frames aren’t as consistent as movie theatre-frames – a higher refresh benefits games over movies because it allows the placement of the rendered frame to be twice as “accurate” when it comes to render time.

    • neckro23 says:

      Speaking of movies, it only recently occurred to me that 120 Hz monitors (and 144 Hz ones) can show 24 fps movies with consistent frame lengths. 60 Hz panels can’t.

      • OmNomNom says:

        Yeah because of this people have gone to extreme measures in the past in terms of ‘split frame’ vsyncing their video to get rid of the occasional microstutter.

  20. Philopoemen says:

    so by my limited understanding, a high refresh rate is really only worth it if you’re rocking a top of the line gfx card.

    The majority of the games I play are turn-based, and whilst Endless Legend looks prettier on my good rig than it does on my laptop, it’ not that much of a difference.

    I suppose the question is, what sort of games are benefiting from the higher refresh to make it worth it?

    • Geebs says:

      Well, if you go 120Hz, the menus will seems the same but your mouse pointer will look awesome

    • Heavens says:

      Exactly my thoughts.
      I’m playing a lot of “fully-pimped” Skyrim lately and I’m usually cruising at 40-45 fps and I haven’t seen any “benchmark” of sorts that shows how a refresh-rate upgrade will effect my experience when I’m not getting 120fps.

      Tl;dr:
      Is it worth it when I’m not playing at 120/144 fps? Because I probably won’t on 2/3rds of my games I’m currently playing.

      • Caerphoto says:

        The way I understand it, yes, you will see an improvement when playing at any framerate that isn’t an exact multiple of 30 or 60. In your case, if playing at 45 fps, since a regular 60Hz monitor doesn’t actually display at that rate* you’re basically getting a sort of juddery alternating odd-even rate, which is worse than a rock-solid 30fps.

        It basically allows more accurate frame timing, showing the frame more precisely when it’s intended, rather than approximately. It’s sort of a brute-force *G-Sync.

    • OmNomNom says:

      Yeah, if you’re not playing fast moving games or you don’t think you have the hardware to support it, there is absolutely no reason to go for a high refresh screen.

  21. Foosnark says:

    Still worthwhile if you’re wearing bifocals, can’t honestly see much (or maybe any?) difference between regular TV and HD, run dual monitors at 1920×1200 and can’t fathom why anyone would want more, and hates 3D because it causes headaches?

    • MattM says:

      It can be, it improves the image/responsiveness in a different way than all those other things. Until they released 120hz lcds I used 60lbs CRTs that I would find in estate sales because 60hz just felt too unresponsive.

  22. Juan Raigada says:

    While the science you quote is correct, there’s more than that going on in the eye. Specifically, the claim that a higher refresh rate peels a layer of suspension of disbelief is, I think, badly explained. Or at least I think it works in the opposite way.

    While there are many reasons why the 48fps in movies is frowned upon, one of them is because higher frame rate does indeed destroy the suspension of disbelief (or peel a layer out of it). But if you take away the suspension of disbelief, whatever is on the other side better be truly real. That is, the more “imperfect” an image is within certain parameters, the more effort the brain and eye have to do to process it, and thus, the more they participate on the process of making that image “real”. Or in other words, suspension of disbelief is easier (or more effective) at lower framerates, lower resolution…etc.. 2D animators consistently limit their framerates for creative effect (and budgetary reasons too, but that’s a different issue)

    It is similar to the uncanny valley effect. You eyes do not see or process the trickery, so you are more prone to see the other tricks. Yes, with higher refresh you are going to be perceiving more real object on screen. The problem is that, because you are processing those objects as real, the brain will demand more “real” characteristics of them (something that game assets/interactions consistently fail to deliver yet). And if those more real qualities are not there, since there’s less suspension of disbelief, it’s easier for the brain to dismiss the diegesis (the same way you see the artificial nature of The Hobbit scenarios, for example, and fail to engage emotionally on the movie -that and all other failings of course, but this is an issue-). For higher refresh/framerate truly take away the need of suspension of disbelief and that actually improving the experience, other technical advances are necessary, and I don’t think we are there yet.

    However, not everybody looks at games like fictional worlds, and not all games want to be experienced that way. Simulators are an example of a genre where the more fluid, less “fictional”, feeling higher refresh and framerate give is an undisputed advantage. there are more (a strategy game at super high refresh can feel like a “real” boardgame, for example, and benefit from it. But story based AAA games (those where the suspension of disbelief is actually important) are a different matter.

    • El_Emmental says:

      I confirm it’s affecting the suspension of disbelief in that way: lower framerate gets our mind in a mode where we’re rebuilding and imagining what we’re seeing, while a higher framerate makes it harder for our mind to suspend that disbelief (we’re expecting an actual reality). Same applies to the pixel density of an object (depending on its distance from the POV).

      However I don’t think we’re not “there” yet. We’re starting to see increasingly better visual representation really getting close to reality in some situations, I believe we’re climbing the side of the uncanny valley. But since it’s the most subjective experience ever (how we relate and perceive what we define as reality), it will depends a lot on the individual personal experience and context.

    • Asurmen says:

      I can’t say the article covers suspension of disbelief, nor does it have any relevance to computer game. No one is trying to believe what they’re seeing is real, outside of Rift and friends that is.

      • Juan Raigada says:

        From the article (as one of the reasons why increased refresh rate is better):

        “The same applies in-game. You have a sense of things actually moving, rather than images being rendered and animated frame by frame. In short, it peels away another layer separating you from a fuller suspension of disbelief in what you are looking at. ”

        Which is, as I explained, wrong. It doesn’t peel a layer separating you from a fuller suspension of disbelief. It peels away the suspension itself (makes it harder since it makes the eyes expect an actual reality). Wether that’s a good thing or not is debatable, of course.

        • Caerphoto says:

          I think it depends on the art style of the game. One that goes for a realistic look would probably have more problems than something more deliberately cartoony like World of Warcraft – at imperceptibly high resolution and framerate, that game’s going to end up looking like a ‘real-life cartoon’, rather than a poor imitation of real life.

        • DanMan says:

          Agreed. He got that wrong there.

          Nicely explained, thanks. I’ve always felt like the props in The Hobbit stuck out much more than in other movies. Good to see my suspicion confirmed that it’s due to the higher frame rate.

          Hence I’ve come to the conclusion that higher frame rate in movies only benefits, if the content is as non-fictional as possible, like in documentaries. Or pr0n. ;p

    • Jeremy Laird says:

      I’m aware of the uncanny valley issue regards frame rates. I’m not convinced higher frame rates necessarily put you in the valley as opposed to higher up the slope just before the dip.

  23. jingies says:

    “imagine a screen that only refreshed once a second”

    I don’t have to imagine it, I just have to go to work.

  24. TheSplund says:

    So 120Hz refresh rate but a poor response time – 4ms for a £700 monitor? nah, not for me

  25. udat says:

    Is there a reason why the article doesn’t include the model name/number of the various monitors it discusses? e.g. the IPS and Gsync capable job at the top?

    Hell if there was an affiliate link to amazon, I might have just bought the thing straight from the article.

    • Alec Meer says:

      It’s not something we’ve thought about tbh, but presumably people would pile on and yell about ethics if we did.

    • Jeremy Laird says:

      Just to add to Alec’s comment, I’ve mentioned the model in previous articles, but having not seen it or tried it, I’m not overly keen on promoting / recommending it.

      • MercurialJack says:

        It kinda feels like putting a picture of the thing in your article is promoting/recommending it, if only by implication. I don’t see any further harm in sticking the model number in hover text or something, just for people’s reference if they want to do further research (like I did, and had to scour the comments section for a model number from someone else who just happened to know).

  26. DizzyCriminal says:

    Can I just ask; how are people using these monitors in terms of game settings?
    Can you carry on using 40-60 f/s and you get the benefit of having your frames rendered 2-3 times (which smooths the motion), or are people reducing image quality and increasing their framerate?
    I’m just asking because most people here will have modest/entry level enthusiast level PC’s, running at 1080p with high settings and ~60 frames ceiling on modern games.
    It sounds like a dumb question, but I doubt PC World will have any gear like this in to see for myself (like 21:9 monitors) to the best I’m going to get is people describing it to me.

  27. Shadowcat says:

    So despite being under the impression that I’m happy with my current monitor, I actually need a 120Hz monitor because it will cost me a lot of money, requires me to use a DisplayPort thingamy that I don’t have, requires me to buy a hulking great video card which can render games at 120fps (while probably negating the need for a heater in the room, and sounding like an aeroplane taking off), and finally because it will ruin me for all lesser monitors, despite the fact that I’ll probably still use them elsewhere on regular occasion.

    I’m pretty sure we have different definitions of “need” !

  28. Gordon Shock says:

    And here I thought that the previous article covered it good. My patience for hardware is running thin, I am almost to a point where I would just flip a coin. Some words of wisdom here :

    link to ted.com

  29. avrus96 says:

    Just wanted to hop in and mention that Eizo’s 240 Hz display works by inserting a blank (black) frame between each of the regular frames, which update at a 120 Hz frequency. What this does is virtually eliminate perceived motion blur (sort of how the strobing backlight of 3D Vision monitors with LightBoost works). Article explaining how/why LightBoost works: link to blurbusters.com

    And from Eizo’s website:

    The EIZO “Turbo 240” function decreases motion blur by doubling the frames of the input signal and blinking the backlight.
    With backlight blinking, the monitor displays the screen like an impulse-type display.

    • OmNomNom says:

      Unfortunately it does have its own ghosting issues, it is definitely not comparable to a high refresh TN panel.

      Source: I own FG2421 and PG278Q

  30. drewski says:

    While I totally appreciate the benefits of high refresh rates, ultimately I’d rather have an IPS panel and a cheaper price over a fast TN.

  31. tonicer says:

    My 144hz monitor (Benq XL2720Z) is the best purchase i ever made for my PC … it even tops the insane speedboost my system got from my first SSD. Going from 60hz to 144hz changes every game, it doesn’t even matter how many FPS i get, even at <144FPS it feels vastly different.

    Only con: Going back to a 60hz display (on my windows tablet for example) is really terrible. I can't comprehend how "videogameconsolepeople" deal with the low refreshrates. (videogameconsoles don't support more than 60hz afaik)

  32. mactier says:

    As long as you don’t propose that I need retina resolution or super-big format. The whole reason for their existence is namely that most users don’t know about LUT, which in over 8 bit (real/unsimulated) makes ordinary screens look way more detailled than common ones, or is really the only way to make them look detailled/authentic. I have two screens, one with over 8 bit, the other with 6 bit and 8 bit simulated (perhaps the most common format). The over 8 bit one looks fairly gigantic, the 6 bit one, well, looks just like a small-ish screen. They’re the same size.
    (Yes, this is a fairly random comment, and therefore is likely to be “punished” by default, but it’s true nonetheless.)

  33. FabriciusRex says:

    I’ve had the Dell U2711 since 2011. It’s a 2560×1440@60hz ips. Recently I got the gtx 980 and finally I have a gpu that can refresh all the pixels at a resonable rate. Can’t think of many games that would benifit from higher refresh on a single gpu setup at my res.