Week In Tech: Overclock Your Monitor With NVIDIA

By Jeremy Laird on March 4th, 2013 at 8:00 pm.

A high quality LCD panel. Or high refresh rates. Take your pick. Because you can’t have both. Well, not unless you think BadgerJump Monitors (or whatever they’re called) sounds like a sensible consumer electronics brand and you’re thus willing to roll the dice on a dodgy Korean panel off eBay. But wait. One of the footnotes to NVIDIA’s recent Titan graphics card launch is a new monitor overclocking feature. Yup, monitor overclocking. But will it give you 120Hz for free? Will it fry your panel? Do you need NVIDIA’s £800 Titan? Should you actually care about high refresh? I’ve got the answers…

First up, if you’re not interested in chewing over the broader subject of high refresh and you’re keen to overclock the twangers off your monitor immediately, skip to the last third and the ‘hands on’ bit. There, I explain what you need, where to download it and my experiences so far. The simple version is that if you’ve any kind of NVIDIA graphics card, you’ll probably be able to have a go.

Anywho, monitor refresh rates. They’ve always been a bit of a brain ache. Back in ye olde days of CRTs, you had to play off resolution and refresh. The higher the resolution, the lower the refresh was the general rule.

CRTs generate an image line by line, of course. So higher refresh makes for a more stable, less flickery image. Easier on the eye. Less chance of a headache. Plenty of obvious benefits.


Remember when flat CRTs were the bomb?

With LCD panels, none of that applies. In really simple terms, you can think of an LCD panel as being always on. The image isn’t generated line by line. Instead, every pixel is simply updated at a given frequency or refresh rate. Even if you reduced the refresh rate to 1Hz, there would be no flicker. You’d just have seriously crappy frame rates.

In truth, it doesn’t work quite like that. But that’s close enough to the reality for argument’s sake. Anyway, the point is that flicker isn’t an issue on LCDs. But frame rates are. It’s at this point that the science of human sight enters the equation and I have to admit my limitations. Or at least frustrations. It’s a subject about which I’ve always found the science a little unsatisfactory.

To get an idea of what I’m talking about, let’s have a think about the various video formats around today. Take movies, for instance. Until recently, the standard frame rate (or effectively refresh rate, though actual projection rate or shutter speeds vary) for a movie was 24 frames per second. That’s enough, I think you’d agree, for what looks like smooth, natural motion when you’re plugged into a pew at the cinema.

However, if you’ve suffered through the impenetrable tedium that is The Hobbit in High Frame Rate (HFR) format, you’ll know that doubling the frame rate to 48 frames per second makes an enormous difference to the look and feel of motion. One can argue the toss over the question of whether HFR looks better. But clearly the human eye and mind can distinguish 24fps from 48fps.


Bloody awful: The Hobbit in HFR. Not sure about the frame rate, either.

Now, consider that the standard refresh rate for a flat panel PC monitor is 60Hz or effectively 60 frames per second. Significantly higher than the new HFR movie format, then. And you might think high enough for completely fluid motion.

That’s pretty much what I thought until fairly recently. I used to assume the only benefit to rendering games above 60fps was that it gave you more headroom for those occasion frame rate troughs. Rendering well above 60fps on average, in other words, makes it less likely you’ll drop significantly below 60fps at any given moment.

Given that all LCD monitors were limited to 60Hz, that was literally true. But I didn’t give any credence to the idea of an upside to a monitor capable of higher refresh rates.

Then NVIDIA rolled out its 3D Vision tech, requiring monitors capable of delivering 60Hz to each eye and thus 120Hz overall. And I had an immediate epiphany. Getting a precise handle on exactly where the threshold is for the human eye and brain in terms of frame rates is tricky. No doubt it varies, anyway. We’re analogue beasts, not digital droids.

But I can tell you this for sure. Even just pushing windows around on the desktop, the difference between 60Hz and 120Hz is obvious.


3D gaming according to NVIDIA. Uh huh.

That said, I’m not sure I can tell the difference between 100Hz and 120Hz. A further complicating factor is motion blur. It’s this that allows a 24fps movie to seem smooth, which on the face of it doesn’t make much sense in the context of our ability to distinguish 60fps from 100fps.

Anyway, the most important point is that on the desktop, in games – bascially, on your PC – 100+Hz is bloody lovely. I can’t stress that enough. High refresh rates make your PC feel much more responsive and look much slicker. To coin an Alan Dexter-ism (you know who you are!), high refresh makes all your games look awesome. The only snag is that you’ll need a video board that can feed all those frames.

Well, that and the fact that, currently, high refresh rate monitors are limited to TN-type panel technology. Yes, a boutique industry has popped up involving 27-inch IPS Korean monitors capable of 100+Hz. But in the real world, high refresh monitors are TN. And TN has the worst image quality by every metric save response times.

That it’s the quickest panel type makes it a natural fit with high refresh rates. But the latest IPS panels are pretty nippy, too. And high end HDTVs now typically offer 100Hz and beyond refresh rates but do not use TN panels.

The hands-on bit:

Overall, I doubt there’s any good technical reason why you can’t buy an IPS 120Hz screen. It’s just none of the big boys have had the balls to try it, so far. But can you make your own? Now, that’s an intriguing question.

When I first heard about NVIDIA’s monitor overclocking, it was supposedly limited to the new Titan board and was thus irrelevant. But no. It works with a broad range of NVIDIA graphics cards. I’ve tested a current GTX 680 and an ancient 7900 GTX.


Monitor overclocking arrives with Titan. Works with older NVIDIA boards, too.

The latter dates from 2006 and works fine with the monitor overclocking tool. So, I’m going to assume anything newer will be dandy. Software wise, NVIDIA isn’t providing the overclocking tool directly. It comes from board partners. I’ve been using the EVGA Pixel clock tool. It works with non-EVGA cards. You can download it here.

For the record, I tested it with an AMD graphics board and no dice. It simply throws up an error decrying a missing NVIDIA API.

The real question is monitor compatibility. I’ve tested four monitors with variable results. Most disappointing are my Dell 3007WFP and 3007WFP-HC. Neither would run even 1Hz higher than 60Hz. Bummer.

Next up is my Samsung XL30. That will run up to 72Hz, but behaves oddly at that refresh. The fastest stable refresh it supports is 70Hz.

In many ways, the most interesting test subject is my Dell U2711. That’s a 27-incher with a modern IPS panel and lots of inputs. It’s exactly the sort monitor I’d want to be overclocking for games.


EVGA’s tool is pretty much idiot proof.

Unfortunately, I found it essentially doesn’t overclock at all. I tested up to 80Hz and it will render an image. But at any setting above 60Hz, the frame rate is jerky and stuttery. Something odd is going on with the image processing.

If that’s disappointing, what’s interesting is that I reckon I can feel the difference at 70Hz on the XL30. It’s noticeable smoother. Reading around, it looks like 85Hz or thereabouts is probably where the maximum subjective smoothness kicks in, so you don’t need to achieve 100+Hz to get a tangible benefit from monitor overclocking.

The proviso to all this is involves the unknown risk to your hardware. My understanding is that it’s actually pretty safe. But the usual small print applies. Move up in very small steps. And it’s all at your own risk.

Still, something for nothing is always nice. I’ll probably be running my XL30 at 70Hz from here on in.

I’d also be very interested to hear how any of you guys get on overclocking your panels. Good luck!

, , .

107 Comments »

  1. LlamaNL says:

    “But I can tell you this for sure. Even just pushing windows around on the desktop, the difference between 60Hz and 120Hz is obvious.”

    yeah… no… Windows DWM runs at a set 22 frames per second regardless of your monitor refresh rate. The only way around this is turning off Aero and that causes a huge amount of visual distortions.

    • Sparkasaurusmex says:

      Is disabling Aero simply choosing a theme that isn’t an “Aero theme?”
      I don’t notice a huge amount of visual distortions doing that.

      • LlamaNL says:

        In windows 7 / vista disabling aero, turns off hardware rendering of the desktop. in Windows 8 you cant turn it off at all.

        • HothMonster says:

          Really? I know in win7 aero runs in directx9 so you have to turn it off if you want a second display to be visible while playing directx11 games. Are you telling me that is impossible with windows8?

        • Mctittles says:

          I thought Windows 8 didn’t have any “Aero” effects on windows?

          • Malcolm says:

            It doesn’t have frame transparency or the “glass” effect (although the taskbar remains slightly transparent for some reason). It’s still hardware accelerated though.

    • TheManko says:

      What do you mean? That you wouldn’t notice a difference in smoothness while dragging windows around on the desktop? Because I just set my monitor to 24hz and it looks jerky as shit compared to 60hz. I assume you’re referring to something else.

    • iainl says:

      Really? Since I’ve not met -anyone- who runs their monitor at an exact multiple of 22Hz that must cause all sorts of horrible judder.

      Which is also why I would never clock my monitor to 70Hz; video is going to drop frames in a complete mess. Get to 72Hz and you’re the boss, obv.

    • Kohlrabi says:

      Stop spreading FUD, please.

    • Premium User Badge Llewyn says:

      Synchronization is likely the issue, in that case. Assuming* the window manager is only updating at 22Hz but the monitor is refreshing at 60Hz, the monitor won’t display the window updates at a constant frequency – there will be an almost imperceptible** jerkiness to the updates as displayed on the monitor.

      At 70Hz it will of course also be permanently desynchronized, but as the monitor frame rate increases the mean, and maximum, variance from the ‘correct’ timing will reduce. I suspect, but have already had too much wine to be able to do simple maths without writing it down and hence might be very wrong, that the 85Hz threshold is actually at 88Hz, where there will be exactly four monitor frames per DWM frame. Above this threshold the variance will of course always be lower than it will at corresponding points in the cycle below it.

      *I have no idea, so I’m happy to take your confidence at face value.
      **Or blatant, or completely imperceptible, mileage appears to vary here.

    • phuzz says:

      What complete nonsense.
      The framerate of the desktop is not capped to 22Hz.

    • jalf says:

      [citation needed]

      Or we could, you know, just look at the screen and see that it’s obviously redrawing everything at a much faster rate.

      Heck, we could even load up a game. Remember that if the game runs in windowed mode, it goes through the same graphics pipeline as all other windows. Which means that it should be updated 22 times per second, according to you. Now, I’m pretty sure I’d notice it if my games ran at 22hz.

      Perhaps *something* in Windows is only redrawn 22 times per second, but all of it? Er…. No.

    • Solidstate89 says:

      No it doesn’t, that is complete bullshit.

    • KayinAngel says:

      Then why is fraps telling me DWM caps at 60 fps when I move a window? Could it be that you don’t know what you are talking about?

    • Old Rusty Dusty says:

      This is simply incorrect — I have one of the 120Hz Korean IPS Panels, and can tell you for a fact that the difference between running Windows 7 Aero at 60Hz vs 120Hz is simply staggering. There’s more frames rendered and dragging a window around at 120 vs 60 yields a liquid-like smoothness that’s easy on the eyes.

      I hate to say it, but you really can’t have any opinion on 120Hz vs 60Hz unless you’ve actually used a true 120Hz panel itself.

      • dontnormally says:

        …I think it goes without saying that anyone who might read the comments here would be very, very interested to hear which make and model of monitor you purchased.

        It’s like you’re saying: “I’ve reached the goal outlined but not-reached-by the author of this article, however I shall keep the details of my exploits private, thank you”.

    • Rian Snuff says:

      That was painful to read. I actually thought to myself while reading this article “Great maybe nutters will stop with this “THE HUMAN EYE CAN’T EVEN SEE DAT’ MANY FRAMES!”" nonsense… Then here comes a new one I’ve never even heard before, lol.. Dude, anyone with a 120hz monitor doesn’t even NEED to have a frame counter up.. YOU CAN SEE THE DIFFERENCE. Just by moving your mouse you can literally count that it draws it twice as many times with your naked eye and instantly feel the difference.

      Okay, so please just try it someday before you call bullshit. Ha.

    • SirProudNoob says:

      Um, no… I have a 120 Hz monitor and EVERYTHING is smooth.

  2. Sparkasaurusmex says:

    My 120hz TV is sort of weird.
    It tries to make anything work at a higher frame rate (?) I guess that’s a standard thing for these?
    But because it’s interpolating those extra frames (or whatever) it has a second delay, so if you want to play a game on it you have to turn on “game mode” so it doesn’t try to upscale the frame rate.

    I can push some older games to 120 frames per second, but I can’t seem to get them to actually show up like that on the TV. Any insight offered is appreciated.

    • colw00t says:

      What is giving you trouble is something that my Panasonic calls a “motion smoother” which seems to be just a quick and dirty interpolation so as to force everything up to the desired refresh rate.

      I turned mine off about ten minutes after getting the TV. I can’t stand it, especially when dealing with film, which is designed and shot around a specific display rate.

      There should be an option in the settings somewhere to just turn it off outright, which I personally would do. Some sources can use the 120Hz, so let them. Don’t force stuff into speeds it’s not meant for.

    • TheManko says:

      As far as I know none of the current 120hz or higher TVs out there support 120hz input. Only 1080P 60hz max, so the “120hz” is only post processing, not actual 120hz rendering like with 3D computer monitors. It’s because of HDMI bandwidth limitations I believe.

    • Sparkasaurusmex says:

      Catalyst Control Center …er Vision Control thingy reports 70hz from the TV. It’s either because the TV is early generation 120hz and Insignia brand, or something to do with Catalyst. maybe

    • Dave3d says:

      Your tv is not true 120hz.
      It is 60hz, with emulation.
      It will basically do 1 frame, then the same frame again, for a ‘smoothing’ effect.
      Some monitors do this also.

      As far as I know, only monitors that say ‘true 120hz’, and plasma tv’s (old plasma’s actually were 600hz, and newer ones are much higher hz. This is why there was minimal to no ‘ghosting’ on plasma’s, vs lcd’s), are over 120hz or more.

      • rennex says:

        That 600Hz was / is also marketing BS. It may be doing SOMETHING at 600Hz, but refreshing the whole image is not it. Remember CRTs running at a smooth 100Hz? Now imagine a CRT running at 50 or 60Hz, and that’s what all plasmas still look like. I can walk into a store and spot all the plasma TVs at a glance, because they’re the ones that are flickering and suffer from rainbow ghosting.

        • Skydancer says:

          Indeed. Those are 600 scans per second, this means for a 60hz image refresh, it simply scans each frame ten times.

    • Rian Snuff says:

      It’s not true 120hz, what it does is double each frame, shift it over and blur it to give the effect of smoother video. Which requires processing, which requires time (lag) which of course results in the input delay. A lot of people don’t understand this. If you plug in your PC to it and go to the monitor settings you’ll in fact see that it’s locked to a true 60hz.

      Them sneakers buggers!

      Oops, I didn’t notice someone already said this.
      Glad the world is getting it now.
      So yes, it’s better to just buy a cheaper 60hz one 90% of the time if it’s an HDTV.

      About Plasma’s however, the image will still be visiably smoother from my experience, however the input is still limited to 60hz most of the time on any I’ve tried. Like the Quatro Pros.. Even if you turn off vsync.. The cable you’ll likely be using, and/or the input is still limited to 60hz.. (I never did try to get a true 3D one..) I’m not really sure what this means in the end of things, but if someone could explain more about that I’d love to read it..

  3. SuperNashwanPower says:

    Will it make any difference if its a laptop monitor? I have a Medion Erazer x6813 gaming laptop with GTX460M, so it will be an MSI screen (1920×1080, TN panel). The monitor is not very good, but will it blow up and cause a black hole to appear in my living room?

    • bernieH says:

      Just tried this on my laptop monitor and went from 60Hz to 100hz stable.
      Seems pretty awesome

      • SuperNashwanPower says:

        Yeah mine goes up to about 100Hz as well, but I’ll be damned if I can see any difference …

        • Old Rusty Dusty says:

          Powerstrip will let you test your monitor to confirm if it’s actually running at 100Hz.. Also, in terms of gaming you will want to make sure that the game you’re playing is actually running at 100Hz. Some games will default back to 60Hz and require minor tweaking to get running correctly. You will definitely notice a difference if it’s actually running at 100Hz vs 60. I’d say more so in games than windows, but I can still tell the difference.

          • Jon M. Kelley says:

            My eyes still burn out trying to use the new flat LED/LCD/Lwhatever low resolution screens for 10 hours a day. Luckily I still have a 2048×1536 @85Hz CRT at work as my primary screen (& 2 at home). Yes, they are old enough that they cost over 1/2 a grand each new, and the last one I had repaired cost almost that much to get working again, but I wouldn’t trade them for anything I’ve seen on the market for the last (lost?) decade. Supposedly with the right marketing, you can sell anything to anybody, including mercury filled twisty bulbs.

  4. Kapouille says:

    Back in the day of CRTs, 85Hz was the sweet spot beyond which the (my) eye would not distinguish the flickering anymore. I’d expect the ideal framerate to be around those figures.

    • colw00t says:

      Unless you’re dealing with a recording.

    • Aninhumer says:

      I’m not sure the rate at which you stop perceiving flicker is necessarily the same as the rate you stop perceiving differences in smoothness. They’re kind of different phenomena.

    • PedroBraz says:

      +1 to that. However, 100 hz was/ is my preffered rate for a CRT

    • Mctittles says:

      As a long time CRT user, I find that (like all things dealing with perception) 85hz seems the sweet spot until months of use and then it is noticeable. I imagine the upper limit for perception could be very, very high to someone acclimated to a certain hz.

    • Grey Poupon says:

      I’m probably one of the only ones still using a CRT. Would just have to throw a thousand bucks at a monitor to get a better TFT than this. But at least my PJ is a 120hz LCD.

      While the CRT is bulky as hell, weighting in a around 40kg, I do still love it. No native resolutions or refresh rates or poor viewing angles or slow response times. I really do prefer CRT as a technology. A shame I’m in an extreme minority.

      • jrodman says:

        Well, the masks on CRTs are designed to work best at certain resolutions, but thankfully they ‘degrade’ extremely gracefully.

        The sharpness tends to be slightly worse, and gets worse with time (so if your monitor is as old as I think it’s probably not so sharp). But they certainly tend to have better color and “refresh”.

        But yeah they’re going out of date. Soon it will be prohibitively expensive to buy one.

        • Grey Poupon says:

          The sharpeness does take some toying around with the settings at times as they “float around” a bit, but the difference isn’t really noticeable when it’s set up right. And considering it’s always sharper at every other resolution but the LCD’s native res and I’d rather reduce the resolution than the eye candy effects. But yeah, the monitor is old as hell. Still able to push out 2048 x 1536 @ 75hz which is something the flats are rarely capable of.

          Since the bulkiness doesn’t bother me much, I mean, what would I need more space behind my monitor for, I find it hard to consider an LCD an upgrade, unless it’s a 30″ IPS. And those cost around 700 bucks from Korea. If the CRT tech would still be developed and manufactured for monitors, I’d probably be able to get one sick monitor for that kind of money. Damn you progress and other people. Everyone should just be a clone of me.

    • slight says:

      Not really.

      Realtime graphics dont have any real motion blur so if you imagine a ball moving right across the screen in one second, at 24fps the ball will be drawn in 24 positions, just a clean circle. A camera would capture 24 swooshes of movement which the eye blends together into one big movement because the edges touch. Now at 60fps you get 60 balls drawn in that time and their positions might overlap meaning that it looks more like that motion blur to the eye. Suppose that the eye was able to capture 30 fps (it doesnt have a framerate but humour me), and you monitor / GPU are displaying 60 fps, each frame that the eye saw would actually show the ball in two positions per ‘frame’. Now imagine its 600 fps and you get the ball at 20 positions per frame, it starts to look like real motion blur (and thats how tru motion blur is actually generated in movie cgi, you just render multiple frames per output frame and overlay them on top of each other for the output frame).

      So for computer games, desktops etc, you could go to very high framerates indeed and still see a noticable difference, its just that you get rapidly diminishing returns.

      So next time someone tells you you only need 24 or 30 fps for games…

      (Ranted from my phone, I think my thumbs are going to fall off)

      • slight says:

        Sorry I should clarify that a movie camera shooting at 24fps only normally opens its shutter for 1/48th of a second per frame, so the motion blur from frame to frame wont perfectly match up, but it’s still better than none.

    • Rian Snuff says:

      85 is my sweet spot where I’ll never get eye fatigue or headaches.

  5. ShineyBlueShoes says:

    This seems like something that’s going to greatly vary from monitor to monitor, in terms of capability, since some monitors have those fancy extra bits in them that actually further process the image before displaying it and some just throw up what the gpu spits out.

    Regardless having worked in electronics stores for many years I just do not like the look of anything over 60hz personally. It actually seems too smooth and makes the fake bits stand out as fake to my, rather poor quality, eyes. Maybe it’s just the video feed those stores ran for their displays or the way the TVs themselves handle the interpolation (which is how these displays all handled 120hz+ last I paid attention) but it’s not for me.

    • Tanksenior says:

      You really shouldn’t be comparing TV screens to screens designed for use with a computer though, there are major differences.

    • Baines says:

      With a computer monitor, everything is “fake bits”.

      You don’t get the clash of mixing realities, like live actors on a half man-made set interacting with purely CG creatures, all converted to a framerate that isn’t a direct multiple of the source material.

    • Snargelfargen says:

      Yeah, its surprising how much post-processing TVs do these days. Computer monitors don’t have any of that cruft; no interpolation, no upscaling no sharpening algorithms. Without all that input lag, a fancy refresh rate does look pretty good, provided the computer is up to the job of pushing enough frames.

      On that note, 27′ and larger monitors are getting cheaper and cheaper. PC gaming on a 40′ TV used to seem like a sweet deal, but the rough image quality and lag are annoying.

  6. Inverselaw says:

    Honestly its really hard for me to tell when a game is running at more then 45 fps, though a friend of mine often complains that I’m insensitive to visual lag and stuttering (isint that a good thing?).

    For me whats Ideal is a game that maxes itself at 60 Hz and a graphic card smart enough to throttle itself when the game is light enough.

    • Wisq says:

      Just enable vsync and you should get exactly that effect, I believe.

    • phuzz says:

      I’m going to assume the maximum frame rate you can discern varies from person to person.

    • Cytrom says:

      Being insensitive to low framerate = your brain is missing information.

      For example, how could you perform a quick railgun flickshot in quake, if you cannot even see most of the frames in the move just the beginning and the end (which is pretty much the case if you were capped to 45 fps). Any experienced or pro player of fast paced games can clearly see the difference between 60 and 90 FPS, some even between 90 and 120 fps.

      The faster the movement the easier to spot low framerate. FPS difference is not so apparent in an RTS or some other games with fixed camera angles and realtively low movements on screen.. a PnC adventure game could run perfectly smoothly even at 5-10 FPS for example.

  7. Brosepholis says:

    I think possibly the subjective improvement reported here has more in common with that of the audiophile, one of whom once told me his complicated and expensive analogue filter “removed the sines from the signal because the cosines sound better”.

    • colw00t says:

      Try this one on:

      ” What is the Blackbody?

      The Blackbody is a high-tech audio accessory which greatly enhances your audio playback experience by addressing the interaction of your audio gear’s circuitry with ambient electromagnetic phenomena and modifying this interplay. The Blackbody takes advantage of the quantum nature of particle interaction, and is therefore able to permeate metal, plastic, wood, and other barriers to affect the circuitry inside your components. This altered electromagnetic influence results in profoundly improved sound quality. ”

      $959 each and they recommend you get three.

      • Sparkasaurusmex says:

        Ambient electromagnetic phenomena interplay modified by taking advantage of quantum particle interaction?
        Hell yeah I want five!

      • dontnormally says:

        Thank you for this. Oh man.

    • Aninhumer says:

      While I haven’t seen a 120Hz monitor myself, I think writing it off any difference as placebo without seeing it for yourself is kind of silly. Obviously there is going to be a refresh rate at which humans can no longer perceive a difference, and no doubt there will be people who claim they can see a difference beyond it, but I don’t see any reason to think 60Hz is the maximum.

      • Baines says:

        I remember people adamantly writing off people being able to distinguish a difference between 30 and 60 fps.

        And then it was adamantly writing off people being able to take advantage of a 60 fps game speed, despite years of fighting game players doing exactly that.

      • Rian Snuff says:

        Thank you so much for being intelligent and subjective.
        Lol.

    • iucounu says:

      My dad told me a story (probably apocryphal) of an audiophile he knew back in the days of vinyl LPs, who had built, at tremendous expense, an audio set-up that he claimed removed all extraneous hiss from playback. A friend suggested that a way to test this would be to have a completely blank record pressed, to see if it would play silently. Intrigued, he had it made, listened to the resulting hiss, and then hanged himself. (It wasn’t a very cheery story, no.)

    • Old Rusty Dusty says:

      And I’ll tell you that it’s not from experience. I’ve been running 120Hz for the last few months on a Korean IPS display, and every now and then if a game has issues and runs at 60Hz prior to tweaking, it’s a very noticeable difference, and feels sluggish in comparison. I’d go so far to say that once you’ve experienced 100-120Hz gaming, that the difference is just as significant as when playing a game at 30FPS capped vs 60FPS capped.

    • 11temporal says:

      Yep, we need a double blind trial.

      My friend has 120Hz monitor and I’ll be damned if I can see any difference although it is “obvious” to him (it was obvious even in games which weren’t able to push more then 60Hz though…)

      • drewski says:

        I would love to make that pitch to my university for funding.

        Don’t think I’d get anywhere, mind. Maybe I can get LG to fund a PhD position.

  8. surv1vor says:

    Video drivers give the option to force higher refresh rates. Its not something I’ve tried, but how different is it to this?

    • Sparkasaurusmex says:

      I believe that’s just for activating refresh rates that ARE supported by your monitor but for some reason aren’t reported to your drivers. I don’t know if you could set something there and actually make it work outside your monitor’s capabilities. Try it :)

      • surv1vor says:

        Well I’m looking at it now, and even though it gives me alternatives, none of them are higher than the 60HZ at the moment. Plus it gives me warnings about damaging my display so its probably not worth the effort if I’m not gaining anything from it. I was just hoping this would be an option for AMD users.

      • randomgamer342 says:

        It’s not, the settings in the driver is the exact same thing as the EVGA tool with the ability to turn off automatic configuration

        Nvidia has just taken that feature and marketed it as something new and easy

        120hz took a couple of days for me to get used to, but it’s FAR superior to 60hz, especially with fancy IPS monitors that can handle the speeds

    • MattM says:

      Most games these days either include a refresh rate option in the settings or just use whatever refresh rate the windows desktop is using when the game boots up, but some games and applications just use a resolution setting with a 60hz refresh rate and don’t have an option to change it. Your driver control panel has an option to override a games screen refresh rate and force it to use the refresh rate you specify. This is similar to forcing AA or AF from the driver panel. It usually works although there are some games where the game engine both uses v-sync to cap the framerate and has problems above 60 fps. This can lead to problems with the games physics system or broken animations. If some game switches your 120hz lcd to 60hz when it launches you probably want to use this option to force it to stay at 120hz.
      You can also use your driver panel to force your monitor to run at unsupported resolution and refresh rates. This often doesn’t work and can damage your monitor. You might need this option if your monitor isn’t being correctly recognized. I had a CRT that supported resolutions higher than those showing up in the driver panel so I had to create a custom resolution. I think there are also a few other reasons you might want to use this setting. There is a way to do SSAA using this setting and there was a game (Bulletstorm I think) that had performance issues unless you used resolutions that were a multiple of 8.

  9. 626 says:

    Just gave this a go and going above 65 gives me a black screen. Hazro 27″ with a Gigabyte 670

  10. ResonanceCascade says:

    Hmm, what are the odds this works with the Macintosh monitor I cannibalized from my old editing suite?

    Pretty low, I reckon.

  11. Mctittles says:

    By the way, anyone curious about the inner workings of a monitor, this guy tears it apart in an easy to understand way.

  12. Premium User Badge mR.Waffles says:

    I got an ASUS VG248QE 24″ LED LCD last week for 260 dollars. The Counter-Strike community is ranting and raving about how great 120hz is for competitive play and after experiencing it I can’t go back. The difference between 60hz in 144hz is absolutely stunning. I highly recommend checking one out.

    • MarcP says:

      I’ve been contemplating trying out a 120hz monitor for a while, but as someone who can, in most first person shooters, reliably tell in blind tests when my framerate drops from 60 to 50 (provided motion blur isn’t used), I’m worried once I get accustomed to 120 FPS anything below that will feel bad then. Even below 100, or 80, might be a problem. At best, most graphic intensive modern games tend to shoot for 60 FPS on good hardware; and when they do, the goal is often met most of the time but very rarely all the time. Not only that, but even in those situations where ~60+ FPS is maintained, framerate still tends to vary widely between 60 and above. I figure seeing as I already get irritated with those rare dips below 60, consistently having the perceptible framerate move back and forth would drive me insane.

      • Old Rusty Dusty says:

        In my experience after having switched to 120Hz, the framerate dips are not as noticeable as 60Hz ones unless you’re dropping to say 85FPS from 120. To me it seems very smooth as long as you can push 100FPS+ without any dips. But you’re definitely on point, in some more intensive games a drop to say 80FPS is noticeable, yet altogether I’d say it’s worth it simply because that dip to 80FPS still feels smoother overall than running at 60FPS capped all day.

        If you’re a big FPS player, another aspect you’ll like as well which is a side effect is that using the mouse at 120Hz will feel much smooth than 60Hz, simply because the number of samples is increasing in relation to the number of frames displayed on the screen. It’s hard to describe, but in the most basic sense games in the past whose mouse felt a little laggy from having v-sync enabled at 60Hz will feel much more responsive when v-synced at 120Hz.

  13. Blucid says:

    Ah the good old days…

    Quake -> Quake 2 and Quake 3 @ 800×600 @ 160hz booyah.

    • nbF_Raven says:

      This. I had a CRT that could do 160hz. The difference from 120hz to 160hz was insane in CPMA. Too bad the CRT was old and broke shortly thereafter. Now I’ve got a 120hz LCD which is still pretty awesome.

      I always laugh when I see the “can’t see above x fps” argument. I can tell the difference between 50fps and 60fps, let alone 60fps and 120fps.

  14. MattM says:

    When considering monitor refresh rates, it is also important to consider pixel response times. Once the pixel has received a signal it still takes time for it transition from one color to another. The bigger the change in color the slower the transition. When using 3d shutter glasses, ghosting is caused by pixels not finishing their transition quickly enough. In 2d mode it’s less noticeable, but slow pixels cause the image to blur more when turning or strafing and cause moving objects to appear blurry. If you ever had an old laptop, you might remember LCD panels so bad that the mouse cursor would mostly disappear when moving. Today’s LCDs are a lot better, but still not as good as CRTs were. Newer 120 hz 3d ready LCD panels tend to have very good pixel response times and this is part of the reason they look better (in terms of moving image clarity) than standard 60hz panels.

    • Rian Snuff says:

      Very good point dude and something overlooked in this article.

      Running 120hz on a monitor with something with a 10ms response time seems.. Wrong.

  15. Premium User Badge dangermouse76 says:

    Anyone any word on how it’s worked on a projector ?
    My Optoma got up around 70mhz at 1200×800 went funny after that.

  16. hazeprophet says:

    Just decided to give this a test on my 2 AOC 20″ IPS Panels. Both clocked successfully and stably at 74Hz. Loading up Guild Wars 2 allowed the use of 75Hz Refresh Rate (confirming that it changed and was functioning). No image issues. Looks good and solid.

  17. Dubbill says:

    My crappy Dell 2407WFP-HC will go as high as 63hz. Woo!

  18. Bobtree says:

    RPS, this was painful to read.

    This is not overclocking a display, it’s just a video driver tweak. The image processing gear in modern LCD displays is not something you can just squeeze some extra performance out of. Either the hardware can display at certain refresh rates, or it can’t. Specs list supported refresh rates for a reason.

    > Rendering well above 60fps on average, in other words, makes it less likely you’ll drop significantly below 60fps at any given moment.

    Unless you also use vsync, which makes fps drops below the refresh rate HIT HARD (typically dropping to HALF the refresh rate). This is why vsync sucks.

    > Given that all LCD monitors were limited to 60Hz

    This is false. There have been LCD monitors at other refresh rates long before 120hz existed.

    > A further complicating factor is motion blur. It’s this that allows a 24fps movie to seem smooth, which on the face of it doesn’t make much sense in the context of our ability to distinguish 60fps from 100fps.

    This is nonsense. 24fps movies look smooth BECAUSE of motion blur (and sometimes produce double-images when panning, due to projectors running at 48fps, exposing each frame twice). We can distinguish between different high FPS rates because artificial single-instant images don’t have real motion blur (integration over time during each frame). LCD blurring due to slow pixel response times is something else entirely, which maybe got confused with motion blur.

    I strongly suspect Jeremy Laird has very little idea what he’s talking about here and simply never bothered to set or buy an LCD at > 60hz refresh before.

    FWIW, the recent Nvidia “adaptive sync” driver option for Vsync is a great performance/quality tradeoff (it syncs when FPS > refresh, tears otherwise). It handily beats all the alternatives (vsync, triple buffered vsync, no sync) and is frankly just THE RIGHT THING TO DO unless you need to test maximum unsynched performance. Tearing may look bad in certain games or on some displays though, so YMMV. Adaptive vsync + high refresh rate CRT = awesome.

    • Low Life says:

      This is not overclocking a display, it’s just a video driver tweak. The image processing gear in modern LCD displays is not something you can just squeeze some extra performance out of. Either the hardware can display at certain refresh rates, or it can’t. Specs list supported refresh rates for a reason.

      Surely by using that exact same logic we can come to the conclusion that increasing CPU’s clock frequency isn’t overclocking? When I buy a CPU it lists a frequency for a reason, I can’t just magically squeeze extra performance out of it.

      What happens here is that the clock sent to the monitor (via the display cable) can be increased beyond the rate the chip is specced to handle (i.e. what it reports to the computer). These are monitors that are specced to work at 60 Hz working at frequencies above 60 Hz. The amount it can be increased varies on a monitor-by-monitor basis, even between monitors of the exact same model, for the exact same reason it varies when a CPU is overlocked: sometimes you just get lucky during fabrication.

      There’s a reason this works best with the cheap Korean IPS displays – they don’t do any special processing (some of them can’t even handle input signal different from their native resolution) and as such their control logic is very simple, so they don’t have any extra checks on the input frequency. Then we have something like Dell’s IPS models that have such specific requirements for the input signal that they don’t accept anything above 60 Hz.

      • Old Rusty Dusty says:

        You’re right, it’s not overclocking the LCD, because it is in fact overclocking the LCD’s Power Circuit Board, not the monitor itself. The Korean IPS models that can hit 120Hz are all actually 60Hz panels; however the ones that don’t have issues hitting 100Hz+ all have a specific variant of a PCB installed which is known to overclock well without issues. Most consumer monitors won’t have a PCB that takes to little if any overclocking at all, but it is still technically overclocking in the sense that the graphics driver is telling the PCB to run at a higher rate than normal— That’s no different than a driver telling a GPU core to run at a higher clock rate.

    • Rian Snuff says:

      It’s sort of weird. If you look at many of the spec sheet sfor “60hz” monitors.. They’ll actually read 75/80hz or something like that. It just seems they’ll -locked- to 60hz just because. I thought maybe what this actually was doing was allowing people to fully utilize the bit more they’re actually capable of.

      So in my mind this was more so an unlocker, like unlocking the cores on the 6850s to match 6970s.

      I was just speculating however. But I do believe the term overclocking it is good enough. Heh.

    • Jeremy Laird says:

      You’re either factually wrong or wrong in sentiment regards most of what you said, I’m afraid.

      You are overclocking the display just as much as you are ever overclocking anything, such as a CPU.

      Until recently, the vast majority of affordable consumer LCD monitors were 60Hz. I didn’t put that qualifier in because uber money and specialist displays may as well not exist in this context and there are always a few exceptions. You are nitpicking.

      You make no discernible point re motion blur. You seem to be inferring things I never said.

      I’ve used high refresh LCDs since they first came out with 3D Vision. I’ve used probably 20 different models and as per the article I attempted overclocking on four different standard 60Hz monitors for this article. I would say I have perfectly adequate first hand experience in this area.

      The one thing I do agree with you is that nV’s adaoptive sync tech is a nice little feature.

      Less hyperbole in future, please!

      • Bobtree says:

        I concede the points on overclocking and was not aware of LCD monitors that do not sanitize their inputs. I would trust one no further than I could throw it however.

        > You are nitpicking.

        “all LCD monitors were limited to 60Hz” is plainly false. The “vast majority” do not change simple facts. Most is not all.

        > You make no discernible point re motion blur.

        My point is that your “doesn’t make much sense” comment is gibberish. I inferred a possible mixup with motion-integration blur and LCD response-time blur because it’s an obvious mistake to make.

        > I’ve used high refresh LCDs since they first came out with 3D Vision

        3D Vision is only 5 years old. Other non-60hz LCDs are not new or special. Your experience does not trump history.

        > wrong in sentiment

        There is nothing wrong with my feelings about incorrect facts. Here are some more:

        > Even if you reduced the refresh rate to 1Hz, there would be no flicker.

        LCDs still have the inherent flicker rate of the backlight. Recent CFL’s have imperceptible flicker, but varying the flicker rate is used to change screen brightness on some LED-backlit displays, and as a result there are complaints about their flickering at low brightness settings or when showing dark images.

        > CRTs generate an image line by line, of course. So higher refresh makes for a more stable, less flickery image.

        Line-by-line scanning is not the cause of low refresh flicker. CRT oscilloscopes, for example, behave similarly. As refresh rates go up, the CRT phosphors glow more continuously since they have less time to dim between refreshes, so the flickering disappears. It is the duration of phosphor emission being shorter than the refresh interval that causes the flicker, and a much longer duration would cause a ghosting effect.

        > Less hyperbole in future, please!

        I object to shoddy writing.

        • thebigJ_A says:

          Then stop filling the comments up with shoddy writing.

          You “corrected” his motion blur comment by saying precisely what he said in different words. Read the sentences again. I think maybe you missed the part where he says motion blur is *why* 24 looks smooth, and then also missed the “on the face of it” part. You do know what that expression means, right?

          Your other bits are just as bad. Read more slowly, or carefully, or at least read things over again when they make you pissy to be sure they say what you thought they said.

          • Bobtree says:

            > Then stop filling the comments up with shoddy writing.

            No. For starters, I write very well.

            > You “corrected” his motion blur comment by saying precisely what he said in different words.

            There’s quite a big difference in what we said. Laird made odd and unsupported statements about motion blur. The only factual thing he said about it is that movies have motion blur and appear smooth. So why is it a “complicating factor”? What apparently “doesn’t make much sense” about it in the context of perceiving very high frame rates? I repeated the movie point for emphasis and elaborated with a correct and relevant explanation and a likely reason for his confusion. I precisely wrote what he did not.

            For RPS sake I will elaborate on the topic. The human eye-brain vision system perceives motion blur. Films of real life record it, CGI movies usually simulate it, and some games use special effects to fake it (often annoyingly, since they give us camera control but can’t track our eyes and blurring is gaze-target relative). Per-frame motion blur exists or not independently of frame rate (though when present the degree of per-frame blurring is inversely proportional). High frame rates look smoother than low ones, both with and without blurring, because reality is not frame-based. An infinite frame rate would make motion appear perfectly smooth and produce the same blurring we see in real life, and the individual images would be unblurred so objects would appear sharp when the eye tracks them. Actual display and rendering rates are limited, so games must make performance and quality tradeoffs.

            > also missed the “on the face of it” part. You do know what that expression means, right?

            Of course: “which on the face of it doesn’t make much sense” suggests that in fact it does make sense, but then he then fails to explain it or tell us why he raised the issue. To write in this confused manner is misleading and sloppy.

  19. Tiguh says:

    Well that did the most extraordinary things to my monitor! None of them were “good”, but all of them made me laugh like a drain! Play if you want to, but not with an Iiyama Pro-Lite E2209HDS – mushrooms are cheaper!

    XXX

  20. mrmalodor says:

    I wanna overclock my mouse!

    • Low Life says:

      Well, USB poll rate tweaking used to be the shit a few years back.

  21. Lamb Chop says:

    I recently upgraded to a GTX 660 because my old 280 burnt out, and the new card creates horizontal tearing in all full screen flash video (youtube, hulu, etc.). I tried forcing program-specific vsync/adaptive vsync on chrome and it hasn’t fixed the problem, despite that being the only posted solution I could find online.

    1. Does anyone know of a fix for this?
    2. If not, do you think this tool might fix it if I can push the monitor refresh rate up over whatever the rate the flash is trying to render at?

  22. urmamasllama says:

    managed to squeeze an extra 6hz out of my 22″ AOC

  23. JarinArenos says:

    Both my monitors (viewsonic VA2323wm and Dell somethingorotherPOS) just throw some variation on “out of range” and go black. No joy. Gotta upgrade my monitors, I guess.

  24. alms says:

    My old Samsung LCD has a 75 Hz refresh rate option which is easily accessible through the WIndows display properties panel without any additional software.

    It doesn’t make a lot of (any) difference at a first glance (the desktop), but I remember people discussing how changing the refresh rate had unpleasant effects in games hence I always used at 60 Hz.

  25. jakh says:

    This is pretty cool. I’m one of the, as far as I have seen, -few- that is strongly affected by lower refresh rates, and purchasing my Viewsonic 120hz monitor was the best day of my gaming life. This let me breath some life into my second monitor, a Dell E228WFP, got it pushed up to 76hz and I notice it. The Viewsonic let me push it all the way to about 200hz before it started getting REALLY weird looking, had noticeable colored lines in solid color and lots of distortion, but oddly enough the animations and motion continued to be exceptionally smooth (even more so than at 120hz). It sits around 140hz with no graphical oddities. I am happy!

    I also tested a Dell U2411, which did not let me change anything (fussy Dell hi-tech IPS crappity), and an old Planar… piece of crap… which continued to look like a piece of crap!

  26. sophof says:

    I’ve never understood why CRTs for PCs went the way of the dodo when they did. I tried to use mine as long as I could, but when it finally died there was no real affordable option left. The space and weight is not an issue for everyone. My current screen has a worse color representation, worse refresh rate and a lower resolution. It is quite a bit sharper though, but I was used to a degraded CRT.

    Right now LCD screens are certainly better than the CRT I had back then, but when they got replaced, CRTs were better in every way but weight and depth, it was weird…

    What was funny was that I usually ran my monitor at 85Hz back then, only going 120Hz for quake 3. Now THAT was smooth :D

  27. Rian Snuff says:

    Anyways, I’ve been using a true 120hz BENQ for a few years now and I won’t ever really go back to anything less. No headaches in fast games or long session, way more accurate/responsive input (My K/D actually jumped when I made the switch), even using windows is much nicer. I can easily read while the page is scrolling. Sure maybe the viewing angle is tight, but that’s why mine came with the best stand I ever seen. (Even though I bought an arm mount for more desk space..) and the LED helps make the colours pop that extra bit. With proper setup it looks beautiful to me.

    All I can say to any remaining nay sayers is.. Please give it a try.
    You might not be able to see the difference.. Somehow.. But it doesn’t mean nobody else can.
    Peace!

  28. fish99 says:

    A lot of misinformation in the comments here. I can you tell for a fact that 120Hz screens feel a lot smoother than 60Hz, but that’s not even the best thing about them.

    The best thing is you don’t have to pick between a framerate drop from having v-sync on (or the laggy feel from triple buffering), and tearing with v-sync off. With 120Hz you can leave v-sync on, get no tearing and the framerate drop is much smaller than with a 60Hz screen, because obviously the time wasted waiting for a retrace is a lot less when the screen is refreshing twice as often. Or you can turn v-sync off, have no framerate drop, but the tearing is much less visible than on a 60Hz screen, i.e. the size of the tears is dramatically reduced, because of the higher refresh rate.

    • Aninhumer says:

      Firstly, I’m pretty sensitive to tearing, and I’m sceptical that improving the framerate would mask it. The problem is seeing blatant inconsistency across a single frame. Showing that inconsistency for less time might be an improvement, but I suspect I’d still notice it.

      Secondly, if you have the hardware to produce 120fps, you’re clearly not going to have any issues turning on vsync on a 60Hz monitor.

  29. Jediben says:

    I’ve just pushed my 24″ HP LP2475W from 60Hz to 75Hz! :D

  30. Nate says:

    Of course the human eye can see more than 60 frames per second. Don’t you notice fluorescent flicker from those old tubes? Don’t you notice CRT flicker, at least in your peripheral vision?

    Besides which, if you want to draw something at 60hz (like accurate fluorescent flicker), you’ve got to sample it at double that rate, at least. Same as you need 44khz sampling to hope to represent 22khz tones (poorly, as square waves).

    So if you want fluorescent flicker to be modeled in your game, you need 120fps. If you want flickering CRTs to be modeled, you need 85*2 = 170fps. If you want to do that for each eye for stereoscopy, double it again, to 340fps.

    What if you want a CRT monitor displaying a flickering fluorescent? Well, thankfully, you only need the framerate to display the CRT. So there is an end in sight :)

  31. Sync0r says:

    Tried this out on my Alienware M15x panel (1920*1080). OMG, my panel must of been thinking “Why have I been running so slow for so long!”. 60Hz -> Maximum 123Hz! Thats more than my desktop Acer 120Hz monitor can go to, it only managed 121Hz before crapping out. Thank you Nvidia/EVGA, you got me a free 120Hz panel in my laptop! And just encase you guys are wondering, it does look as good as the refresh rates on my TN Panel ACER GD245HQ, yay! Time to try it out on my 47″ TV.