A high quality LCD panel. Or high refresh rates. Take your pick. Because you can't have both. Well, not unless you think BadgerJump Monitors (or whatever they're called) sounds like a sensible consumer electronics brand and you're thus willing to roll the dice on a dodgy Korean panel off eBay. But wait. One of the footnotes to NVIDIA's recent Titan graphics card launch is a new monitor overclocking feature. Yup, monitor overclocking. But will it give you 120Hz for free? Will it fry your panel? Do you need NVIDIA's £800 Titan? Should you actually care about high refresh? I've got the answers...
First up, if you're not interested in chewing over the broader subject of high refresh and you're keen to overclock the twangers off your monitor immediately, skip to the last third and the 'hands on' bit. There, I explain what you need, where to download it and my experiences so far. The simple version is that if you've any kind of NVIDIA graphics card, you'll probably be able to have a go.
Anywho, monitor refresh rates. They've always been a bit of a brain ache. Back in ye olde days of CRTs, you had to play off resolution and refresh. The higher the resolution, the lower the refresh was the general rule.
CRTs generate an image line by line, of course. So higher refresh makes for a more stable, less flickery image. Easier on the eye. Less chance of a headache. Plenty of obvious benefits.
With LCD panels, none of that applies. In really simple terms, you can think of an LCD panel as being always on. The image isn't generated line by line. Instead, every pixel is simply updated at a given frequency or refresh rate. Even if you reduced the refresh rate to 1Hz, there would be no flicker. You'd just have seriously crappy frame rates.
In truth, it doesn't work quite like that. But that's close enough to the reality for argument's sake. Anyway, the point is that flicker isn't an issue on LCDs. But frame rates are. It's at this point that the science of human sight enters the equation and I have to admit my limitations. Or at least frustrations. It's a subject about which I've always found the science a little unsatisfactory.
To get an idea of what I'm talking about, let's have a think about the various video formats around today. Take movies, for instance. Until recently, the standard frame rate (or effectively refresh rate, though actual projection rate or shutter speeds vary) for a movie was 24 frames per second. That's enough, I think you'd agree, for what looks like smooth, natural motion when you're plugged into a pew at the cinema.
However, if you've suffered through the impenetrable tedium that is The Hobbit in High Frame Rate (HFR) format, you'll know that doubling the frame rate to 48 frames per second makes an enormous difference to the look and feel of motion. One can argue the toss over the question of whether HFR looks better. But clearly the human eye and mind can distinguish 24fps from 48fps.
Now, consider that the standard refresh rate for a flat panel PC monitor is 60Hz or effectively 60 frames per second. Significantly higher than the new HFR movie format, then. And you might think high enough for completely fluid motion.
That's pretty much what I thought until fairly recently. I used to assume the only benefit to rendering games above 60fps was that it gave you more headroom for those occasion frame rate troughs. Rendering well above 60fps on average, in other words, makes it less likely you'll drop significantly below 60fps at any given moment.
Given that all LCD monitors were limited to 60Hz, that was literally true. But I didn't give any credence to the idea of an upside to a monitor capable of higher refresh rates.
Then NVIDIA rolled out its 3D Vision tech, requiring monitors capable of delivering 60Hz to each eye and thus 120Hz overall. And I had an immediate epiphany. Getting a precise handle on exactly where the threshold is for the human eye and brain in terms of frame rates is tricky. No doubt it varies, anyway. We're analogue beasts, not digital droids.
But I can tell you this for sure. Even just pushing windows around on the desktop, the difference between 60Hz and 120Hz is obvious.
That said, I'm not sure I can tell the difference between 100Hz and 120Hz. A further complicating factor is motion blur. It's this that allows a 24fps movie to seem smooth, which on the face of it doesn't make much sense in the context of our ability to distinguish 60fps from 100fps.
Anyway, the most important point is that on the desktop, in games – bascially, on your PC - 100+Hz is bloody lovely. I can't stress that enough. High refresh rates make your PC feel much more responsive and look much slicker. To coin an Alan Dexter-ism (you know who you are!), high refresh makes all your games look awesome. The only snag is that you'll need a video board that can feed all those frames.
Well, that and the fact that, currently, high refresh rate monitors are limited to TN-type panel technology. Yes, a boutique industry has popped up involving 27-inch IPS Korean monitors capable of 100+Hz. But in the real world, high refresh monitors are TN. And TN has the worst image quality by every metric save response times.
That it's the quickest panel type makes it a natural fit with high refresh rates. But the latest IPS panels are pretty nippy, too. And high end HDTVs now typically offer 100Hz and beyond refresh rates but do not use TN panels.
The hands-on bit:
Overall, I doubt there's any good technical reason why you can't buy an IPS 120Hz screen. It's just none of the big boys have had the balls to try it, so far. But can you make your own? Now, that's an intriguing question.
When I first heard about NVIDIA's monitor overclocking, it was supposedly limited to the new Titan board and was thus irrelevant. But no. It works with a broad range of NVIDIA graphics cards. I've tested a current GTX 680 and an ancient 7900 GTX.
The latter dates from 2006 and works fine with the monitor overclocking tool. So, I'm going to assume anything newer will be dandy. Software wise, NVIDIA isn't providing the overclocking tool directly. It comes from board partners. I've been using the EVGA Pixel clock tool. It works with non-EVGA cards. You can download it here.
For the record, I tested it with an AMD graphics board and no dice. It simply throws up an error decrying a missing NVIDIA API.
The real question is monitor compatibility. I've tested four monitors with variable results. Most disappointing are my Dell 3007WFP and 3007WFP-HC. Neither would run even 1Hz higher than 60Hz. Bummer.
Next up is my Samsung XL30. That will run up to 72Hz, but behaves oddly at that refresh. The fastest stable refresh it supports is 70Hz.
In many ways, the most interesting test subject is my Dell U2711. That's a 27-incher with a modern IPS panel and lots of inputs. It's exactly the sort monitor I'd want to be overclocking for games.
Unfortunately, I found it essentially doesn't overclock at all. I tested up to 80Hz and it will render an image. But at any setting above 60Hz, the frame rate is jerky and stuttery. Something odd is going on with the image processing.
If that's disappointing, what's interesting is that I reckon I can feel the difference at 70Hz on the XL30. It's noticeable smoother. Reading around, it looks like 85Hz or thereabouts is probably where the maximum subjective smoothness kicks in, so you don't need to achieve 100+Hz to get a tangible benefit from monitor overclocking.
The proviso to all this is involves the unknown risk to your hardware. My understanding is that it's actually pretty safe. But the usual small print applies. Move up in very small steps. And it's all at your own risk.
Still, something for nothing is always nice. I'll probably be running my XL30 at 70Hz from here on in.
I'd also be very interested to hear how any of you guys get on overclocking your panels. Good luck!