By Jeremy Laird on November 7th, 2013 at 8:00 pm.
Suffering from headaches, tired eyes and all-round gaming fatigue? Must be that flickering LCD monitor ripping up your retinas. No idea what I’m on about? BenQ would have you believe flickering LCD monitor backlights are the new evil and it has the solution. Flicker-free backlight tech. I’ve tried it and can reveal whether it’s the next big thing after 120Hz-plus panels. It’s not. Next! Graphics. AMD and Nvidia are currently squelching about and looking grumpy following of one of their traditional pissing contests. An unpleasant image but it’s good news because it means things are very closely matched. Still, we need to tidy up a few details after all the new GPU launches and some last minute changes including AMD’s Radeon R9 290 and its dodgy cooling and final specs on the Nvidia Geforce GTX 780 Ti.
Flicker-free LCD screens, then.
The short version:
It’s mostly twaddle, ignore it. Skip to the graphics stuff below.
The long version:
BenQ is punting a new PC monitor with alleged flicker-free LED backlight properties. Can’t say I’ve ever had an issue with flicker on an LCD monitor. CRT screens, yup. LCDs, nope. Then again, there was a time I’d have scoffed at the benefit of going beyond 60Hz refresh on an LCD panel.
I still can’t entirely compute that, what with 48fps being good enough for HFR movies. But I was simply wrong. 120Hz-plus is lovely and has become a source of some woe. I love my 30-inch panels. But I want 120Hz pretty badly, too.
Anyway, the issue here involves backlight modulation. Run a typical monitor at full brightness and the backlight is simply on. No flicker. No opportunity for flicker. However, crank it down a few notches and the problems, allegedly, appear.
Low-Hz flicker used to be a fundamental issue
That’s because lower LED backlight brightness settings are usually achieved by pulsing the backlight on and off, a technique known as pulse-width modulation. The dimmer the setting, the more time it spends off. Say hello to flicker.
So says BenQ, anyway. Frankly, I can’t tell the difference. This may be a subjective observation. Some people, for instance, are more sensitive to the rainbow effect from cheapo DLP projectors than others.
But whether it’s the rainbow effect, anti-glare sparkle, inverse ghosting, gradient banding, IPS glow – whatever – I tend to find myself towards the OCD end of the sensitivity spectrum when it comes to minor display technology flaws. And I really couldn’t sense the difference with flicker-free technology.
I suspect it’s also quite telling that BenQ’s bumpf tells you that the best way to spot the benefits of flicker free technology is to put a large fan in front of both its screen and a conventional screen. At which point the flicker on the conventional screen presumably becomes apparent.
I am not making this up, it’s the best BenQ can come up with
Oh, BenQ also suggests you take a picture with a digital camera. You’ll see the flicker in the form of banding in still images. If these two examples are the really the best BenQ can come up with, I’m not sure much more needs to be said on the matter.
Also, for the record the screen I looked at is the BenQ GW2265HM. It’s actually a damn fine 22-inch 1080p screen for a whisker over £100 thanks to a VA panel that’s claimed to be good for 3,000-to-one static contrast (the blacks are bloody brilliant). So it’s definitely worth a look, just forget the flicker-free nonsense.
That graphics stuff
As for graphics, we’ve covered the major points in recent posts. But here’s what you need to know from the very latest developments:
There’s something weird going on with the cooling on AMD new R9 290 boards:
1. The second-rung R9 290 looks fabulous on paper, cranks out awesome numbers for £300-ish
2. But AMD has upped the fan speed at the last minute
3. This makes the performance even better
4. But it also makes an already noisy card trend towards cacophonous
Full details on the GeForce GTX 780 Ti are out:
1. Yup, it’s the full 2,880-shader GK110 Monty
2. Memory is a ‘mere’ 3GB
3. It’s still faster than anything else, including AMD’s 290X and Titan
4. It’s stupid money (£550-ish)
Where does that leave us?
We need a bit more time for things to play out. At a little over £300 the AMD Radeon R9 290 blows everything else away at the high end for bang-for-buck and would be the obvious choice. Putting the noise to one side, I reckon it will give you a gaming experience that’s largely indistinguishable from a £550 780 Ti.
But if it’s as noisy as some say, that’s a problem. I haven’t heard it running with the shouty new fan firmware, because I’ve been too busy driving this:
A game changer with great graphics but makes the odd surprising noise
Which is quite the thing about town and happens to have a pretty nice line in graphics rendering and a few noise issues itself (two-pot range extender is, er, interesting). But isn’t going to help us get any nearer a final answer for this latest round of the GPU war.
My advice is to wait a bit for the dust to settle. AMD may make further revisions. Board makers will have their own cooling solutions, too, so any noise issues with the reference design from AMD may turn out to be moot a month from now.
Anyway, all that’s left to say for now is that only AMD could take what ought to be a winning position with the new Hawaii / R9 290 GPU and cast doubt on the whole enterprise courtesy of what is a pretty minor issue in the broad scheme of things, namely the cooler. If it was any other company, it would be a minor scandal. From AMD, it’s depressingly predictable.