27-inch IPS LCD panel? Check. 144Hz refresh rate? Yep. Some kind of frame-smoothing adaptive sync technology? Present and accounted for. 2,560 by 1,440 pixels? Count ’em. A price you can afford? Bit borderline, but that was inevitable. Is Asus’s new MG279Q therefore the perfect LCD panel, the one we’ve all been waiting for, the veritable messiah of PC monitors? I’ve been eyeballs-on. All will now be revealed…
First, let’s deal with the price problem. The Asus MG279Q goes for around £475 in the UK and $600 Stateside. That’s a lot of money. In fact, it’s more money than I’d ideally want to spend on a monitor.
But you have to weigh that up against the fact that a monitor has a huge impact on everything you do with your PC, and also remember that monitors have legs. A really good monitor lasts a very long time. I’ve mentioned this before, but I have a couple of near decade-old 30-inch Dell’s and it’s only in the last six months that’s I’ve begun to seriously contemplate replacing them.
Anyway, the point is that if this Asus is truly the one, it’s worth cracking open your piggy bank and calling in all those old debts. But is the long wait for a monitor that combines the best in gaming goodness finally over?
In terms of the basics, this thing certainly scores highly. Aesthetically and in terms of packaging and design, it’s mercifully bullshit-lite but delivers what you actually need. So that’s decent build quality, a fully adjustable stand and a 100mm VESA mount. Tick, tick, tick.
The core image quality of the 27-inch IPS LCD panel is also impeccable. Actually, it’s more than that. It’s ruddy glorious. It doesn’t matter if you’re playing games, watching movies or just shuffling boring things about the Windows desktop. Everything looks fantastic.
That’s thanks to a combo of seriously nice colours – ultra vibrant and yet accurate and natural – and outstanding contrast for an IPS panel. In terms of raw image quality, I’m not sure I’ve seen anything better.
Actually, the obvious comparison here is Asus’s own RoG Swift. That ticks a lot of the same boxes as the MG279Q – 27-inch, 2,560 by 1,440 pixels, 144Hz refresh rateFlick that FreeSync switch…
We’ve been over a lot of this stuff on multiple occasions. But to recap briefly, both are technologies that smooth out gameplay by virtue of syncing the output of your video card with the refresh rate of the monitor itself. They both also do so while removing screen tearing.
The difference is that Nvidia’s G-Sync requires a special scaler board to be fitted to the display, while AMD FreeSync makes do with simply an up-to-date DisplayPort interface to get the job done.
As it turns out, AMD’s approach has proven problematical. That’s because of another tech known as ‘overdrive.’ Again, we’ve touched on this previously, but overdrive essentially involves pumping excess voltage through pixels to encourage faster response.
It works, but it also needs to be carefully tuned to avoid going too far and introducing weird ghosting artefacts that occur when pixels actually over-shoot the target colour state. Anyway, long story short, G-Sync is optimised for overdrive. Thus far FreeSync has not been. I’m afraid that doesn’t change with the new Asus MG279Q. It’s still a problem.
It’s not easy to achieve, but I’ve tried to capture the problem here in AMD’s own FreeSync Windmill demo:
What you’re looking for is a ghostly shadow in the wake of the windmill’s blade (it rotates clockwise in the demo). Where the blade is bright, the ghosting is dark, and vice versa. What you need to ignore is the same-colour ghost slightly ahead of the blade. That’s a photography-related issue, not something you see with the naked eye. Like I said, capturing this stuff on camera isn’t easy.
Happily, however, the MG279Q has a really rather lovely OSD menu and within it you’ll find an overdrive setting that offers five levels of operation, plus fully deactivated. The upshot is that you can tweak the overdrive level to achieve a pretty decent compromise between ghosting and response.
With overdrive set to full reheat, the ghosting is catastrophic. But you can achieve good response with pretty much zero real-world ghosting. Great. Slightly less great is that the FreeSync functionality is limited to an operating range of 35-90Hz. So you can’t have adaptive sync and really high refresh on this screen.
Arguably, 90Hz is enough to get most of the benefits of high refresh, but it’s another niggle that undermines the FreeSync proposition. Oh, and you will of course need an AMD video card to use the FreeSync feature, whatever you think of it.
That’s equally true of G-Sync panels. You’ll need an Nvidia board. Bottom line is that adaptive sync comes with a level of lock-in. I don’t like it, but there it is.
On the other hand, you could argue that if you have a really quick GPU cranking out something in the region of 100 frames per second and a high-refresh monitor to go with it, the benefits of adaptive sync are pretty marginal. So, what the MG279Q has going for it is that FreeSync doesn’t add much by way of cost.
That means you can primarily view it as simply a 144Hz high-refresh IPS monitor and treat the FreeSync stuff as an interesting extra rather than a core feature. With G-Sync screens, the price premium is currently such that you need to be much more committed to the concept.
With all that in mind, is the MG279Q therefore the messiah of monitors? In the sense that it’s the best all-round, gaming-centric monitor I’ve yet seen, the answer would be a qualified yes.
34-inch superwide screens including the curved Samsung effort I eyed up recently are more dramatic but less usable as all rounders, not to mention a lot more expensive. 40-inch 4K screens also take things to another level in some respects, but there’s really only one option right now from Philips and it comes with compromises like a slightly iffy VA panel, 60Hz refresh, no adaptive sync and the problem of a graphics card powerful enough to drive all those bloody pixels.
That latter point means the very notion of a perfect LCD monitor probably makes no sense. Too much depends on outside factors like GPU performance. But given the various compromises and as far as I can see, this thing is as good as it currently gets.