Asus ROG Swift PG27UQ review: 4K HDR has arrived, but good grief, HOW MUCH?

Asus ROG Swift PG27UQ

When I first saw the Asus ROG Swift PG27UQ a couple of weeks ago, I thought the future of monitors had finally arrived. After 18 months of waiting, false starts and an increasingly dizzying array of HDR standards, infuriating Windows 10 options and simply not being able to have as good a time as our console box friends with their fancy pants 4K HDR TVs, both the PG27UQ and the Acer Predator X27 (a review of which will be coming soon) promptly blew me away with their 1000cd/m2 bright, 4K IPS screens, 384 dynamic backlight zones, 144Hz refresh rates and Nvidia G-Sync HDR support (or high dynamic range, to you and me).

A month later, that rush of excitement hasn’t faded, and every moment I’ve spent with the PG27UQ has been – for the most part – an absolute delight. That is, when Windows 10’s playing ball and I’ve spent at least fifteen minutes fiddling around with the settings making sure the brightness level’s set correctly, followed by the same amount of time again in-game tweaking luminance levels to ensure the Fat Chocobo Triple-Decker sandwich in Final Fantasy XV isn’t a blown-out mess of white highlights and oversaturated colours. I was almost ready to crown it our best gaming monitor of all time. Then I found out how much it actually costs. Brace yourselves.

Available for £2300 of your fine English pounds or $2000 of your even finer American dollars (thanks Brexit), it’s no wonder that Asus have been keeping this particular bit of information close to their chest. I told you to brace yourself.

I’m still not quite sure how I feel about that. When I first saw the PG27UQ and Acer Predator X27, all Nvidia let slip was that they’d be “at least” a grand. A grand I can get behind. I’d spend a grand on a nice Ultra HD Premium-certified TV for playing games on, and I was sufficiently impressed by each monitor’s overall image quality at the time that I’d more or less convinced myself that, yes, I’d probably pay that amount of money for an equivalent monitor as well – even if these ones were only 27in instead of 55in.

Asus ROG Swift PG27UQ face on

Two grand, however (or closer to two and a half grand for those of us unlucky enough to be buying one in the UK), is a lot harder to swallow. I suspect a fair amount of that is going on the PG27UQ’s laser light show that runs the length of its height-adjustable/rotating/swivel/tilt stand. At the bottom you have the privilege of having a red ROG logo burned into the surface of your desk, while the top lets you sear the same ROG logo onto your back wall and/or ceiling. Yes, really.

LEDs aside, though, I’d be a lot happier forking out this kind of cash if HDR still wasn’t such a pain in the backside to set up and get working properly. In truth, this is more of a Windows 10 issue than a problem with the PG27UQ specifically, but the monitor doesn’t exactly help make things any easier.

Asus ROG Swift PG27UQ stand LED

Credit where it’s due. HDR on Windows 10 is now much better than it was a few months ago. The built-in brightness slider introduced in the last big April update for gauging what SDR (or regular ‘standard dynamic content’) will look like on your desktop when HDR’s turned on is infinitely preferable to not having one at all. But when the monitor also has its own brightness controls for HDR, the number of variables in play can make it pretty tricky to get a great picture right away.

Let’s start by leaving HDR aside for a moment and focusing on what the PG27UQ is like as an everyday monitor. You’ll find six colour profiles in its easy-to-use onboard menu system here, and (unsurprisingly) a lot of them are themed around gaming. By default, it’s set to FPS, but you’ll also find an RTS/RPG mode, one for Racing, and then Cinema, Scenery and sRGB profiles. The latter is quite restrictive in the number of additional settings it lets you play with, but the rest give you pretty much free rein to customise them how you want.

Play in the dark and that rear ROG logo is strong enough to beam up to your ceiling.

Play in the dark and that rear ROG logo is strong enough to beam up to your ceiling.

For the most part, though, there’s not actually a whole lot setting them apart. On its default FPS mode, for instance, my x-Rite i1 DisplayPro calibrator showed it was capable of displaying 94.8% of the regular sRGB colour gamut, and 68.9% of the larger DCI-P3 gamut. I got the same results on RTS/RPG (95.0% and 69.2% respectively), Cinema (95.9% and 69.9%) and Scenery (95.5% and 69.6%). Racing and sRGB, meanwhile, were just a couple of per cent behind with nearer to 93% sRGB and 68% DCI-P3.

Initially, I have to admit I was quite disappointed. These results would be fine for a normal IPS panel, but when this is meant to be a super duper mega Nvidia G-Sync HDR monitor (not to mention one that’s supposed to be able to hit 97% DCI-P3 coverage by Asus’ reckoning) then barely hitting 70% DCI-P3 just isn’t going to cut it when you’re demanding north of two grand for it – especially when, I might add, the £166 / $230 BenQ EW277HDR can hit 91.9% DCI-P3 out of the box without any kind of HDR setting enabled whatsoever.

Sadly, things didn’t improve much when I actually came to switch HDR on. Yes, the brightness jumped from a maximum of around 300cd/m2 to a whopping (and sustained) 650cd/m2 in almost every case, but I didn’t see a corresponding leap in colour accuracy to wash away my lingering fears.

Slightly slim pickings on the ports front - just single DisplayPort 1.4 and HDMI.2.0 inputs, plus a two-port USB3 hub and 3.5mm headphone jack.

Slightly slim pickings on the ports front – just single DisplayPort 1.4 and HDMI.2.0 inputs, plus a two-port USB3 hub and 3.5mm headphone jack.

FPS reached a high of 97.7% sRGB and 75.1% DCI-P3, but I saw pretty much identical results across RTS/RPG, sRGB and Cinema as well. Only Racing and Scenery came close to hitting 100% sRGB (99.3% and 99.1% respectively), but even they topped out at 79.1% and 78.1% DCI-P3.

The monitor’s contrast ratio also seemed to halve from around 2500:1 to roughly 1500:1 whenever HDR was turned on, and black levels often jumped from a respectable 0.13cd/m2 (or thereabouts) without HDR to a less brilliant 0.38cd/m2 (or even as high as 0.56cd/m2) with it.

Admittedly, my testing software doesn’t actually produce its own HDR metadata, so any monitor I test doesn’t technically recognise my test patterns as proper HDR content. This means that any colour gamut figure I record with HDR turned on isn’t a completely accurate representation of what a monitor can do. Colours will therefore likely look better when you come to play an HDR game, because HDR-compatible graphics cards will be sending a proper HDR signal to the monitor in question.

It wouldn't be an Asus ROG monitor without a giant RGB panel on the back as well as a light-up stand...

It wouldn’t be an Asus ROG monitor without a giant RGB panel on the back as well as a light-up stand…

That said, the PG27UQ doesn’t help matters by lumbering you with not one, but two HDR brightness sliders – the one in Windows 10 and the one on the monitor itself. These are vital to achieving the intended HDR ‘look’ of eye-searing highlights and lovely vivid colours, but they can also upset everything you’ve spent the last hour tweaking to the nth degree in an instant depending on where they’re set and how nicely those particular settings play with the million other brightness sliders in the games themselves. Which is a very long-winded way of saying, “Good lord, I’ve just spent £2300 on this thing, why doesn’t it just all work?”

When I first turned on HDR in Windows 10’s display settings menu, for instance, the monitor was so bright that I had to turn its SDR slider right down to zero to stop my eyeballs from leaking out of their sockets. A round of testing and a PC restart later, and zero was suddenly too dim – and indeed produced even worse results when I re-tested the PG27UQ to see what had happened. In this particular moment in time, somewhere around the 70 mark seemed more appropriate.

Then there’s the PG27UQ’s own HDR-specific ‘Reference Brightness’ setting to contend with. In non-HDR mode, you just get a normal brightness option like any other monitor. But turn on HDR and this splits into two settings – a greyed out maximum of 1000cd/m2, and a so-called ‘reference white’ option that seems to be based on whatever brightness setting you had in non-HDR mode. This set itself to around 220 during that initial test phase, but when I reset everything back to its defaults again, it only came up as 80.

Asus ROG Swift PG27UQ controls

Even without HDR-enabled testing software, then, there’s a certain difficulty in trying to establish the PG27UQ’s exact gamut capacity at any given moment. My results were obtained using the combination of brightness settings I thought most appealing/correct by eye at the time, but it’s clear there’s still quite a bit of potential variation depending on how you set them up – and that’s not even taking into account the kind of brightness adjustments you’ll have to make for playing games in the day as opposed to playing at night.

Either way, I have to say that, on paper, the PG27UQ isn’t quite the incredi-monitor I thought it might be.

Onscreen, however, the PG27UQ comes alive like no other monitor I’ve ever seen – except, perhaps, for the Acer Predator X27. Numbers be damned, this is still the most jaw-dropping implementation of HDR available right now, and I’ve never seen Final Fantasy XV look this amazing on any other HDR monitor I’ve tested, whether it’s the Samsung CHG90, BenQ EW277HDR or BenQ EL2870U.

Colours just pop on the PG27UQ with such a vibrant intensity that you’ll want to dye your chocobos every colour of the rainbow just to get that mad hit of reds, greens and blues as Noctis and his pals charge across the screen. Combine that with the monitor’s frankly ridiculous brightness – I measured occasional peaks of at least 800cd/m2 when I placed by calibrator over sun flares and street lamps – and you’ll never want to drive the Regalia round during the day ever again.

Thanks to its 384 dynamic backlight zones, neon signs and fluorescent bulbs burn with the same kind of ferocity as the sun when they’re pitched against the inky blacks of the dark night sky, and it gives the entire game a sense of realism and fidelity that just isn’t there when you’ve got HDR turned off. You might be causing long-term damage to your eyeballs in the process, but for Noctis and co. it’s worth every damn second.

Asus ROG Swift PG27UQ angle

Assassin’s Creed Origins and Far Cry 5 looked equally impressive in their respective pixel flesh as well, even if both of them required even further tweaking to their own in-game brightness sliders to create the same kind of ‘Holy mother of bears and camels’ impact as Final Fantasy XV.

In this sense, I don’t really care that the numbers aren’t as good as they could be. HDR remains an absolute pain in the backside to get right, but this is still a damn fine monitor that impresses right from the word go – even taking it out of the box feels good, like this is a proper, premium bit of kit you’d just dropped a truck’s worth of money on.

The true test of whether the PG27UQ is, in fact, worth the price of a small car, will come when Acer release their Predator X27, which shouldn’t be too much longer after this one comes out on June 29. As much as I like the PG27UQ, there’s no way I can possibly recommend it as something you should buy right this second until I’ve seen what Acer’s got in store. After all, when both monitors share such similar specs, these two screens are destined for the kind of the long-term rivalry that only thorough testing could possibly hope to untangle.

If you really can’t wait, of course, then by all means go for it if you’ve got the cash. While I continue to have a few reservations about the overall quality of the panel’s DCI-P3 coverage, there’s no denying that this is still a tasty, tasty screen if, like me, you’ve been dying to play Final Fantasy XV in all its 4K triple stacked ham, battered barramundi and horntooth meat pie HDR glory ever since you found out it was coming to PC. For everyone else, though, it’s time to play the ever-enjoyable waiting game until the Predator X27 rocks up at the RPS Treehouse.


  1. geldonyetich says:

    When you’re the best in town, you can afford to charge for it. But egads, it’s not even an ultrawide! That’s some pretty steep early adapter tax.

    • napoleonic says:

      A helpful scribes notes: the word is “adopter”, because they adopt the new product earlier than other people.

      • BooleanBob says:

        I’m pretty sure early adapter is right in this case, because in 3 months’ time there’ll be another 7 competing standards and you’ll need to go out and re-buy all your cables again.

  2. caff says:

    Utterly ridiculous. Makes you wonder how much those BFG displays will cost…. £10k?

  3. Don Reba says:

    $2000 is about in the middle of the range I expected, so no surprise there. I think I’ll wait for the prices to come down and for the software to catch up before contemplating an HDR monitor. This doesn’t sound like a ton of fun:

    That is, when Windows 10’s playing ball and I’ve spent at least fifteen minutes fiddling around with the settings making sure the brightness level’s set correctly, followed by the same amount of time again in-game tweaking luminance levels to ensure the Fat Chocobo Triple-Decker sandwich in Final Fantasy XV isn’t a blown-out mess of white highlights and oversaturated colours.

  4. Premium User Badge

    particlese says:

    Nice to hear how it works/looks outside the company of manufacturer PR folks – thanks! But yeah, that price seems bonkers for what I’m interested in.

    And even if I had the money to justify it, beaming logos around for all to see just seems like gold-plated Ferrari levels of tackiness. I thought expensive things were supposed to be elegant and subtle and not loudly advertise themselves if their owner doesn’t switch them off – unless perhaps the owner is a heavily sponsored sportster… Are these things supposed to be marketed at e-sports teams? It would especially make sense with them perched on those arena benches with the giant illuminated logo on the back.

    • HiroTheProtagonist says:

      It’s being marketed at people willing to spend $1500+ on SLI setups that would be required to get full 4k HDR 144fps out of it. There are some monitors capable of individual parts of that, but none that can actually pull off all three at once reliably.

      As for the aesthetics, I’m generally ambivalent/indifferent. They’re stupid, but I’d be too busy watching the screen itself to care what the housing looks like. Apparently there’s enough demand for garish lines and RGB lighting that things like this are made.

  5. MiniMatt says:

    Good lord. That’s about five times more than we paid for our last car…

  6. Cederic says:

    When it’s that much effort _and_ you have the testing kit to assure you’re getting it right, that’s just no good at all as a consumer device.

    I want a wide gamut monitor that works when I switch it on, let alone at that price :(

    Interesting that you like it anyway though.

  7. sosolidshoe says:

    Goddamnit, all I want is a reasonably good quality 27″ 1440p panel, with a reasonably good image quality, a reasonably good refresh rate, and ideally Gsync, that doesn’t cost an utter fortune. I don’t need all this extraneous LED garbage and tech that isn’t even close to ready for prime-time like 4K and HDR pushing up the price.

    But it seems like there are only two kinds of monitor now – store brand-level budget stuff(and more expensive monitors that are the same quality), or the ridiculous flash-above-function wallet-toys that primarily exist for the particular kind of “enthusiast” who wants to brag about how much they spent on their rig.

    • MacTheGeek says:

      I bought a Dell S2716DGR a couple years ago. 27-inch screen, 1440p, Gsync, looks amazing to my eyes. Amazon has it for under $450 (which is significantly less than what I paid in late 2015).

    • TormDK says:

      1440p is pretty easy to get a good monitor for.

      I’m not paying 2k+ euro for a monitor though, especially not “only” 27 inches.

    • brucethemoose says:

      I got that… over 6 years ago, Feburary 2012.

      1440p, 110hz, IPS, $400. Admittedly it’s not wide gamut or variable refresh like recent monitors.

  8. iainl says:

    That is a frankly massive premium over an entry-level 55” OLED TV that utterly devastates this on every quality measure, just for the privilege of a smaller screen.

    • ThePuzzler says:

      If I give them $10,000, can I get a 13″ version?

    • Don Reba says:

      It wouldn’t be a substitute, even if the size was not an issue. A TV will have high MPRT, and an OLED TV will have burn-in to boot.

      • iainl says:

        Burn in means it’s not something I’d leave a Windows bar on for hours as a primary monitor. But the lag is only 21ms, which is fine.

      • genosse says:

        Note: This is not directly aimed at Don Reba, just a general rant. No hard feelings mate, I just got triggered. ;)

        I read about the burn-in issue a lot whenever someone mentions OLED TVs, but as someone who uses a 55″ 4K LG OLED TV for PC gaming, console gaming, streaming and regular TV for over half a year, every day for multiple hours, I can not confirm this – and I do not care at all if a fixed image stays on screen when I get up from the couch to do something else.

        I don’t know what you have to do to the TV to get any burn-in that is not fixable with the built in maintenance tools (and even those I never had to use), but I honstly doubt that it could still be considered “normal use” for a television set. 24/7 server monitoring perhaps?

        In fact, this is the best TV AND monitor I have ever owned and after getting used to it I would chose OLED every time if I had to make the decision again.

        But if you want to sacrifice pristine picture quality (especially with HDR content) for fear of burn-in that may or may not actually occur, that is fine of course.

        Personally, I buy a screen for the picture, but to each his own.

        And about the input lag: If you think that 21 ms hold back your pro-gaming carreer, you have either already hit your skill ceiling and are a living god in your game of choice (congrats in that case, you can go and buy the £2500 ASUS from your sponsor’s money), or you are one of many people that think skill can be somehow bought if you just spend enough on “pro” equipment.

        It’s still worse within the HiFi and audio community, but gamers are slowly getting there. Difference is that gamer stuff looks progressively worse the more money you spend, like this hideous monitor, cluttered with decals, LEDs and fucking floor and ceiling projection just in case you are worried that others can’t find out quite fast enough what a gigantic tool you are by spending a fortune on the screen-equivalent of a badly customized Ford Fiesta from the 1990s with under-car neon, gigantic fake muzzle, glued on “carbon” and the cheapest body kit available. Vroom!

        I own an Asus ROG laptop by the way, which is also really ugly with that stupid glowing tribaly logo, just so you know I don’t hate the brand, and that I can perfectly understand how this monitor can be a desirable object on specs alone, despite the Fisher Price design.

        But really, does it have to look like that?

        Is there really a market for this, or are the guys at ASUS and other manufacturers just clueless? I just don’t see the target audience. Teenagers probably like the XTREME design, but which teenager has so much money to burn? I can’t quite wrap my head around it.

        Rant over.

  9. brucethemoose says:

    One thing to keep in mind: HDMI 2.1 (aka 120hz VRR) TVs are going to make this monitor obsolete in a year or two, if not less.

    You could maybe justify $1k as a long term investment. $2k just for early access, at a quarter of the screen size? No way, not in a million years.

  10. Kexin says:

    There is no way in hell I would pay the asking price for a HDR monitor when I can buy a TV with superior HDR performance that offers a shit ton more value.

  11. Raoul Duke says:

    I don’t understand why PC HDR can’t work like normal, TV/media HDR.

    On my TV, which is 4K and HDR/Dolby Vision capable, if the source is non-HDR it just… displays it like a normal image. No dicking around with sliders and so on. Then if the source is HDR it just… renders displays it as an HDR image.

    The fact that something is HDR or non-HDR is right there in the data being provided to the monitor. Why on earth doesn’t Windows 10 just rely on this to allow monitors to choose whether to display in HDR or not? Why have something as awkward and ridiculous as having to adjust a slider to display non-HDR content while in HDR mode?

    I mean, there’s very little reason for the desktop itself or 99% of content you would view on the desktop to be HDR. It’s all very strange.

    • Don Reba says:

      You would need HDR support on the desktop to play HDR games and view HDR movies in windowed mode or in the browser. And for HDR images, once they become a thing.

      • Raoul Duke says:

        Yeah, sure. But that is an unavoidable problem with trying to run two incompatible display standards at once on a single screen. I.e., it’s just not a practical thing to do. It’s like trying to play audio with two different equaliser settings at the same time. And it doesn’t prevent a system by which you could choose to keep your desktop in non-HDR mode and have the OS seamlessly switch to HDR for full screen HDR content.

        Even so, I would have thought Microsoft could come up with an algorithm to properly resample non-HDR content to the HDR luminance range pretty easily.

        I dunno. The current situation is just not workable IMHO. And HDR is just plug and play on consoles and for other media (UHD blu ray, Netflix for instance).

  12. melerski says:

    I as a citizen of the republic of gaming demand more blinkylights!

  13. pevonzn says:

    Does the monitor support HDR over HDMI as well as DirectInput? Might be worth the price point if I can use it with both consoles and my PC.

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>