Nvidia’s G-Sync HDR monitors are the real deal and will be here in just four weeks time

The Asus ROG Swift PG27UQ is one of Nvidia's upcoming G-Sync HDR incredi-monitors

HDR, or high dynamic range, has been around for a while, and you’ve probably heard countless barks from your console box friends about how amazing that new Dad of War looks on their giant OLED telly, or how they can never go back to a ‘normal’ screen after experiencing the wondrous glory of Final Fantasy Toast (even we’re a bit guilty of that last one, Alec and I, so please accept our belated apologies for all our Ignis-related food ramblings). PC monitors, on the other hand, have been much slower on the uptake.

Now, however, Nvidia have finally got their butts in gear and are about to release two of their long-awaited G-Sync HDR monitors: the Acer Predator X27 and the Asus ROG Swift PG27UQ. I went to see them earlier this week and I’m happy to report that they’re both bloody amazing. HDR on PC is finally here.

Until now, anything else I’ve seen has been underwhelming. There was a flurry of excitement when Nvidia announced their very first crop of proper HDR monitors back in January 2017 (which included the X27 and PG27UQ), but then it all went a bit quiet. I got hold of BenQ’s EL2870U, which while not majorly expensive was a bit disappointing, and even when Alec tried his luck with the much fancier BenQ SW320, the amount of faff involved trying to get HDR to play ball with Windows 10 just wasn’t worth the tiny improvement he saw in overall image quality.

But, on first assessment, the Acer Predator X27 and Asus PG27UQ do very much look like they’ll be both worth the wait and their respective weights in gold – and may even be some of the best monitors ever made. I’ll be upfront: you’ll have to fork out a fair wodge of cash for either monitor when both of them launch over the next four weeks or so. Nvidia wouldn’t be drawn on exact pricing, but admitted that both would cost “at least $1000” each, making them as big an investment as a fancy Ultra HD Premium-certified HDR TV.

That’s a lot of money to spend on a 27in display, even if it is 4K, has 384 dynamic backlight zones, 1000cd/m2 of brightness and a 144Hz refresh rate. No matter which way you slice it, we expect $1000 to equate to something much bigger, like all those giant 55in TVs everyone keeps raving about.

It's an Asus monitor. Of COURSE it has rear-facing LEDs and a stand that beams a giant ROG logo onto your ceiling...

It’s an Asus monitor. Of COURSE it has rear RGB lighting and a stand that beams a giant ROG logo onto your desk and ceiling…

But if you’ve ever upgraded your TV for a new console (as my husband Matthew and I did recently for the Xbox One X), it’s not too much of a leap to apply the same kind of thinking to a new PC monitor, especially if you’re in the midst of building a new system now that prices for today’s best graphics cards have finally dropped down to something vaguely palatable.

Specs-wise, there’s not a lot to choose between them. As much mentioned above, both have 27in panels with 3480×2160 (4K) resolutions, 144Hz refresh rates, and you get the same 384 dynamic backlight zones on each monitor. The latter is particularly key for impactful HDR. The more backlight zones a monitor has, the more proficient it is at showing light and dark bits of an image simultaneously without either affecting the quality or impact of the other, allowing for deeper blacks, brighter, more eye-popping highlights and better contrast than a screen with fewer backlight zones.

If that all sounds a bit abstract, think of it like this. You’re playing Final Fantasy XV, and Noctis and his pals find themselves running around Lucis after dark when you see the flashlights on their jackets burst into action. On non-HDR screens (let’s go with ‘standard dynamic range’ or SDR for the sake of acronyms), which are generally ‘edge-lit’ around the sides of the display and only have a maximum brightness of around 300-400cd/m2 if you’re lucky, those flashlights would certainly pack a decent amount of punch, but the surrounding shadows wouldn’t necessarily be as dark as you might expect, perhaps appearing as a slightly murky grey or just looking a bit washed out.

This image doesn't really do the X27 justice compared to looking at it in the flesh, but the difference in how it handles night scenes is still quite stark

This image doesn’t really do the Predator X27 (right) justice compared to looking at it in the flesh, but you can still get a rough idea of how much better it handles night scenes than one of Acer’s SDR Predator displays

With HDR, a maximum peak brightness of 1000cd/m2 and 384 backlight zones at your disposal, however, that same scene becomes transformed – as Nvidia were keen to show me in my demo session. Noctis’ flashlight turned into a veritable beacon of light against the night’s pitch black hue on the Acer Predator X27, and other reflections on shiny road signs morphed from (what people in the imaging biz call) ‘clipped‘ patches of dim milky white into smaller, more refined orbs that shot out of the darkness. No light bleed, no haloing; just eye-searing torch light against the inky midnight sky.

Admittedly, I’m not sure what kind of long-term effect all this extreme luminescence will have on my eyes when I eventually get round to testing them, but I’m hoping that because most of the brightest spots are just that – spots – and not the entire screen scorching my face at 1000cd/m2, that it should be pretty much business as usual in terms of how long I can sit at my desk and carry on playing without feeling like I need to go and sit in a dark room for the rest of the afternoon.

HDR isn’t just about an increased range of white to black, however, as within that range you’re also looking at significantly more shades of colour, too. As I mentioned briefly yesterday in my post about the upcoming BenQ EX3203R FreeSync 2 monitor, the number of colours (and by extension the amount of luminance, or white to black) you get on an SDR screen is like the long, thin bit of a funnel. HDR, on the other hand, is like the wide, outer rim bit, giving you many, many more colours than you had before.

Again, it's hard to tell from just a photo, but trust me, this is the BEST Cup Noodle sign you'll ever lay eyes on...

Again, it’s hard to tell from just a photo, but trust me, this is the BEST Cup Noodle sign you’ll ever lay eyes on…

As a result, parts of an image that were once lost to clipping are now almost fully restored in all their lovely detailed glory. This includes small things like being able to see further into those sun flares and reflection spots before everything turns white, and bigger things like the white-hot heat of an Iron Giant’s massive sword – which just looked ‘white’ on the SDR screen, but when viewed in HDR suddenly burned with the same intense shade of bright, fluorescent orange you might see on a toaster filament. It sounds silly, but looking at that sword on the X27 almost gave the impression it was actually giving off its own heat, it was that convincing.

Colours generally were more vibrant and vivid on the X27 as well, with the red and yellow decals adorning Noctis’ monster Regalia car appearing much richer than they did on the SDR comparison screen. Again, just as the darker areas might appear washed out on an SDR display, so too do its colours – which is something you’ve probably never thought too much about before because, to your eyes, they still look perfectly fine.

But ‘fine’ isn’t good enough for Nvidia, which is why both the Predator X27 and Asus PG27UQ are targeting the DCI-P3 colour gamut instead of the ordinary sRGB gamut. Now, this is something AMD’s FreeSync 2 monitors are going after as well, and it’s also the gamut being used all those high-end TVs, too, in which every certified Ultra HD Premium TV must cover at least 90% of it.

Thankfully, Acer's ditched the red and black colour scheme of their previous Predator line, but there's no escaping pointy stand angles...

Thankfully, Acer’s ditched the red and black colour scheme of their previous Predator line for the X27, but there’s no escaping those pointy stand angles…

Without getting too technical about it and bombarding you with gamut charts and all that jazz, sRGB covers a pretty decent portion of the known visual spectrum – about 35% if you want to put a number on it. It’s the standard that everything’s used up until now, and for the most part, it’s been just that – fine. DCI-P3, on the other hand, covers around 54% of what the human eye can see, and is 25% larger than sRGB – hence that funnel metaphor earlier.

Neither Acer or Asus are putting a number on their proposed DCI-P3 gamut coverage at the moment (Acer, in fact, ignores it completely and refers to the in-between Adobe RGB gamut on its website instead), but even without having my calibrator to hand to test it myself, it was clear the X27 was leagues ahead of the adjacent SDR display, offering more life-like colours that elevated my entire gaming experience. Our collective RPS dream (all right, just mine and Alec’s) of seeing Ignis’ delicious foodstuffs realised in all their drool-worthy 4K HDR glory is finally at our fingertips.

The reason I haven’t talked about the Asus ROG Swift PG27UQ very much is because, out of the two monitors, it was the one running a demo of Destiny 2 sans comparison monitor. Maybe it’s because I don’t know Destiny 2 as well as I know Final Fantasy XV, but here the effect didn’t seem quite as pronounced. Yes, Destiny 2 looked lovely, with all its bright lasers and pew pew guns piercing the alien gloom in a dizzying spectacle of light and colour, but it was hard to tell exactly how lovely it was when I didn’t have an SDR screen next to it to point and laugh at.

Taking good pictures of monitors is HARD, okay?

Taking good pictures of monitors is HARD, okay?

I don’t mean this as a slight on the PG27UQ – I imagine Final Fantasy XV will look just as pretty on that as it does the X27 when I get it in for testing. But it suggests that some games might look better than others in HDR – and when there are currently less than 40 games that actually support HDR on PC, that might not be enough for some to justify dropping over $1000 on a new screen right this second (even if they do come with all the same advantages G-Sync has always brought to PC monitors, such as Nvidia’s variable frame rate tech for smoother, stutter and tear-free gaming and low input lag).

Limited game support aside, though, seeing both the X27 and PG27UQ has genuinely made me feel a lot more excited about the rise of PC HDR. So far, one of my key complaints about the other HDR monitors I’ve tested, whether it’s the BenQ EL2870U or even the FreeSync 2-enabled Samsung CHG90, is that their peak brightness is just too low. It lessens HDR’s impact and just doesn’t compare to what you’d get on a high-end TV.

Colour accuracy and DCI-P3 coverage alone just isn’t enough to produce worthwhile HDR in my books. Your eyes adapt too quickly to pretty colours onscreen, and after a couple of weeks you’ll probably wonder why you spent such a large amount of money on a monitor that might look a bit better than your old one but not enough that you can really tell the difference. Really, it’s the screen’s brightness that counts in this arena – although having all those dynamic backlight zones certainly helps – which is why the X27 and PG27UQ’s peak measurement of 1000cd/m2 make them such enticing prospects.

This is HDR on PC as it should be – Windows 10 faffing be damned. Our console couch friends have been enjoying this kind of HDR for years, so it’s high time us PC peeps had our own version of Ultra HD Premium TVs that we can actually put on our beloved desks – and from what I’ve seen so far, the X27 and PG27UQ look like they’ll deliver just that. Stay tuned for my full reviews over the coming weeks.

58 Comments

  1. Don Reba says:

    I wonder if I could have an HDR and a non-HDR display side by side.

  2. Premium User Badge

    Aerothorn says:

    Katherine, would you investigate in the future whether HDR displays make sense for colorblind individual? I imagine the ‘wider color gamut’ wouldn’t do a lot for people who already can’t distinguish a lot of gradiations, and would change the calculus as to whether these are worth buying.

    • aircool says:

      As a colourblind person myself, I’d say you’d still benefit from a HDR display. The fact that we have problems seeing ~1/3 of the colour spectrum won’t change, so the quality of colour depth would still be improved for those colours we can perceive.

  3. mkotechno says:

    Nvidia should tell the prices of upcoming BFGD or STFO.

    • KaiUno says:

      Yeah.

      Also… I thought we were doing ultra-wide now. I’m not going back to 16:9. I can’t. I won’t.

      Maybe for a BFGD. But not for a measly 27″

      • Kirudub says:

        Yeah, I’ve been waiting for a new monitor for … well, a long time. My HP LP3065 was made in 2008. I need 30″ 2560×1600/1200 minimum, and for the love of crumbcake, *don’t* curve the damned thing. I want to use my monitor for work/play. Designing for my biz (print/packaging) is a big “nope” on a curved screen.

  4. LilSassy says:

    Looking forward to sending the same monitor back 4 or 5 times just to get one that doesn’t have screen tearing or dead pixels.

    • televizor says:

      I think you mean light bleeding and dead pixels.
      But yeah, that’s been the case on lots of forums I checked in regards to buying a new monitor

    • sion12 says:

      It should be alot better this time round. first its a new panel, so there should be improvement. secondly its VA, not IPS panel, and lastly, this is premium product, so QC should be much higher, and its a FALD monitor, it should have less problem than the current crappy 144hz IPS monitor

  5. iARDAs says:

    Can’t wait to buy one of these bad boys and pair it with my GTX1060

  6. falcon2001 says:

    I’d definitely check out something in ultrawide with gsync and HDR, but man I don’t want to give up my ultrawide.

    • Glubber says:

      Yeah, I also went UW recently, and cant imagine going back. Happy for the evolution, but will wait for wider displays.

    • Nolenthar says:

      They are coming too, no worries, though it will be Q3 / Q4, and we’ll have to sell a kidney or two (as long as we keep our eyes, we shall be fine) ;)
      But fully agreed, not going back to Widescreen after knowing ultra wide.

  7. MiniMatt says:

    Even the pimply lads who race Vauxhall Corsas up and down the hill every Friday have realised neon underglows are a bit naff, but still we have gaming tech with the same aesthetic. Just stick a fibreglass whale-tail spoiler and phat exhaust pipe on yer monitors why don’t you.

    Sorry, getting old. And get off my lawn.

    From the piccies can we take it there might be the odd logitech keyboard review in the works? My creaky 12 year old G15 is about given up the ghost.

    • grimdanfango says:

      I hated that shit way back when I *was* 15… it doesn’t mean you’re getting old, it just means you have some taste.

    • sion12 says:

      I am sure you can turn it off. (imo, the only acceptable RGB LED usage is on keyboard, it just looks tacky and unnecessary on anything else.)

      also if price is not a concern, Corsair K70 is one of the best and most popular mech keyboard

      • duns4t says:

        Perhaps, but if it’s like a Razer device, not before downloading some bloatware and registering yourself with their data collecting business. Ornamental LEDs can DIAF.

        • Premium User Badge

          particlese says:

          Red backlight LEDs for the keys, on the other hand, are typically much better than RGB rainbowfests for low-light hand positioning since red doesn’t get your eyeballs excited quite so much (hence astrophotographers running around in the dark with red lights). And that Corsair, at least, lets you just push a dedicated button to switch between various levels of lighting – no custom software involved at all. (I still haven’t even touched the CD that came with it, assuming one even did.)

  8. cultiv8ed says:

    I’m looking forward to this tech filtering down to mid and low range. I haven’t bought a pc monitor for 8 years so I’m due an upgrade.

  9. Stromko says:

    I was almost convinced until that 4th to last paragraph that extols the wonders of G-Sync, and remembering that the monitor I’m viewing this on has G-Sync but I had to turn it off immediately because it caused a lot of stuttering and tearing, the exact thing it’s supposed to prevent. You’d think a monitor that supports NVidia’s flagship anti-tearing / anti-stuttering technology would pair well with their 1080Ti, but apparently not.

    • Nolenthar says:

      Sorry mate but something is terribly wrong with this statement / setup. Gsync works wonder. You may want to review settings/config/OS as the only time when I saw tearing with a Gsync monitor (I owned 2 with 4 different Nvidia GPU) was when my game refresh rate was not aligned with my settings (like game believing my refresh rate is 100 when it was mistakenly set up at 60 on windows)

    • Bucketear says:

      Either you have faulty equipment or incorrect settings. The problems you are facing are not indicative of gsync performance.

  10. KFee says:

    Unfortunately way too small display for me. When you are used to 40 inches, it’s hard to suddenly have 13 inches less. Too bad as I am very interested in G-Sync HDR.

    • Nolenthar says:

      Big gaming monitor (or whatever they are called) from Nvidia are following those monitor quite soon. They will be similarly specced but with up to 65 inches – though they are more intended for the living room, no one will throw you in jail for using a 65 inches screen at 1 meter from the monitor ;)

    • Don Reba says:

      It’s just the opposite for me. Having become used to a high pixel density, I would not want to go back. 27″ is really the maximum size for a 4K monitor.

  11. Thulsa Hex says:

    I have a(n) HDR question if someone wouldn’t mind indulging me: do decent HDR displays provide any general gains for “SDR” content? I’m particularly interested in black levels and contrast ratio.

    For example, I run a HDMI cable to my living room’s Panasonic plasma TV where I play most AAA games/anything that works better with a controller. Being a plasma screen, games that utilise a lot of light and shadow (e.g. Dishonored 2) look fantastic on it, greatly increasing the sense of atmosphere.

    On the other hand, I prefer playing Metro: Last Light at my desk with a KB&M, but the murky greys my BenQ LCD monitor tries to pass off as “black” do the game no favours. Knowing how much better(/spooky) these dilapidated subway tunnels would look and feel on the plasma TV pains me somewhat.

    I’ve been afraid that an upgrade to 4K TV would come at the cost of my plasma’s yummy blacks, but reading about HDR mitigates that fear–at least when it comes to content that supports the technology. What I haven’t been able to discern is whether these displays produce competent-to-excellent blacks/contrast levels for regular stuff. If so, one of these newfangled monitors might actually be worth it for me somewhere along the line, even if few PC games take advantage of the actual HDR capabilities for now.

    • DanMan says:

      You can benefit from the excellent black point of those HDR displays even for SDR content because the SDR black point is also actually defined at zero for both sRGB and Rec2020. You might want to play with the gamma setting a bit though. In any case, the much higher brightness is wasted.

  12. Dandadandan says:

    Yeah I just spent all my money on 34″ UW 1440P 100Hz screen so I’m not changing screens until this one dies.

    That said the jump from 60Hz to 100Hz made such a massive difference I’m kinda keen on trying 144Hz.

    • fray_bentos says:

      I have a 165 Hz G-sync monitor, unless I sit and watch the FPS counter, I only notice drops below 100. It is possible to tell the difference from 100, to 120 to 144 to 165, but not reliably. I can’t go back to playing 60 fps now, nevermind 30fps, it just looks like a slide show.

  13. televizor says:

    No news on ultrawide HDR monitors with G-sync?

  14. OmNomNom says:

    Meh 4k. When do the decent ultrawide (non VA, 1440+ vertical) screens arrive?

  15. Cederic says:

    It would be lovely to hear real world experience with ghosting and viewing angles with these monitors.

  16. Premium User Badge

    DeadlyAvenger says:

    Can anyone explain why the games/content need to support HDR? I would have thought that the backlight control is just a function of the difference between dark/light parts of a scene…or is there more to it than that?

    • DanMan says:

      Take a rubber band and draw a few strokes on it like on a tape measure. Now pull it apart. What happens? The strokes become much wider, or in other words: less fine grained/detailed.

      Same thing with HDR. If you just stretched SDR content across a wider range of brightness, it’d look terrible (lots of banding). You need the game to render in that much higher dynamic range.

      On top of that, the brightest spots in SDR are defined to represent about 100cd/m². The added dynamic range in HDR goes on top of that. The *average* picture brightness (APL) in an HDR scene is actually very similar to SDR. Exactly so that you don’t tire out your eyes.

  17. aircool says:

    I don’t know if I’d want a 4K monitor as I only have a 1070ti and like to play games at ~120fps. I suppose I could just lower the resolution whilst playing games in exchange for a decent framerate, but I’d feel as though I’d not be getting the best from the monitor.

    I’ve only got a 24″ 1080p Asus G-sync, but it goes up to 180Hz, and running DOOM (Vulkan) with my current setup, the 1070ti can easily keep up with the max refresh rate and look terrific.

    I think a 1440p would have been a better option, as 4K gaming with decent framerates AND top tier graphics in games are still a few years away.

    • Drift Monkey says:

      I also run a similar 1080p 180hz setup that I enjoy. It’s probably unpopular, but I’d consider a 24″ HDR 144-240hz upgrade…I don’t think we’re really at reasonably affordable, fluid 144hz+ 4K yet.

  18. Zeneage says:

    Please tell me they will do 1440p versions of these screens. I can’t believe, even with the next gen of gfx cards, you will be able to run games at 4k 144hz on ultra

    • Don Reba says:

      Gaming on a 4k 27″ monitor myself, in almost all games, I would rather dial down the effects than resolution.

    • DanMan says:

      Many games these days come with resolution scaling. So you can have a native 4k resolution, but the rendering happens at a lower scale to save performance.

  19. Chrithu says:

    1000$+? I guess I’ll wait some time. I just made the jump to VR, though I am still waiting for my Samsung Odyssey to arrive from the US.

  20. JoeD2nd says:

    Yeah, HDR has already been here on PC. I guess my Samsung doesn’t count in you guys’ eyes. Of course, now Windows has to get off it’s ass and fix their worthless HDR support.

    • DanMan says:

      You only need support in Windows to show HDR content in a window, like in a video player. If you run the software in fullscreen mode, it can take full control of the display. That’s why you can have HDR in games on Windows versions <10. Unless they deliberately block you out like FF15 and the MS games do.

  21. joekiller says:

    Thanks for the explanation and preview. Looking forward to the full review too!

  22. DanMan says:

    Thanks for stressing the point that 400 cd/m² isn’t enough on a LCD. I’ve been saying the same thing from the beginning and caught a lot of flag for it. Mostly from people who hadn’t even seen HDR at its full potential yet.

    I’ve already invested in a UHD Premium TV though, so I’m not looking to spend another pile of money on a HDR monitor. Got my PC hooked up to the TV, that has to do.

  23. fish99 says:

    I’m interested in a 27″ 1440p (or 4K) 144Hz screen at some point in the future, but they currently cost a bit too much for my budget. I guess you can add HDR to that list as well now, although it’s probably gonna push the price up.

    Not bothered about g-sync though, especially given how much it adds to the cost of a monitor, and I know quite a few people with g-sync monitors who either can’t tell when it’s switched on, or find it causes problems. I already have a 120Hz screen and a high refresh rate seems to entirely fix the problems of tearing/v-sync lag, so I’m not entirely sure why g-sync/freesync are needed.

  24. Enko says:

    Got Predator X27 (144hz) for about $700. Its totally fine and I love the resolution.

    Will consider this when its around the same price tag (wtf $3000?). I guess it will be 5 years before then. But at least GPU’s should catch up. Even Dota2 doesn’t run at 144 yet at 2560 (1070ti here).

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>