BenQ’s EX3203R monitor joins AMD’s FreeSync 2 HDR ranks

BenQ EX3203R

After much weeping and gnashing of teeth and disappointing first looks, proper, honest to goodness HDR finally looks like it’s about to become a reality on PC. We’ve already seen how AMD’s FreeSync 2 tech made the Samsung CHG90 one of the best monitors I’ve ever tested, and it’s shortly going to be joined by the rather stunning Nvidia G-Sync HDR-enabled Acer Predator X27 and Asus ROG Swift PG27UQ over the coming weeks as well.

I’ll have more words on those two Nvidia monitors in the next day or two, but those on the FreeSync side of the HDR fence need not fret about being left behind, as AMD have announced another new addition to their FreeSync 2 roster, this time in the form of the BenQ EX3203R.

To recap, HDR (or high dynamic range) is an enhanced bit of display magic that produces more vibrant and life-like colours, brighter whites and darker blacks, making an image look much closer to how we perceive it in real life than your average display. An easy way of thinking about it is to think of a funnel. If the long thin bit is how many colours you’re getting on a standard display, the wide, outer rim is how many you’re getting on with HDR.

Not all HDR displays are created equally, however, as a large part of their overall impact comes down to two things: the bit-depth of its panel (the best HDR displays use 10-bit panels rather than regular 8-bit panels) and a screen’s maximum brightness, as the brighter things are, the more vivid we perceive them to be.

This is why both AMD and Nvidia’s new HDR standards – FreeSync 2 and G-Sync HDR respectively – have their own certification processes so you know exactly what you’re getting if and when you decide to buy one. From the look of the BenQ’s EX3203R’s specs, however, it would seem Nvidia’s HDR standard is a lot more stringent than AMD’s, as the EX3203R’s curved 31.5in VA panel only has an 8-bit colour depth and a peak brightness of 400cd/m2.

The latter figure may well be a typical brightness figure you’d see in everyday use (it’s hard to tell from BenQ’s rather vague spec sheet and I won’t know for sure until I get one in for testing), but it’s a little worrying nonetheless.

Still, with a claimed DCI-P3 colour gamut coverage of 90% – the very same DCI-P3 percentage touted by your fancy Ultra HD Premium-certified TVs – the EX3203R certainly looks like it will have colour accuracy on its side, and a refresh rate of 144Hz is pretty nice as well. Bundle that in with a resolution of 2560×1440, a USB Type-C port, a height-adjustable stand and all the flicker-free / low blue light options you’d expect on a high-end monitor to help keep eye strain to a minimum (not to mention everything else that comes with being a FreeSync display like adaptive frame rate technology and stutter-free gaming), and there’s certainly a lot to like here.

What’s more, specs like this imply the BenQ EX3203R will be a heck of a lot cheaper than either of Nvidia’s G-Sync HDR monitors (which I’ve been told will cost at least $1000 when they launch either later this month or in early June), and it will hopefully be a much more affordable entry-point into this snazzy new dawn of display goodness than anything else I’ve seen so far.

I’ll be able to give a more thorough verdict once review samples become available (which should be ‘soon’ according to AMD), but for now it’s certainly one to keep an eye on if you own an AMD graphics card and don’t fancy forking out £1075 / $1100 for the Samsung CHG90.

11 Comments

  1. davebo says:

    The difficulty is in convincing me I need an HDR monitor when all the photos and videos of them I see make me think “huh, that looks pretty good” then realize I’m watching it on a non-HDR screen and am perfectly happy. I’m still waiting for the new HDMI standard with variable-framerate to be implemented so I can bypass G-Sync entirely.

    • bp_968 says:

      I have all nivdia GPUs but a freesync compatible 144hz monitor. I just can’t bring myself to cough up the huge 300-400$ premium for a gsync compatible display and I’d take a step back on the GPU switching to AMD (im using a 1080ti). So for now anyway I’ll be sticking with plain ole 144hz without variable display sync.

      Since it seems unlikely Nvidia will ever play nice with open standards like freesync, I’m with you on the HDMI standard. The sooner the better so we can stop this nonsense of having to choose between the Nvidia money-tax or the AMD performance-tax when looking for a variable sync monitor.

      • sion12 says:

        For what it worth, I got the Acer XB271HU at discount, i personally dont see any different at 100fps + with g-sync on or off.

        i hate how HDMI doesnt support 144hz and 1440p tho, yet window boot priority is higher than displayport. so if you connect any other display device like a TV, your monitor wont display anything till it fully booted

        • bp_968 says:

          I bought the XF270HU (I love acer monitors!). I only snagged it because I found it for 350$ and the cheapest I’ve seen the gsync versions was 600$+. Knowing what I know now about 144hz+ gaming if I had it to do over again I’d probably happily go as high as 500$ish for the same monitor, it really does make a huge difference!

          Your also not the first person who has told me that once your past 100-120fps you don’t really notice the gsync that much. I also did not know about the windows boot priority issue, but it explains a lot now that you mention it. I have 4 monitors attached, 2 via HDMI and 2 via Displayport and it always shows the windows boot info on my HDMI monitor and not my DP one. It’s not a huge deal but it is a little annoying.

          I previously had an Acer 27″ 2560×1440 60hz IPS panel and grabbed this one almost exclusively because of PUBG but it *had* to be an IPS panel (no way was I going to step backwards to a TN panel, regardless the refresh rate). I’m glad I got this monitor but I still plan on an upgrade in a few years depending on how adaptive sync and HDR ends up panning out.

          My (current) “halo” monitor would be a curved 34-38″ ultra-wide 3440×1440, possibly 3840×1600, IPS or OLED, HDR, 10-bit, adaptive-sync (hdmi-standard or gsync), and 120-144hz. At first I wanted to move up to 4k but honestly after playing around with the ultra-wide monitors I feel like, for gaming anyway, the extra pixel density is better used in a ultra-wide display then simply a denser display. 4K gaming looks nice but its really not *that* much nicer then 2.5K gaming and its vastly harder on the GPU. Those extra pixels are better used on a ultra-wide display. The 3840×1600 would be nice for desktop space but takes a 33% fps hit (2.5K@144fps ends up being about 120fps@3440×1440 and 100fps@3840×1600 and all the way down to 70fps@4K!!). Those numbers are part of why I laugh when the console crowd talk about how their new “pro” or “X” consoles are “4K” consoles. lol. Sure, 4K at 30FPS with the settings all turned down, but what exactly are we trying to accomplish at that point? Its going to be another couple generations before 4K truely arrives on console and probably this generation (11xx/20xx) that 4K/60fps arrives for the masses on PC. Widespread adoption of adaptive-sync would really help though because of how useful Async is for games/systems not quite able to sustain 60fps or 30fps.

          Edit: bleh. I wrote a novel here.. sorry about that.

        • laiwm says:

          I don’t see much difference between 100, 120 and 144 either but where my freesync monitor shines is in running newer or more demanding games. On a fixed refresh display the difference between 55 and 65 FPS is huge because you’re essentially flipping between 30hz and 60hz, but with variable refresh I can tune the settings so it stays at 60ish most of the time and I don’t notice stutter unless it dips into the low 40s.

  2. sion12 says:

    I just want to see a cheaper free sync 4K 144hz non HDR monitor

    • bp_968 says:

      I just want to see modern games that are playable at 144hz @ 4k! Even at 2560×1440 with a 8700k *and* a 1080ti very few modern games peg 144fps at ultra/epic settings. Based on some of the spec leaks I’ve seen so far for the 11xx/20xx series of cards I don’t think we will be seeing 4K@144 this generation either (though it might get really close).

  3. Premium User Badge

    particlese says:

    Nice to hear AMD’s getting some more love here, too, at least in the gamut department! Hopefully BenQ is receiving a 10bpc signal and doing some form of dithering with it. My old Acer does the old flickering pixels trick with an 8bpc signal into a 6bpc panel at 60Hz, and I still find it to be fine as long as it’s not too close to my face or if there’s sufficient movement going on – and I’m even one of those people who busts out a color calibration kit for every screen.

    • bp_968 says:

      I usually do the same thing (color calibration) but with my new 144hz/ips panel I noticed that properly color-calibrated was *not* also “game-calibrated” and actually made games like PUBG worse because it made it more difficult to see other players. Thankfully the acer panel has a reasonably decent settings menu so I can have a “game” mode and a “color” mode and switch between them. Doing the same thing with windows color profiles isn’t quite as easy sadly.

    • Don Reba says:

      I used to be against dithering in distant past, but then I happened to write some WinCE software and discovered that the difference between even gradient-rich, sophisticated 24-bit graphics and their well-dithered 565 16-bit versions was barely noticeable at 300ppi.

      I am now convinced that a high-pixel-density monitor would get no perceptible benefit from a 10bpc pixel driver. The DisplayHDR VESA standard agrees.

      • Don Reba says:

        To clarify, this is to agree with particlese that you need 10bpc output and dithering. Many people chase numbers and insist on having 10bpc pixel drivers, but it’s not actually worth paying for. And, also, the dithering algorithm’s quality matters.

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>