Asus PG348Q: Second Coming Of The Monitor Messiah?

OK, this is a little embarrassing. Last July I hailed, albeit with the usual journalistic qualifications, the Asus MG279Q as the Messiah of Monitors. Now I’m doing it again. And it’s another ruddy Asus monitor. But there’s nothing to be done. I cannot unsee what has been seen. And what I’ve seen is the new Asus RoG Swift PG348Q in all its 34-inch, curved-screen, IPS-panel, G-Synced and 100Hz glory. Nurse!

By way of premable, it’s ironic that the better LCD monitors get, the more it strikes me just how unsuitable the underlying technology is for full-colour screens. I’ve mentioned this before, but LCD is a dumb idea.

Made up of a grid of tiny shutters, it attempts, imperfectly, to control the transmission of light from a rear-firing backlight. But there is always some leakage. And that not only means you can never have perfect control of contrast and colours. It’s also the source of viewing-angle related issues.

The fact that the liquid crystals are, ultimately, moving objects and take time to respond only makes matters worse. It’s why early LCDs were chronically blurry when showing moving images. The response problem still hasn’t been entirely solved and some of the technologies designed to accelerate response can themselves be problematical.

Of course, with the backlight plus the LCD panel itself, the whole affair is relatively bulky and complex. In other words, LCD tech is one big overcomplicated kludge. All of which only makes the new Asus RoG Swift PG348Q all the more impressive. Because the last thing you’re thinking as you soak up the visual splendour is, “this LCD stuff. It’s a bit crap, isn’t ?”

Instead, it has me marvelling at human ingenuity. This thing does things that you feel oughtn’t be possible with LCD tech. There’s the curved panel for starters. I blow a bit hot and cold regarding this aspect. But at its best, the wrap-around ambience definitely adds to the sense of gaming immersion.

That said, it’s a little difficult to unpick that from the sheer scale of the 34-inch super-wide 21:9 aspect LCD panel. I use a 40-inch 4K monitor as my daily, but the extreme aspect of this thing means it still feels dramatic. In terms of the size and format, there’s no question I’d rather game on this than a 40-inch 16:9.

Of course, all of that has been available before. It’s from here that the new PG348Q begins to diverge from the norm. The quality of the IPS LCD panel is part of it. I’m not sure if it’s the best all-round monitor panel I’ve seen. But I don’t recall better and it certainly scores in a lot of areas.

The colours seriously pop and the contrast is excellent. Actually, the latter proves just how pointless monitor specifications have become of late. The RoG is quoted at 1,000:1 for contrast (that’s the inherent contrast achieved by the panel without using tricks like dynamically adjusting the backlight). As is most of the market.

There are a few 3,000:1 screens around based on VA panel technology, of course. But you’d be very hard pressed to pick the difference subjectively. Which is odd, because older LCD monitors with 700:1 contrast ratios are very obviously bad. Whatever.

The viewing angles are fantastic, too. Like a lot of the most recent good-quality IPS screens, the extent to which colour control is maintained off centre, along with the near total banishment of that annoying IPS-glow problem, means it’s just no longer an issue.

If there is an area where the underlying LCD technology announces itself, however, it’s pixel response. It’s very good for an IPS screen, make no mistake. But as with virtually any LCD screen, a little residual blur is visible.

On a related note, the other items that set the RoG apart are of course support for both 100Hz refresh and Nvidia’s G-Sync adaptive-sync tech. In regards to the former, high refresh basically rocks and 100Hz is enough to get most if not all the benefits. Yes, there are faster screens with support for 120Hz and 144Hz. But I would be very impressed if you put a screen running at 100Hz next to one running at 120Hz and could tell which is which.

Also, as I’ve said before high refresh is great on the desktop. It’s great in games. It’s great nearly everywhere. It just makes everything feel more slick and responsive. The one snag is generating enough frames in-game to make the most of that 100Hz ceiling.

With a pixel grid measuring 3,440 by 1,440, that’s no mean feat. Which in turn is where Nvidia’s G-Sync comes in. It’s also where I think the diminishing returns kick in. Let me explain.

G-Sync really works. There are demo applications that make that abundantly clear. But that’s the problem. Outside of demo applications, it’s often hard to tell if it’s on.

I confidently predict that someone will comment below how life changing G-Sync has been for their gaming. Or quote me saying positive things about it in the past. But with time and experience, my attitude to it has cooled.

Put it this way. I’ve simply spent too much time looking at various PC setups and screens wondering if it’s been properly enabled. Not so high refresh. Give me about three seconds and I’ll tell you if a screen is running significantly above 60Hz. So G-Sync is one of those features that I’d rate as nice to have but not worth a huge premium. Unfortunately, it not only adds a hefty premium to what would already have been a pretty pricey monitor. It requires you use an Nvidia GPU, too.

You could split a few hairs elsewhere with the RoG, too. The chassis and stand design is a bit of a mixed bag. It’s very flamboyant and in places feels pleasingly expensive. But elsewhere, there’s some cheap plastic.

Likewise, the so-called ‘frameless, edge-to-edge’ design translates into a fair amount of bezel in reality, and the LED mood lighting that projects a coloured logo onto the surface below the stand is at best a bit of fun and at worst just adds unwanted cost.

At which point I feel like I’ve gotten a little grumpy. So, let’s redress the balance. I’m reluctant to draw too many sweeping conclusions – a third coming of the monitor messiah would be impolitic, after all. And in the back of my mind, the coming OLED revolution looms. But this is, very likely, the most desirable monitor for PC gaming I have ever seen. It’s a gorgeous display.

It’s just a pity it has to cost £1,000 / $1,400, or thereabouts.

From this site

96 Comments

  1. FurryLippedSquid says:

    I dunno. With your basement theatre and your curved screens…

    You’ve changed, Jeremy.

  2. Premium User Badge

    PoulWrist says:

    That stupid light underneath it… Why? Just, why? Who would leave that on for more than the time it takes to turn it off? It’s a distracting element to a monitor, where the whole point is to look at the image area, not the bezels, LEDs and other stuff…

    • Banks says:

      I love it, It’s so excessive.

      The screen is gorgeous too, Asus makes great gaming monitors.

    • Xerophyte says:

      Asus’s entire “Republic of Gamers” (eww) line is like that. The hardware itself is excellent and I’m very fond of my oversized laptop and it’s 980M, but I their styling and software is aimed directly at 15-year old boys with rich parents and an extreme fondness for gaudy LEDs, cruddy filters and random acute angles.

      I will never understand gaming peripheral design in general but RoG is among the absolute worst offenders.

      • Premium User Badge

        gritz says:

        Weird, my RoG monitor has none of that. The only LED is the power button, which is cleverly positioned such that it’s not usually visible over the edge of the bezel.

      • Maxheadroom says:

        This. Exactly this. All of this.

        It really annoys me that high end or enthusiast gear is marketed like the sole demographic is 14 year old boys with a fundamental need to have words like XTREME CYBER FRAGGER POWER MASTER stencilled on everything.

        I was recently looking for a new case and briefly considered Deepcool’s Genome (which ok is a little gaudy but I thought the idea of the built in water cooling was interesting) or I was until I saw they’ve gone and embossed GAMER STORM! on the front:
        link to tomshardware.com

        Just why??

      • TheWhippetLord says:

        Ah, RoG = “Republic of Gamers”.
        I thought he was Asus’ celebrity designer or something.

        • Zamn10210 says:

          Hey, he can’t live off that Lucozade Sport gig forever.

      • thelastpointer says:

        I grab every opportunity to post this, and I’m not going to miss this one either.
        link to amazon.com

        • fabronaut says:

          now that router… that is hilarious design.

          THE CLAW! INSERT BEATING HEART SACRIFICE HERE

          good to know that the LED on this monitor (likely?) could be disabled in a pinch. I find bright lights at my peripheral vision and such to be quite distracting, unfortunately.

          also, I could literally buy a car for the amount of money they’re asking for this monitor in CAD after price conversion from USD. (it’ll be upwards of $2000 – 2200 here, which is just absurd)

        • Premium User Badge

          Qazinsky says:

          Am… Am I supposed to wear this like a crown while fighting armies of elves?

    • kendoebi says:

      You will almost certainly be able to turn it off; I have the lcd light turned off on my MG279Q.

  3. Amatyr says:

    I’d say the monitor messiah is the X34 for me. Same screen, had it for some months already, looks more sensible.

  4. Premium User Badge

    heretic says:

    1 grand… ouch

  5. wisnoskij says:

    Really, more monitors need to be ultra widescreen. Really, a widescreen monitor is better for watching movies and playing games, it offers zero benefit otherwise. What revolutionises computer use is two monitors, or a screen wide enough to handle two concurrent windows well. Give us like a third more horizontal real-estate and the monitor almost doubles in its effectiveness.

    • Don Reba says:

      I don’t know, with two widescreen monitors, I can easily have four windows side by side by snapping them to the edges. With a single ultra-wide I can only have two. Seems like a noticeable inconvenience.

      • froz says:

        I’m using little app called GridMove for my ultrawide screen. It allows you to chop your screen however you want for easy snapping of windows. In my case I usually have 2 smaller windows on the sides (for skype and e-mail) and big one in the center (taking 1/2 of the total width) for browser. Works perfect.

        However, some games still doesn’t play nice with ultra widescreen. And a lot of movies/TV shows are still not ultra widescreen as well. Netflix doesn’t support that ratio for example (there is a plugin for that in Chrome, but then again the highest resultions in Netflix work only in IE/Edge, if I remember correctly).

      • Premium User Badge

        phuzz says:

        Well, if this wasn’t a thousand quid, I was thinking of getting one and sticking it next to one of the two widescreen monitors I have already. So, one ultrawide, one wide, best of both worlds!
        I’m not sure they’d both fit on my desk though, and of course I can’t afford it either.

  6. TacticalNuclearPenguin says:

    But it’s curved.

    • Zenicetus says:

      Yeah, if I had a dedicated simpit for flight sims, I’d get something like this in a heartbeat. But I don’t think the graphics/video work I do would be possible on a curve.

      But one of these days… if I ever build a simpit. Probably won’t happen before VR takes over though.

    • froz says:

      You get used to the curve in a few days. It’s not that big of a curve as you might think + human brain is perfect for not noticing such things. Think about it – in reality a long enough straight line is visible to your brain as a curve, but brain adapted and you can still see it as a straight line. There is no problem whatsoever about this. I admit I’m not doing graphical work though.

      • TacticalNuclearPenguin says:

        Actually, with the perfect desk setup, the perfect distance, monitor size and curve radius, a curved monitor might actually end up looking flawless because it follows your cone of vision.

        I wouldn’t even need to get used to it, because it’s already tailored to my specific setup more than a flat monitor could ever hope, but if that’s the case what’s the problem then if i could actually perceive it as perfectly flat?

        The problem is that i would know it’s not, and i really go OCD about this stuff, especially since the perfect case scenario with a curved monitor is hard to achieve and only possible if you never move.

        • froz says:

          It’s not really possible in practice.

          1) The curve is only in one dimension. You would have to make the curve like an inside of a ball (curving in both dimensions).

          2) The curve is very little, the center of that virtual ball (or, as it is now, just a circle) is longer distance then you would normally use. I’m too lazy to calculate that, but I think it would be somewhere around 1-2 meters from the screen, not really comfortable.

          3) This is not why the curve was designed. It’s there to help achieving perfect color even on the far sides of the screen. Sure, it’s IPS, so you get very good color even from bad angle, but at some point you can see distortion. The curve helps it, as due to the lenght of screen the angle at the ends would be higher if it was flat. Mind that you don’t have to sit at the ideal center to get benefits of this, it also helps if you are slightly to the side (meaning you can perfectly watch a movie together with other people and it will be fine).

          I’m basing this on my experience with DELL U3415W (which has non-variable 60 Hz IPS panel).

          • TacticalNuclearPenguin says:

            Off course, i was just trying to please both crowd, but i will concede that i didn’t think about neededing the curve to be on both axis.

            About your color accuracy point, yeah, i guess that’s part of the reason, but i don’t see it working very well in practice given that to make a curved edge-lit ( yeah, cheap bastards ) IPS monitor they most likely introduce new problems anyway.

            If you want to remove IPS glow there is absolutely nothing that will help aside from a polarizer, and to resolve some edge differences it’s still better to have a good uniformity correction algorythm, because at the end of the day the entire screen has slight uniformity issues anyway.

            I don’t need a gaming monitor, and this one costs as much as my Eizo ColorEdge that has aforementioned features. You see this page? link to lagom.nl ??

            I see it like this: link to i.imgur.com and that’s an impossible result for any non corrected IPS. The reason you can’t see the characters on the entire screen is because gamma is perfect 2.2 all around.

          • froz says:

            Well, Eizo ColorEdge prices go up to 5k € here, while the monitor recommended in this article goes for around 1,3k € (my Dell was little cheaper, still probably the most expensive part I ever bought). The cheapest ColorEdge I found now is still around 20% more expensive then this Asus and is only 24 inch. So I would say it’s completely different story.

          • TacticalNuclearPenguin says:

            You don’t need the CG, the CX271 is what i have, 1440p and 27 inches.

            The CG differs by a better warranty, built in colorimeter and slightly better internal LUT ( but still 16bit ), but the CX still has hardware calibration ( non-3D lut ) and the built in thinger only sorts the white point, but it’s enough since gamma never drifts anyway.

            It was 1200 euro btw, which is actually cheaper now that i think of it. Well, that or Amazon was in a good mood.

  7. grimdanfango says:

    Outside of demo applications, it’s often hard to tell if it’s on.

    Personally, having tried G-Sync, I can never go back. All tearing or jittery frame delivery issues solved, simply and transparently. Actually, generally I can’t obviously tell it’s on, but that’s because it just works, mostly flawlessly. I can sure as hell tell when it’s *off*! :-P

    It’s the same effect as SSD over HDD… I stopped noticing I was using an SSD within a couple of days of getting one, but trying to use a computer with only an HDD now, I get an overwhelming compulsion to punch it.

    • aircool says:

      Yeah… I thought G-sync was all about smoothness and not framerate. Therefore, you won’t notice when it’s working, only when it’s not.

      You could have 100fps, but if the frametimes are all over the place, it’s gonna look shit :)

      My next monitor will be a G-Sync one… unless something else has superseded it by then.

      • Jeremy Laird says:

        But that’s the point. If you can tell when it’s not working, you can tell when it is working. Because you would be able to tell it’s not not working!

        And my experience of G-Sync involves a lot of standing in offices with colleagues asking each other, ‘do you think it’s working?’ Not because it’s particularly unreliable, though it can be a bit pernickety about drivers. But because it often simply isn’t obvious that it’s working.

        Anyway, like I said in the post, I’m not saying it doesn’t do anything of value. I’ve just cooled a little to how much value that is!

        • FriendlyFire says:

          I’d say that’s because, if it’s working, everything looks… normal. G-Sync is there to correct somewhat random artifacts in the graphics rendering pipeline. If it’s doing its job, you just shouldn’t notice it exists.

          That makes it really difficult to just go “WOW it’s working!” It’s not like adding a new effect or doubling the frame rate.

          • Frosty_2.0 says:

            I think it’s easier for humans to discern sudden differences, artifacts, or loss of responsiveness – ie. when something is wrong.

            A game that was chugging a bit w/V-Sync, but now smoother & more responsive w/G-Sync – You’d probably tell the difference easily if played immediately one after the other.
            But if you sat and only played that game with G-Sync on you might not be able to tell (without testing specifically), it could appear to just be running better on a better rig..

            It’s a bit like when your car has an under inflated tire or bad suspension; you notice the bumps & looseness but you tolerate it, live with it and acclimatise.
            Then you inflate your tires back properly and the ride is suddenly smooth & responsive again, nothing in particular to notice any more.

        • grimdanfango says:

          I get the feeling some people are more sensitive to the artefacts that G-Sync solves. Screen tearing annoys me in a very visible way, but microstuttering just chips away at the back of my head… I’m acutely aware that *something* is wrong, and the moment I used G-Sync, it felt like a breath of fresh air… games suddenly stopped grating on my nerves!

          I have the same problem when watching 24hz movies on a 60hz screen… drives me up the wall… really can take me out of a good movie. If I’m watching anything 24hz on my computer, I have to switch to a 72hz refresh.

        • Frosty_2.0 says:

          ^ Agreed.
          Tearing really annoys me also but G-Sync eliminates it down to the bottom end (where it’s worst) while the sync-related frame rate drops & stuttering are gone, that smoothness makes a tangible difference to the feel of a game for me.

          Switching G-Sync off to V-Sync and actually playing, I can feel the frequency of stutters (& see the drops), that is the most important factor and you will not feel that difference just looking at the screen.
          The price premium is a bit steep though.

          Now if your [rig | game] is consistently pushing a great frame rate anyway, it’s not going to make a tangible difference.

          • Dizrupt says:

            G-sync was never designed to improve your viewing experience at high framerate. Fluctuations, micro-stutter, tearing (no V-sync), that’s why it exists.

            It’s a general advice to turn G-sync off and use ULMB if you are able to push high fps constantly, you eliminate another fraction of unwanted delay.

            Another advice I’d give is, don’t use G-sync at 144Hz. There’s noticeable input lag in comparison to running it at 120Hz.

          • Frosty_2.0 says:

            That was my point, as the discussion was around “When & If you can tell G-Sync is on”;
            That’s why I stated both cases for when it will & won’t be tangible to a player.

          • Jediben says:

            The latest Nvidia drivers have fixed the huge problem that G-sync had – the inability to use G-Sync and Sli and DSR at once. At last we can now turn on DSR, render 4k texture at lower resolution, enable SLi to improve the framerate and use Gsync to make sure that if it taxes the Sli cards too much to maintain the 144hz, the frame rate drops are NOT going to cause us misery. It’s now perfect in every way. Death to AMD!

          • Grovester says:

            The new nVidia drivers that bork the living daylights out of loads of people’s PC’s? Now withdrawn?

            link to theinquirer.net

        • Anti-Skub says:

          I think you are looking for the wrong thing to be honest. The reason I would always go for a g-sync monitor is because without g-sync your options are v-sync or screen tear both of which I hate.

          Sure if you’re playing Counter Strike at 160 FPS and you’re trying to spot the difference between v-sync and g-sync you might not be able to tell, but where g-sync really shines is when you aren’t hitting your monitors refresh rate.

          Say you’re playing the Witcher at 4k with all the settings cranked up…you aren’t going to be getting a smooth framerate with any normal set up. And that’s when g-sync shines. Without it you are forced to chose between the obvious loss of image quality that comes with v-sync off or the horrible input lag of turning it on. G-sync removes screen tear without introducing lag.

    • HothMonster says:

      I recently acquired my first g-sync monitor (pg278q) and did not have much love for the feature. The first few times I used it everything seemed to be working but I couldn’t really tell anything was going on.

      Sometimes though I would fire up a game and get 1/2 to a 1/4 my normal FPS, CS:GO which I normally get around 170 FPS in would run at about 5 FPS. I would have to restart the computer to get it to run right. Even then games with heavy fluctuation; not in game but differences between menu, game, cinematics, ect., would get hobbled to a much lower max then I know they normally run at.

      I’ve turned it off and never had a problem with tearing or anything abnormal since. I feel like all it did was add significant cost to the monitor since it seems to work wonderfully without it.

      Do you run into these issues and are just OK with restarting the PC whenever it craps out or has it been problem free for you? Did I just miss what it was adding and fighting to get it to work is worth it?

      • grimdanfango says:

        Nope, never run into any issues like that. I can’t really see what effect the monitor could really have on GPU framerates… unless there’s a glaring bug somewhere in the mix. GPU Performance has never been affected in the slighest for me, aside from obviously being capped at a maximum of 144fps on a 144hz screen… it’s not like 170fps gains anything if your monitor can’t even display those frames.

        But anyway, I’d say your problem likely lies somewhere else… there’s really nothing about a G-Sync module than can reduce a game to 5fps!

        • HothMonster says:

          It’s definitely a g-sync issue. Went away if g-sync was turned off. I’ve googled it other people ran into it, goes away if you restart for a while. I don’t know if its just my setup or how long I leave my machine on normally but it kept coming back at an annoying frequency. Adding in the fact that I wasn’t really noticing the value when it was on and working right I didn’t really pursue a fix for very long.

          Seemed like if the game ever ran at a low frequency g-sync wouldn’t get over it. CS:GO was the extreme example, I think because as the game starts up it makes a window but runs at sub 10 fps while it loads, but I couldn’t even get into an actual game because the main menu screen would be capped at 5fps. Other games would just be handicapped xcom2 ran at ~-30 average FPS. Other things I booted up were 20-40 fps under their average. Vsync was off, the monitor was set to run at 144.

          When it worked it worked, games ran at or slightly under their normal average fps, CS stayed at 144. It just never seemed noticeably different from when I had it disabled. That may just be my peasant eyes but I’m not surprised when I see other “meh” about it.

          Maybe I’ll give it a go again next time I rebuild Windows.

          • Dizrupt says:

            Problem was definitely on your end. Either driver, faulty G-sync module, or your specific hardware setup.

    • fish99 says:

      Just having a high refresh rate monitor (120Hz in my case) solves most of the problems g-sync was designed to fix. I leave v-sync on for everything, get no tearing and no v-sync associated framerate drop.

      • Dizrupt says:

        You still have input lag. Fuck that.

        • fish99 says:

          Why would you have input lag and why would g-sync fix this? I don’t need to have triple buffering on. With any setup you still have a back buffer because otherwise you’d see the screen getting drawn.

          Also you need about 30ms of input lag to really notice it. Most people happily game on HDTVs on console and they all have >30ms of lag. My screen already has <8ms of input lag.

    • 44TZL says:

      I just traded in my 85Hz XR341CK 34″ UW for the 16:9 G-Sync Rog Swift 279Q and what an improvement! I’m now on average 5-15 secs faster in Dirt Rally stages, Quake Live rail accuracy up by 10%, I now can catch my slides in Automobilista and wether I get 60 or 165 frames/sec.. no more stutter and blur/overshoot is nearly gone. Easily noticable :)

      The UW format is beautiful, but until it starts acting like a decent CRT like the 279Q, I am not going there! In a way I really agree with the author on LCDs being such a step back.. but these type of panels still are – despite their awesome format.

      And btw many things work when you don’t notice them!

  8. pillot says:

    too small man. need 50 inches at least

  9. Wisq says:

    Waiting to see where the Rift and Vive go before I buy a new monitor. If they end up taking over most of my cockpit and first-person stuff, then the widescreen immersion factor is gone.

    For more “2D” stuff (e.g. strategy games) where I wouldn’t use VR, well … honestly, I’d rather not be having to look way off to the side to find the UI elements I usually need to be keeping a constant eye on.

    And of course, that’s all on top of the resolution compatibility issues in cheaper games — we finally got pretty much everyone supporting 2560×1440, and now we’re going to throw 31% more widescreen at them?

    • grimdanfango says:

      Well, I suspect Rift/Vive will certainly replace your monitor for cockpit experiences, but FPS… probably not. I really think FPS games are a bad fit for VR, even though half the industry seems to still think it’s going to be great. Moving your own body around with a keyboard/analogue stick feels horrible!

      • eqzitara says:

        link to youtube.com

        Should check it out if you think FPS will be bad on VR. Sure less skilled but fun/immersive. Certainly.

        • Anti-Skub says:

          a) That is not a product that will ever become a standard controller and therefore not any kind of proof that FPS in VR wont be rubbish.

          b) It looks shit? The gun part of it appears to be nothing more than a trigger, with the vertical aiming controlled by the headset and the horizontal aiming controlled simultaneously by the headset and by rotating the thing attached to your hips. The fact that he repeatedly accidentally opens the menu during firefights and is choosing to side step to get his reticle lined up with the target rather than have to deal with the clunky rotating shows just how low fidelity it is.

  10. brucethemoose says:

    Those Korean imports are still the messiah of monitors, IMHO. 2560×1440 IPS @ 96hz+ is nothing to sneeze at, but it’s amazing for $300.

    You could get THREE of them for the price of this monitor.

    • Geebs says:

      Those things are fantastic – I have a Hazro which was a bit pricier when I bought it, but a damn sight cheaper than the basically-identical Cinema Display and not a single duff pixel.

      I’ve been spoiled, though, by a Retina laptop and so have just switched to a 28″ Samsung 4K monitor. It’s TN and the build quality is a bit crap, but it makes up for it: text looks great, the dot pitch is high enough that I can finally plug in the Xbox and upscale it without making a blurry mess. You lose a bit of screen real estate but 4K is a more flexible resolution.

      Now, if only Microsoft would wake up to HiDPI displays and finally fix Windows scaling so that resizing widgets no longer need you to hover over the exact, tiny single pixel…

    • Asurmen says:

      I have one, but it was from an American company. I’m honestly thinking I took the high refresh rate dive too early because I’m stuck in DVI land if I want to keep the high refresh rate. If Pascal/Polaris don’t include DVI (which is unknown but could eqsily be possible), I can’t upgrade my GPU at all without moving monitors.

      Had I waited I’d be financially better off and be looking at a high refresh 1440p screen with HDR and a new GPU that doesn’t require me messing around with patching drivers and messing around with CRU.

      Nothing to do with your actual post really other than it involves an overclocked monitor.

      • brucethemoose says:

        Ya, I’m a bit worried about DL-DVI as well. But seeing how long VGA has survived, I bet it’ll be around on GPUs until HDR OLED monitors drop down to sane prices.

  11. Zanchito says:

    It’s literally the same thing as the Acer X34 Predator, but even tackier. It’s surprising it’s not mentioned in the article.

    • grimdanfango says:

      It sort of is, true, but Acer are almost always sloppier with the details. Asus screens seem to often have better default calibration, less backlight bleed, less/no overdrive pixel overshoot, etc, etc…
      They’re far from perfect, but Acer are far far FAR from perfect.

    • suibhne says:

      The Acer X34 has been downright plagued with QA issues – so severe, in fact, that Amazon de-listed the product for awhile due to the high number of returns. If Asus can solve this, and especially if they can address the Acer X34’s atrocious backlight bleed, then this monitor will be the clear winner.

      • Frosty_2.0 says:

        I had that experience with the Acer XB270HU, I wouldn’t say the samples I tried were terrible in the backlight bleed department but they weren’t good & it makes the IPS glow worse.

        It’s frustrating watching or playing anything with dark scenes in a dark room/night and you’ve get distracting light leaks + different tints with certain panels.
        It’s a horror playing survival/horror games.

        Monitor manufacturers in general need to up their game on backlight bleed I feel, most seem to have a quality standard where ‘If no bleed is visible in office lighting, then it’s a pass’.

  12. BellicoseBill says:

    Personally, having tried G-Sync, I can never go back.

    Ditto this. FSX used to tear awfully for me and no tweak or setting would cure it. I finally tried a G/Sync monitor and all tearing issues disappeared. ROGs are expensive but worth it for serious gaming.

  13. OmNomNom says:

    Does it have blur reduction tech? Tbh thats what killed the X34 for me, just couldnt stand how naturally blurry it is was an IPS without blur reduction and only 100hz

    • Jeremy Laird says:

      Yeah, it has multi-level user-configurable overdrive.

      But not LCD is perfect in this regard. The very fastest TN panels are getting there. But no LCD panel is blur free that I have seen.

      • OmNomNom says:

        The finest for me so far have been Benq monitors. You have to put up with TN colours but ULMB (blur reduction) is truly gamechanging. Far more than GSYNC is, if you have the power to push the decent framerates anyway.
        The only thing that used to hold me back from blur reduction (strobing) was the brightness deficit, but the recent Benq monitors like the XL2730Z are the brightest in their class.
        I am very sensitive to blur though so YMMV.

        Ideally I’d like a 120hz+ IPS 21:9 monitor with both GSYNC and ULMB able to be activated at the same time. I guess I’ll have to wait another 5 years for that.

        • OmNomNom says:

          Obviously these are just my preferences, others may be perfectly happy with other screens. It’s just for me, the blur is always the killer.

          • Jeremy Laird says:

            Yeah. On paper OLED solves the blurr problem at the same time as solving the contrast and viewing angle problems. On paper…

            Will be interesting to see how it all plays out and VR could also be part of the conversation. But I think at the very least within five years we’ll know if OLED really is going to replace LCD wholesale for our big desktop panels. I think (hope) it will and that it delivers in reality all that on-paper promise.

          • OmNomNom says:

            Well I love the way the VR tech is also pushing forward the blur reduction tech.
            I personally have an OR on pre-order for late March / early April and I am crossing my fingers that they have mastered the blur and input lag as it was ruining the experience for DK2.
            I have high hopes that this, along with smartphone dominance will help with PC displays. (most current OLED phone displays are far too blurry for actual PC level gaming IMO)

  14. Simbosan says:

    The glowing light would put me right off, playing in the dark with that dumb distracting orange glow? Gaffa tape time, assuming it wouldn’t cause overheating… Nah, get another monitor

  15. Warduke says:

    I know some pretty serious gamers but I don’t think I know a one that would pay $1400 for a monitor..

    I thought $700 for an Asus ROG was a lot..

    • UniuM says:

      This!

      There is no good monitor for everything. And giving this kind of money for a “non” standart res is crazy.

      I know the subject is G-Sync, but i recently bought a 144hz 24′ 1080p more than standart monitor, (TN panel) cheap like carrots… a new card and hopped on this brave new world of whatever Sync.

      In my case was a R9 390 with a AOC G2460PF freesync 144hz. For all i payed 650€.
      Now if i went on the G-sync route… that would cost me more than 1000 for the same specs.

  16. nimbulan says:

    I beg to differ on the contrast ratio. I got a VA monitor a couple months ago and the black level and contrast are very obviously better than standard 1000:1 displays.

  17. fish99 says:

    Not sure I want that aspect ratio for gaming, and definitely not for coding where vertical height is everything. With the way (divide by Z) projection works in 3D games, you get a fattenning of anything to the horizontal edges of the screen, and the wider you make the screen the worse this gets.

    Also LCD needs to die.

    • Zanchito says:

      The aspect ratio doesn’t matter, it’s the total vertical pixels. Just think of it as two 1440p vertical monitors in one single frame. I use an X34 for backend / frontend programming and it’s just peachy. Bonus points for fitting both my monitors at work when using remote desktop in one single screen.

      • fish99 says:

        Just to be clear I’m talking about aspect ratio purely in terms of gaming, and vertical resolution in terms of coding. Personally for both applications I would prefer a 16:9 version of the same screen size. For coding you want more than one screen anyway.

    • Don Reba says:

      I wouldn’t say vertical resolution is everything in coding. You only need so much of it. Beyond that, it’s often helpful to have several files or several views of one file open side by side; as well as documentation; and stackoverflow; and an instance of your program. So, actually, ultrawide might be a good fit.

  18. Thants says:

    Note To Self: Read the end part with the price before the rest of it.

  19. Xzi says:

    Ehh, I’ll stick with my 34-inch non-curved ultrawide LG IPS panel @ 60Hz over that any day. Especially given it was $350 on sale.

  20. RubberbandAU says:

    Not buying premium products from ASUS again.

    There’s a PSU issue with the ROG Swift which they wouldn’t help me with unless I did a 2 hour round trip to their repair centre which was some blokes dingy cellar of a shop.

    • suibhne says:

      Not useful for UK folks, maybe, but this is why I keep ordering stuff from Amazon even if I don’t like a lot of their ethics. The return policy is absolutely ace, at no cost to the buyer.

  21. Jonnyuk77 says:

    Hup….

    what about the aspect ratio and support for it in games? I had issues with DA:I (beyond the game itself) needing a downloaded fix from UWS. Cutscenes in games often try to jump back to a standard widescreen even if the game is running 21:9. The Witcher 3 I remember having issues with also…

    Farm Simulator on the other hand runs like a charm, seems these big studios need to take some pointers from people who know how to make a game!

    • Zanchito says:

      I’ve had the X34 for a few months now, which is basically the same screen, and have had very little trouble (only Fallout 4 and X-COM 2 don’t support the right ratio correction), but tweaks for those two appeared overnight. I’ve got to say 21:9 is really a wonderful thing for immersion, and much less problematic that people make it out to be. Excellent for programming too, left half of the screen is Eclipse, right half is stack overflow. Or when doing web, one half for the HTML editor and the other half for the test browser. You can even go three full columns and still do perfectly fine.

  22. obowersa says:

    Something I’m always curious a our with ultrawides, how do folk handle window management? On Linux I tend to use a tiling wm for things, which strikes me as being perfect for ultra wide.

    With Windows though, are there any neat applications for it? I guess I’m wondering how you split a single wide screen into columns at least sk you don’t have to spend half of your life adjusting window size.

    It’s a large part of why I’ve resisted getting an ultra wide for gaming/ windows usage.

  23. Neutrino says:

    Is ‘pop’ as in ‘the colours seriously pop’ just what the kids these days call saturation, or is there something more to it?

  24. Grovester says:

    I’m guessing that you need a 980 to run this at full res in games? I’ve got the ASUS PG279, which I love, but my 970 just about keeps up at 2560×1440 in close-to-ultra.

    Oh, and GSync did a great job for me in Shadows of Mordor. The tearing was terrible for me in that before I got the 279.

    • sandman2575 says:

      This is what I’m wondering as well. The monitor’s pricy price-point is one thing, but what kind of rig do you need to run this 4K behemoth and get anything like 60 to 100 or better FPS at high/highest settings in a game like Witcher 3? Or, per last screen shot, Rome 2 TW, which is still a very demanding game at ultra settings even though it was released almost 3 years ago.

      Will a single 980 cut it? I’d be surprised if any single-GPU solution would be adequate — ??

  25. Premium User Badge

    Elusiv3Pastry says:

    So assuming nVidia’s new line of Pascal chips are powerful enough to run the 40 inch 4k monitor with a single card at 60 FPS, would you still use this monitor for gaming or would you go back to the monster?

    • Jeremy Laird says:

      Purely for gaming? I’d take the Asus over the 40-inch beast.

      For all-purpose PC stuff, the sheer real estate of 4K wins, especially over 1,440 vertical pixels.

      • Premium User Badge

        Elusiv3Pastry says:

        Thank you, good to know. I’ve been clutching my 30″ 16×10 monitor like it was the last living creature of an endangered species (which I guess it sort of is :p) but it’s time for a high end monitor refresh when I build a new pure gaming rig this year. I still haven’t been able to decide between obscene 4k real estate or narrower but faster refresh rates. I suppose at 34″ this nifty thing is a pretty good compromise between the two.

  26. tahru says:

    Does it have HDMI 2.0? I was reading the technical specs and it did not indicate it. I would assume it was, but the refresh rates quoted in the spec only shows 50Hz for HDMI which sounds like HDMI 1.2. I am probably just confused about it.