To 4K Or Not 4K? The Pros & Cons Of Ultra-HD Gaming

With Laird Towers currently undergoing major renovations, RPS’s hardware coverage has been forced to retreat to the vaults. But that hasn’t stopped me. No, I’ve battled through the dust, the rubble, the builders lumbering about the place at ungodly hours of the morning (I regard consciousness before 9:30am as rather uncivilised) and the relentless tea-making to bring you some reflections on 4K gaming. We’ve covered several interesting alternatives to 4K of late including curved super-wide monitors, high refresh rates, IPS panels and frame synced screens. So does that experience put a new spin on plain old 4K, aka gaming at a resolution of 3,840×2160?

The context here is threefold. The first bit involves the aforementioned floatilla of interesting new screens with fascinating new technology, pretty much all of it gaming relevant. Whether it’s 34-inch curved ubertrons, 27-inch IPS panels with 144Hz refresh capability or the Nvidia G-Sync versus AMD FreeSync thang, it’s all aimed at PC gamers.

That’s right, somebody still cares about PC gamers and if I’m honest it’s a little against my expectations of a few years ago when the monitor market had completed the jump to 1080p and was looking rather stagnant. The exception to all this innovation is arguably LCD panel quality itself, which has progressed gradually rather than dramatically. As for the perennial great hope for the future of displays, OLED technology, it’s nowhere to be seen.

IPS, high refresh and frame syncing from Asus – is 4K even relevant?

The next item on my 4K housekeeping list involves graphics cards. Nothing hammers a GPU like 4K resolutions. It’s easy to forget the full implications of 4K resolutions. But even if you remember, I’m going to remind you anyway.

For PCs, 4K typically constitutes 3,840 by 2,160 pixels, which in turns works out at eight million pixels. For reasonably smooth gaming, let’s be conservative and say you want to average at least 45 frames per second.

That means your graphics subsystem needs to crank out no fewer than 360 million fully 3D rasterised, vertex shaded, tessellated, bump mapped, textured – whatever – pixels every bleedin’ second. And that, ladies and germs, is a huge ask.

Up the ante to 120Hz-plus refresh or flick the switch on anti-aliasing techniques that involve rendering at higher resolutions internally on the GPU and you could very well be staring a billion pixels per second in the face. The humanity.

That dinky little thing on the right is my trusty 30-inch Samsung

Now it just so happens that I have philosophical-going-on-ideological objections to multi-GPU rendering set ups. Multi-GPU splits opinion, but personally I’ve always found it unreliable and lacking transparency even when it does work. In short, I find myself constantly wondering whether I’m getting the full multi-GPU Monty and it eventually sends me mad.

Long story short, I’m only interested in 4K gaming on a single GPU. Enter, therefore, the MSI GTX 980Ti GAMING 6G. Handily loaned to me by MSI for a little while, it doesn’t come cheap at £550 / $680. But partly thanks to a core clockspeed that’s been bumped up by about 15 per cent over the reference clocks, what it does do is give you probably 90 per cent of the gaming grunt of the fastest single GPU on the planet, Nvidia’s Titan X, for about two thirds of the cash.

Maxwell to the Max: MSI’s 980Ti with some overclocking

It’s also based on what I think is the best current graphics tech, Nvidia’s Maxwell architecture and also has two massive (read, very quiet running) fans that actually shut down when the card is idling. Yay. Anyway, as things stand, if any graphics card can deliver a workable single-GPU 4K gaming experience, it’s this MSI beast.

The final part of my 4K background check involves the screen itself. I am absolutely positive about this bit. If you’re going to go 4K two things are critical. First it needs to be have a refresh rate of at least 60Hz, though the fact that 60Hz is currently as high as 4K goes remains a possible deal-breaker, more on which in a moment. Second, it needs to be huge.

Loads of full-sized DisplayPort sockets. Yay

Specifically, it needs to be huge because otherwise you lose the impact of all those pixels at normal viewing distances, and because to this day Windows operating systems are crap at scaling, so super-fine pixel pitches are problematic for everything non-gaming. How huge? I wouldn’t want less than 40 inches for 4K.

All of which means right now, you’re looking at just one monitor, at least as far as I am aware. That is the 40-inch Philips BDM4065UC. So, I’ve got one of those in too. Huzzah. And doesn’t it make my trusty Samsung XL30 30 incher look positively puny?

Even at around £600 / $900, I reckon the Philips is fantastic value. It’s not perfect. In fact it’s far from perfect. But my god, it’s a thing to behold.

So there you have it, 4K done as well as it currently gets on the PC courtesy of Philips and MSI and arguably at a not absolutely insane price, depending of course on your own means. So what’s it actually like?

Sheer man power allowed the Romans to achieve 4K in 132 A.D. Probably…

The first thing you have to do is get over the slight borkiness of the Philips monitor. I’m not totally sure if it’s a pure viewing angle thing or also involves the challenges of backlighting such a massive screen. But the apparent brightness and vibrancy falls off towards the bottom and especially in the bottom corners, especially at normal viewing distances. Viewed at an HDTV-ish 10 feet, the problem disappears. Three or four feet from your nose and it’s all too obvious.

The other issue is pixel response. With the pixel overdrive option at either of the top fastest settings, the inverse ghosting (as detailed here)is catastrophic. At the lowest setting, it’s merely a bit annoying. You can also turn it off altogether. And then the panel is just a bit slow.

Oh and the stand isn’t adjustable. At all. But let’s turn to the biggest questions of all – frame rates and quality settings.

For the meat of this discussion, I’m going to use Witcher III. Why? Because it’s right on the cusp of playability at 4K and that’s handy for framing the broader debate. There are more demanding games, for instance Metro: Last Light, some of which are probably not truly goers at 4K. There are less demanding games, like the GRID racing series, which will fly at 4K. Witcher III is somewhere in the middle and serves as a handy hook on which to hang all this, nothing more.

Everything set to full reheat…

Is Witcher III any good as a game? Who cares, it looks ruddy glorious at 4K native and ultra settings. And yes, it is playable on the MSI 980Ti, even with anti-aliasing enabled.

In fact, toggling AA doesn’t make much difference to the frame rate which is either good because it means you may as well have it on or bad when you consider that the MSI card is knocking out marginal frame rates in the mid 30’s.

Actually the frame rate itself is tolerable and it’s not the numbers that really matter. It’s the feel. The possible deal breaker for some will be the substantially increased input lag when running at 4K.


4K with AA above, 4K with no AA below…

Crush the settings to low quality and the frame rate doesn’t leap as you might expect. It only steps up to the mid 40s. The lag remains. So you lose a lot of visual detail for relatively little upside.

However, bump the resolution down to 1440P (i.e. 2560×1440) and the 980Ti shows its muscle. It’ll stay locked at 60fps with V-sync enabled at ultra quality settings. And there’s no more lag. Which then begs the question of how non-native looks on a 4K panel.

Hi there. I’m interpolated…

In isolation you’d probably say it looks pretty darn good interpolated. Indeed, You might not immediately spot it was interpolated if you hadn’t been told, though you’d probably pick it if asked to make the call. But then you go back to 4K and the clarity and the detail is simply spectacular.

So these are the kinds of conundrums you’re currently going to face at 4K. To be clear, you’ll get a lot of variation from game to game. As another for instance Total War: Rome II runs pretty nicely maxxed out at 4K. Notably it runs lag free and looks fantabulous.

That said, running native for a lot of online shooters is probably a non starter. For everything else, it all depends on personal preference.

The sheer scale and majesty of 4K at 40 inches is truly special, one of the wonders of the modern computing age. But the 34-inch super-wide panels do the magisterial vistas thing seriously well, too, and give single-GPU setups that critical bit of breathing room.


What happened to my fancy threads? 4K ultra quality above, 4K low quality below…

Meanwhile, as I 4K-gamed the evening away I was reminded of the bad old days of PC gaming when you were constantly playing off quality settings versus resolution. More recently, with a top-end GPU the usual drill involved maxxing everything out without a worry in most games. That’s very liberating. With 4K, the settings stress returns.

Then there’s the whole high refresh and adaptive refresh thing. There’s no getting round it. If you are familiar with those technologies, you’ll miss them here. Then there are the basic image quality flaws that come with this particular 4K panel – slight response and viewing angle borkiness, basically.

Very long story short, I’m afraid I don’t have any easy answers. You pays your money, you takes your choice. Personally I don’t even know what I prefer between 4K, high refresh and fulsome adaptive-synced frame rates right now, so I can hardly tell you what to go for. But if a single issue was likely to swing it, then fears of frequent input lag might be the 4K deal breaker for me. I really hate input lag.

85 Comments

  1. craigdolphin says:

    I can’t help but read these articles from Jeremy and mentally ‘hearing’ the words in a thick scottish brogue.It’s the surname. :)

    On topic: I don’t think 4K gaming is a realistic option for me for a few years yet. I think my next rig will be aimed at 1440 resolutions with one or other of the frame-locking techs involved. And am definitely waiting until the next gen gpu’s on the next manufacturing process node enter the mainstream. I figure upgrading in ~ 2 more years is a realistic time frame for that.

  2. Matf661 says:

    Just my 2 cents, i am running a 780 and have a 28″ 4K monitor and its a fucking dream, Play GTAV on moedium/high settings at 45-50fps and RO2 maxed out at 60fps+. If you mainly play the newest games on the market its not worth getting it as you wont be able to max them really, but if you play games from within the last year definitely get one, i would highly recommend it as the clarity in games is genuinely unbelievable.

  3. PopeRatzo says:

    I just wish my old-ass eyes could tell the difference between 1440 and 1080 at gaming distances.

    For me, it just ain’t worth the money.

  4. vorador says:

    I think i will wait to end year before building a new rig, both for 4k and VR. Current GPUs are a bit lacking unless you’re willing to expend way too much on the 980 Ti or going SLI with its related problems.

  5. Cei says:

    In all honesty, no.

    Instead of 4K I just picked up a Dell U3415W. It’s a fancy curved 21:9 display at 3440×1440, which is a big resolution but not as much as 4K for the poor GPU to push out. It’s still huge, fills my viewing area, and looks fantastic in terms of colour and the like.

    As for GPUs, I went for SLI 980Ti <3

  6. Jediben says:

    4k is a crock. 60 Fps is not enough any more. I’ve tasted 144hz and going back is unimaginable.

    • steves says:

      Word.

      Although you probably don’t need the extra 44-54Hz – about 90 is where I stop noticing the extra smoothness, but the difference over 60 is worth way more than extra pixels.

      4K is for desktop work. Extra screen space is always welcome, and you can never have too much DPI for text, but once shit starts moving…gimme the framerate.

      Witcher 3 is a kind of weird middle ground re. this because it’s action-y, but not stupidly twitch-dependent like some things. And amazingly enough, despite 7 patches, you still can’t set the FOV to something tolerable without ‘cheating

      • nearly says:

        This is probably a perfect summary of the dilemma that’s kept me from upgrading for quite some time. The other big concern is that I’ve found myself bumping up the text size on Windows and I worry that going to a higher resolution will mean harder to read text. While I think I’d love 4K for productivity outside of gaming, I’d worry that I’d need to visit the eye doctor, especially with the bizarrely limited FOV and text sizes in big releases like The Witcher.

  7. BlueTemplar says:

    Hmm, looks like it’s time to upgrade to 1920×1080! :)

    Well, seriously I’d prefer 1920×1200 if I can find a monitor that’s not prohibitively expensive. Anyone heard of 1920×1200 monitors with low latency IPS, 144Hz refresh or G-Sync?

    • Cei says:

      Nope. 1920×1200 is now the realm of “business” monitors, not gaming monitors, and the latter are the only ones bothered with 144Hz and Gsync. Basically it’s 16:9/21:9 or go home.

    • TacticalNuclearPenguin says:

      Older stuff only and maybe some obscure model, but to make it simple 16:10 is 99% abandoned and as such you’re not going to find one with new features like high refresh and the likes.

      • BlueTemplar says:

        Actually, it wasn’t that hard to find :
        Asus PB248Q, 300€ :
        link to translate.googleusercontent.com
        IPS, Only 5-6ms switching time, and lots of other qualities, including (possibly?) being able to set the resolution (black borders, centered) in the screen menu itself and being able to push the refresh rate to about 70Hz?
        Any issues I’m missing?

      • Eleven says:

        They’re a niche product, sure, but they’re not unpopular. Dell makes a new 24-inch 16:10 monitor every year or so, and they’re always highly rated.

        They are meant for workstations and productivity, that’s what the aspect ration shines at, but there’s enough cross-over into gaming that in the steam hardware survey there are more people using 1920X1200 than 1440p and 4K put together.

  8. BlueTemplar says:

    By the way, what are the power requirements (min, max, average?) of these over-sized screen and graphic card mentioned in the article?

    • Awesomeclaw says:

      I have a 290X and a 28″ 4K screen and I push about 450-500W at full whack, including the PC, audio, monitor, various USB stuff, etc. The whole rig idles at about 130W with the screen on, and 90-100 with it off.

      I find that 4k is entirely feasible for most games, and provided that you dial down a few features you can push 60fps on many of them. I play Witcher 3 at 1440p with most stuff turned up, and games like Warframe and Rocket League (which are pretty but not quite as intensive) I have turned all the way up except for anti aliasing.

      Incidentally the difference between 4k and 1080p is entirely visible on a 28″ monitor at desk distances.

    • Cei says:

      980Ti’s draw up to 330W, but reality is <300W. With the rest of the rig you might touch 400W at best, including monitor.

  9. Premium User Badge

    syllopsium says:

    I’m not bothering with 4K – currently still running old 4:3 TFTs and CRTs, which top out at 1600×1200 and 1920×1440 (it’ll output more, but the phosphor won’t resolve it properly). 4K would make sense if I was doing image or film editing, but I’m not.

    Next upgrade will be to 1440, possibly one of the ultrawide monitors if I can stand to chuck two of my monitors (but that’ll be ok, since it’ll have more resolution than both combined!)

    • tnzk says:

      Heh, quite recently I had to throw out my 32″ CRT TV for the inorganic rubbish collection. I gym regularly, and let me tell you I didn’t need to work out my grip or my back that week. Just a few weeks ago I also moved house, and I was lifting the 32″ LCD with the frugal tips of my fingers.

      Yeah I love(d) the colour reproduction on CRTs, but in terms of power consumption, space, and portability, I’m glad we left that era behind.

      Actually quite like LED tech (it’s honestly created a minor revolution with production filmmaking too). OLED is going to be even more awesome.

      • BlueTemplar says:

        What LED tech are you talking about? (And why is it revolutionary for production filmmaking?)

        For screens it would seem that recently was being called “LED” the technology involving Liquid Crystal Displays blocking light from Light Emitting Diodes backlight (as compared to a more previously used technology of using a Fluorescent (lamp?) backlight.

      • Premium User Badge

        syllopsium says:

        I have a big desk, and a fair few computers.. The 21″ CRTs are ok to move, although, yes, a big telly is weighty. On the other hand that’s nothing compared to my 8″ tubes CRT projector, which is bolted to the ceiling and weighs 75kg..

        I’m looking forward to when OLED really becomes a thing, but seeing as it’s only passable on phones if you don’t have a bottomless wallet (and even then, the screen sizes are limited), I’m not holding my breath.

    • Premium User Badge

      phuzz says:

      You realise that if you’re sat in front of two CRTs for, say, forty hours a week, you’re getting the equivalent radiation of an arm x-ray per year to your face.
      Probably not a problem, but my freind who also refused to chuck his CRTs ended up going grey in his twenties from the front of his head first.
      Coincidence? Well yeah, probably.

      • geisler says:

        “you’re getting the equivalent radiation of an arm x-ray per year to your face.”

        No, you aren’t. Unless your CRT device is more then 30 years old, it will not emit enough EM radiation to be even remotely harmful (less than 0.5 mR per hour has long been the FDA required spec). Even older units, would have to be unshielded, and you would have to stick your head into one of them for years, to get to the radiation equivalent you’re talking about. Please don’t spread FUD like this.

        Source: I’m an electronics engineer.

      • dropbear81 says:

        You’re also getting radiation by virtue of sitting in a room, whether there is a TV in it or not. And two arm X-rays are equivalent to about 1 week of average environmental radiation you’re going to get by simply existing. Sitting in front of your TV for hours and not moving is worse for your health than any perceived radiation dose from the TV (which as I understand it, was almost eliminated from the invention of the transistor? Correct me if I’m wrong, I’m a Radiographer not an electrical engineer).

      • Premium User Badge

        syllopsium says:

        I’m in my early forties, and aren’t doing too badly on the grey – there’s a fair bit, but not as much as you’d expect at my age. I think if it did cause greyness, there’s more than enough people to prove it.

        As to the X-ray dosage, even if it’s true, currently a moot point because I’ve had six X-rays recently due to a broken shoulder..

        At this stage, there’s many people who’ve been using computers with CRTs day in/day out for 40-50 years.

  10. OmNomNom says:

    Even if you have a beast of a machine it is way too early for 4k gaming imo. I have 980ti sli and i wouldn’t touch it. I enjoy high framerates

  11. povu says:

    I intend to upgrade to a good quality 120hz 1440p display at some point in the next two years or so. Right now that seems a good compromise. 1440p is enough of a difference over 1080p to me, and it will let me take advantage of the higher refresh rate.

  12. aircool says:

    For me, it’s just too fucking expensive for what you’re getting. If you’ve got money to burn, fair enough, but even then, I can find far better ways to piss a grand up a wall.

  13. MuscleHorse says:

    I don’t suppose we could have another round of recommendations for solid 1080p monitors? I got mine back when I didn’t have quite such money to spend on ridiculous things and as such it pretty much shits itself at when things go over 60fps and vsync is causing issues. Would be nice to have a good monitor with RPS stamp of approval.

    • Ejia says:

      Yes please. But 23″+ (wait, is that supposed to be ‘, or “?). I just want nice smooth 1080p.

      • Premium User Badge

        DelrueOfDetroit says:

        ‘ is feet, ” is inches. So 6’5″ would be 6 foot, five inches.

        I just went to 27″ and I don’t think I could go smaller. It’s about the perfect size for having everything right inside your peripheral vision.

  14. Premium User Badge

    gritz says:

    Great monitors! That keyboard though…

    • Premium User Badge

      DelrueOfDetroit says:

      Seriously! Why would someone voluntarily use a Mac keyboard?

      When I type I don’t hold down the shift key to capitalize. What I do is I quickly hit caps, then the letter and caps again. The Mac keyboard capslock button is absolute crap. Half the time it doesn’t even register so you end up tYPING LIKE THIS.

      • OmNomNom says:

        Mac keyboards are poop i agree. It feels as with every other product, they try to remove buttons.

        But if that is how you do single capital letters… then you are doing it wrong…

    • loldrums says:

      It’s funny how people get attached to things. $1,000+ gaming rig attached to a $12 Dell mouse from 10 years ago? All too common.

    • geisler says:

      It’s obviously not a permanent setup pictured there. On other photos, you can see a testbench, and another keyboard. The picture featuring the mac keyboard, is a temporary setup for photographic composition, look at the depth and the lighting, the general location…

  15. melnificent says:

    I picked up this Philips monster the other week. Tweak a setting down here and there on a few games and my 290x (tri-x oc), runs most things at 60fps more often than it drops. It’s about managing expectations and I knew going in that single card 4k at max settings is not going to give a locked 60 on any card.

    But for split screen gaming and other things it’s hard to complain. The multiple inputs to Picture by picture mode is a weird gimmick. Select upto 4 sources for a 1080p display on the same monitor. I’ve found it useful for when I want completely separate desktops to work on so tell the PC that it’s really outputting to multiple monitors.

    • Yam10002 says:

      I got the BDM4065UC when it finally came to North America. The monitor is perfect, size-wise and text can actually be read (unlike 28″). Response times are good and the price was very appealing at $799 from Amazon.
      The 4K experience still needs work on the part of Windows and game developers though. Newer games do better in general, older games may have no 4K support, bad interfaces, and other issues.

  16. joa says:

    Shit, I am still waiting for the OLED. They have this for TVs, but it is very expensive and only available in big size. I don’t really care for the 120Hz, or the syncing. Anyone thought about projectors? Can you project that shit on the wall, make it huge as you like? How is graphics quality?

    • OmNomNom says:

      Afaik oled isnt coming soon because it is still expensive to make into large monitor size panels because no one does it really. Also from what i remember the ghosting wasn’t great.

      Projectors are great as long as you are prepared to fork out at least £1500 for half decent one. Input lag etc usually isnt quite as good as a monitor and of course you have to have ideal conditions (use your pc in a cave).

    • Jeremy Laird says:

      The vault in the images will be my home cinema in the next month or so and at that point I’ll look into playing games on a projector. I’ve done it before quit often, but not with a proper install so it’ll be interesting to see whether it feels like a gimmick or something I’d do regularly.

  17. El_MUERkO says:

    I’m using the Philips 40″ 4k to type this on, beside it on my desk it my old Dell 30″, despite it’s larger screen size the pixels are smaller on the 40″. For me there’s no going back to a smaller screen, only onward and upward.

    My ideal… 42″ curved 144hz OLED, it’s all doable with current tech and Display Port 1.3, chop chop philips!

  18. DeepFried says:

    I have a small desk, so my monitor is 2′ or less from my face, does that preclude 4k as a viable choice if 4k needs a big monitor? I can imagine 27″ being fine, maybe even 30″ at a push, but 40″ at 2′ from your face!? that doesn’t sound in any way optimal. (My current monitor is 24″)

    • OmNomNom says:

      I think needing to turn your head to see the whole screen while using your pc is a bad idea, for general use let alone for gaming.

      Also having a screen that close doesnt sound good for your health. Cant you buy a bigger desk?

  19. Raoul Duke says:

    Can anyone comment on how these beasts perform with a 1080p input? I.e., given that this is an exact multiple, do they gracefully do pixel doubling, or do they become a blurry mess?

    The use of 1440p in the article to test interpolation is… confusing.

    Meanwhile, I am a happy 1440p owner. Runs most things pretty nicely at native resolution on a single Radeon 290.

    • OmNomNom says:

      Ive never seen a tft monitor that looks spectacular outside of its ideal resolution really. You’re better going with the resolution you’ll end up using

    • Jeremy Laird says:

      So, I’m afraid that the whole pixel doubling thing has never translated to reality. Whether it’s the screen scaler or GPU scaling, you always get interpolation not pixel doubling. 1080p on this huge display looks very soft.

      So, that’s why you’d use 1440p – because it looks much sharper than 1080p and the GPU in question is well up to 1440p.

      • TacticalNuclearPenguin says:

        It’s because you can’t circumvent the fact that a pixel is made of three subpixels, so when you ask 4 pixels to act like one you simply get this problematic result: link.

        Now ignore the fact that we’re talking about 9 pixels there instead of 4, but the issue is the same. 1080p on 4k would look perfect if a pixel simply was… a pixel, and not a series of things that make it look like one.

        • Jeremy Laird says:

          Almost all of the time you don’t get a ‘problem’ at all. Four blue pixels are the same blue as one blue pixel. You don’t see the subpixels.

          The exception to this involves rendering that’s sub-pixel aware, including Windows Cleartype. I’m virtually certain that games do not use subpixel rendering of any kind, so pixel doubling would work perfectly for games.

          • TacticalNuclearPenguin says:

            You can clearly see the effect of subpixel layout, check there: link to lagom.nl

            Different layouts will look sharper with certain lines, just like you can figure out which layout you have depending on the small black border that appears on the cyan square.

            In short, stitching together multiple groups of subpixels is all you need to lose sharpness.

          • Jeremy Laird says:

            Like I said, this is only relevant for sub-pixel aware rendering. Again, like I said, this includes Windows ClearType. Thricely, like I said this does not include games.

            Fourthly, like I said, you could do pixel doubling for games and without any subpixel issues. You wouldn’t run on the desktop non native, so OS font rendering at non native isn’t a huge issue.

            In any case, you could have a solution that only used pixel doubling for full screen 3D rendering and make the subpixel font rendering limitations thoroughly moot.

            Oh, and ClearType doesn’t exist because subpixels are a problem, you are misunderstanding the issues here. ClearType uses subpixels as an opportunity to increase the effective resolution of an LCD panel.

          • TacticalNuclearPenguin says:

            Besides, cleartype wouldn’t exist if subpixels weren’t a problem.

        • Raoul Duke says:

          Sorry, but that doesn’t make sense.

          If I draw a bitmap image in which I make two blue boxes, one exactly 1×1 pixels big, and one exactly 2×2 pixels big, then what you will see on your screen are a blue dot and a slightly bigger blue dot.

          As someone else has pointed out below, you can demonstrate this using something like DOSBox. If you set it to render an exact multiple of a game’s base resolution, you will get what appears to be a pixel-perfect enlargement of the original game so long as the multiplied resolution fits on your monitor and you turn off any funky attempts to further scale that image.

          There is no logical or technical reason why a 4k monitor couldn’t do exactly this with 1080p input, i.e., for each 1×1 dot, display a 2×2 dot. Given that 1080p is pretty much the current standard for video, I find it somewhat surprising that this isn’t a very basic option on most monitors.

          The stuff you’re talking about only matters if something else is interfering between the source and monitor, such as bad scaling, cleartype, etc.

    • TacticalNuclearPenguin says:

      Keep immensely enjoying your 1440p thing, i’d say.

      • Raoul Duke says:

        Oh, I will. I actually find that 1440p is dense enough that you can turn anti-aliasing down or off on a lot of games without any really noticeable loss of quality. So getting things running at 60fps isn’t actually hard at all and looks delightfully smooth and detailed in motion.

    • Person of Interest says:

      I have an HP ZR30w (2560×1600) that can also accept 1280×800, but nothing else due to its lack of built-in scaler. The upside is, with some trickery in the graphics control panel to add a custom resolution, I was able to get any game’s 1280×800 resolution setting to bypass the video card’s bilinear scaling, and the monitor did true pixel doubling. Older games with interfaces designed for the SVGA era use a lot of bitmap text, and it looks so much better when there’s no smoothing applied by the scaler.

      It’s possible that these 4k monitors will properly pixel-double if you feed them 1080p, but you first have to convince your graphics card that your monitor will accept 1080p and doesn’t require scaling. Other options might be using something like the GeDoSaTo DirectX injector to do nearest neighbor scaling of 1920×1080 to 3840×2160, which I tried to do but failed (maybe it isn’t possible).

      I was able to create DOSBox profiles that force 640×480/800×600 games to run at integer multiples (1920×1440, 1600×1200). There are some black borders around the edges of the screen, but again these games tend to use tiny bitmap UI elements that look awful when scaled smoothly, so it’s worth the letterboxing.

      But for a modern 3D game like The Witcher 3 (caveat: I have not played it), the UI is often scaled and resolution-independent already, and the game is anti-aliased, so I’m not so sure the “sharp” 1080p would actually look better than upsampled 1440p.

      The subpixels are not the problem, only the scaling technique. To see for yourself, just run a console emulator, or pixel art game, with built-in integer scaling graphics options. I don’t notice the motion or detail looking any better or worse at 2x/3x/4x than when the game runs at 1x scaling (postage-stamp size in the middle of the screen).

      • BlueTemplar says:

        I’ve always wondered how to make your PC display a specific resolution at the center of the screen (with black borders around), and THEN scale the game (at whole or simple fractional ratios) inside that display space…
        Any ideas? (besides using an emulator)

        Also, Website of Interest :
        link to dewt.org

      • Raoul Duke says:

        I’ve done this with DOSBox too, it works well.

        I just find it strange that with 4k being an exact multiple of 1080p, there is no option to gracefully pixel double 1080p content. Seems like a no-brainer to me – no other form of scaling is going to magically improve the image or add extra information, all it can do is degrade it.

        • BlueTemplar says:

          There seems to be some confusion here…
          (Marketing bending facts to sell things, who would have thought!? /s)

          link to en.wikipedia.org

          The relevant resolutions are :

          – “1080p”, or 1080 lines, progressive (rather meaningless, I’m only aware of some satellite networks that are (were?) using digital interpolated signals), 16:9 ratio assumed : 1920×1080

          – “Ultra HD” or “UHD”, 4 times “1080p” (still 16:9) : 3840×2160 (could also in theory be called “2160p”)

          – Digital Cinema Initiatives’ “4k” : 4096×2160 (256:135 aspect ratio, that’s 16:8.4375 or ~17.067:9), which is NOT 4 times 16:9 “1080p”, and also has 2160 lines.

          – There also seems to exist a DCI “2k”, which is 2048×1080 (256:135 aspect ratio), so also 1080 lines.

          And since everyone is jumping on the “4k” bandwagon, and “UHD” is so close to 4k, and Cinema is probably going to win again in imposing their standards, I’m betting there’s going to be a lot of confusion as to why “4k” isn’t exactly 4 times “1080p”.
          Though the good news is that since “4k”>”UHD”, you can still fit 4 times “1080p” in it, with black borders on the side… assuming of course you don’t have a cheap display that doesn’t know how to do that natively.
          I’m also willing to bet that in the cases where the display does do that natively, there’s going to be a lot of teeth gnashing about “how to make those ugly black bars on the side go away (or maybe not, as that’s only 128 pixels on each side, or ~3% of screen size on each side).
          OTOH, those that end up with “UHD” monitors will either have (ugly?) interpolation issues trying to display “4k” movies, or, hopefully, will have a setting to cut those extra 6% horizontal pixels…

          • TacticalNuclearPenguin says:

            In theory you’re perfectly right, in practice it’s just commonly accepted that the 16:9 equivalent of 4k is 3840, and indeed many 16:9 screens marketed as 4k have that resolution.

            Yes, it’s wrong, but this idea has gained a lot of traction and it’s probably easier that way, since 16:9 is still a big factor for the foreseeable future.

      • mygaffer says:

        If you turn on GPU scaling your monitor will handle any resolution, as your gpu will do the scaling.

  20. Nereus says:

    Any chance of a side-by-side comparison of how a triple monitor setup runs and feels compared to one single monster of a monitor? I am torn between getting some 24 inch or 27 inch monitors and going triple screen of springing for a 30 inch plus IPS.

    • mikejs says:

      I replaced a triple-monitor eyefinity setup with the phillips 4K, and it’s a huge improvement. Bezel compensation helps, but means that there are things the game is displaying that you can’t see because they’re behind the bezel. If that’s some key part of the HUD or something, or part of the menu system or inventory or whatver, you’re kind of stuck. It’s also an extreme aspect ratio and some games just don’t like that – and running them in a window usually just means running them on the middle monitor.

      The phillips is huge, but it’s also 16:9, which many games expect, and means that running in a window can use slightly smaller 16:9 resolutions, using most of the screen area without problems.

      My old system was 3x 1280×1024 (i.e. old-style 5:4 ratio). The phillips is the same number of pixels horizontally, more than double that vertically, much improved image quality, for less desk space.

      • RProxyOnly says:

        What?

        You bought a monitor where the bezel encroaches on the viewing area? There are manufacturers that do this? Really?

        Are you sure you are scaling properly, and that something isn’t causing the picture to be drawn at a larger size that the view area allows? (overscan)

        TV’s do this and you have to manually set them to 1:1, ‘just scan’ or some such. Check your gpu setting also and tinker with which device you allow to do the scaling, but if you actually have a monitor where the bezel covers some of the viewing area then you’ve bought a pic in poke.

        Would you share your model so we can all avoid it?

        • mikejs says:

          No – the triple monitor setup had bezels that obscured some of the viewing area. It’s pretty much unavoidable with things like eyefinity. All panels have some sort of bezel, and where one monitor touches the next you either have some pixels “behind” the bezel, or you have a discontinuity where moving 1 pixel in the image moves 2x the bezel thickness in space. Google bezel compensation for more info

          Moving to the phillips 4K gives me more pixels, and roughly the same physical FOV, but without any problems caused by bezel compensation.

  21. mikejs says:

    Got the Phillips 4K monster a month or so ago, and would not go back. Some things run at 60FPS full screen, and some don’t, but I just run the ones that don’t in a window, at whatever res they are happiest at – which is often still stupidly high (Dark Souls at 2560×1440). For some games you want a window anyway – Spelunky on a 40″ screen is a little silly.

    The screen is big enough that you really can leave windows at normal display scaling, although the pixels are smaller than a typical 1920×1080 display, and bumping up font sizes here and there isn’t a bad idea. All that screen area does need something to manage it though, as you won’t want to run many applications full screen at that res. WinSize2 is the best of these that I’ve found, and lets you clamp particular windows to particular sizes and positions.

  22. pillot says:

    single gpu gaming at 4k is simply a fantasy, that fact will not change in the foreseeable future

  23. TormDK says:

    Not to 4K.

    I just moved to 1440p, with a Geforce 980Ti and an ACER XBU (or whatever it’s called) 144hz and I’ll be staying there untill Team Green can make a monster GPU that can give me a minimum average 60FPS at 4K at all times.

    • loldrums says:

      I made a similar decision. I had been wanting to upgrade to G-Sync, 144hz, and either 1440p or 4K. Decided most of the tech was too expensive at the moment and my GTX 970 wouldn’t be able to handle it anyway — if not with current games, then with new releases over the next year or two — and went with a nice BenQ XL2420Z 144hz screen. It’s still a nice upgrade over my old budget-y LG flatscreen and if I eventually decide to nab a big-ass 4K beauty, I can flip this one vertically and use it as a second monitor. It’s doing nicely for now, though. 144hz is a noticeable improvement, at least in games that aren’t limited.

  24. qwurp says:

    I just purchased (comes in tomorrow wooooooo) the ASUS PG278Q ROG Swift which I will be pairing with a my single-gpu GTX 780 Ti Classified. I thought about 4k but just doesnt sound like it’s ready yet and I’m good with my single-gpu. This is going to be a huge jump from my current “standard-fare” 1080p. The ROQ has 144hz, 1440p, 1ms response times & Gsync? Yes, I’m ready to embrace the gaming monitor finally…

    • TormDK says:

      Sounds sexy. I was looking at the ROG Swift as well, but went with the ACER XB270HU.

      I had a BenQ 1080p TN panel monitor before, and I’m not entirely certain I can see all the big differences between TN and IPS that people were talking about prior to my purchase, so I hope you have better eyes than me :)

  25. csbear says:

    I’m going with a 48″ 4K Samsung LED TV (JS9000 series) as my next “monitor.” It is a curved screen to help mitigate some of the peripheral issues with being close to such a large screen. Yes, it is expensive to be used just as a gaming monitor, but I will be obviously using it to watch TV as well (what a surprise). Another good option mentioned there is the cheaper Samsung 6550 (40 “). These particular Sammie series work very well as PC monitors and there is a “push” for it on computer geek sites. Long thread here with loads of info; even a set guide to get it working optimally:
    link to hardforum.com

    One caveat is that a GTX 9xx card is required due to HDMI 2.0 support for these “monitors.” I will be getting a single 980Ti which people say runs nicely: Quote from site: “I am running The Witcher 3 maxed (even maxed hairworks AA) and it’s working just fine on JS9000 @ 4k. I’m obviously not on 60 fps, but the game is fluid and it’s not stuttering or lagging anywhere. not a single time.”

    My current monitor is an Eizo 27″ 2560×1440 and it has been wonderful (for a non-120Hz), but it is just too small for me now. I don’t play FP shooters and the like, so I’m not too worried about high fps. For me, a big screen and high res trumps high frame rates with the games I typically play.

    Not a cheap way to go by any means, but if you are willing to splurge (and a lot of PC builders do just that in the hobby anyways) and are ok with it not being 120Hz, people are very happy with it as mentioned in the thread above.

    • Yam10002 says:

      “Scamsung” 4K TVs of late do not support a 4:4:4 colour scheme on most/all models despite their claimed HDMI 2.0 compliance. You may wind up at 4:2:0 @60fps – basically 8 million brightness levels with 2 million colour samples. Read carefully.

    • mygaffer says:

      Unless that guy was OK playing The Witcher 3 at sub 30fps no way was he getting acceptable frame rates @4k with everything maxed including Hairworks. The GTX 980ti just breaks 30fps @4k with all settings on ultra and Hairworks turned OFF, Hairworks itself causes a further ~25% performance hit.

  26. KastaRules says:

    I’ll take a triple screen setup over a single 4k monitor any day.

  27. mygaffer says:

    4k just doesn’t make sense for most people right now I think. I run 2560×1440, 144hz, g-sync monitor with IPS panel and even that can be hard to max certain titles on with a single GTX 980. I also love adaptive sync and the high refresh rate, blur is virtually eliminated as well as tearing. Once I can get high refresh and adaptive sync in a 4k monitor with a high quality, non-TN panel I’ll still need to wait for 14nm GPUs to have even a hope of running games near max settings without spending $1,000+ on my graphics cards.

    We will get there eventually, but not for a few years yet I think.

  28. BlueTemplar says:

    You should check the difference between “4k” (~17:9) and “Ultra HD” (16:9) (see my post above).
    Non-CRT displays requiring pixel-perfect resolutions, and native monitor black borders / scaling not always being present, this seems to be an important distinction to make…

  29. pfig says:

    I’m not sure I even care about resolution (that threads shot aside), I want 4K because of colour. But it’ll be a few years.

  30. Cryio says:

    For proper 4K gaming you’d still need SLI 980 Tis or XFire Fury Xs.

    Otherwise, 30 fps in most games maxed out in 4K …

    • BlueTemplar says:

      Hah, PC’s graphic cards are so weak!

      The PlayStation 3, that is going to be released in 2005, will be able to output 1080p at 120Hz over 2 screens at the same time, using the 2 Teraflops of its Cell CPU… it therefore should be able to run 4k at 60 Hz!

      link to neogaf.com
      link to eurogamer.net

      #ps3masterrace

  31. Strabo says:

    The problem I have with 4K is the scaling issue for Windows. Macs works pretty well now (I have a Retina MBP), but Windows is – and will be because of the implementation – pretty bad with showing stuff not at 100 %. Which means I would need a monitor that still is readable without scaling. My current 27″ 2560×1440 one is where I would say is the smaller end of readable (108 ppi) for me. Which means for 4K I would need a 45″ monitor. Which in turn is far too big to be visible without turning my head to see everything on it and thus something I wouldn’t want on my desk.

    I thought about a 34″ 21:9 monitor instead and skipping the whole 4K for 5K/7K a few years later (5210/6880×2880), in the hope that Windows finally scales well in 2020 or so. 34″ curved should still be something you can see the whole screen without turning your head around. But then, Ultrawidescreen is supported by only a handful of games and even with third party tools it seems a pain in the butt to set up for others.

    So I stay with my 27″ and my 24″ as secondary screen for the time being.

    • BlueTemplar says:

      Hmm, I think I saw somewhere a mention of Windows 10 having much better support for these ultra-high resolutions?

  32. Arioch13 says:

    Have several machines. The only one I successfully use 4k with is is an X99 based machine with 2 Water cooled and very extreme watercooling at that. 3 huge radiators (one the alphacool monsta) so the cards are incredibly overclocked. It is a massive screen. Yet, for some reason I keep going back to my rather small Asus Rogue Swift monitor for the Gsynch at 1440p. Something just doesn’t feel right at 4k. If pushed on the point, it is just so damn slow after playing at 120 & 144hz with Gsynch enabled for so long.