Week In Tech: 4K, The Complete And Unabridged Shizzle

It’s been a long, hot summer and there’s only so long one man can stare out over the Med and self-medicate-going-on-immolate on passable local vino (turns out that length of time is three weeks). The wi-fi was rubbish, anyway. So, I’m back with some regular updates on all things hardware related. And I’d like to kick off with 4K gaming. The best thing since the original bilinear-filtered graphics accelerators? Or, like stereoscopic 3D, just another over-hyped irrelevance that’ll give you a hurty head, an empty wallet and the sneaking suspicion that the tech industry is pathologically cynical? I’ve got the answers.

A bit of background
First up, let’s deal with what 4K means. In simple terms it refers to the number of horizontal pixels. Four thousand of ’em. Geddit? Unlike 1080p and its 1,920 by 1,080 pixels, it’s not a rigidly defined grid. More of a ball park figure with around 4,000 horizontal and 2,000 vertical pixels.

Take my word that it’s not remotely critical whether you have slightly less or slightly more than 4,000 pixels across. What matters is that you have a shed load of them. In fact, you have four times as many of them as a 1,080p panel.

Indeed, for the PC, it’s precisely quad-1080p that appears to be the target resolution. So, that’s 3,840 by 2,160 and a grand total of 8,294,400 pixels. Yup, an eight mega-pixel panel. Within the industry, it what’s known as as an arse-load of pixels and it has all manner of implications – good, bad and possibly even indifferent.

But before we come to those, a quick final word on the quad-1080p thing. In some quarters (Alan, *cough*, you know who you are), there’s been some confusion over the veracity of the four-times metric. After all, it only has twice the pixels across.

Nvidia’s 3D Vision: As good as stereoscopic gets and still a cheap fairground gimmick compared to the wonders of 4K. Really.

You can see where this is going. But hopefully, you can also see we’re talking about geometric squaring and thus if you overlay a 1,920 by 1,080 grid top left on the 4k panel, then another top right and two more below, the result is precisely four 1080p grids therein.

Image quality
Enough with the advanced non-Euclidean geometry. On to the image quality implications. This is actually a bit of a moving target given you can have 4K displays of wildly varying sizes. But we’re talking mainly about the PC and I can give you the low down on what the latest 31.5-inch 4K panels made for PCs are like.

Absolutely, monumentally, life-altering, cataclysmic, general-superlatively awesome is the short answer. The long answer requires some qualification.

For experienced high-res gaming hands, there is a significant delta between your first 4K gaming experience and 4K movie or video experience. And there’s a good reason for that. It’s because video content beyond 1080p is rare. Games rendered beyond 1080p are not rare.

Put another way, if you have a 2,560 pixel 27 or 30-inch panel, you’ll be used to gaming beyond 1080p. But you’re unlikely to have adjusted to anything beyond 1080p for video content. Because there isn’t much. 4K content, that is.

That’s why watching 4K video on a 30-odd-inch panel with outrageous pixel density blows your mind. It’s like looking through a window into an alternate reality that’s somehow sharper and more vibrant than the real world. More than anything, it makes stereoscopic 3D look like an utter crock.

As for games, much will depend on your experience. For me, the upgrade was a bit more incremental given my daily workhorse involves a couple of 2,560 panels. But the increased sharpness and detail is still absolutely tangible and immediately and utterly wantable. On the other hand, if you’re not suffering from 2,560-pixel fatigue, then I imagine the 4K gaming initiation will be like my 4K video experience. One of a small handful of permanent mental yardsticks.

Resource hogs like Company of Heroes 2 are just a teensy bit of a problem when you’re pumping 500 million pixels per second. Or at least trying to…

Of course, more pixels means a lot more screen real estate. That’s an easy and obvious win for desktop drudgery. But if any of your favourite games have complex on-screen menus, toolbars you can toggle or any of that jazz, a 4K screen just gives you a load more options, a tonne more breathing space.

A final note on image quality involves the possible downside of that crazy pixel pitch. On this subject, it’s tricky to be objective. For me it’s a non-issue in-game or on the desktop. If for whatever reason you suspect or know you struggle with high DPI displays, the simple advice is try before you buy. Not that buying is terribly realistic at this point. More on that momentarily.

The technical bit
With great image quality comes great, well, technical challenges. Most obviously, eight million pixels is an awful lot. And remember, we’re talking 30 frames per second on average as an absolute minimum for decent game play. Preferably 60 frames per second.

Do the maths and we’re talking 250 million pixels per second minimum, 500 million preferably, processed and pumped out to the panel. Honestly, the numbers are so huge, I can’t really make sense of them.

Anyway, the consequence is that there is no single graphics card that’s up to the job of driving the latest games on a 4K display at maximum engine detail. Even a pair of Nvidia Titans isn’t really up to the job. Three? Maybe. I didn’t try three.

In reality, you’ll get games running OK on a Titan (or AMD’s imminent new mega GPU) running reasonably well by turning down the settings a bit. But when you’ve just spent thousands on a graphics subsystem, that’s going to be hard to swallow.

The other issue is just driving the display in 2D terms. DVI can’t cope. Nor can a single HDMI link. Only DisplayPort can do it on a single cable. Even then, it needs to run in a special multi-stream mode to achieve 60Hz (trust me on this, running at 30Hz is total catastrophe) that causes compatibility problems with pre-OS imaging.

Triple-Titan action: Even three of Nvidia’s finest is marginal for max-detail gaming @ 4K resolutions.

In other words, your BIOS screen might be distorted or just invisible. You can switch to a lower res or 30Hz mode briefly should the need arise. But there’s currently a little clunkiness round the edges to be aware of.

But those are utterly piffling caveats compared to the major deal breaker, which is price. The screen I’ve played with is the Asus PQ321Q (you can get my detailed thoughts on that screen over three pages in the next print edition of PC Format mag, by the by) and it costs about £3,000. Which is preposterous and makes 4K look immediately irrelevant.

A number of other big brands including Dell are going to wheel out similar panels with similar pricing, which doesn’t help much. But there’s a possible alternative option. And that’s a 4K TV.

I’ve never been a fan of cheap HDTVs as big PC monitors. 1080p on anything beyond 24 inches means big, fat ugly pixels. But what about 4K on a 39-inch panel? I’ve not experienced that, but we’re talking roughly comparable pixel pitch to a 2,560-pixel 30 incher, so it seems promising.

Already, there are dodgy no-brand Korean 39-inch 4Kers in circulation, apparently for as little as $700. My understanding is that they don’t support beyond 30Hz in full 4K mode, which kills them as prospects for the PC. But the potential is obvious.

In truth, display pricing and graphics hardware are probably a year or two from being ready for anything even resembling the mainstream. And as soon as 4k at 60Hz is attainable, I’m going to want 4k at 120Hz. But let’s try to keep things a bit realistic. For now, if you have any appreciation of high quality visuals in PC games, you’re going to love 4K. Get saving!


  1. Tom De Roeck says:

    4K is inevitable. Its already taken the film world by storm, so the hardware will come. And since the computer crowd are hungry for ever more graphics, it is inevitable.

    • Grey Poupon says:

      While I do agree it’s inevitable, it’s also quite overhyped. Higher resolution doesn’t equal better graphics. I’d rather get more effects, animations and better textures and physics than higher display resolution. Though all these things are usually dictated by the console platforms. You’re still going to need a huge ass monitor to get any truly noticeable difference in games with 4k.

      • Snidesworth says:

        I agree. 1920×1080 is good enough for me and I’d rather put my hardware towards better graphical frills and physics interactions than cramming more pixels onto my screen.

        • GamesInquirer says:

          The same for me. Kind of like the Apple retina stuff. Those i-things’ screens have the resolution (or rather, the pixel density, a 1080p screen of much smaller size would look super sharp as the 4K does in much larger size) but the hardware doesn’t have the juice for it and often 3D games can perform worse or do without some eye candy that the supposedly lesser siblings in their product range actually do just fine due to their lower resolution. Similarly, often we get PC games that are so demanding they don’t perform optimally even at modest resolutions, I can’t begin to imagine what kind of system I would need to run them at 4K with good (not just playable) frame rates.

          But maybe if developers optimize for such resolutions that will mean I don’t have to upgrade as often yet keep playing games with nice settings, just lower resolutions. For example if Crysis 3’s (or whatever very demanding game’s) goal had been for beastly systems with multiple GPUs and the like to run the game optimally (60+fps) at 4K, maybe my not so beastly system would be performing better at 1080p. Instead it was made for the beasts to run it optimally at more reasonable settings and systems like mine have to compromise their frame rate, eye candy or both.

          I’d personally be far more likely to jump on to something like the Oculus Rift if it takes off rather than merely higher resolution. At least until the hardware has advanced enough so that a modest single GPU system (or multi-core or whatever becomes the norm without actually having to buy two – or more – separate high end GPUs that are already expensive when you buy the one, just like a 4 or 6 core CPU doesn’t cost 4-6x what a single core CPU did) of the time can handle the games at 4K well enough.

          Shit, I only just recently upgraded to a 1080p monitor. Granted my last wasn’t that much lower at 1680×1050. But when the latter broke I had to use an older 1280×1024 monitor before I bought the new one and it wasn’t that bad really. It was a bit smaller so the pixel density wasn’t that different and image quality was still pretty decent. Of course I wouldn’t go back if I could help it, but that’s because today’s hardware is at the right level for 1080p without crazy costs, 4K won’t be there for a while yet I imagine.

          I should also add I do kind of hate aliasing still, which high pixel density monitors help a lot with, if not almost eliminate. It’s why I currently prefer games with cleaner less detailed visuals that look super crisp and slick over something like Crysis 3 that in motion is a shimmering mess almost regardless of AA settings (that I’ve only tested just to see how they look, my PC definitely can’t handle high amounts of that in such demanding games). So it will definitely be a great day when cheap hardware can handle games at 4K well. But that day is definitely not today, or next week.

          Edit: yikes, tl;dr much?

        • Flappybat says:

          Don’t be that silly, it’s not happening tomorrow. Remember in 1998 we were all playing games at 1024×768. 1080 wasn’t that widely used until 2005. In the next 5 years we might move up to the 2k resolutions.

          • Artfunkel says:

            Yeah, I can’t wait until those 1920×1080 monitors start rolling out in 2018!

          • Mctittles says:

            I’m pretty sure I was playing games at 2048×1536 in 1998. Either that or soon after. “1080p” was a downgrade for me.

          • stupid_mcgee says:

            I was rocking 1600×1200 back then, using a 21″ P810 Viewsonic CRT. That thing finally died last year, and I got a 1920×1200 Samsung SyncMaster T260HD for a steal on Craig’s List.

            I don’t plan on making the jump to 4k until single-card GPUs can handle it. I simply don’t see any really compelling reason to get a rig with 2 or 3 top-of-the-line GPUs just so I can play games on medium to medium-high settings at a freakishly astounding resolution. The graphic designer in me drools at the idea, but the practical side says it’s just not worth it.

            Yeah, I’m sure it looks great and all, but I’m getting older and my eyesight’s getting worse. I probably won’t even notice in the increase in fidelity once I do make the switch. :p

          • riverman says:

            I was rocking games in 320X480 on a pentium 75 back in 98, and I was thankful for the privilege. didnt get a real gamng pc until 2001, a p3800 with a 32meg rage, bt even then I gamed at 640X480.. its all about the framerate for me :)

        • Dowr says:

          I don’t get why everyone’s reaction to this new tech is always “it’s cool, but I don’t care about it”. This isn’t 3D, it’s a genuine step forward in graphical display technology so why can’t people just smile for once?


          • Tom De Roeck says:

            I’ll start being happy when people stop referring to stereoscopy as 3D.

            And yes, 4K is da bomb. For me, as a film person, the range you can work with to do post production is fucking amazing.

          • airmikee99 says:

            $3,000 for a monitor that needs three high end video cards to run it? What’s there to smile about? Hey guys, there’s a new technology that’s not going to be in my ballpark for at least 10 years, but some guy on RPS says I need to smile about it.

            4K resolution is the very definition of “It’s cool, but I just don’t care about it.” Kinda like a Ferrari, they’re cool, I just don’t care about them.

          • Shuck says:

            I’ll smile about it when it doesn’t cost me $3000 for the monitor and another $3000 for the video cards necessary to make it work. Although frankly my eyesight probably won’t be good enough to even notice a difference by then.

          • Faxmachinen says:

            It’s about as much a step forward in graphical display technology as the NVidia Titan is a step forward in graphical rendering technology. Why should I care if someone makes a big number bigger and a costly number costlier when there are things like the Oculus Rift to fawn over?

          • stupid_mcgee says:

            The most interesting thing is how this will completely supplant print. When cost comes down, of course. We’re already getting devices that have 300 pixel per inch displays, and the default for most print is 300 dots per inch. I know they’re not exactly comparable, but the 72ppi-gap is quickly dissolving and digital media will be expected to retain the same fidelity as high-end print.

            4k is neat. It has a lot of great implications. But 3 NVidia Titans to run most games at high settings? No thanks. I’ll stick with my 25″ 1920×1200 LCD monitor and GTX660.

        • Apocalypse says:

          I play on 2560 (that is already about twice as much as 1080p) and I can tell you that you do not want better explosions, or even worse, just better AA instead of higher fidelity. Higher resolutions are a big deal, sure they can not compensate for anything, but they are well spend on the resource budget of your GPU, always better to play at higher resolutions with less effects than the other way around.

          A big deal as well is for those high resolution displays their hunger for hardware, no doubt, a old 5870 does not cut it anymore for 4k gaming, not even close. Still, even something as simple as a single titan gives you already good experience with 4k, and 4k displays are expensive enough to justify getting 3 titans. (My personal rule of thumb = Price GPU = Price Display)

          Prices for the display will fall, and so will GPU prices.

          • KwisatzHaderach says:

            A month ago I bought a korean 2560*1440 PLS Samsung dislplay for 250€ (tax included). The thing I’ve come to love the most is not the higher resolution and higher pixel density, but the colours! Comparing the colours of a TN panel alongside a PLS panel and you will never again want to go back to TN.

        • PoulWrist says:

          Nothing is worse than people who go “this is good enough for me” – try to reach a little higher. A higher DPI is a major improvement in visual fidelity. You just don’t know it because you sit with your tiny, blurry monitors and go “i don’t like new things” :|

          I for one am looking way forward to 4k displays at a decent pricepoint, like 5-600£.

          • Sheng-ji says:

            I read this as:

            Nothing is worse than people who go “live within their means” – try to reach a little higher “spend more than you can afford”. A higher DPI is a major improvement in visual fidelity. You just don’t know it because you sit with your tiny, blurry monitors and “feed your children properly” :|

          • P.Funk says:


            Maybe people would be more welcoming to the “withing your means” thing if those who presumably feel this way weren’t such Luddites about the future. You know, we can’t have it today, but one day this is gonna be wicked sick?

          • Sheng-ji says:

            @P.Funk – Luddite? Really? Did you read the post where I talk about what monitor I have just bought? the 2.5k PLS Asus PB278Q. I’m fucking excited about the tech that’s available now, that’s here ready and waiting to be used at a reasonable price! I’m not going to get excited over a resolution rise – I’ve been seeing resolutions rise for the last 30 years, every step is nice… but only nice.

            Things that excite me about the future are things like the oculus rift, low latency wireless streaming and the raspberry Pi generation growing up. Higher screen resolutions are coming and I will embrace them with about as much enthusiasm as I embrace camera megapixels – i.e. none what so ever. What I will be excited about is if the monitor or camera makes good pictures.

            Dasla created a monster with the Origin and the fact you have no idea what I am talking about pretty much proves that you have no idea why people are all over 4k way too early and why anyone who knows just rolls their eyes. Google it, read this: link to philipbloom.net and then get back to me.

          • harbinger says:

            You know that higher resolutions will especially be helpful for something like the Oculus Rift, right?
            link to nextpowerup.com
            In fact it is one of its largest downsides for now.

            By the way RED recently announced their new 6k imaging sensor: link to brightsideofnews.com
            I’m not sure why you would dispense the words of some unknown movie director for some TV movies with poor grammar and often using ALLCAPS poorly explaining what he barely understands as “the truth”.

            Might have something to do with you doing the same with almost every article John Walker writes too.

          • Sheng-ji says:

            Do share the last time I ALLCAPSED, because I don’t do that – kinda makes me believe that you think I’m someone else.

            (This post doesn’t count)

            Also, just because you don’t know who Phillip Bloom is doesn’t stop him being one of the most influential voices in cinematography. He knows what he is talking about and he proves that by making a tonne of content every month – you basically done the equivalent of saying “Who is notch and why should I listen to anything he has to say about the games industry” except Phillip Bloom made more than a handful of successful projects.

            You still spectacularly missed my point though, which was merely that you only have chosen to get excited about the resolution known as 4K rather than 8K or 6K or …… insert resolution here…. because video producers and cinematographers are raving about it. They are raving about it because a stupidly high resolution (but crap) camera sold a tonne of units because it could shoot 4k (badly). Everyone jumped on the bandwagon from ARRI (They write it all caps fyi) to Red and now Sony, Canon et al are making raw cameras (with the help of the magic lantern hack) and selling a tonne of units. Only problem is, they are selling units to the general public, not pro’s. THe general poublic want “the best quality possible” even though they don’t have a clue what it means or how to use it. They certainly don’t realise that getting a great picture from a a camera relies on far more than the number of photosites in the sensor – the size of them for example is far more important. So producers who do not understand the cinematographers job started requesting everything in 4k. Cinematographers complied, for the reasons Philip gives. (The price of movies quadruples overnight). Movies were shot, edited and distributed in 4k, only to be shown in a 1080 format…. duh, then the TV manufacturers created 4k TV’s despite the panels costing stupid money. As demand increased the price came down until – in about 10 years 4 k becomes mainstream.

            I’ll get excited about it then. Well, as excited about it as I am about Reds 6K sensor… yawn. Wake me up when Red bring out a product that competes with the Black magic pocket. Then I’ll get excited!

            Oh and BTW – they won’t be putting 4k inside of the occulus rift – I can guarentee it. And the consumer version is confirmed to have a higher resolution.

    • Mctittles says:

      Personally I’d rather see increased color depth come first. I fear this will never happen since marketing managed to convince half the populace that 24bit is the highest the human eye can see, but I long for the day when there is no more color banding in the sky.

      no more color banding in the sky

      • Sheng-ji says:

        Well, the problem there is that you have to spend silly money on a Quadros graphics card, no games and hardly and software support it.

        • Rikard Peterson says:

          It’s a good thing that I still can be happy with my old 17″ 1280×1024 monitor. It’s not amazing, but it’s working well and the limited resolution / size has yet to prevent me from running a game on it. (Though some games from the nineties have problems in the opposite direction. StarTopia for example.) Sure, it’d be nice to be able to afford something like the things this article describes, but if I had that kind of money, there’s so many other things I’d spend it on first.

          Edit: Oops – I meant to post this as a replay to another post in the thread, but I guess it can stay here. Good colours are more important than huge resolution.

      • identiti_crisis says:

        Developers could try using temporal “dithering” to give the impression of extra colours. Obviously, the maths would be interesting, but we already have HDR lighting which uses extended buffer depths (effectively) but renders that into a 24-bit “window”.

        You could use log-space or floating point calculations, maybe, to get the intermediate colours, and then experiment with dithering patterns; might be fun to try.

      • Don Reba says:

        Thing is, so many monitors can’t even handle those 24 bits. Most TN displays show 18-bit colour.

      • Nate says:

        It’s not the monitors that are responsible for the color banding, it’s the fact that you’re doing a bazillion shader operations that tend to limit the color depth (eg, do a 128/128/128 multiply on 1/1/1 or on 0/0/0, you get the same thing). That’s why Carmack was pushing video manufacturers way back when for internal (only) 64 bit color. 24 bits may or may not be enough for monitors, but the most significant color banding is the fault of the video card, not the monitor.

  2. Yosharian says:

    Sorry but I have to say I don’t care about this at all. As far as graphics go, art style far more important than pixel count, and as far as games go, gameplay is far, far more important than fancy graphics. I think in this industry we need less emphasis on hardware and graphics fidelity, and more emphasis on making good games.

    I know I’m being THAT GUY, and sorry about that, but it’s my gut response to reading the article, so there it is. I have a 21/22inch monitor and I really couldn’t care less about getting a larger one.

    • The Laughing Owl says:

      Higher graphical fidelity allows for all sorts of innovation when it comes to presentation. Just look at Watch Dogs, The Witcher 3, Star Citizen, The Divsion. Those have terrific art style and design, that would not be possible without a leap in hardware in the next-gen. So hardware power can be pretty important too.

      Resolutions get bigger, graphics get more realistic, effects get more pretty, displays get bigger and thiner…that just the way of the industry since, well…forever

      • voidburn says:

        While I don’t disagree with anything said before, I would like for this tech to come out when a single graphics card, albeit top of the line, is capable of pushing at least 60fps @ max detail. I cursed ad my 2560×1440 27″ monitor for 4 years of gaming, and when I bought a GTX 680 9 months ago, hoping to finally return to the land of smoothness, I was completely devastated witnessing its inability to deliver.

        In the end I bought a 27″ ips panel @ 1920×1080 and yes, I had to start using anti-aliasing again, but man it finally felt so much better.

        Having gone through all this once, I promise you, I will buy a 4k display when there will be no other option on the market, cause they can suck it as far as I’m concerned! I’d rather buy a 27″ 1920×1080 panel, 120hz+, oled, with virtually no response time and no pixel persistence, AND for the same price of a 4k.

      • Stardreamer says:

        Forgive me, but much of that sounded like Marketing Buzz-Speak rather than any actual real world effect.

        “Resolutions get bigger, graphics get more realistic, effects get more pretty, displays get bigger and thinner…that just the way of the industry since, well…forever”

        So the way forward is the way it has always been? Sounds to me like a recipe for driving into walls. If there’s anything the games industry has been learning in the last few years it’s that the costs of producing Incredi-Realism is actually harming gameplay, the costs involved stifling creativity in favour of formulaic iterative ‘product’, leading to the Indie/Kickstarter booms. People are still buying and watching DVDs quite happily even though Blu-rays have been out for years. This industry push for even higher resolutions is really for it’s own sake, to sell The Next Big Thing. There’s no need beyond that. Currently, I see no need for 4K in gaming whatsoever.

        • MykulJaxin says:

          I completely agree with this. When they announced a new wave of consoles, I was flabbergasted. “Wait, you want to add more pretty stuff when we’re still struggling to make games with innovative gameplay?”
          I’m tired of formulaic releases, and would rather encourage progression in gameplay mechanics over graphical improvement. Even things as simple as camera angles and control schemes can make a difference- Look at what Resident Evil 4 did to the gaming industry. Bells and whistles are nice, but if the game is the same underneath it all, what’s the point?

          • Vandelay says:

            I can’t deny that this console generation has stagnated in terms of gameplay. However, that stagnation has mainly become as a result of the limited hardware. The lack of RAM making level sizes smaller, slow processors not being able to push more complex AI and systems, and limited graphics cards unable to deal with large amounts on the screen at once all contribute to games that end up playing generically (obviously, there are indie and low budget games that disprove that you need high power to create intricate systems, but that lo-fi approach isn’t going to sell many AAA games.)

            Of course, attempting to dive into 4K displays isn’t going to help matters either, but better hardware doesn’t just equate to more pretties.

          • TimorousBeastie says:

            Yep, upwards of 5Gb of Ram on hand is a massive boon to all aspects of development, including design.

          • GamesInquirer says:

            Stagnation is not because of the hardware, it also has little to do with level sizes and what not. There are plenty big games these days, that doesn’t make them any less stagnated, see your average sandbox game that’s a clone of the next and how large games have existed for a few decades, just with lesser visuals. There are also great non stagnated games released on current consoles still, like Dark Souls. Or upcoming games that I see them here as arguments for the next generation, when they will also be playable, with lesser visuals, on current consoles. Like Metal Gear Solid V, Watch Dogs and the like. Anyway, it’s all about publishers and developers putting the money in something that’s a little riskier and a little different than your average big budget production (assuming that’s what we’re talking about here, not your average indie game that could run on a netbook, yet still provide a great experience which also works for my argument rather than against it). Sometimes they’re willing to take more risks at the start of a new generation as they attempt to ride the hype and establish themselves as a powerful player all over again though, which may be fooling you.

      • PopeRatzo says:

        Just look at Watch Dogs, The Witcher 3, Star Citizen, The Divsion.

        I can’t. They’re not out yet.

        • Apocalypse says:

          Star Citizen you can, just download it from their Website. I am not saying that the game has much content, but you can watch it ;-)

    • neems says:

      I pretty much agree with both of you to an extent :)

      The real problem is the never ending upgrade cycle you can get yourself into with bigger screens / more powerful GPUs. You’ve just plonked down £300 on your brand new dream GPU, and everything is going swimmimngly. And you start thinking… “I could get a bigger monitor, maybe one of those 2500×1600 fellas they look nice…”

      So you plonk down some more cash, and it looks great, and performance is… well obviously lower than it was, but it’s fine. Oh look Crysis 8. AND Battlefield 6. Wow they look good. Maybe I could go SLI? Another one of those cards would only cost me £200 now, so it’s almost like saving money…

      Apologies for horrendous overuse of the humble ellipsis.

    • Snakejuice says:

      “gameplay is far, far more important than fancy graphics”

      Why not have both?

      • Sheng-ji says:

        Your financial situation may limit you to having to choose

    • Derppy says:

      No, you wouldn’t need a huge monitor to notice an absolutely massive difference between 1920×1080 and 3840×2160.

      Anti-aliasing exists only because our displays suck and we need to mess up the picture to make the huge pixels less obvious. When we have enough pixels per inch, it won’t be a thing anymore.

      You might be happy with your current display, but people were also happy with 800×600 monitors. That didn’t stop the technology from getting better and people realizing they were wrong and the difference is massive.

      In the future, when you have a 4K 144Hz OLED IPS display on your desk, you’ll look back and think how silly you were when you though your shitty 1080p 60Hz TN-panel was everything you’d ever want.

      Also, how would better physics and higher resolution be mutually exclusive? Sure, both take computational power, but there’s nothing stopping us from advancing both at the same time.

      Higher resolution benefits everyone, it’s just as useful for a random artsy flash-game as it is for a “AAA” game shooting for realistic graphics.

      • battles_atlas says:

        OR you might look back and think “games were fucking awesome then, and it didn’t hurt my enjoyment one iota that they weren’t ultro-HD.”

        This is all just bells and whistles which invents its own need for existence. No one needed an HD TV until they saw an HD TV.

        Improved graphics tech certainly creates new possibilities for games devs, but that seems largely overwhelmed by the negative impact such tech-obsession has on creativity

        • Viroso says:

          What is the negative impact?

          • MykulJaxin says:

            It becomes more about making a pretty picture than about making a good video game.

          • Viroso says:

            Does it really though? I think that’s a myth we convinced ourselves was true back in the 90s when consoles were making the jump to 3D and PCs were making games with jaw dropping graphics at the time.

          • Wulfram says:

            All advancing graphics tech does for me is make my old games seem worse. I don’t think it enhances my enjoyment of new games very much.

            And as for stuff that doesn’t just reflect my twisted psychology, it seems like it pushes up costs to produce games, which lots of people are saying are getting out of control. I’m not against advancing technology, but I’d gladly see games feel less need to push themselves to the edge.

          • Nate says:

            The biggest negative impact of advancing technology on games development is cost of development.

            4 times as many pixels on the screen means 4 times larger textures, 4 times more detailed models. That translates to more artist hours. Larger budgets means a shrinking stable of developers, with fewer risks taken, with more cooks in each kitchen.

            That sounds more doom and gloom than I mean it to– I love great graphics, I love that there are crazy big budget games, I know that imaginative, risk-welcoming independent developers exist and will always continue to exist, and I pray that someday development paradigms shift to more procedural games and tools, which would help– but it would be a mistake to ignore the impact advancing technology has had on AAA games development.

        • Apocalypse says:

          That is a strange view angle, as our monitors did 1600×1200 in times were hd tv was not really a thing. And I do remember that I would have prefered even bigger and better monitors.

          Though times were better for games in this regard, there was no downscaling, if your 1600×1200 monitor was to much for your dual voodoo 2 setup, and believe it, it was, than you still could play at 1024×768. There was simply no need for “native” resolutions as the tech worked different in those times.

      • takfar says:

        “You might be happy with your current display, but people were also happy with 800×600 monitors. That didn’t stop the technology from getting better and people realizing they were wrong and the difference is massive.”

        And yet, it took us 15 to 20 years of steady improvement to go from a 800×600 standard to a 1080p standard, increasing resolution and improving image quality. For the industry trying to force consumers from a recently-achieved 1080p to a 4k standard in five years is just nonsense. Hardware will not be able to scale towards more complex AND higher-res graphics that quick. So, either one of them has to be left behind, or we have to begin spending four times as much in our machines (not counting the display)

      • sophof says:

        Going 4K to fix aliasing is like shooting at a mosquito with a bazooka. Sure, it’ll get the job done, but it’s slightly overkill. Most likely 2K with anti-aliasing will look exactly the same unless you press your nose against the screen and 1080P with anti-aliasing is already pretty good, especially when you factor in that most people’s eyesight isn’t that good anyway.

        The fact that 4K looks better doesn’t mean it is the best approach. All that extra resolution is lost after it passes the human pupil and therefore all that processing power is wasted. There is a reason noone cares about 4K for video, because aliasing doesn’t exist there.

        Screen sizes need to dramatically increase for this to make sense and I doubt they will. About the only application I can think of is the Oculus Rift.

      • drewski says:

        Or, like a lot of people are with DVDs, he might be perfectly happy with his antiquated technology!

        I’m still playing loads of games on sub-HD monitors because, you know what? It doesn’t matter.

    • Viroso says:

      Just remember how it used to be. Super short draw distance, you’d sometimes try to shoot through a fence only to realize it was actually a wall with invisible strips, very low resolution meant everything had to be larger, less environmental detail.

      When tech improves it isn’t just graphics that improve, it’s everything. Certain games we enjoy today would be impossible to exist some years ago. Let me just list some examples:

      Just Cause 2, the world in that game wouldn’t be the same without the huge draw distance. Just some years before JC2 we had a game like San Andreas, with a significantly smaller map and you couldn’t see across it.

      A game like Battlefield 3 or ArmA 3. Based on your comment I suppose you don’t like these sorts of games, but anyway, in the crappy resolutions we used to play not long ago, sniping things at a longer distance would be impossible.

      Higher resolution also means HUDs that can fit more information in less space, more screen space, cleaner look.

      Portal and Portal 2. The first game could not exist on worse hardware, not the way it exists. Basically you could forget the see through the portal effect. Portal 2 almost didn’t get ported to the 360 and PS3 because Valve was having a hard time making the fluids work on consoles, and the fluids were gameplay.

      Minecraft on the 360 cannot create a world the same size as on the PC. Minecraft, a game that isn’t at all demanding graphics wise has to be toned down to run on a console.

      Even for story telling this is important. Since more games decided to forego cutscenes and interruptions they’ve been using the environment to tell stories. All the posters in Bioshock, which are used to characterize Rapture, they would have to look a lot different on a PS2.

      A lot of games that people praise for their art style also couldn’t look they way they did on weaker hardware. Wind Waker, which still looks fantastic today, just couldn’t exist on a N64. A lot of games with cell shading, which people gratuitously praise as having great art style, wouldn’t exist on weaker hardware.

      All of those physic based puzzle indie games, none couldn’t exist on hardware from like 15 years ago.

      We who play video games have this knee jerk reaction against graphics and better tech, we think they exist at opposite ends as if there were tons of games that sell on graphics alone and have crappy gameplay. Then a new game comes out and we love it, like Shadow of the Colossus, Portal, and we forget such game couldn’t exist just some years ago.

      • welverin says:

        “Portal and Portal 2. The first game could not exist on worse hardware, not the way it exists. Basically you could forget the see through the portal effect. Portal 2 almost didn’t get ported to the 360 and PS3 because Valve was having a hard time making the fluids work on consoles, and the fluids were gameplay.”

        I couldn’t even play Portal when I first got it. The PC I had didn’t have a dedicated video card and while the game would launch and even run every time I stepped through that first portal the came crashed to the desktop. It wasn’t until I got a video card, which required a better power supply, that I was finally able to play the game.

        It’s funny you also mentioned Shadow of the Colossus, because when reading about the HD re-release one common thing I saw was happiness that it would be on hardware that could better handle it because the PS2 wasn’t quite up to the challenge. And speaking of that, I should probably go start that…

        • GamesInquirer says:

          I dunno about Portal, people homebrewed a pretty convincing recreation on Nintendo DS (not 3DS which is more capable and with more modern features like support for pixel shaders, the original). You could probably get something better from a professional studio but this is still cool. It even has a level editor.


          It was just a unique idea (that Valve saw and decided to adopt and evolve) at the time, not necessarily something that couldn’t have been done before. I suppose it would be impossible on a Game Boy (:P) but nobody suggested we go backwards in time, just expressed their desire for more than shinier graphics, I’ve yet to see a “next gen” game that couldn’t have been done, with lesser visuals or other less important aspects that don’t really affect gameplay, in the last 5 years or more had anyone been willing. Hell, some of the games I look forward to the most are “cross gen” like MGSV, playable on current hardware with lesser visuals yet the same gameplay. The argument reminds me of people saying the Wii lacked large scale or whatever type of games because of its power alone (the real reason was a lack of publisher interest to invest the budget, Nintendo themselves got large scale games like Xenoblade and pretty games like Mario Galaxy etc), as if we didn’t have large scale games on inferior hardware before, from Ultima Underworld, to Elder Scrolls, Gothic 2, Grand Theft Auto and so on, never mind awesome 2D games. Compromises will always exist, because of hardware, budget, skill or any other reason, hardware alone can’t be blamed for any lack of quality or fresh ideas.

          • Viroso says:

            The thing about these homebrews is that one thing is getting it done the other is having a game where it works. Still, yeah, pretty impressive. People do very impressive things on limited hardware. But I suspect they probably work really hard on these tricks. In the end technology does open up new paths, even if by making them easier to thread. Opening doors.

            And I totally agree with what you said, hardware alone can’t be blamed for what comes out. That also means it isn’t a chase for fancy graphics that’s “ruining” video games, like it’s somewhat common to think, the notion that better hardware means devs won’t care about gameplay.

    • YoungSeal says:

      Just to be THAT OTHER GUY, I’ll just point out that its not technically the same industry that is making the games and making the technology to run the games better.

      There’s no one sitting down and saying ” I could make a really perfect system of gameplay right now but instead I’m going to create a better monitor. “

    • RProxyOnly says:

      “art style far more important than pixel count”

      Can we have a medal over here please !

  3. Sheng-ji says:

    I got the ASUS PB278Q – it is not 4k but 2560 x 1440 and it is wonderful – the resolution I mean specifically though I would say the same about the monitor – that extra res is fantastic on the desktop and appreciable in some games though as I’m driving it with a single geforce 770, I have to choose between res and detail and quite often res wins.

    That being said, it’s enough, more than enough and I think 4k at the moment is too high for the market to support. 2560 x 1440 should be this generations resolution in my opinion.

  4. orient says:

    But…what about running indie games in native res, like La-Mulana? They’re going to be the size of a postage stamp! Also, I’m not sure pixel density inherently improves usability. Icons and UI elements still need to be a certain size to be practical.

    • Sheng-ji says:

      Any monitor worth considering buying can scale content to fill the screen. How well it scales should be very important in choosing one. Your monitor does this and you can prove it right now by changing your desktop resolution. If your text gets all blurry, it does it badly, if it stays crisp it does it well.

      • GamesInquirer says:

        No monitor or TV or anything I have ever seen scales low resolution content well. The GPU driver settings for scaling do a little better overriding the monitor’s scaling but still far from good. For games like La Mulana however, that specifically use pixel art, you only need to basically double, triple, quadruple, or whatever the pixels properly and they will look absolutely fine, yet that’s still something the games themselves will need to have as an option (and often do just that, though not likely up to 4K these days) for it to not become a fuzzy blurry mess but remain a crisp approximation.

        • Sheng-ji says:

          Obviously “well” is a subjective term – what’s good enough for me may not be good enough for you and vice versa. My TV does a much better upscaling job than my sky box and I had to work quite hard to persuade the engineer who installed the system to let the TV do it. He did not agree, but I am happy. As mentioned above, I have a 2560 x 1440 monitor, but I play many games in 1080 – I can’t see the difference without breaking out my macro lens so to me, this is good enough.

        • Apocalypse says:

          You remember those old big TVs and Monitors without LCDs in them ;-)
          Anyway, my old Samsung 226BW did the scaling a lot better than even todays GPUs do, so I was fine with that when older games did not support higher resolutions.

    • MattM says:

      Games like La-Mulana usually have a pixel multiplier option where every pixel in the original game is rendered as a 2×2, 3×3, or 4×4 block. This allows you to increase the size without tricky scaling algorithms and blurring. You can just pick the largest multiplier that fits in your screen and play with small black borders. In La-Mulana’s case I think it has a background image that it uses to fill the borders.

  5. talon03 says:



  6. thestjohn says:

    If 4k is irrelevant then 3 x 4k Eyefinity is even more irrelevant then as shown here

    • nutterguy says:

      Wow, that is insane! Good to know it can actually be done now, hopefully in a year or tow this will actually be (kind of…) feasible. Thanks for the link. :-)

  7. Boozebeard says:

    Well I’ve just gotten an EVGA classified 780 which is about as fast as a titan and it can only just run some of my higher demanding games at 60 fps on my Dell 2560×1440. I really can’t see 4k gaming being a realistic prospect for a couple of years at least!

    In fact what I am more interested in is the 2560×1080 Ultra wide screen monitors that have been hitting the shelves. They seem great for FPS, as you can get such a massive FOV. I’m hoping that as display port becomes more prevalent they will bring out some 120hz versions. Would be the ultimate FPS monitor!

  8. zat0ichi says:

    So I’d be looking at at least £300 quid for a successor to my little GTX660 and a grand for a not shit 4k monitor in about 2 years time?
    That’s some serious cash so I think myself and a lot of poorer gamers probably wait about 5years for it to become affordable.

    • welverin says:

      And another five years for new consoles to come out that can manage it, because even if a PC can handle it all games are going to be designed around the inferior consoles, so none of them will take real advantage of it.

      • Sheng-ji says:

        I wonder if the new gen consoles will release 2.5 or 4k versions of their machines later in the life cycle. This gen was 8? years and I fully expect this on to be longer.

      • wild_quinine says:

        This generation of consoles have been designed to support 4K output, although they’ll be limited to up-scaling in most cases.

        But if you think that’s a serious issue, bear in mind that this is all the current generation of consoles have done with the current HD standard of 1080p. Almost no games apart from the most geometrically simple render at 1080p. Many render below even 720p, and upscale for output.

        This hasn’t stopped the modern PC from playing those same games with their same effects, at higher crisper resolutions.

        In 5 or so years, you can expect a roughly analogous situation: consoles pushing at or near 1080p, but up-scaling to 4K in some circumstances, with maybe one or two ‘simple’ games running at full 4K res.

        Meanwhile the PC will be running those same games, with the same effects, at drastically higher resolutions.

    • mickygor says:

      I don’t think we’ll have to wait. Most games are developed for the 720p that current gen consoles can handle, yet we have perfectly fine textures for 1080p+, because the source material for them is a much higher resolution.

  9. golem09 says:

    I was (am) a huge advocat of 1080p, it seems pretty good in all situations for me with some FXAA.
    The 4k revolution will come, but maybe in 10 years, not now. That’s when there will be affordable hardware to play games with it.

    • Low Life says:

      So it’s you we have to thank for 1920×1080 becoming the standard for monitors instead of 1920×1200 :(

      (note: joking [at least a bit])

      • Apocalypse says:

        I am getting the torches, some should start to gather wood.
        I really miss 16:10 :(

        On the bright side, with 27″ and up 16:9 becomes a lot better.

        • Tams80 says:

          I’ll get the wood. Who’s going to get the stake and rope?

          Larger displays at 16:9 are fine, but 24″ and below aren’t. There is some hope with some of the tablets, but that doesn’t seem to be translating up to laptops (even the small ones).

  10. wild_quinine says:

    How many people saying it’s over-hyped have actually seen it in the flesh? I haven’t seen it in the flesh, but even the 1440p video content I have seen has been genuinely staggering. Probably as a big a jump as SD -> HD originally was, in my opinion.

    If you think what you’ve got is enough, you’re simply accustomed to what you’ve got. If you think there’s no more detail to see, allow me to disabuse you of that belief:

    link to unrealitymag.com

    • Apocalypse says:

      720×480 a typical sd-resolution are 345600 pixels. (Wii)

      1280×720, one of those hd-ready resolutions that many people do not consider HD anymore has 912600 pixel (PS3) (266% of SD)
      1920×1080, full hd called by most has 2073600 pixel (600% compared SD)

      At the same time 2560×1440 has a 3686400 pixel, 177% compared to full HD
      3840×2160 has 8294400 pixel, 400% compared to full hd.

      So the step is from fhd to 4k is not has drastic as it was from SD to FHD, but much bigger than it was from SD gaming to 720-hd gaming. Actually the upgrade from those 2560×1440 screens to a 4k display is roughly the same as it was from old SD television to 720p hd television. Quite a big leap forward if you ask me.

  11. harbinger says:

    The other issue is just driving the display in 2D terms. DVI can’t cope. Nor can a single HDMI link. Only DisplayPort can do it on a single cable. Even then, it needs to run in a special multi-stream mode to achieve 60Hz (trust me on this, running at 30Hz is total catastrophe) that causes compatibility problems with pre-OS imaging.
    HDMI 2.0 was just announced recently at IFA 2013 and that’ll support 60FPS via single cable connection: link to engadget.com
    Just have to wait for new graphics cards/displays.

    But those are utterly piffling caveats compared to the major deal breaker, which is price. The screen I’ve played with is the Asus PQ321Q (you can get my detailed thoughts on that screen over three pages in the next print edition of PC Format mag, by the by) and it costs about £3,000. Which is preposterous and makes 4K look immediately irrelevant.
    The problem is that all 31.5″ 4K LCD monitors on the market right now use the SHARP IGZO panels e.g. Sharp PN-K321, ASUS PQ321 and the new upcoming Samsung model basically use the same screen. There are no alternatives so far out there, but seeing as HDMI 2.0 was just made official I’d expect them soon.

    I wasn’t particularly impressed with this first model of screens, it’s better sure but the angular dependence of the color and the overall clarity weren’t up to par with some of the upcoming Samsung or LG 4k TV selections.

    Already, there are dodgy no-brand Korean 39-inch 4Kers in circulation, apparently for as little as $700.
    They’re actually Chinese, the panels are manufactured by a company called TCL that is also selling TVs in the west, but a lot of the ones selling out are branded Seiki in the US and they go for as low as $550 apparently.

    • harbinger says:

      Here are some interesting further articles for anyone interested by the way:
      link to brightsideofnews.com
      Benchmarks: link to pcper.com

      When are you going to write something about the Oculus Rift by the way? I’d expect that from a “PC Hardware” guy. :P

    • Apocalypse says:

      That is to my knowledge right and wrong at the same time. HDMI 2 supports 4k@60hz, that is right, but this support is optional. Which means HDMI 2 devices can, but do not have to support 4k@60hz.

  12. Sian says:

    I hope there’ll be 16:10 screens or similar. I always feel cramped using a 16:9 display, even the big ones.

  13. Megakoresh says:

    Gonna have to say: No. 4K doesn’t really make sense for gaming. For this to become mainstream developers must be ready to optimize their games for this type of experience. And for that to happen they must be choosing between chopping down graphics (which will never happen, no artist wants he work diminished because of performance if they can avoid it) or upping system requirements to a ludicrous degree (which will never happen, noone will choose less sales).

    Developers never strived for resolution, aside from Eyefinity (which is not really about resolution per se). We arrived at 1080p as a result of going towards optimal price/quality ratio.

    And since, while I can see screens becoming cheap enough for this to work, graphics cards of that caliber will not become “mainstream” in price (as we are already seeing diminishing returns in that field, 200€ difference in price meant 10 times better performance a few years back, when now it barely means 1.5 better framerate), the prospect of this 4K gaming going to masses seems extremely bleak to me.

    At least with current technology iterations. There needs to be some sort of completely new tech for this to happen, like optical computers or something (and those are about a decade away from even being fully constructed with current funding in the field).

    • PoulWrist says:

      It makes perfect sense for gaming. You obviously don’t know just how great a step it is to go from a 24″ 19×12 to a 27″ 25×14. The visual fidelity alone on everything is improved. Then, the fact that you have way more room on screen for things. But that’s not really needed in most games anyway, since the whole new wave of stuff is an interface that is as small and unintrusive as possible. And titles that really require interface, well, they generally already support some sort of customised setup of their gui, like MMOs.

      • Megakoresh says:

        You obviously didn’t read my post , yet replied to it. No matter how great the visuals are, it’s never going to cost low enough to be “mainstream”.

      • RProxyOnly says:

        It makes no sense whatsever for gaming… all it’s going to do is give people something else to wave their e-peen at and double the cost of producing AAA games, plus it’s still going to be the same ‘artists’ so the graphics ‘design’ will still suck.. but at least it’ll suck clearer for you and people like you.

        The 4k push simply exists to push a new product wave.. it’s got nothing to do with providing actual, you know, quality.

        4k, for the next half a decade AT LEAST will be shit anyway.. you can’t AA at those resolutions and the pixels still aren’t so small that their edges disappear… So yeah, jaggie 4K pics for a long time yet.

      • Nate says:

        Think of it in terms of opportunity cost.

        Pick one of the following: a) four times screen resolution (2^2); b) displaying on four monitors (two for peripheral vision, two central or one central and one for UI); c) using four times as many shaders for every surface

        Is a) your first choice of those three? Because in terms of video performance, they all cost the same. Actually, b) and c) are probably a little cheaper, since your don’t need more polygons or larger textures (well, with b at least) to take advantage of them.

        In terms of user hardware, a) is the most expensive.

  14. drewski says:

    I can believe it’s amazing, but I also can believe that most people don’t really care *that* much, judging from the Scrubs, Futurama and Simpsons marathons that still get pumped into everyone’s homes in SD and nobody particularly minds.

    I also think people will be a bit wary of the Big New Thing after the last Big New Thing (3D) flopped so spectacularly.

    But on the flip side, once 4K tvs and monitors hit the sort of mass market price that makes them just tvs, rather than something special, then of course everyone will buy them, because that’s the way technology works. I think we have a bit of time yet.

  15. traumadisaster says:

    I think the price tag has irritated a lot posters, people are promoting their perceived negatives because they cant afford it. I cant afford a sports car but I dont say, its not practical and it requires expensive gas and racing tires. Ive been playing on a 50 inch 4k and a titan at 30hz since the Spring and its fun. Not for everybody, like sports cars.

  16. fish99 says:

    ” Or, like stereoscopic 3D, just another over-hyped irrelevance that’ll give you a hurty head, an empty wallet and the sneaking suspicion that the tech industry is pathologically cynical? ”

    As someone who has played 70 hrs of Skyrim in stereo 3D in the last month, and found it a ton more immersive due to the 3D, it’s not an irrelevance to me. The month before it was Batman: Arkham Asylum which again was fantastic in stereo 3D. And before that Dead Space, Thief 1 & 2, Far Cry, Kingdoms of Amalur, Fallout 3, System Shock 2, Trine 1 & 2 etc. Next up will be Arkham City and Dead Space 2 which also work great in stereo.

    I’ve had 3D Vision for about 4 years now (?) and my opinion has never wavered, that when it works well, it genuinely adds a bunch of immersion to the experience, and gets you closer to feeling like you’re there.

    I have to ask Jeremy, do you own 3D Vision and have you played your favourite games in stereo 3D? Your attitude towards stereo 3D sounds an awful lot like ignorance to me.

    • Jeremy Laird says:

      For the avoidance of any doubt, you probably won’t actually be surprised that I do indeed possess my own 3D Vision kit and have used it extensively including testing and reviewing multiple 3D Vision monitors and also several 3D Vision projectors. In case you’re wondering, yes, this does include the latest iteration with Light Boost.

      By all means disagree with me. But to infer I’m not in the position to have an opinion is probably ill advised.

  17. WinTurkey says:

    I’ll stick with 120Hz monitors, I prefer never having eye fatigue over marginally sharper 24″ screens.

  18. Xorkrik says:

    Do you know why they call it 4K ?
    Because that is how much is costs. Ba dum bish.

  19. MattM says:

    Tom’s hardware recently did their own overview of the the state of 4k.
    link to tomshardware.com
    Their take was that it wasn’t really ready for even the enthusiast. I do look forward to it in the future, but probably in 5 years not 1 or 2.

    • Apocalypse says:

      “But if you have a friend with more money than patience who can’t help himself, definitely spend as much time as possible gaming at his place. Sitting in front of 3840×2160 will absolutely wreck 1920×1080 for you—even if you’re used to playing across three screens.”

      From the same review.

    • harbinger says:

      “I do look forward to it in the future, but probably in 5 years not 1 or 2.”

      The new Maxwell GPUs that’ll come out next year will focus on “4K” as a feature and I’m pretty sure both GPU manufacturers are also working on driver improvements.
      And if anything their Benches have shown that even some of the most graphics intensive games run fine on SLI Titans aside from Crysis 3 and Arma 3, which run notoriously badly anyway. :P

      I also don’t know why they focus on the Dual-HDMI/MST mode so much, since the main reason they were using that since way back in 2001 with IBM T220 were the bandwidth limitations of driving such a large image over a single cable/display connector.

      With that problem largely solved, there’ll be a lot of single panels soon similar to their TV counterparts coming out in 2013/2014: link to en.wikipedia.org

      I’d be pretty safe in saying ~1 year for enthusiasts and 2-3 years for everyone else.

      • MattM says:

        I think the next few years of GPU power increases are going to mostly be used for better lighting and shader effects. The cost of a 4k screen probably will come down but the GPU’s to drive it at high settings are going to remain very expensive. As a result I predict that in three years adoption will still be really low <0.5% among PC gamers and the cost of a 4k gaming set-up will remain above $4k. It would be nice to be wrong though.

  20. Carra says:

    I’m happy to finally see some movement in the Monitor departments. Seeing tablets of 10″ with a resolution bigger than my monitor is just depressing. And hopefully it will give Nvidia & Ati a new boost to seriously improve their cards.

    4K is not for tomorrow but my 27″ 2560x1440p ips screen will hopefully last me for a few years more. Seeing the upgrade from my previous, 22″ 1650x1050p screen made me realize what a huge difference a nice screen makes.

  21. Grape Flavor says:

    Like much of what gets posted on RPS these days, this article is flawed. In short, here’s why:

    4K is dead in the water (for now) for a few reasons:
    1: The display expense is insane.
    2: The graphics cards required to do high end gaming at that resolution do not exist, and even if they did that expense would also be insane.
    3: The interfaces of both Windows and games are not properly designed for that kind of pixel density, resulting in a poor experience.
    4: In-game texture resolution is a completely separate thing from render resolution, meaning much of the “detail” that 4K would theoretically expose does not actually exist in the game assets. Developers could increase texture resolutions but that would make the already dire performance situation even worse.
    5: Video content that takes advantage of 4K is not available to consumers in any meaningful capacity.

    Are these problems insurmountable? No, but they are pretty significant and unlikely to be resolved soon. For shits and giggles let’s compare the obstacles facing Mr. Laird’s derided stereoscopic 3D:

    1: Poor developer support.
    2: You have to wear special glasses.
    3: 3D monitors are currently limited to 1080p TN panels.
    4: Video content that takes advantage of 3D is scarce (though still better than 4K).

    I realise consumer interest in 3D has waned considerably, so the resources needed to correct these issues may never be applied. But I would say on a technological level they are easier to overcome than the issues facing 4K, which basically requires an enormous breakthrough in both cost reduction and in GPU technlology itself. In contrast, affordable 3D already exists.

    So as much as I’d like to believe that 4K is just around the corner, I’m not holding my breath.

    • Sheng-ji says:

      I agree with almost everything you say but:

      “Video content that takes advantage of 4K basically does not exist.”

      Most movies are shot in 8k, tv in 4k and more and more cinemas have 4 and 8k screens. It exists but no-one is selling it in the high street.

      Oh, and I don’t agree that the article is flawed either.

    • airmikee99 says:

      The article is flawed for those five reasons, all of which were listed and covered in the article?

      While I agree with everything else you said, I’m not sure you understand the definition of ‘flawed’.

      • Grape Flavor says:

        Perhaps it was a poor choice of words. My impression of the article was that it portrayed 4K as the next big thing when there are serious obstacles to that being so. I’ll read it again.

        • Low Life says:

          That’s why it’s the next big thing instead of the current ;)

        • airmikee99 says:

          The impression I got was that the article is stating that 4K could be the next big thing, but it’s still got some time to go, basically, early adopters need to buy it up so manufacturers can reduce pricing. I viewed this article as if it were written about HDTV in 1998.. HDTV existed at the time, but was still years off from wide adoption because of price and the lack of available media. In the next decade we will see 4K become more common, but only as the problems you listed are solved.

    • harbinger says:

      1. TVs are already down to $550 and on the low end available for a lot of consumers, I believe there was someone further above who bought a Seiki, monitors are $3000-4000 and hardly constitute an “insane expense”. New technology is expensive at the beginning and all that because manufacturers want to make money. The displays were “insanely expensive” about 2 years ago.
      2. The graphics cards required to do high end gaming (Battlefield 3, DiRT2/3, Tomb Raider, Sleeping Dogs) at that resolution are 2x GTX Titan @60FPS for apparently all but Arma 3 and Crysis 3. For less than “high end” gaming even single cards like a Titan would suffice (Bioshock: Infinite, Skyrim)
      This problem will be largely fixed with the new graphics card generation next year.
      3. It’s not as bad as you make it out to be, Windows 7 already has some features to enlarge UI elements. Windows 8.1 is specifically designed with it in mind.
      4. So what? This has never stopped anyone from playing at higher resolutions before. Go see how games look on the Wii and using the Wii Dolphin emulator with the same textures.
      Even games like System Shock 2 or similar that were released in 1999 still look a lot better in higher resolutions.
      5. This is truer than anything else you’ve said, at least for the moment but there are still some:
      link to engadget.com
      link to rapidtvnews.com
      And specific broadcasts in Japan and Europe as far as I know of have already started transmitting some stuff in 4K and plan to further that in 2014 (mostly using H.265/HEVC encoding).
      All major TV manufacturers (Samsung, LG, SONY, Toshiba, Sharp) etc. are pushing this and a lot of hardware giants like Intel, Nvidia, AMD are doing it too. Content will follow.

      • Grape Flavor says:

        1: $550 for a 4K TV would be awesome, never heard of that.
        2: 2xTitan is still way out of reach for most people. Next gen will be faster, yes, but it’s not like the games are going to be standing still, either. Especially with the imminent launch of the new console generation, system requirements are going up.
        3: This is a solvable problem, yes. From what I’ve heard Windows 8 isn’t there yet but perhaps 8.1 will be better.
        4: Well, distant objects will be more clearly defined, basically. Which is good, but I don’t know how much people are willing to pay for that. Because jaggies can already be solved with AA and texture detail is dependent on the assets, like I said.
        5: Yeah, this will improve but right now it’s pretty dire.

        I’m not saying 4K is impossible, just pointing out some significant hurdles in the way.

        • harbinger says:

          1. It’s those: link to amazon.com , they are far (very far) from the best 4K panels I’ve seen (Samsung and LG make those), but they’re a start and help reduce prices for better models over time. There were deals to get them for as low as $550 a few weeks ago.

          3. It’s one of the most prominent features of Windows 8.1: link to hothardware.com

          • harbinger says:

            4. Have you ever seen a 4K display in action? Basically everything you see will be much clearer as always even if the textures don’t scale up. Just look at some of the games designed for the Wii and what kind of improvement you can get using a simple Emulator despite the Textures not scaling up:
            link to i.imgur.com
            link to i.imgur.com
            Believe me, there’s a great improvement in picture clarity overall with 4x the amount of pixels and you will notice it especially in cases where your monitor is right in front of your face.

          • harbinger says:

            Even simply Downsampling from higher resolutions for the sake of better AA and details provides better results on displays we have now.
            link to neogaf.com
            Not to talk about the whole “Dark Souls was designed for 1024×720 on consoles and anything higher than that won’t make it look better” malarkey with some people writing big explanations about it when the PC version released and how wrong that quickly proved to be.

            And even really geometrically simple and low-res texture games like this benefit from the added clarity of higher resolution rendering:
            link to u.cubeupload.com

      • RProxyOnly says:

        LMAO.. streaming 4k media…. yeah? on who’s connection?

        It’s not the speed you have to worry about, it’s the data cap.. currently 2, 4k movies would be enough to fuck 90% of the population for a month, unless of course they use the same shitty bitrate that’s used for netflix and youtube HD media. (sidepoint.. what’s the fucking point of watching HD media on youtube and netflix, because it looks godawful on anything that isn’t pc monitor size, i.e small.)

  22. RProxyOnly says:

    I have a 46″ 1080p 3D (passive) LG lcd TV.. I use it as my pc display and I’m never more than 8 feet from it (when using it)… I don’t see ANY pixels never mind big fat ugly ones..

    I think it’s just that you had a shitty tv.

  23. RPSRSVP says:

    A year ago, I went with a great deal on a Korean 27″ 1440p monitor to replace my 1080 Acer 20 incher. I am all for another res bump but there are 2 reasons why I will stay with my hardware for probably 2 more years:

    1. The price is a deterrent to become an early adopter. Obviously we’ll be singing a different tune 2 years from now but whether I go for it in 2015 will still depend on the 2nd reason.

    2. When I made the jump to 1440p, I upgraded my GPU. A AMD 7950 was was pretty much needed to to keep my frame rate that my old 5770 was pushing at 1080p. I am hopeful that 2-3 generations of GPU’s will bring enough progress to push the new resolution standard but I doubt a single GPU will get the job done. Maybe 2016 will bring one along but I am pretty sure we’ll need 2 GPU’s to drive this resolution.

  24. Nate says:

    It seems a mistake to bill this as four times the resolution. The pixel density isn’t really any greater than the displays to which we’re accustomed. It’s just a bigger monitor that doesn’t sacrifice resolution.

    My experience with gaming on bigger monitors is that it’s nice, but there are limits to how big of a monitor one would want (I say this because I am occasionally frustrated with gaming on my 24″ :) ) Those limits depend a lot on the game. All of those games who’s developers irritatingly refuse to create UI that scales appropriately with resolution are generally a joy to play on a large monitor. It sounds like Mr. Laird is focused on those kinds of games. First person shooters? Sometimes, it gets a little hard to keep track of that much screen space, and there’s only so much FOV can do to compensate without creating its own wierdnesses.

    EDIT: Obviously, that also depends a lot on how distant from the monitor one usually plays.

    • Jeremy Laird says:

      You are incorrect. The pixel density is indeed greater than any LCD monitor you are likely to have regular experience – certainly far, far greater than your 24 incher.

      Mobile devices have comparable or indeed much greater pixel density than a 31.5-inch 4K panel. But not PC monitors.

      • Nate says:

        Pardon me. My estimation abilities aren’t as good as I’d hoped :)

        The pixel density is high, but it’s also a bigger monitor. Part of the difference in pixel count between the Asus and my own is the pixel density, and part of the difference is the difference in monitor size. The Asus is about the density you’d get by running an old 15″ CRT at 1600×1200. That density isn’t really new, although there hasn’t been enough demand for it to catapult into the consumer world.

        What’s kind of new about the Asus is the delivery method, because 8 megapixels (much clearer terminology in my mind) swamps the bus.

        With gaming, there really are two different questions. Do you want to run at a high pixel density? As long as it’s computationally practical, yes (although as long as pixels are discrete, there will always be aliasing issues). Do you want to run a larger screen? Only up to a limit. Framing monitor discussions in terms of absolute pixel count confounds the two.

  25. Don Reba says:

    Once we get 600 dpi screens, we won’t need antialiasing. Even 300 dpi will do, if you are not too picky, really.

    • Gargenville says:

      Which is nice because the ~13k horizontal resolution this would entail on a 23 inch monitor is going to be hell on a quad-Titan setup even with AA disabled.

      • Don Reba says:

        Hey, there have been some very promising recent advances in carbon nanotube-based computing.

  26. Gargenville says:

    I came in expecting a review of 4000 different monitors. Blatant clickbait, I demand a refund.