Week in Tech: 4K Screens, Virtual Reality, New GPU Grunt

Let’s shake off the downbeat vibe from our last installment involving console toyboxes with some promising prospects for PC display tech. For starters, ultra high-res LCD panels look fairly likely to transition to the PC. If that’s an incremental step, could virtual reality in the form of the Oculus Rift headset deliver a real-world rendering revolution? I’ve also got a little something known as frame latency for you to think about regarding graphics performance.

First up, is 4k tech set to hit your desktop any time soon? The driving force here is once again trends in other areas of consumer electronics.

How so? 1080p smartphones are now popping up and thus your handset will soon have as many pixels as your desktop PC, which can’t make sense. Then there’s the Google Nexus 10 and its pixel-packed tablet brethren bringing you a 2,560 x 1,600 grid in just 10 inches.

Google’s Nexus 10 has as many pixels and a 30-inch panel

All the while the mouth-breathing masses are generally being beaten about the head with Apple’s “Retina” display marketing and 4k pixel grids are being touted as the next big thing for HDTVs. In short, expectations and comprehension of the benefits of more pixels are on the up.

So, just like the rise of IPS tech on phones and tablets presaged the appearance of cheap IPS PC monitors, could the same thing happen with high pixel densities? In some ways PC monitors have actually been regressing when it comes to resolution. 1,920 x 1,080 has largely replaced 1,920 x 1,200 and 30-inch panels with 2,560 x 1,600 panels have dwindled in number. Not good.

But you could say the same thing about panel tech, with TN screens becoming almost entirely ubiquitous before IPS made a resurgence. Are there any hints of a high-res revolution to come? A few monitor makers had their new 4K wares on show at the CES show in January. 4K in this context means 3,840 by 2,160 pixels.

The 4k monitors: Was $50,000, now $5,000, next year $500?

Unfortunately, we’re currently talking about prices around the $5,000 mark or more. Mainstream these new 4K monitors most definitely are not. But this is all about context and a few years ago such a display would have been $50,000. We’re on the way, in other words.

The other reference point is, of course, the Apple 15-inch MacBook Pro with its 2,880 by 1,800 Retina display. I hate to admit, but just like Apple woke people up to IPS, it could be instrumental in driving higher resolutions across the PC industry.

Put another way, if your MacBook has a Retina display, shouldn’t your iMac have one too? And if panel makers are tooling up to make panels for Apple, they’ll flog them to monitor makers, too.

More pixels than any desktop PC? That’ll be the MacBook Pro with Retina display

A 4K PC would give instant benefits, too. With a PC, you instantly have a much bigger desktop and the prospect of running games at insane resolutions. For for 4K TV, well, there’s little to no 4K video content out there. The end.

Of course, you could argue that extra res is somewhat redundant if the underlying game assets – the geometry and textures – are of mediocre fidelity. And there’s the minor matter of driving such a high resolution display, both in terms of video outputs and pixel-pumping power. The latter becomes a major problem if you combine higher resolutions with 120Hz refresh rates.

But my general feeling is that the arrival of 4k or at least higher pixel densities in general is going to happen over the next few years and it will be a good thing. It will push higher resolutions into smaller form factors and make existing displays cheaper. I’d like to see 2,560 by 1,600/1,440 in the 23 to 24-inch market and also see 27-inch 2,560 by 1,440 panels become truly affordable.

27 inches of Samsung glory. If only it was cheaper

And so we come to another PC-related display prospect and one that I’m instinctively reluctant to even mention. It’s virtual reality. A bit like stereoscopic 3D, VR is a technology for which the theory is fab but the reality tends to suck.

But the signs are that’s about to change thanks to the Oculus Rift headset. To my great regret, I’ve not had a chance to try the Oculus Rift, of which you will all surely have heard. But according to people who sampled the latest version as CES, it’s a stunning experience.

Early dev kits of the Oculus are finally beginning to roll out

The overall immersion and the ability of the Oculus Rift headset to respond rapidly enough to head movements to generate an effective illusion have been big talking points. And they are critical. But I’ve also heard that another big difference is the ability of the system to generate convincing stereoscopic 3D images.

3D tech to date tends to produce a frankly piss-poor cardboard cut-out effect. By that I mean most objects in the scene appear as fairly flat objects located at different points and angles in space. So there’s some sense of overall depth to the scene, but often not the objects themselves.

A 3D movie, yesterday

Well, apparently Oculus Rift is awesome in that regard. I like the fact that’s it’s come from a home-brew start up, not a faceless corporation, too. I’m really looking forward to giving it a go. It’s potentially a massive game changer. I know. Sorry.

Anyway, the Oculus Rift has just started production in dev-kit form so it looks like the project is truly a goer.

In other fairly recent news, you lot might like to hear about one of the latest trends in graphics performance testing. As you’ll know, measurements of average frame rates are the standard metric of graphics performance.

The next step is to look at the minimum frame rate, which in many ways you’d think is the most critical measure. At worst, how slow is a given GPU when running a given game? If it’s never slower than 60fps, who cares how much faster it sometimes is? You might actually care if you have a 120Hz screen, but you get the point. You can add another layer of analysis by looking at performance over time in a graph. Surely, then you know everything?

As ever, however, it’s not quite as simple as that. For the full story, you need to look at what you might call sub-second performance. The problem is – or at least, can be – frame latency. The idea here is that individual frames can suffer an acute delay or latency in rendering. The overall frame rate for, say, a given elapsed second of rendering might look good. But there may have been moments when the rendering of an individual frame more or less locks up.

GPU testing, Tech Report styleee

In practice, this is something you can see in the form of visual stuttering. It’s a noticeable lack of smoothness. You can read more about it over on the Tech Report. But suffice for now to say the issue emerged with particular regard to AMD GPUs.

It’s something AMD is now fully aware of and has produced a beta driver that in part addresses the problem. For now, as I understand it, the fixes are on a per-game basis, so inevitably not every title will have been covered. But AMD is working towards more generalised optimisations.

For clarity, do not misinterpret all this. I’m not suddenly suggesting all AMD GPUs are stuttering piles of junk. But it is an area of graphics performance that’s worth knowing about and keeping an eye on when you come to your next video card purchase.


  1. El_MUERkO says:

    My primary monitor is a 30″ Dell 2560*1600 display. I’ve long felt the display makers have needlessly stalled development of higher resolution screens to milk users.

    With 4k and 8k televisions on the horizon I’m hoping to see the display market forced to move forward with the knock of effect on the PC component market as we need more power to drive all the extra pixels.

    • D3xter says:

      Here’s a list of all 4K capable displays as of now: link to en.wikipedia.org

      The Sharp 32″ 4K display with IGZO panel is the first “high end consumer market” monitor of that kind for ~$5000.

      Intel is generally pushing the market forward, 2013 is mainly for expensive “Premium” displays, but we should see more mass market adoption by 2014/2015, especially with UHDTV being introduced: link to tomshardware.com

      • Jeremy Laird says:

        Hmmm. 4K projector. *Drool*

        • kelseypaul1 says:

          If you think Aaron`s story is unbelievable…, last pay-cheque my friend’s brother basically brought home $5859 putting in a fifteen hour week from there house and the’re best friend’s aunt`s neighbour done this for 8-months and made over $5859 in their spare time from their labtop. applie the instructions from this website, Read about it

      • El_MUERkO says:

        Indeed, the 2014 World Cup in Brazil is due to be broadcast in 2k and with over one hundred 4k TV’s launching in 2013 it’ll be interesting to see what CES 2014 will have to offer.

        • ResonanceCascade says:

          2k is just a pinch over 1080p though. 4k is 4 times that. So that’s still a long way to go.

          I definitely think 4k is overkill for a home theater (though it’s fantastic for movie theaters). A lot of movies used 2k digital intermediate process, so unless they want to totally remake the movie from the original prints in 4k, the highest resolution we’ll possibly get from them is upscaled 2k.

        • D3xter says:

          2K? By definition that would be closer to 1080p, it’s standardized as 2048 x 1080.
          As I understand it, there were already test-broadcasts for 8K done to test cameras, bandwidth etc. during the Olympic games in the UK last year: link to hollywoodreporter.com and broadcast to public viewing sites around the country.

          If you want to see what the crazy Japanese are working on and how the progress towards 8K displays and content goes, this is a good programme to watch: link to news.bbc.co.uk

    • Thiefsie says:

      But why do we need 4k or 8k TV’s??

      The distance we sit from them makes anything much greater than what we have redundant. Monitors on the otherhand are set to really benefit from high resolution, just as the retina displays were to up close phones, the same HD upscaling to up close monitors will be excellent (and also to Rift devices etc.)

      For a telly though… I just don’t see the point. You don’t generally sit closer than say 2m to a telly to watch a movie or TV…

  2. skinlo says:

    I personally don’t care about high resolution monitors above 1080p that much, I’d much rather have OLED type displays than crazy high res’s.

    • D3xter says:

      I wouldn’t expect them any time soon, unfortunately. Samsung and LG switched their focus from developing OLED to UHD/4K as the next “new thing”, and there’s still big problems with the yield rate:
      link to hdtvtest.co.uk
      As this article says:
      “Now, for the first time, DisplaySearch has revealed just how shockingly bad these problems are, and how much work there is to be done. Recent pilot production runs of 55-inch AMOLED panels showed astonishingly low yield rates, owing to the fragility of large-sized backplanes. Straight yield (i.e. prior to repair) came in at less than 10%, meaning that at least 9 out of 10 panels were damaged in some form or another. Mending the defective panels (either physically or electronically) merely improved yield rates to under 30% – to put it another way, more than 70% of all panels had to be discarded. Even those that do make the grade are reported to have their lifespan cut short by the bonding process which causes further instability to the organic compounds.”

      Also a nice video: link to theverge.com

    • x1501 says:

      1080p may suffice for smaller screens, but try staring at a 30” display at 1080p or lower. The overall image quality is just noticeably bad.

      • skinlo says:

        That is true, I forgot about big screens.

      • frightlever says:

        At what distance are you staring at that 30″ screen? I have a 23″ 1080P screen that sits about 22 inches away from my nose. For me, personally, it’s almost too big to game on at that distance and borderline a pain in the butt for fullscreen applications. Great for multiple windows though. I’d be swivelling my head left and right if I was looking at a 30″ screen from that distance. Am I going to see any difference between a 1080P and a 4K screen on a 23″ monitor that would justify the extra strain on the computer and the extra cost? I have no desire to go any bigger than a 23″ screen on my desk.

        I will say this, I saw the difference in going from an original iPad to the retina display on the third iPad but it’s not really comparable. Going from 1024 by 768 to 2,048 by 1,536 is profound on the same size screen. Quadrupling the resolution again would be pointless. I think the same is true for HD TVs. Unless you choose to watch your 80″ 4K or 8K TV from a couple of feet away – you’ll see little or no difference from a typical 10 to 20 foot viewing distance.

        I’m probably anomalous – I’ve had the same 1280×1024 17″ TFT screen at work for years and they’ll have to pry it from my cold, dead fingers before I let them replace it.

        Anyway, ultimately I think people are perfectly happy with 1080P and have no overwhelming desire to replace them. That’s why 3D hasn’t prompted a mass upgrade of televisions that are just a few years old. In fact most TV is still being watched at standard definitions so I’m happy enough to wait this one out.

    • bad guy says:

      I’m with you.

      If they are having problems with yield rates of 55″ screens (because they are too big), why not start with 24″ computer screens. I’m sure alot of PC enthusiasts would like a nice OLED screen. I no understand.

      I’ve been using an OLED for over a year now (LG 15″), and I don’t want to go back. The colours and the black are PERFECT.

      • darkChozo says:

        Don’t know how applicable it is to the OLED technology, but monitors typically have smaller dot pitch for displaying stuff like small text. Compare PC output to a 1080p TV and a 1920×1080 monitor and you’ll see the difference is pretty obvious. I’d imagine that achieving a smaller dot pitch is “harder”, so if anything they’d probably make a 24″ TV first.

        Of course, that’s basically speculation on my part. And it might just be that any OLED display of reasonable size is going to be pricey enough that only home theater enthusiast types will buy it.

    • Old Rusty Dusty says:

      I’d love to have an OLED as well, but it is nice to finally see a push for higher resolutions by the big companies. Being a PC snob myself, I ended up settling on importing one of the Korean IPS panels since there still isn’t a better alternative for the high-end gamer. 27″ 2560×1440 @120Hz is really a nice upgrade when coming from a smaller 1080P panel or even if you’re used to playing on a 32″ or 42″ TV for PC gaming.

  3. Premium User Badge

    FhnuZoag says:

    I’m rather skeptical of the push to higher resolutions. Do these really make much of a difference for games? Will I be forced to squint as developers squeeze interface elements into smaller and smaller spaces? And wouldn’t it be a pain in the arse for all web designers to up-res their interface elements and push more pixels down the internet tubes at users?

    My laptop kinda chugs when pushing out data as it is – how much worse is it going to get when it has to push out graphics at a higher res? How much longer will load times get?

    • daf says:

      You really just have too look at the macbook retina display, the UI sizes don’t change it all looks like the previous model yet text looks super crisp and smooth as well as pictures.

      So you need to see high resolutions not as making things smaller but as allowing more detail and refinement to the image, with a high enough resolution AA would become obsolete because you would have enough pixel density to not be able to perceive any jagged lines.

      Unfortunately that requires the pixel count to increase keeping the physical size the same which seems to escape most screen manufacturers.

    • ResonanceCascade says:

      For PC gamers, who usually sit pretty damn close to our monitors, 4k monitors would definitely make a difference. But then again, the main problems I have the graphics in PC games wouldn’t be solved by rendering them in 4k. A low res texture and globbed on “post-processing” effects will just annoy me that much more in glorious 4k-o-vision.

      As far as the living room goes, I think 4k TVs are borderline absurd.

    • Reapy says:

      I’m more about higher and steady framerates in my games than anything. I honestly can’t remember the last time I actually could use AA on in a game to get acceptable framerates. First thing I always do is drop AA and down the resolution to 1400 or 1200 depending.

      It might just be that my eyes suck or I don’t have that much against jaggies, but again I like a nice lush 60 fps + on my action games. Other types not so much a big deal, but anything that is fast or requires twitch I do whatever I can to push it up to 60.

      Having a higher resolution monitor just means I’m going to get less bang for my buck for my hardware. For desktop productivity I can see how yeah, more pixels the merrier, but gaming I’m not as much to desire more pixels.

      Though this day and age the ‘core gamer’ is probably a lot less of the target than we used to be for top of the line hardware.

      • MrLebanon says:

        My old laptop used to have a problem with AA… but my desky has little to no performance drop with AA.. especially with FXAA which is less resource draining

        I used to not mind having AA set to 0… but once you get attached to it, you cannot go back!

    • Old Rusty Dusty says:

      I was skeptical for a long as well, but was slowly convinced by the nice people on the 120hz.net forums a few months ago that 1440P was the way to go. I always thought 1080P was sufficient, but after recently upgrading from a 1080p panel to one of the Korean 27″ 2560×1440 120Hz IPS panels, I can safely say that the extra resolution and refresh rate really makes a big difference in games. The extra pixels really add a lot of detail, especially when looking at objects, terrain, or players in the distance, which appears clearer and far less “blurry”. Additionally, shader effects, lighting, and shadows especially are a lot more refined and detailed because pixel shaders are rendered on a per-pixel basis. You could say it’s kind of subtle to a point, but once you load a game backup up on an old 1080P monitor in comparison, it’s really obvious just how much detail and clarity was missing.

      Additionally, if you haven’t used an IPS panel it’s a nice upgrade over the traditional TN– The colors really pop, and everything looks much more vibrant and true to life as well. And for the 120Hz, it really helps to bring back the smoothness you had back the old CRT days, if you can push that 100FPS+ framerate. 60FPS in comparison now feels sluggish to me in the same way that most would say that 30FPS is less desirable than 60.

      Which brings me to the caveats, upgrading from playing games at 1080@60 FPS to 1440P@120FPS really requires serious hardware if you like to play games at max, but I suppose it’s the price to pay for all that the extra detail and smoothness.

  4. MrUnimport says:

    Not quite as optimistic about the Rift as you seem to be, head tracking and stereoscopy are all well and good but the way it interacts with game interfaces means its most immersive functions may not be compatible with a large number of games. “Peering”, for example, would be difficult or impossible to reflect from the viewpoint of a second player and would muck up the third-person animations something fierce, this in an era where most FPSes don’t implement leaning. Furthermore: as an immersive tool it may well be unparalleled, but it seems like it’d decrease one’s performance as a player to have the aiming crosshair decoupled from the center of one’s vision, as it would be in a game with independent head movement. The alternative, being able to turn one’s character by looking and also with the mouse, seems like it’d raise minor issues when a player turns a corner, turns their “feet” imperfectly with the mouse, and has to hold their head somewhat off-center in order to walk in a straight line.

    Although I fear the majority of games will not put in the development time to make total immersion possible, it certainly seems like a 3D wide-angle monitor would be a fine thing to have. I’ll most likely be picking one up at release.

    • Jeremy Laird says:

      Some fair points, but don’t strike as insurmountable. Not saying that all you’ll have to do is plug in and it’ll make all your current games look awesome.

      But I’m excited by the possibilities.

    • D3xter says:

      I’m probably looking forward to it more than any hardware product right now, since if it takes off its promise is immense. They’re delivering the first ~10.000 kits to developers in March to see what works and what doesn’t, build support and gather feedback on the design.
      Obviously it likely won’t work very well with some game genres and there will be specific design choices that will have to be changed, like rethinking UI and having to improve general texture detail, but still.

      “Peering” or “Leaning” your body forward or general movement tracking as I understand it isn’t possible yet, but they’re working on resolving it for the “Consumer Kit” version. Still, a 360° headtracking, including looking sideways already makes a big difference I can imagine.

      – 110° Field of View (instead of 45° like in the SONY HMZ-T2 or similar) covering most of your sight
      – 1000Hz low-latency custom Headtracking solution using accelerometer/gyroscope/magnetometer to make it as fluid/realistic as possible
      – Actual stereoscopic 3D rendering (a frame being rendered for each eye)
      – Cheap/affordable at around ~$300 and only weighs about 225 gram
      – Actual developer support, with the likes of John Carmack, Cliff Bleszinski, Gabe Newell etc. endorsing it, and some of them working together with the Oculus guys to integrate it into the Unreal/Unity/Source engines and a lot of people are getting excited about it
      – Lots of positive reactions from the Press at the last E3/QuakeCon/PAX/GamesCom and latest CES with everybody that tried it coming away from it impressed

      Here are some great reaction videos from the last CES (where it won several “Best of Show”) to give you an impression :P

      link to youtube.com

    • Continuity says:

      Yeah IMO its simply not going to work well with older games that have been developed without it in mind, but then that isn’t what the rift is for, the rift is being shipped to developers first because Oculus are fully aware that content will have to be created for it.
      Some older games will be able to be adapted to work reasonably well with the rift but if anyone thinks its going to “just work” with all their favourite games then they’re living in a fantasy land.

      Further more, yeah, the rift isn’t a practical interface for a competitive FPS, thats blindingly obvious; though its in no way mandated that your cross hair is tied to where you’re looking, so you could still aim with the mouse and have freelook with your head… but this is all pie in the sky until developers get hold of the thing and find out what really works.

      For my money the rift, and VR in general, will be best suited to single player Adventure and RPG games, and maybe some single player FPS.

    • MiniMatt says:

      Can’t help but think 3D technologies of any flavour will only really take off once either (a) someone figures out how to integrate them with or get them to work with spectacles or (b) spectacle use declines through common acceptance of laser surgery / genetic fixes / alien technology etc.

      Perhaps 3D printing will enable folks to print a prescription lense for their headset.

      Speccy four eyes form, what? 30% of the population? And as “gamer” demographic increasingly merges into “pensioner” demographic that proportion likely rises.

      • MrUnimport says:

        Pretty sure these ones fit over eyeglasses, or at least a fellow with them was able to demo it pretty well, if memory serves. There’s been some talk about diopters and focus points at infinity but I am far from versed in such matters.

      • Continuity says:

        I don’t know if I can put a link in here, but if not then check out the latest update on the Oculus rift kickstarter page.

        link to kickstarter.com

        Long story short, the goggles have adjustable clearance for spectacles and come with 3 sets of interchangeable lenses so if you have poor sight you can probably use it without glasses anyway.

    • Reapy says:

      I am worried about motion sickness. I saw in some of the test videos how if you shifted your head off axis a bit people can get really sweaty / sick looking. As carmack explained in his super long lecture our heads don’t rotate on a fixed point like a fps camera, so when we look down we shift forward a bit and/or lean.

      Still, really the potential of the rift is new styles of games will suddenly be interesting. I can imagine a slow walking / exploring game will bring a whole new level of immersion when you can put your face down on the ground and peer around and look at things. Crazy architecture / landscapes will have a bigger impact I think when scaled to encompass your peripheral vision.

      If you think of 1st person vs 3rd person in MMO’s, like EQ vs WOW, there is a noticeable difference to character body language and ‘getting in your face’ from down in first person view to 3rd. While I like 3rd a lot, I can imagine 1st person + occulous rift will give you a new level of ‘being there’.

      What I think will be glorious with the rift are simulations where you are in a cockpit like mech warrior or IL2 / rise of flight etc. You already expect to not be able to rotate your body and are manipulating joysticks to interact with the world, so that described disconnect is not there. I already got an amazing sense of immersion from a trackir like device (used facetacknoir) in IL2, so I can imagine the rift taking it to an entirely new level.

      I’m really excited that VR is getting worked on again, and the tech is finally catching up enough to be able to do it correctly. My biggest worry honestly is the lack of real life situational awareness you’ll have with it on, my wife will have to start throwing objects at my head to get my attention with the rift + headphones strapped on.

      • Sleepymatt says:

        You make a great point at the end there… aside from your wife’s increasing aggravation, imagine her horrific revenge when she waits for you to play a bit of Amnesia, then gently taps you on the shoulder….


        • D3xter says:

          Not only that, but someone made another suggestion in one of those videos that could be potentially more terrifying.

          Putting one of them headsets on someone sleeping and having them wake up in a world like Amnesia with some sort of monster in their face. Now that would certainly trigger some sort of release of bodily functions or worse.

          Regarding the “head movement” thing, I believe they worked on doing a neck-model so it’s not literally just the camera being on a fixed point turning up/down/left/right, but you get the impression that you are actually moving your head around.

      • darkChozo says:

        It seems like the leaning issue would be rather easily addressed in software. Just offset the two axes of rotation (I’ll call them X and Y) from the camera focal point to simulate a “head” and “neck” setup. Might not be perfect, and it might not be something that can be modded into older games, but should be possible (and if it’s doable in garry’s mod, it’s totally doable in a new game).

        Also, expect the new video game scare to be someone dying of a heart attack from the above post.

        • Reapy says:

          That would be a heart attack waiting to happen!

          I think for getting the head motion right is why they wanted to have head tracking. I think carmack talked about using different tech to track head motion so you could get rid of that problem. I think I recall him saying the track IR did a really nice job until you swiveled too far and it was like falling off a cliff in terms of the movement and it was better to have something with more gradual degradation.

          But yeah it is a solvable problem, but will probably require the right head tracking stuff. I just am super psyched about the rift, but I’m really susceptible to motion sickness, to the point that some games (only certain ones, jedi outcast, half life 2 @ < 60 fps, lego star wars) make me feel like heaving after playing them for a while. So any kind of thing that'll send a normal person to puking is definitely going to get me there.

  5. Dimonte says:

    30-inch panels with 2,560 x 1,600 panels have dwindled in number.

    Actually, there is at least one good reason why that happens. Most bigger consumer monitors are used not only for gaming, but also for video. And right now everything is available in 1080p. And it doesn’t scale to 2560 x 1600 at all well. Sure, you can watch most things in 720p scaled x2, but why would you do that when you can get a nice monitor for a bit less money and it will play your fancy full-hd video in native resolution?

    And another point about high-density screens on mobile devices. The screens are small. You hold those screens pretty close to your eyes. Factor in the distance and you’ll be looking at much more similar pixel densities on both monitors and mobile devices. That, and Apple successfully persuaded people that “retina display” is something that they need.

    • Jeremy Laird says:

      I disagree re the scaling. In fact, not actually sure what you mean. HD content scales very, very nicely on 2,560 x 1,600 panels. Good quality 1080p content looks better on my Samsung XL30 than any other display I have seen, regardless of the fact it’s not being rendered pixel for pixel.

      • Dimonte says:

        Well, actually, it doesn’t. Try making a pixel checkerboard pattern at 1920×1080 in any graphics editor and then scale it up to 2560×1440 using any algorithm you want. You’ll see what I mean.

        • slight says:

          Yeah but video content isn’t a pixel checkerboard. Video is usually very forgiving of rescaling to non-multiple resolutions.

          • Dimonte says:

            I agree with that, but not completely. When upscaling 1080p to 2560×1440 you get 1.(3) pixels of screen per source pixel. It’s a really, really bad number. If you have really small details in your video, you’re going to notice the difference.

          • Jeremy Laird says:

            I’m afraid you are letting the theory overwhelm reality. Have you seen 1080p video on a 2560 panel? I very much doubt it. Otherwise you’d know that your suppositions are wrong in practice.

          • Dimonte says:

            As a matter of fact, I did. I have a 27″ Dell 1920×1200 monitor at home and quite recently I got a newer model Dell monitor with the same diagonal, but 2560×1440 resolution at work. Got a couple of 1080p clips with minimal compression to test it out and immediately noticed a slight and uneven blurring. It isn’t that noticable if you look at the screen from further away, sure, but then what’s the point of having that resolution at all?

        • darkChozo says:

          Aww, and here I was looking forward to watching the Infinite Chessboard Channel in 2560×1440. Now I’m sad.

          Less facetiously, if we’re talking real video, yeah, pretty much. I rather doubt that interpolating 1080p video to a higher resolution will make things look much worse (unless your video is a still image of a brick wall or something), but it probably won’t make it look better either. Upscaling does have the advantage of acting as fake antialiasing (ie. making jaggies less prominent via being blurry), but that’s not really that relevant to movies and such. The XL30 probably looks good because it’s a 3500 dollar monitor, not because it’s a higher resolution.

          (Disclaimer: I secretly know very little about imaging. I probably have a better idea of aliasing in a discrete mathematics sense than a graphics sense :D)

  6. Barberetti says:

    What’s the deal with the monochrome picture of David Hasselhoff in a dress?

  7. jrr says:

    You can fake a 4k display with four 1920×1200 displays (or yucky 16:9 1920×1080 ones), tied together with AMD Eyefinity. Dota 2 runs well at this resolution on my old quad Intel / ATI 6950. Many games don’t scale well, though =

  8. Low Life says:

    I’m hoping my 24″ display at 1920×1200 lasts until 27-30 inch 4k displays are affordable (below $1000), until then I’ll settle on upgrading my second display and/or getting a third one. That’s likely going to take a few years, though, and hopefully by then we’ll have a solution to the problem of pushing those pixels, too.

    • sd4f says:

      It’s going to take a long time before i depart from my 16:10 aspect ratio monitor for a 16:9 monitor.

      How is 4k content going to be delivered? Are they going to use blu-ray, or is the internet going to be the medium of choice? I live in Australia, and our internet isn’t up to the task of delivering youtube reliably, let alone 1080p video, and the pipe dream 4k stuff.

      • D3xter says:

        Blu-Ray specifications aren’t compatible with 4K content, so I believe that might be out of the question for now since almost all players wouldn’t be able to play them back.

        The first consumer-ready 4K movie-content delivery system from Sony available for their new TVs is being streamed from some sort of content server: link to blog.sony.com

        The first movies available are:
        “The Amazing Spiderman
        Total Recall (2012)
        Bad Teacher
        The Karate Kid (2010)
        Battle Los Angeles
        The Other Guys
        That’s My Boy
        Taxi Driver
        The Bridge on the River Kwai”

        On the broadcasting and Internet-streaming side it will probably be closely together with the new HEVC/H.265 codec, since it’s supposed to produce the same image quality of H.264 at about half the bandwidth: link to en.wikipedia.org

  9. Rian Snuff says:

    I’d most definitively sell all my monitors to chip in for a 4k on 23′.
    You can pretty much turn off AA in games, get that extra frame boost and still retain crisp edges allowing headroom for other shiny things to be cranked.. As far as I gather.. That and everything else you could imagine is generally much more crisp.. More workspace doing other things..

    A lot of people on other tech sites I visit seemed to miss that concept and simply focus on 3298723′ screens! Meh.. Gimme’ dat’ cram pixel packed tiny one.

    I first realized how sexy 1080 pc games look when being streamed to my 7′ tablet.
    I would imagine the effect being quite the same. It was sort of mind blowing to me.

    I also noticed doubling the in-game res in Arma II and disabling AA gave me a 30+ fps boost compared to when using my native res + AA. And it looked really, really sexy.. Not quite the same but it had effect. Sometimes I can get away with double res + 2-4x AA and, wowza.. Distant leaves and such get more clarity for sure.

  10. pupsikaso says:

    Wow reading this is like a breath of fresh air and gives me that long-lost feeling of expectation when it comes to the future. Now we’re finally going to see some action in regards to PC that’s been so stale (in terms of gaming) for more than 5 years.
    We’re going to get bigger and better screens with higher resolution and faster refresh rates. Meaning we’ll be able to enjoy prettier games with better visual quaility, meaning we’ll need bigger and better graphics cards, which will prompt more games to utilize all that power.
    It’s like someone’s thrown a bunch of nitro into the vicious circle to give it a boost.

  11. Didero says:

    So why are these things called ‘4k’?

    • darkChozo says:

      link to en.wikipedia.org

      Basically, it’s anything with a horizontal resolution of about 4K pixels. Using that terminology, 1080p would be about 2K. So 4K is about twice the linear pixel density of the current standard.

      • MrUnimport says:

        Nice to be briefed on the new buzzword early, I mean I was getting tired of referring to high-res displays as “really really HD”

  12. Ernesto says:

    Great! So we can finally ignore AA and all that related GPU-eating stuff and instead concentrate on pushing more pixels. 4K is a lot of pixels, but hopefully cheaper then AA, speedwise. That’s the whole point for using 4K displays in gaming, right?

    • PedroBraz says:

      No the point is to get you to buy something you dont need.

    • LionsPhil says:

      Pumping more pixels is comparable to doing AA by oversampling, which is the “proper” way to do it. (It’s the way that actually deals with aliasing effects, not just “jaggies”, and will make sense to signal processing geeks who use terms like “Nyquist frequency”.)

      Unfortunately, gaming has ended up developing “fake” ways of doing AA instead that are computationally cheaper because oversampling is expensive compared to smearing everything with vaseline.

      So this probably isn’t going to do wonders for your framerate.

      • jrodman says:

        On the plus side, 2x the pixel density will probably look better than 2x oversampling for approximately the same work.

    • Old Rusty Dusty says:

      Unfortunately, running at 4K will have a much, much greater impact than just turning up say AA at 1080P or even 1440P. It all comes down to the number of pixels scaling linearly with performance, so for example 1920×1080 = ~ 2 Million pixels, 2560×1440 = ~3.7 Million pixels, and 3840 × 2160 (4K Res)= ~8.1 Million Pixels. I recently upgraded from a 1080P to a 1440P monitor, and subsequently my frame rates dropped by nearly half in all games, which makes sense since that’s close to double the number of pixels. And as you can see, the 4K resolution is 4 times the number of pixels, so in any modern game you would need 4 times the graphics hardware to achieve the same framerate compared to 1080P. (There are exceptions, but that’ mostly when looking at much older games that have limited shaders and graphical bells and whistles).

      In comparison, running AA in most games generally results in a ~25-50% drop in framerate depending on the game. But furthermore, many games are now using new types of AA that have little to no framerate impact, with similar visual quality to traditional AA. For example, FXAA and SMAA both have a FPS impact of maybe 0-2%. FXAA is a bit blurry for my liking, but SMAA is really quite nice. There is actually a free utility you can use for just about any DirectX Game called SMAA Injector, that allows you to force this type of AA in the game. In most comparisons it looks just as good (and better in cases of foliage/grass than 4x MSAA), without any real performance impact. You can find that tool here if interested in boosting your FPS by using the SMAA instead of a game’s built in AA: link to mrhaandi.blogspot.com

  13. Lev Astov says:

    Good writeup. I’m really psyched for the Oculus Rift. I signed up for the dev kit, so I’ll be sure to parade it around the comments when the time comes.

    Also, where’s the staring eyes tag? Or better yet, the terrifying eyes tag…

  14. Xardas Kane says:

    4K TVs are the most pointless, stupid and overhyped advancement in TV technology this side of LED. 4K PC monitors though I can certainly dig, a welcomed and necessary change. As for the Oculus Rift – won’t believe it’s worth it till I check it out myself.

  15. darkChozo says:

    Not to restate the obvious, but having recently purchased a 27″ 1080p monitor, price point is going to be a huge sticking point here. I’ve got a midrange PC (~$1000), and with 1440 monitors starting at like $500 (more like $700 if you’re looking for something a bit less sketchy), it doesn’t really make sense to spend so much, particularly when I’m going to have trouble pushing that many pixels in some games anyway.

    Then again, I suppose high end monitors aren’t aimed at my price bracket anyway…

    Also, Occulus Rift is cool as always. Maybe we’ll actually get VR right for once, press reactions have been promising so far.

    • MrUnimport says:

      As >1080 monitors enter the mainstream I imagine we’ll see a corresponding drop in prices.

  16. pingu666 says:

    occulus rift would/should be great for sim games, like driving, flying etc, where you really would be physicaly static, but moving your head. so you could easily look around the cockpit.

    combining that with a kinect like device which tracks hand/body movement, you could map have a gameworld where you pick stuff up and manipulate things, or you mime pressing controls or something.

    or the hands in game might just mirror what your real hands are doing, how your holding the steering wheel for example

  17. SighmanSays says:

    I’ve looked through these comments and I can’t help but feel that I’m the only one who realizes that modern graphics cards simply aren’t up to the task of 4k resolutions at playable frame rates. Particularly for multiplayer. It’s a 4-5x jump in pixels to push. Even SLI setups will be hard pressed for some years yet.

    • LintMan says:

      Well, it will still be at least a few years before 4K displays are generally affordable, and it will likely also be some time beyond that before games start to really take advantage of those higher resolutions, so there will be quite some time for graphics cards to catch up. (And even then gamers can always do what they’re done in the past when their vid card can’t keep up: run at a lower resolution).

      The near-term good news here is that this stuff might start to spur greater mainstream availability and popularity of 1440p and 1600p displays.

    • Low Life says:

      Except that it’s really not far away at all – a single Geforce 680 can already run BF3 at resolution 2560×1600 @ 60 fps on highest settings without AA (link to media.bestofmicro.com). Thats about half the pixels of a 4k class resolution, so even if we assume worst case scaling (there’s much more to performance than the amount of pixels) we’d get 30 fps – and that’s at highest details.

      In fact, there are a lot of few people already playing at comparable resolutions using Eyefinity or the Nvidia equivalent.

      Either way, it’s still going to take a couple of GPU generations until the 4k display prices drop to reasonable levels, so we’ll have some time to get ready for that, too.

      • Old Rusty Dusty says:

        On the other hand, as newer more intensive games are released, you do really start to feel the squeeze. For example, it takes 2 SLI’d GTX 680’s to run Far Cry 3 maxed at ~60FPS at 1440P, and one doesn’t even come close to cutting it. I know this from experience.

    • Rian Snuff says:

      I can already play a lot new games fully maxed shy full AA and perhaps some ambient occlusion on my three year old 6950’s at respectable frame-rates at three times the resolution of 1080. Seeing as how having a 4K is bout’ double that of 1080 and nullifies the need for AA. I’d have to disagree with you.

      • Old Rusty Dusty says:

        I hate to be a stickler, but it is actually quadruple the resolution in terms of pixels. 1920×1080=~2.1 Million pixels, and 4K (3840 × 2160)=~8.1 Million pixels. You could say that it’s “double” in terms of doubling the horizontal and vertical number of pixels, yes, but the net effect is a 4-fold increase in pixels, which means you’ll need 4 times the graphics processing power to achieve the same FPS.

  18. LintMan says:

    I’m really hoping that 4K and the proliferation of high res portable displays will finally free us from the unending tyranny of the 1080p “HD” monitor we’ve been stuck wallowing at for 14+ years now by making consumers realize that “1080p” is NOT high resolution for a computer display.

    Seriously, 14+ years we’ve been stuck at 1080! I had a 1600×1200 monitor (CRT!) back in 1998! That’s scarcely less pixel area than’s today’s standard 1920×1080 display.

    • D3xter says:

      Yes it’s sad :P
      Carmack was coding Quake on a 28″ 1080p displays back in 1995: link to geek.com
      And IBM released their first 4K monitors/displays back in 2001: link to theinquirer.net
      Why this didn’t get mass market appeal at that point is obvious, since the processing power for most of the content (including games and videos) wasn’t exactly up to snuff yet.

      Soon after IBM gave up their hardware manufacturing ways and sold most of their PC business to Lenovo, and we’ve since been in a degrading curve to the lowest common denominator cheapest possible TN displays, without even offering high quality alternatives most of the time.
      It’s sad that it took the likes of Apple, their marketing and the mobile industry to finally be ready to move on.

  19. Hoaxfish says:

    I think the Rift will be pretty exciting, but a screen will always be more convenient (for talking to other people, for watching tv at the same time, for multiple people watching the same thing, not having to take it off/put it on, etc)

  20. PopeRatzo says:

    Game developers can barely keep the framerate up on 1920×1080 FPS games without nuclear-powered hardware. They’re not close to ready for 4k monitors for mainstream gaming.

    And with developers favoring consoles and hardware manufacturers going all-in on integrated graphics (something that was actually celebrated on these pages, BTW), I think it’s going to be a while.

    What’s the big incentive for developers to start to make games look good on 4k?

    • Dominic White says:

      A mid-range PC should run most things maxed out at 1080p, if you’re willing to trade FSAA for SMAA. An i5-3570k and a 560ti should do just about anything on the market right now.

  21. essentialatom says:

    The cardboard cut-out effect you see with some stereoscopic 3D is not to do with the quality of the display, whether active or passive specs, a glasses-free display or the Oculus Rift. It’s to do with the field of view – the longer a lens, the more foreshortened the perspective and thus the flatter everything looks. When depth is added to such an image, things tend to look more separated and individually flat. This is something content creators need to figure out and get to grips with.

    It’s a worse situation in cinema because fairly long-lens photography is very typical, whereas in games FoV is more malleable and tends to be shorter in order to envelop the player (thinking of FPSs in particular). The best 3D films often don’t have the same visual feel as 2D films, as they use shorter lenses more in order to provide more natural 3D. (They also tend to cut less frequently and things, and some films with really good 3D don’t follow any of the rules, but that’s another conversation.)

    The extreme of this in cinema is IMAX, in which the geometry of the cinemas themselves is tightly controlled so as to keep the audience within a given range of viewing positions relative to the screen, which means that you can fine-tune the 3D to provide a much, much more natural and astonishing experience. If you’ve ever been to an IMAX to see one of their 3D science or nature films you’ll know what I mean. To bring it back to games, hopefully the Oculus Rift can do the same kind of thing in acting as both the method for achieving extreme quality and as a standard by which 3D experiences in games can be created and evaluated.

  22. Curly says:

    Who are “the mouth-breathing masses” and why are you insulting them? Please be nicer to people in your articles.

    • Premium User Badge

      FhnuZoag says:

      I’m mouth breathing pretty hard right now. I’ve got a blocked nose though because of a cold.