Week in Tech: 4K Screens, Virtual Reality, New GPU Grunt
The future of looking at stuff
Let's shake off the downbeat vibe from our last installment involving console toyboxes with some promising prospects for PC display tech. For starters, ultra high-res LCD panels look fairly likely to transition to the PC. If that's an incremental step, could virtual reality in the form of the Oculus Rift headset deliver a real-world rendering revolution? I've also got a little something known as frame latency for you to think about regarding graphics performance.
First up, is 4k tech set to hit your desktop any time soon? The driving force here is once again trends in other areas of consumer electronics.
How so? 1080p smartphones are now popping up and thus your handset will soon have as many pixels as your desktop PC, which can't make sense. Then there's the Google Nexus 10 and its pixel-packed tablet brethren bringing you a 2,560 x 1,600 grid in just 10 inches.
Google's Nexus 10 has as many pixels and a 30-inch panel
All the while the mouth-breathing masses are generally being beaten about the head with Apple's “Retina” display marketing and 4k pixel grids are being touted as the next big thing for HDTVs. In short, expectations and comprehension of the benefits of more pixels are on the up.
So, just like the rise of IPS tech on phones and tablets presaged the appearance of cheap IPS PC monitors, could the same thing happen with high pixel densities? In some ways PC monitors have actually been regressing when it comes to resolution. 1,920 x 1,080 has largely replaced 1,920 x 1,200 and 30-inch panels with 2,560 x 1,600 panels have dwindled in number. Not good.
But you could say the same thing about panel tech, with TN screens becoming almost entirely ubiquitous before IPS made a resurgence. Are there any hints of a high-res revolution to come? A few monitor makers had their new 4K wares on show at the CES show in January. 4K in this context means 3,840 by 2,160 pixels.
The 4k monitors: Was $50,000, now $5,000, next year $500?
Unfortunately, we're currently talking about prices around the $5,000 mark or more. Mainstream these new 4K monitors most definitely are not. But this is all about context and a few years ago such a display would have been $50,000. We're on the way, in other words.
The other reference point is, of course, the Apple 15-inch MacBook Pro with its 2,880 by 1,800 Retina display. I hate to admit, but just like Apple woke people up to IPS, it could be instrumental in driving higher resolutions across the PC industry.
Put another way, if your MacBook has a Retina display, shouldn't your iMac have one too? And if panel makers are tooling up to make panels for Apple, they'll flog them to monitor makers, too.
More pixels than any desktop PC? That'll be the MacBook Pro with Retina display
A 4K PC would give instant benefits, too. With a PC, you instantly have a much bigger desktop and the prospect of running games at insane resolutions. For for 4K TV, well, there's little to no 4K video content out there. The end.
Of course, you could argue that extra res is somewhat redundant if the underlying game assets – the geometry and textures – are of mediocre fidelity. And there's the minor matter of driving such a high resolution display, both in terms of video outputs and pixel-pumping power. The latter becomes a major problem if you combine higher resolutions with 120Hz refresh rates.
But my general feeling is that the arrival of 4k or at least higher pixel densities in general is going to happen over the next few years and it will be a good thing. It will push higher resolutions into smaller form factors and make existing displays cheaper. I'd like to see 2,560 by 1,600/1,440 in the 23 to 24-inch market and also see 27-inch 2,560 by 1,440 panels become truly affordable.
27 inches of Samsung glory. If only it was cheaper
And so we come to another PC-related display prospect and one that I'm instinctively reluctant to even mention. It's virtual reality. A bit like stereoscopic 3D, VR is a technology for which the theory is fab but the reality tends to suck.
But the signs are that's about to change thanks to the Oculus Rift headset. To my great regret, I've not had a chance to try the Oculus Rift, of which you will all surely have heard. But according to people who sampled the latest version as CES, it's a stunning experience.
Early dev kits of the Oculus are finally beginning to roll out
The overall immersion and the ability of the Oculus Rift headset to respond rapidly enough to head movements to generate an effective illusion have been big talking points. And they are critical. But I've also heard that another big difference is the ability of the system to generate convincing stereoscopic 3D images.
3D tech to date tends to produce a frankly piss-poor cardboard cut-out effect. By that I mean most objects in the scene appear as fairly flat objects located at different points and angles in space. So there's some sense of overall depth to the scene, but often not the objects themselves.
Well, apparently Oculus Rift is awesome in that regard. I like the fact that's it's come from a home-brew start up, not a faceless corporation, too. I'm really looking forward to giving it a go. It's potentially a massive game changer. I know. Sorry.
Anyway, the Oculus Rift has just started production in dev-kit form so it looks like the project is truly a goer.
In other fairly recent news, you lot might like to hear about one of the latest trends in graphics performance testing. As you'll know, measurements of average frame rates are the standard metric of graphics performance.
The next step is to look at the minimum frame rate, which in many ways you'd think is the most critical measure. At worst, how slow is a given GPU when running a given game? If it's never slower than 60fps, who cares how much faster it sometimes is? You might actually care if you have a 120Hz screen, but you get the point. You can add another layer of analysis by looking at performance over time in a graph. Surely, then you know everything?
As ever, however, it's not quite as simple as that. For the full story, you need to look at what you might call sub-second performance. The problem is – or at least, can be – frame latency. The idea here is that individual frames can suffer an acute delay or latency in rendering. The overall frame rate for, say, a given elapsed second of rendering might look good. But there may have been moments when the rendering of an individual frame more or less locks up.
GPU testing, Tech Report styleee
In practice, this is something you can see in the form of visual stuttering. It's a noticeable lack of smoothness. You can read more about it over on the Tech Report. But suffice for now to say the issue emerged with particular regard to AMD GPUs.
It's something AMD is now fully aware of and has produced a beta driver that in part addresses the problem. For now, as I understand it, the fixes are on a per-game basis, so inevitably not every title will have been covered. But AMD is working towards more generalised optimisations.
For clarity, do not misinterpret all this. I'm not suddenly suggesting all AMD GPUs are stuttering piles of junk. But it is an area of graphics performance that's worth knowing about and keeping an eye on when you come to your next video card purchase.