Graphics, graphics, graphics. It’s all you lot care about. Actually, it’s what I care about most when it comes to PC performance. So why fight it? Instead, I’ve got a couple of graphics-related titbits for you this week. Firstly, I’ve had a chat with Intel’s graphics guru, Richard Huddy. Odds are, you’ll be gaming on Intel graphics one day. What’s more, the mere fact that Intel has snapped up the likes of Mr Huddy, previously known for his dev-rel uberness at ATI, when there was an ATI, is symptomatic of Intel’s increasingly full-on attitude to graphics. The other part of this week’s awfully exciting package is NVIDIA’s new GeForce Experience. It’s an automated game settings optimisation tool. The idea is to take the headache out of graphics settings and give you the holy grail of PC performance and visuals with console levels of setup pain, which is to say zero pain.
First to Intel and its integrated graphics. We’ve previously covered the fact that Intel has been revving up its processor graphics cores in recent generations and that next year’s Haswell chips will take that even further. But just how serious is Intel about bona fide gaming high performance gaming? And can it really deliver?
On the hardware side, I don’t doubt that Intel can wheel out big improvements in raw graphics hardware capability, generation upon generation. It’s doing that already. In recent and coming generations of Intel CPUs, most of any given increase in transistor budget has been spent on increasing graphics power, not beefing up or adding additional cores.
It’s the software, silly
What I’m not so sure about is Intel’s ability to get a grip on the software side of the equation. “We’ve come a long way in the last three years,” says Huddy. “The difference now is that we’re absolutely focussed on delivering a complete solution of hardware and software.”
Of course. But how, pray tell, will that actually be delivered? After all, Intel’s track record for graphics drivers isn’t exactly stellar. Hell, AMD gets a kicking for its graphics drivers and, for most of human history, they’ve been on a different planet from Intel’s drivers.
A big part of the solution is embodied Huddy himself. He heads up a team of engineers working with game developers across Europe. He’s been around the block doing the same job for ATI and AMD. And he knows what he’s doing. “Regular driver releases are critical,” he says, “as are improved responsiveness to game developers and quick turnarounds on driver bugs.”
Huddy serves up an example in which developer Avalanche was wowed by a demo of Just Cause 2 running on Intel integrated graphics and is now keen to work more closely with Intel. But if I’m honest, I don’t get the impression that a huge amount of per-title optimisation has been done already.
3x Richard Huddy. From his ATI days. Because it needs to be 600 pixels wide
But then Huddy has only been with Intel since the beginning of the year. Anywho, this is not immediately dramatic news. But given the trouble AMD is in, it’s worth knowing that Intel is making all the right noises about driver quality and developer relations and that it’s got the right people working on the job. Frankly, it’s a good sign that it wants to get the message across to you guys in an interview.
I’ll be intrigued to see whether Haswell chips really do deliver a half-decent gaming experience. In the meantime, it would be interesting to hear if any of you lot do significant gaming on Intel graphics. If you’ve got feedback, particularly regarding driver quality and consistency, let it rip. There’s a good chance Intel will be watching. You never know, one day integrated graphics might not suck.
It’s an experience, alright
This week’s second hot topic is GeForce Experience. As I said above, the idea is PC performance combined with console ease of use. Functionally, that means two things, automatic optimisation of game settings and driver management.
On paper, it’s a bloody good idea. Personally, I get no joy out of mucking about with graphics settings. The idea that somebody has done all the work for me sounds just peachy.
NVIDIA says playing at ridiculous non-native resolutions and craptastic graphics settings is surprisingly common because people simply install a game and fire it up without touching the settings.
I doubt too many RPSers fall into that camp. But I bet plenty of you would like a helping hand with achieving the best settings. So how’s it done? Much of it is in-house testing at NVIDIA with what it claims involves thousands of hardware configurations. There’s also an element of user feedback come crowd sourcing, too, so it won’t just be NVIDIA decreeing settings. Gamers will inform the process.
GeForce Experience doing its one-click optimisation shizzle
It’s worth noting that both image quality and frame rates are targeted and the latter varies according to game type. In other words, twitchy shooters are optimised for higher frame rates than brainy strategy games. Oh, and it only works with NVIDIA graphics cards in Fermi and Kepler flavours, which for the most part means GeForce 400, 500 and 600 Series boards.
Usefully, the interface shows you what settings it’s gone for, so you can also use it as a learning tool. Right now it’s in beta with a limit of 10,000 users and 30 game titles supported. It’s only just been released and I haven’t had a chance to have a go. But if a few RPSers would like to take it for a spin, shout out and I’ll put the request in.