Week in Tech: Will It Play Crysis, & More On PS4
"one hell of a CPU-GPU fusion processor"
Ahhh, the quest for PC games with graphics so good, they look pre-rendered. When will it end? Actually, it'll end when PCs are capable of graphics that look pre-rendered. It's going to happen. And say what you want about the gameplay or narrative, but Crysis 3 is a reminder that we're getting ever closer. Think it's also worth a recap on the Sony PS4 launch now that its beating, PC-derived heart has been officially revealed and tell you why I'm increasingly convinced it's good news for the PC.
Right, will it play Crysis? The mere mention has my heart sinking. When Crysis first came out, it was breathtaking to look at. No question. I played it for at least two hours, just for the visuals.
Doesn't sound like much, but it's pretty much a record in terms of my tolerance for tedious shooting galleries. So, Crysis wasn't fun. It wasn't even very well coded. And the more you looked at it, the more the visuals felt a bit unfinished.
Ultimately, it was a pants metric of PC performance. But it became the gold standard and for years it was unrivalled for both absolute visual fidelity and homicidal levels of GPU abuse. When you think about how fast PC tech can change, it was a deeply impressive run in many ways.
Crysis 2 was a much more consoley affair. Way more polished in every regard, but dumbed down and by at least some measures, actually less detailed and less capable in graphics terms.
Now there's Crysis 3 and, dare I say, brief acquaintance suggests it might just be the best of both worlds. And like the eye-candy crack whore I so surely am, I find myself sucked into playing it just for the goggle factor.
I'm bored already, obviously. Don't ask me what I think of Crysis 3's gameplay. I'm barely aware of it and definitely don't care. And it doesn't actually pass the pre-rendered test overall. Not even close, to be honest.
Dead behind the eyes: Pixar would be proud
But, my science, it looks good. And, critically, there are elements that for fleeting moments look like the real Pixar deal. You catch a glimpse of something, maybe a face on a character, and for an instant it really does feel pre-rendered.
Needless to say, I can hardly ask you to unload £40 just to look at the pretty shaders for a few hours. But I'm a hardware guy at heart and I love seeing the game being moving on in terms of graphics quality. If you can find a way of seeing it run on a powerful PC, do. Those video caps just aren't the same.
And so the PS4
As for the Sony PS4, the official launch has confirmed that the leak specs we discussed yestermonth turned out to be bang on the money. We're talking eight AMD Jaguar cores, 1,152 GCN-style AMD graphics shaders and 8GB of actually pretty impressively quick memory.
Technically speaking, two new interesting details emerge. For starters, they've stuffed the whole shebang (well, not the memory, but the CPU and graphics) into a single chip.
I'm not sure if it fully qualifies as a system on a chip. But it's one hell of a CPU-GPU fusion processor and far more powerful in terms of graphics than any combined processor you can buy for the PC.
PlayStation 4: It's a Radeon HD 7850, but not as we know it
Then there's the full shared memory space with 176GB/s of bandwidth shared between the two processor groups, so to speak.
In many regards, it's a glimpse at what PC processors will look like in a few years time. Fusion chips like this will eventually replace the existing discrete CPU-and-GPU model, even in high performance PCs.
The shared memory space answers one of the big questions that raises, which is the problem of bandwidth on fusion chips. GPUs need tonnes. CPU memory controllers aren't nearly good enough. It's also a reminder that heterogeneous computing is on the way to the PC.
Sounds like bullshit, but it really just means having a big chip subdivided into areas of specialisation in terms of workload but at the same time integrated in terms of programming and memory space.
Ideally, what you want is to make some code, shove it on the chip and have it automatically run on the most effective and efficient part or parts. You don't want to muck about making special code paths, which is effectively what game devs do today for GPUs.
It should make for more efficient and ultimately even more powerful and capable systems. I think we're a long way off true heterogeneous computing, but the PS4's design is a reminder that it's on the way.
The other part of the equation is the impact the PS4 will have on PC gaming in the shorter term. From what I understand, the fact that both PS4 and the next MS Xbox are PC based in terms of chips means that from here on in, most of the big games devs will work on PCs and then hit recompile buttons for the two consoles.
It's more complex than that, but the point is that the PC makes a very natural focus point for games development now that the consoles are based on pure PC hardware. That has to be a good thing for PC gaming.
On a final note, I wanted to revisit those AMD Jaguar cores in the PS4. Like I said last time, they're horrible. Clock for clock, they probably do, I dunno, less than a third the work of Intel's Ivy Bridge CPU cores. And clocked at 1.6GHz makes them roughly half as fast in terms of clockspeed.
Not to put too fine a point on it, but do the maths and single thread performance is going to blow goats. That upset me before. But I'm over it now. There are eight of them. And if devs pull their fingers out, that's probably enough to get some decent work done.
What's more, it should help even things out for AMD. Its Bulldozer CPU design sucks at single-thread but is much more competitive at multi-thread. So if game devs really do crack the multi-thread problem, it will make AMD processors much more attractive for gaming PCs. In a word, yay.
Whether all this will happen in time to help AMD survive, I don't know. But it certainly won't hurt.