By Jeremy Laird on May 27th, 2013 at 1:00 pm.
Ha, sorry. Not really. But it got your attention. And there’s a thin tendril of truth in it. It’s been a busy week in hardware and in my mortal hands I hold a laptop containing AMD’s Jaguar cores. The very same cores as found in the freshly minted games consoles from Microsoft and Sony. So what are they like and what does it mean for PC gaming?
Meanwhile, Nvidia drops a price bomb of the bad kind and Intel has some new chips on the way. Read on for the gruesome details.
AMD Jaguar, then. The little lappie AMD sent out for evaluation has four Jaguar cores running at 1.5GHz. Both of the consoles are rolling eight Jaguar cores at 1.6GHz.
On a core-for-core basis, then, the AMD A4-5000 chip I have in hand (also known by its codename Kabini, if you care about that kind of thing) is extremely close to the consoles. What’s more, I’ve an inkling it might actually be close in terms of the core count available for games.
Dunno about you, but when I watched that quick-switching, multi-tasking demo for the new Xbox One, I distinctly got the impression that the trick involves keeping everything running all the time.
And that makes me think the CPU cores are going to be partitioned. In other words, games are likely only going to have access to a limited number of cores. On that note, the Jaguar CPU architecture essentially groups cores in modules of four.
The new Xbox One / PS4. Well, bloody nearly…
Communications between those modules are sub optimal compared to core-to-core comms between cores in a module. That only lends weight to the idea that both consoles will use one quad-core module for games and the other for everything else.
If that’s true, then the thing lying on the floor next to me right now is almost bang-on in terms of the gaming CPU in the Xbox One and PS4. Is it any good?
In a word, no. Not in a gaming context. To be fair to AMD, the A4-5000 is a nice chip for its intended market – mobile devices at the cheaper end of the market. It’s got miles more horsepower than existing Atom chips (though Atom is due a major overhaul to its cores very soon).
But as a gaming CPU? Let me give you some numbers. In raw processing terms, these four Jaguar cores have slightly less than a quarter the grunt of a Core i5-3570K. It’s the same story on a core-by-core basis. Less than one quarter of the performance.
Really, this is no surprise. The Jaguar core is a dual-issue item running at roughly half the speed of Intel’s quad-issue desktop cores. It’s a competitor for Intel’s Atom core, not the full-fat Core, er, core. It was always going to be this way.
On the graphics side, the Kabini chip shares 3D tech with consoles, too. But the gap is pretty big in terms of functional units – just two of AMD’s GCN units to 12 for the Xbox One and 18 for the PS4. The PS4 in particular has a very different memory architecture, too, so the raw 3D rendering comparison isn’t hugely revealing.
Does make you wonder why the Xbox One is as thick as a whale omelette…
As for what it all means for the PC, well, we’ve touched on this before, but you can make the argument pretty much any which way you fancy. On the one hand, it does rather look like you’ll pretty much never have to upgrade your CPU to cope with the next decade of console ports. Almost any half decent CPU you currently have will be game enough.
It might also encourage anyone who’s thinking about really pushing the envelope of gaming to focus on the PC. I just can’t see how developers are going to really ramp up game engine technology with these CPU cores. Next-gen AI, fancy physics, don’t see how it’s possible.
At this point, somebody will pitch up and opine that they’ll shove some of that onto the GPU. But the graphics grunt in the new consoles is merely OK, so where’s the spare headroom? Pinching GPU resources will limit graphical fidelity.
Then again, it might just mean nobody bothers at all and we’re doomed to suffer a future largely populated with console-compromised ports.
Nvidia and other stuff
Meanwhile, Nvidia dished the dirt on the first part of the new GeForce 700 family, the GTX 780. It’s pretty much as we discussed in posts passims, save for two details. Firstly, pricing. It’s £550 minimum, which I find pretty objectionable and robs it of much of its appeal.
Too expensive. Next!
Secondly, they’ve knobbled double precision processing. This isn’t hugely gaming relevant. It’s about preventing people from bagging 780s to use for professional number crunching. As for the rest of the 700 series, let’s assume the 780 is a harbinger of things to come and say that the technical details we’ve already been through are correct but the pricing is a lot higher.
Given that it was pricing that was the main attraction (the 700 series are not new GPUs), I’ve just gone off Nvidia’s new graphics cards in a big way. Oh well.
Finally, Intel’s new Haswell CPUs are imminent. I’ll cover them in more detail when I’ve had a proper play and the NDA has lifted. But I spoil everything right now by saying that for desktop gaming it’s a case of jog on, nothing to see.
That said, Haswell might just have something to offer in the cheap gaming laptop department. Watch this space.