Skip to main content

Week in Tech: AMD's Mostly Not-New Graphics

Mantle Piece


All of this has happened before. And all of it will happen again. AMD has just launched its latest family of 'new' graphics boards and I feel like number two's been whispering portentous, spacey waffle in my ear. The spec lists for the new boards are shot through with galactic levels of déjà vu. But before you get completely bummed out by what mostly amounts to a major bout of rebranding from AMD, there's a wildcard in the form of this weird new thing called Mantle. It might – just might – give AMD GPUs, including every Radeon HD 7000 already in existence, an unassailable performance advantage in the bulk of new games over the next few years.

Those 'new' AMD chips in full
I'll keep this bit as snappy as possible. On paper, we have an entirely new family of graphics cards accompanied by new branding. Out goes Radeon HD, in comes Radeon R7 and Radeon R9. From here on, R7 are cheaper mainstream graphics cards, R9 are the pricier performance boards.

As things stand, five GPUs have been announced, two R7s and a trio of R9s. But here's the thing. Only one of these GPUs is actually new. The rest are rebranded versions of the existing Radeon HD 7000 Series family.

So, it's dusty old chips cynically rebranded? That's certainly one way to look at it. But first, let's get the boring bit out of the way and identify exactly what we're dealing with here.

There's a low-end R7 250 we don't need to worry about. The relevant stuff starts with the R7 260X. It's based on the Bonaire GPU as found in the Radeon HD 7790. Next up is the R9 270X, which is Pitcairn reborn, otherwise known as the Radeon HD 7870.

Then there's the R9 280X. Here we're talking Tahiti and thus HD 7970. Throughout, the detailed specs appear to be identical to the outgoing boards. Things like clocks, shader counts, the works, it all looks very familiar.

7970 begat 280X. Actually, more accurately they're one and the same.

The only one of the above I've had a play with is the 280X and I can confirm it's a dead ringer for the 7970. Move along, nothing to see.

But wait, there is something new
That leaves us with the Radeon R9 290X. Finally, we have something that's actually new, a GPU known as Hawaii. The nasty NDA hasn't lifted on that, so full details aren't yet in the public domain. But the overwhelming weight of opinion says 2,816 shaders (up from 2,048 from AMD's previous best), a meaty 512-bit memory bus, a doubling of ROPs compared with the 7970 (or 280X, if you like) and, again, a rough doubling up on tessellation hardware. The 290X is going to be fugging quick, make no mistake.

What the 290X is not is a major step forward in architecture. It's basically a GCN chip (that's AMD's current graphics architecture) with a few detailed tweaks. So, essentially the same core architecture as the 7000 Series and, of course, the two new games consoles. This may well turn out to be a very good thing for reasons we'll come to in a moment. But even the 290X isn't a truly new chip. Think of it as a 7970 with some added horsepower.

All of that begs an immediate question. Why? It really boils down to two things. First, there's no new viable production node available for AMD to use. The TSMC 20nm node apparently isn't quite ready for beefy graphics chips, so 290X remains a 28nm effort as do the rest.

Secondly, with both XBone and PS4 being based on GCN, there's a strong argument for keeping AMD's PC chips closely aligned with those consoles. Hold that thought.

The sordid matter of money
What this all comes down when it comes to taking a view on how attractive these 'new' boards are is pricing. With a new flagship GPU and a bunch of rebrands, you'd think the 290X enters up top and everything else falls down a peg in the price list. If that was the case, I'd be cheering the new chips to the rafters.

It's not like existing games make exhaustive use of GCN's features. So I'm happy to see it stick around for a while, especially as it's the target architecture for developers courtesy of the consoles.

AMD carrying over its GCN clobber may not be a complete bummer when you consider the console connection.

So, what I was really hoping for with the new GPUs was a return to AMD's fleeting strategy of maximum gaming grunt at around £200. That's what AMD gave us with the old Radeon HD 4870 Series and that's what it promised it aimed to keep delivering.

The 4870 gave you 80 per cent-plus the performance of Nvidia's best for miles less money. But AMD ditched that promise within a generation and since then performance GPUs have been getting pricier and pricier. It's now frankly out of control with Nvidia charging an idiotic £800 for Titans.

As I write this, the typical UK entry price for the 280X looks like roughly £230 to £240 which is at best the same as 7970s have been going for of late and therefore an utter fail in terms of bringing existing tech down in price. This makes me very unhappy.

As for the 290X, with any luck pricing will undercut both the Nvidia Titan (not hard to achieve) and the Titan-derived GTX 780. At a guess, £400 to £450, then. Then again, if the 290X turns out really quick, it will probably be more expensive.

Of course, it's worth pointing out that Nvidia did much the same thing regards rebranding with the GTX 700 series earlier this year. In fact, it didn't add even one new GPU. It just reconfigured and rebranded. But it did give us more performance for less money, even if its pricing towards the high end remains pretty offensive.

In the end, I can only hope that UK pricing for the 280X adjusts down a bit and in the meantime, point you guys in the direction of the cheap 7970s that are available for as long as stocks last. Scan, for instance, has an Asus 7970 clocked at 1,010MHz for £220. If I had to buy a board today, that's where my money would go. Remember, it's the same chip as the 'new' 280X. You're not buying into a defunct architecture.

The console connection
The final ray of hope in all this is that Mantle thing I mentioned earlier. There's a lot of debate over what exactly Mantle amounts to. AMD is being a bit cryptic. And frankly I don't want to get bogged down in the details. But what it might be is this: the low-level graphics API pinched from the XBone and brought to the PC.

AMD's Mantle: Low-level API. High concept stuff.

If it really is just that, the impact could be pretty dramatic. As you guys know, consoles tend to punch above their weight regards graphics performance. And that's because developers make low-level code that's hooks directly into the hardware. It's worth the effort for developers because the consoles are a static target that sticks around for years.

On PC, coding tends to be done at a higher, more generic level that works across the broader spectrum of GPUs inside PCs. But with GCN being in both consoles and then sticking around in PCs, you have the tantalising prospect of PC hardware operating with similar efficiency and overheads as consoles. Put another way, the very same low-level code that runs fast on consoles will run fast on PCs, too.

In theory, that could give AMD chips a major advantage with any game that's developed in parallel on PC and console. One really intriguing possibility is that Mantle might even help AMD's CPUs. Part of what Mantle is claimed to do is massively reduce the CPU overhead from draw calls. If so, we could be in a situation where CPU performance becomes much less critical for in-game frame rates when running AMD GPUs.

I'll stress that, for now, this is all pretty speculative. And there's already talk of some things Nvidia can do to counter all this. But we'll begin to get an idea of just what Mantle will deliver when DICE releases a Mantle-enabled build of Battlefield 4 in December. If it runs like poo off the proverbial, then things will be looking very interesting for AMD.

Battlefield 4 will be the first chance to get the measure of Mantle.

Personally, I hope Mantle delivers on the hype. Not because I want AMD to come out on top. But because there's a chance it will force a market-wide price adjustment for GPUs. If AMD chips are suddenly faster in every new console port, odds are Nvidia will have to respond with more competitive pricing. And that means we'll all get more performance for less money whatever graphics card we buy.

Read this next