By Jeremy Laird on July 10th, 2014 at 9:00 pm.
And so on this 10th day of the seventh month, the year of our Lord two thousand and 14, the final hard disk drive verily came to pass. And there was much rejoicing. Or should that be wailing and gnashing of spindles and platters? Whatever, Hitachi has unleashed what it claims is the highest performing and largest 10,000rpm HDD. Like, ever! Actually, I think an additional qualifier may be its 2.5-inch form factor. But either way, with cheap SSDs now approaching the point where you might consider one for mass storage, let alone boot drive duties, the Hitachi Ultrastar C10K1800 – ye shall know it by its name, etc – feels very much like a swansong. Meanwhile, momentum appears to actually be building for AMD’s Mantle graphics API. Does that mean performance-enhancing magic for all AMD graphics owners? Death to Nvidia? Or just a temporary blip on the road to DX12?
Of course, that Hitachi drive is something of an irrelevance to we desktop dinosaurs. It’s designed for servers and other high density enterprise applications and uses the SAS interface. Thus, it’s not a goer for your PC.
But it’s interesting to note that even this speediest of HDDs can only manage around 250MB/s of sustained data transfer. Not too shabby, but about half of what a really good SSD can manage. The real killer will be random access performance, though.
Hitachi says clever cache technology gives a 2.5x boost in random write performance over its previous best HDD. But we’re talking 2.5x over something very feeble compared to even a budget SSD.
The Crucial MX100 is silly value for money. Did I mention what bonkers value it is? Or just what a great bit of value it is?
What’s more, with 512GB SSDs now genuinely affordable ($225 / £150 or less), even the Ultrastar C10K1800’s 1.8TB capacity doesn’t look that spectacular. Anyway, it’s a not-terribly-misty-eyed farewell to magnetic platters from me. We knew thee well, but we won’t miss your brick-chewing impressions and sluggish performance.
On to AMD’s Mantle, then. Since Microsoft introduced DirectX back in 1995, proprietary graphics APIs have been a bit of a non starter.
It was a blessed relief for anyone coding game engines to be able to code a single path and have it run on everything. Of course, that didn’t happen over night. But DirectX’s assimilation of the market was pretty rapid and largely a good thing.
What’s more, as is inevitable with any maturing market, the number of companies making graphics chips consolidated dramatically. Today, there are only three player in PC graphics – AMD, Intel and Nvidia – and all conform very closely to DirectX.
It’s also true that, over the years, DirectX and its graphics-specific D3D subset has been one of the proverbial ‘good things’ from Microsoft, a company with a great reputation for making money but perhaps a slightly less glorious track record in giving a shit about its customers.
Anywho, if we assume DirectX has generally been a success story, the question is why AMD is even bothering with Mantle and, in turn, why anyone should care about it.
AP…why? Aren’t proprietary API’s like Glide, as required for ye olde 3DFX Voodoo boards, antediluvian irrelevancies?
We’ve touched on some of this before, but the answer, in two words, is games consoles. Not so much the competition for PC gaming provided by games consoles, though that is a factor. But more the insight consoles provide into the shortcomings of the DirectX API on the PC.
To cut a long story short, developing for consoles makes games developers more aware of the limitations and overheads of coding games for the PC.
At any given point in the product cycle when comparing games consoles with PCs, it’s the latter that usually comes out on top in terms of on-paper performance potential. With the arrival of Xbox One and the PS4, that gap has arguably grown even greater. Even at launch, the new consoles were miles off the pace of a high end PC in terms of much of the hardware spec.
And yet games developers have often managed to achieve comparable results regardless of whether the end platform has been console or PC. Now, some will argue that this comes down to artificially hobbling PC ports to prevent the money-making console versions from looking second rate.
Cue the latest controversy over Watch Dogs, Ubisoft, Nvidia Gameworks and all that noise. Whatever the truth – and it doesn’t smell nice from where I’m sitting – the real long-term issue here is the different access that developers have to hardware when comparing consoles and PCs.
The relevant cliché typically wheeled out at this point is getting ‘closer to the metal’, the metal being the very circuitry inside a graphics chip. In layman’s terms, that means being able to code games directly to the hardware rather than abstracting the code to be compatible with some or other software layer.
Of course, even in games consoles, developers aren’t coding directly to the ‘metal’. But there’s a lot less garbage in between and much less overhead. Apply the same efficiencies to PCs and you’d suddenly enjoy the kinds of performance leaps AMD is claiming for Mantle.
If you take AMD’s word on things, the reason why Mantle exists is due to a clamour from developers who wanted the same efficiencies on the PC as consoles. And that’s why Mantle is now being picked up by games devs in what you could arguably call unprecedented numbers for a proprietary API.
Hey dawg, I heard you like pixel shaders…
Well, I say unprecedented. Much of this is rumour, but it helps that multiple games are often built on the same core engine. With the hugely successful Frostbite 3 engine supporting Mantle, for instance, you can start with Battlefield 4 as officially supporting Mantle and infer a whole bunch of future titles as likely to support Mantle, like the next instalments of Mass Effect and Mirror’s Edge.
The CryEngine, Unreal Engine 3, the Nitrous Engine, the Cobra Engine (which underpins Elite: Dangerous) and more are all either confirmed or hotly tipped to support Mantle.
You can have a look at a list of all the confirmed and rumoured Mantle games here, but chuck in a number of in-house engines also supporting Mantle and you have a line up that’s approaching critical mass.
What happens next is an interesting question. AMD talks about its intention to open up Mantle fully so that ‘other graphics vendors’ (ie Nvidia) may also benefit it. And I’m sure AMD will do just that.
However, I can’t quite stomach its rather holier-than-thou attitude on this one. AMD’s Richard Huddy says he suspects Nvidia’s pride will prevent AMD from picking up Mantle. And that’s probably true.
On the other, it hasn’t happened yet and it seems to me AMD is dragging its feet on opening out Mantle and my hunch is that’s a deliberate move to enable some PR wins in the benchmark war. Fair enough, you might argue, but it sits rather disingenuously with AMD’s open-source halo.
In the long run, Mantle might turn out to be a momentary blip. Much of what Mantle sets out to do in terms of getting closer to the metal and reducing overheads and bottlenecks is purported to be in DirectX 12. And generally, we’re all better off if the benefits are available to both Nvidia and AMD GPUs.
But while we wait for DX12 to kick in, the next year could be very interesting. If a large proportion of the best games really do turn out to be Mantle-compatible, Nvidia won’t stand still. Whether it’s cutting prices on its graphics cards or doing something with drivers, I reckon the net result is likely going to be more performance for less money. However you slice it, then, and whatever graphics card you own, AMD’s Mantle is almost definitely good for gaming.