Week in Tech: Ode To The HDD, More On AMD Mantle

And so on this 10th day of the seventh month, the year of our Lord two thousand and 14, the final hard disk drive verily came to pass. And there was much rejoicing. Or should that be wailing and gnashing of spindles and platters? Whatever, Hitachi has unleashed what it claims is the highest performing and largest 10,000rpm HDD. Like, ever! Actually, I think an additional qualifier may be its 2.5-inch form factor. But either way, with cheap SSDs now approaching the point where you might consider one for mass storage, let alone boot drive duties, the Hitachi Ultrastar C10K1800 – ye shall know it by its name, etc – feels very much like a swansong. Meanwhile, momentum appears to actually be building for AMD’s Mantle graphics API. Does that mean performance-enhancing magic for all AMD graphics owners? Death to Nvidia? Or just a temporary blip on the road to DX12?

Of course, that Hitachi drive is something of an irrelevance to we desktop dinosaurs. It’s designed for servers and other high density enterprise applications and uses the SAS interface. Thus, it’s not a goer for your PC.

But it’s interesting to note that even this speediest of HDDs can only manage around 250MB/s of sustained data transfer. Not too shabby, but about half of what a really good SSD can manage. The real killer will be random access performance, though.

Hitachi says clever cache technology gives a 2.5x boost in random write performance over its previous best HDD. But we’re talking 2.5x over something very feeble compared to even a budget SSD.

The Crucial MX100 is silly value for money. Did I mention what bonkers value it is? Or just what a great bit of value it is?

What’s more, with 512GB SSDs now genuinely affordable ($225 / £150 or less), even the Ultrastar C10K1800’s 1.8TB capacity doesn’t look that spectacular. Anyway, it’s a not-terribly-misty-eyed farewell to magnetic platters from me. We knew thee well, but we won’t miss your brick-chewing impressions and sluggish performance.

On to AMD’s Mantle, then. Since Microsoft introduced DirectX back in 1995, proprietary graphics APIs have been a bit of a non starter.

It was a blessed relief for anyone coding game engines to be able to code a single path and have it run on everything. Of course, that didn’t happen over night. But DirectX’s assimilation of the market was pretty rapid and largely a good thing.

What’s more, as is inevitable with any maturing market, the number of companies making graphics chips consolidated dramatically. Today, there are only three player in PC graphics – AMD, Intel and Nvidia – and all conform very closely to DirectX.

It’s also true that, over the years, DirectX and its graphics-specific D3D subset has been one of the proverbial ‘good things’ from Microsoft, a company with a great reputation for making money but perhaps a slightly less glorious track record in giving a shit about its customers.

Anywho, if we assume DirectX has generally been a success story, the question is why AMD is even bothering with Mantle and, in turn, why anyone should care about it.

AP…why? Aren’t proprietary API’s like Glide, as required for ye olde 3DFX Voodoo boards, antediluvian irrelevancies?

We’ve touched on some of this before, but the answer, in two words, is games consoles. Not so much the competition for PC gaming provided by games consoles, though that is a factor. But more the insight consoles provide into the shortcomings of the DirectX API on the PC.

To cut a long story short, developing for consoles makes games developers more aware of the limitations and overheads of coding games for the PC.

At any given point in the product cycle when comparing games consoles with PCs, it’s the latter that usually comes out on top in terms of on-paper performance potential. With the arrival of Xbox One and the PS4, that gap has arguably grown even greater. Even at launch, the new consoles were miles off the pace of a high end PC in terms of much of the hardware spec.

And yet games developers have often managed to achieve comparable results regardless of whether the end platform has been console or PC. Now, some will argue that this comes down to artificially hobbling PC ports to prevent the money-making console versions from looking second rate.

Cue the latest controversy over Watch Dogs, Ubisoft, Nvidia Gameworks and all that noise. Whatever the truth – and it doesn’t smell nice from where I’m sitting – the real long-term issue here is the different access that developers have to hardware when comparing consoles and PCs.

The relevant cliché typically wheeled out at this point is getting ‘closer to the metal’, the metal being the very circuitry inside a graphics chip. In layman’s terms, that means being able to code games directly to the hardware rather than abstracting the code to be compatible with some or other software layer.

Of course, even in games consoles, developers aren’t coding directly to the ‘metal’. But there’s a lot less garbage in between and much less overhead. Apply the same efficiencies to PCs and you’d suddenly enjoy the kinds of performance leaps AMD is claiming for Mantle.

If you take AMD’s word on things, the reason why Mantle exists is due to a clamour from developers who wanted the same efficiencies on the PC as consoles. And that’s why Mantle is now being picked up by games devs in what you could arguably call unprecedented numbers for a proprietary API.

Hey dawg, I heard you like pixel shaders…

Well, I say unprecedented. Much of this is rumour, but it helps that multiple games are often built on the same core engine. With the hugely successful Frostbite 3 engine supporting Mantle, for instance, you can start with Battlefield 4 as officially supporting Mantle and infer a whole bunch of future titles as likely to support Mantle, like the next instalments of Mass Effect and Mirror’s Edge.

The CryEngine, Unreal Engine 3, the Nitrous Engine, the Cobra Engine (which underpins Elite: Dangerous) and more are all either confirmed or hotly tipped to support Mantle.

You can have a look at a list of all the confirmed and rumoured Mantle games here, but chuck in a number of in-house engines also supporting Mantle and you have a line up that’s approaching critical mass.

What happens next is an interesting question. AMD talks about its intention to open up Mantle fully so that ‘other graphics vendors’ (ie Nvidia) may also benefit it. And I’m sure AMD will do just that.

However, I can’t quite stomach its rather holier-than-thou attitude on this one. AMD’s Richard Huddy says he suspects Nvidia’s pride will prevent AMD from picking up Mantle. And that’s probably true.

On the other, it hasn’t happened yet and it seems to me AMD is dragging its feet on opening out Mantle and my hunch is that’s a deliberate move to enable some PR wins in the benchmark war. Fair enough, you might argue, but it sits rather disingenuously with AMD’s open-source halo.

In the long run, Mantle might turn out to be a momentary blip. Much of what Mantle sets out to do in terms of getting closer to the metal and reducing overheads and bottlenecks is purported to be in DirectX 12. And generally, we’re all better off if the benefits are available to both Nvidia and AMD GPUs.

But while we wait for DX12 to kick in, the next year could be very interesting. If a large proportion of the best games really do turn out to be Mantle-compatible, Nvidia won’t stand still. Whether it’s cutting prices on its graphics cards or doing something with drivers, I reckon the net result is likely going to be more performance for less money. However you slice it, then, and whatever graphics card you own, AMD’s Mantle is almost definitely good for gaming.


  1. harmen says:

    Thanks for teaching me that _lovely_ word!

  2. CookPassBabtridge says:

    I no longer understand anything about anything anymore

  3. Halk says:

    While DirectX 12 also sets out to reduce overhead and bring devs closer to the hardware, it only runs on Windows. Mantle is coming to Linux as well.

    • phelix says:

      This. Thisthisthisthis. Why did the article not mention this?

    • Kittim says:

      Not only that, you’ll have to be running the latest version of Windows to use DX12. Fat chance it’ll come to Win7.

    • DanMan says:

      Yet OpenGL has always been there, and you can achieve similar improvements, if you’re using a modern version of it. Nvidia will be pleased to teach you. They’ve held a couple of presentations about it. Steam Dev Days, for example. So no one needs Mantle but AMD (nor should anyone want it), and DX12 is just “me too” lest they become irrelevant.

      And before anyone complains, I’m not a nVidia “fanboi”. I totally prefer FreeSync over GSync, for example.

      • babajikibooti says:

        Yes you can get improvements from modern OpenGL but the problem is it doesn’t actually fix the draw call issue, you circumvent it by using batching, instancing etc. The API still has a lot of legacy code. A programmer from valve has written about some of these issues : link to richg42.blogspot.co.uk

        Mantle on the other hand has been designed from ground-up for modern shader based GPU architectures. It is completely parallel and has a very predictable driver overhead. GPU programmers love something like this.

    • jrpatton says:

      “Huddy told PC World that AMD is “getting requests to deliver this high-performance layer” to Linux. Paraphrasing him, the site notes that AMD plans to “over time . . . dedicate resources to the task.” Mantle, Huddy believes, “could provide some advantages on Steam boxes,” which will be Linux-based.” – link to techreport.com

      Not exactly a huge wave of support for linux like OpenGL offers. They’re basically saying maybe/probably, but have not made any promises.

    • Neutrino says:

      Indeed, you can take your DirectX 12 and tuck it where the sun doesn’t shine.

      Microsoft’s history of pulling the rug out from under developers by retiring perfectly functional API’s just to sell them the Next Big Thing, or force them onto the next version of Windows will not be forgotten.

    • Solidstate89 says:

      Yeah, just like AMD promised to “open up” Mantle for other GPU vendors. How are either of those promises coming along, exactly?

  4. Roz says:

    However, I can’t quite stomach its rather holier-than-thou attitude on this one. AMD’s Richard Huddy says he suspects Nvidia’s pride will prevent AMD from picking up Mantle. And that’s probably true.

    Am I mis-reading this or should that be Nvidia?

  5. xrror says:

    “AMD’s Richard Huddy says he suspects Nvidia’s pride will prevent AMD from picking up Mantle.”

    wait, what? Is that supposed to be “prevent Nvidia from picking up Mantle.” ?

  6. subedii says:

    The thing I’m wondering about though is how well cutting out the API layers is actually going to work out for stability.

    There’s a reason that DirectX came to become dominant in the first place, and it’s not just that it became an all-in-one package.

    The lack of a unified API to program towards was something that caused a tremendous amount of grief and headaches for devs and players as you had a much harder time coding for different hardware configurations and making sure everything ran properly. I can appreciate the sentiment and getting more efficiencies, but I haven’t seen anyone really talk about whether or not this could lead to “the bad old days” all over again.

    I mean it was already stated in the article that Nvidia aren’t likely to go along with Mantle, and IIRC they’ve got their own equivalent in the pipeline. So are we going to see fragmentation with devs having to write more code specifically for Nvidia / AMD hardware? And end up causing more compatibility issues than what we’re still seeing even today?

    It was the increased usage of abstraction that allowed graphics hardware and games developers to “play nice” largely regardless of what you were using. I’m a bit concerned that could disappear again with a new round of petty rivalries.

    • Kittim says:

      The petty rivalries were always there. I’ve owned AMD and Nvidia, I have no leaning either way. I’ve currently got AMD because the last Nvidia based card I got literally went pop after a couple of hours use.

      Nvidia were keen for programmers to sign up to Gameworks (i think) where they would get help in improving performance on Nvidia hardware. From what I understand, they were expressly forbidden to pass on any of the performance improvements to AMD hardware, thus ensuring that games always ran better on Nvidia kit.

      AMD counter with an API designed to get better performance gains (sometimes).

      For me, I want a semi-pro card that I can use to play games AND do 10 bit per channel mode. I’m not bothered who does it, they’ll have my money.

      • jrpatton says:

        There’s been a lot of he-said she-said on the Gameworks thing. AMD claimed that Gameworks limited/forbade people from optimizing for AMD cards. Nvidia says this isn’t true.

    • soldant says:

      That’s my concern too – I definitely do not want to go back to the old DOS days where if the game didn’t support your particular sound card, you outright didn’t get any sound. On paper Mantle promises the world and it’s enough to get people dreaming of more from the hardware, but I don’t know if it’s really necessary or whether it’s even a desirable state of affairs, particularly since Nvidia almost certainly won’t pick it up until they’re kicked to the curb and have no other choice.

      Also I imagine Mantle is going to become more difficult to sort out when it comes to legacy support down the track, and I don’t think people want to keep lots of GPUs around just for legacy support (or suffer emulation, which is probably going to be even worse).

    • Baka says:

      Someone with more know-how than myself feel free to correct me here but it feels like the current implementation in Frostbite aka Battlefield 4 shows this really well. We’re many months in and the marginal performance gains still don’t really outweigh the problems you’ll experience if you switch the renderer.
      Maybe I simply bought into the hype, but considering the game should’ve been the great dynamo for Mantle itself the results are really underwhelming. Either DICE is fucking up royally on that front (not THAT unrealistic a premise) or this whole thing is simply not worth the effort.

      • Baines says:

        One of the big problems with PC development versus consoles is the sheer variety of PC hardware versus a largely singular version of a console. That is the issue that tends to trip up, delay, and bug-riddled ports from consoles to PCs.

        Yes, getting “closer to the metal” helps performance. It also introduces new problems when “the metal” varies from person to person. Those problems are where companies end up sinking their money, time, and reputation to fix. Mantle sounds like it will just add more division.

  7. SuicideKing says:

    Well, honestly I rather DX12 be picked up than Mantle, simply because all three will have to conform to it. Mantle is clearly a similar attempt as 3Dfx, simply to hide the fact that AMD’s DX driver optimisation is lacking when compared to Nvida’s (benchmarks have shown that Nvidia + DX11 = AMD + Mantle).

    I really don’t think we should be encouraging Mantle, here.

    EDIT: Also, according to AMD itself, Mantle is still in “beta”.

    • rei says:

      DX12 isn’t going to be coming to Linux, which is why I’ll be cheering on Mantle.

      • jrpatton says:

        AMD hasn’t said Mantle is coming to Linux either. They’re pushing for Windows, and have said it’ll maybe/probably come to Linux.

      • SuicideKing says:

        OpenGL, and Mantle so far isn’t open source.

  8. Geebs says:

    It would be nice if AMD were ever able to demonstrate any useful gain in benchmarks on anything other than lower-end AMD CPUs.

    • Halk says:

      Mantle gives much better performance with multi-GPU setups where you are more likely to be CPU bound:

      Better frametimes
      link to hardocp.com

      Better min fps
      link to a.disquscdn.com

      • Geebs says:

        You’ve cherry-picked a bit there; those images don’t tell us the model of processor used, and I’d already pointed out that their best gains are when bound by a weak CPU. FWIW, the same site went on to point out that D3D is better for quad-card setups, not that a sane person would care.

        I do wonder, though, whether AMD feel more bullish about their product because both AMD and Nvidia spend much of their software time stuffing new hacks into their drivers which only benefit individual games – so throwing the whole lot out might not seem like such a big deal.

    • Tatty says:

      Well, I’m getting around a 10% improvement in framerate on Thief with Mantle on my 7950 (1080p on the highest graphics settings) which is apparently not as well optimised to run the API as the 2014 cards.

      That said, when I change my GPU later in the year I’ll more than likely go with Nvidia…

    • Sakkura says:

      Why? The whole point of Mantle is to make the performance gap between Intel and AMD CPUs less relevant for gaming.

      • jrpatton says:

        You don’t need a powerful CPU to game. A mid-range i5/corresponding FX will almost never bottleneck you. For many PC gamers, Mantle offers nothing.

        • Sakkura says:

          A Core i5 is a powerful upper-midrange CPU. It will outperform an FX-8350 in most games, which is exactly why AMD needed Mantle to make their FX CPUs (as well as their APUs and Athlons) more competitive.

          And there are PLENTY of games that are CPU-bottlenecked. Almost any MMO, and many other multiplayer games.

  9. Moraven says:

    My Crucial M500 480GB I got in Dec was $279, On Sale (around Black Friday, Cyber Monday, Holiday deals), from Newegg.

    Down to $227.99 now. $213 for the MX100 512GB. Pricing dropping fast on SSDs.

    • Sakkura says:

      And the MX100 is faster than the M500 too – on par with the M550.

  10. buzzmong says:

    No love for OpenGL anymore? Some games are still written for it, and it’s still supported. Personally, I’d rather have seen AMD throw their weight behind that rather than Mantle, especially with NVidia’s own API in the works, as it wouldn’t be utterly beholden to one commercial entity.

    If I were Nvidia, I’d not like to have to tie my stuff into a direct competitor’s API.

    • DanMan says:

      There’s been some movement thanks to Valve’s SteamOS emergence. Like updated OpenGL support in the drivers, at least on nVidia’s side. I’ve also heard that Sony supposedly suggests using their OpenGL variant on the PS4.

      • rockman29 says:

        The PS4 OS is based off of FreeBSD 9.0. The GUI is the PlayStation Dynamic Menu I believe it is called.

        There’s also two levels of API for PS4. One is called GNM which is a low-level API which is more difficult to program for (this is where ICE team and Naughty Dog excel). GNM for PS4 is the analogue to Mantle on PC (they are both made by AMD after all, and the Battlefield developers have remarked that it is essentially the same thing).

        Then there is a high-level wrapper for GNM called GNMX which is more akin to DirectX and OpenGL level coding. GNMX is essentially the solution for simple porting between other consoles and PC to PS4. Here is Ubisoft talking about The Crew and how they were surprised that it was so easily portable into PS4 format: link to eurogamer.net

        There’s also some shader language developed by Sony called PlayStation Shader Language (PSSL) that was developed exclusively for PS4. It is similar to HLSL. This is a document describing PSSL: link to gdcvault.com

        The PS4 browser uses WebGL which is based on OpenGL I think, probably the first actually serviceable browser in a console, especially good with HTML5 based content.

        I don’t know what 99% of that stuff means, but there you go.


        “At any given point in the product cycle when comparing games consoles with PCs, it’s the latter that usually comes out on top in terms of on-paper performance potential. With the arrival of Xbox One and the PS4, that gap has arguably grown even greater. Even at launch, the new consoles were miles off the pace of a high end PC in terms of much of the hardware spec.”

        The PS4 might surprise people whilst being “miles off the pace” of a high end GPU. In sheer throughput performance, a nice new PC GPU crushes the GPU of the PS4.

        But what that means for graphics fidelity we see in games on PS4 and PC remains to be seen. Drive Club looks pretty good as a driving game, I’ve not seen a racer on PC approach it visually for example.

        It would be nice if the PS4 wasn’t used as a whipping boy simply to rally the PC crowd, and instead had some more nice and thoughtful comments about it’s unique benefits as well, like having a single unified RAM pool accessible equally by the CPU and GPU (the “Onion” bus accesses RAM for the CPU, and the “Garlic” bus is the faster bus for the GPU). Some of us play games on PC and consoles you know :)

    • cardboardartisan says:

      Yeah OpenGL never quite died out – it’s been lagging behind DirectX in popularity on the PC front for a long time, but overall it’s actually kinda ascendant.

      Here’s why:
      – The mobile market (which is huge, and neither Android nor iOS really do DX)
      – WebGL (though technically Chrome and Firefox wrap the OGL API calls to DX, it’s still getting more programmers familiar with the OpenGL API)
      – Consoles (PS4, Steam box)
      – Linux (see: steam box, Valve’s general game plan and the direct push for OpenGL support they’ve been making to hardware manufacturers)

      Insofar as those markets keep growing, DirectX is going to start looking less like an obvious choice and more like a barrier for cross-platform compatibility. I modestly predict that its days are limited.

      AMD could have thrown their support behind improving OpenGL, so Mantle seems like a kind of questionable move on their part – it seems like an attempt by AMD to put themselves in a position similar to the one Microsoft is in now. Who knows how that’ll go.

  11. Kittim says:

    I’ve got 7.7TB of disk space on my PC, only the .7 part is SSD.
    If you like taking photos, a big ass HDD still offers value for money. External drives are dead cheep too and make an easy way to archive your pictures.

  12. Fritzy says:

    DirectX isn’t proprietary? Where’s the standards body for it? None, just Microsoft. DirectX doesn’t work on anything other than Microsoft products, and is a tool for monopolization.

  13. frenchy2k1 says:

    Your dig on SSD is still a few years too early to be realistic. Most people still need mass storage and HDDs are much better at that (much cheaper on GB/$) and only sequential access is really needed for mass storage, like pictures or video (which are multi MB files, if not multi GB). The “cloud” will take over in a bit, but as long as network speed limit consumers, local HDD storage is not going anywhere. If you have photos and videos, you need HDDs… And best in class SSDs are on PCIe already, showing ~10x the sequential access of this HDD.

    About Mantle, AMD is making noise about it, lots of noise about Intel’s interest (they asked, AMD said “later”) and possible gains, but so far, the only gains comes on processor-limited PCs, mostly using AMD APUs. Not a bad gain, but those are not what is inside most gaming PCs (see Steam hardware polls). DX12 will offer similar gains to all and OpenGL already allows it.

    • frymaster says:

      I think we’re outliers. A couple of years ago most consumers would have a laptop with a 200 gig HDD and that would be all. Now they can have a 500 gig SSD.

      Personally, alongside my SSD I’ve got most of a 3TB hard drive used in my desktop, and I’ve got 6 of the buggers in my server, but I don’t think I represent the mass market.

    • Chaz says:

      I look at the prices off SSD’s every time I think about putting in a new drive, but they still just don’t offer anything near the same bang for buck as far as storage capacity goes. My other main reservation regarding SSD’s is their reliability. Anecdotal evidence suggests a much higher rate of failure compared to HDD’s.

      The Crucial MX100 512GB SATA III 6GB/s Solid State Drive SSD is currently £151.81 inc vat on Scan.co.uk and that is the cheapest reasonably sized SSD on their site.

      A Western Digital WD10EZEX 1TB Blue Desktop Hard Drive SATA 3 7200rpm is currently £39.84 in vat on Scan.co.uk

      That is one hell of a price difference for a drive half the size of the other.

      Do I care if my windows takes 10-20 seconds longer to boot up whilst I drink my coffee. Not enough to think the extra £110+ is worth spending.

      • MkMax says:


        the benefits dont justify that price increase, i just built a box, i didnt have money to spare and the SSD became the obvious thing to cut

    • Deadly Sinner says:

      I’m surprised AMD hasn’t opened Mantle up to Intel if they are truly interested in it. Mantle’s survival depends on developers adopting it, and they will only adopt it if it affects enough of their customers. According to the Steam Hardware survey, AMD + Intel make up almost exactly half of the GPU market.

  14. The Dark One says:

    And, in actual imminent hardware news, instead of API scrying, AMD is rumoured to be replacing the R9 280X card with… the R9 280X. While standard industry procedure has been to repurpose your previous generation’s chips and sell them at a lower tier, AMD is going to replace the underlying chip in the 280X with a new design, but keep the branding intact.

  15. specialsymbol says:

    I think one of the most important aspects of Mantle *could* be that it might run native on Linux.

    Which would all of a sudden bring *big* gaming to Linux – something that a lot of people have been waiting for for decades.

    DirectX is right now the main sales argument for Windows. And Windows 8 just hit the fan, at least for all desktop users. Anyone’s trying to get away from it but it’s not possible because of DirectX (and a few other things like .NET)