AMD Claim DirectX Not That Bad After All

By John Walker on March 28th, 2011 at 11:45 am.

What a controversial little box.

Here’s an interesting about-face. Last week we told you of AMD spokesman Richard Huddy’s comments regarding the DirectX API (I now type “API” with an undue confidence), and specifically how it gets in the way of PCs realising their ability to be ten times faster than consoles. Well, now they’re saying words to the effect of, “Um, we didn’t say that, and even if we did, we didn’t mean it.” More specifically, they’re claiming the bit-tech story took the quotes out of context, and according to CRN, “exaggerated” them. Update: bit-tech responds to this below.

In the interview from last week with CRN, he says that what he meant was that a “very small number of high-end developers” take issue with DirectX, who have asked AMD for ways to avoid using the API. Apparently these developers include DICE, along with Crytek. However, “It’s not something most developers want.” Most, he claims, would happily choose DirectX or Open GL, because “it’s a great platform.”

The interview quotes contain an awful lot of “No, Microsoft, we love you!” comments. As many of our commenters observed, the pre-API days were apparently far worse for developers and hardware types says Huddy.

“Every single hardware vendor had to worry about producing their own API, or mimic another vendor’s API. But there are game developers who would very seriously consider tuning their code for a particular piece of hardware.”

But DirectX is super-stable, he insists.

“It’s hard to crash a machine with Direct X, as there’s lots of protection to make sure the game isn’t taking down the machine, which is certainly rare especially compared to ten or fifteen years ago. Stability is the reason why you wouldn’t want to move away from Direct X, and differentiation is why you might want to.”

I am left with more respect for all those developers who manage to crash my machine despite it, then. The good news is Huddy continues to insist that the PC is far more powerful than the consoles, and repeats the point that DirectX is inhibiting access to the full power of the tech in their chips. So, um, er.

This all happened a week ago, but we didn’t notice it then because we are distracted by the shiny colours of games. So thank goodness for GI.biz.

We asked bit-tech editor, James Gorbold, to comment on the suggestion that they’d misquoted and exaggerated Huddy’s comments. He told us,

“I think it’s possible that CRN has misunderstood our article Farewell to DirectX?, which we published nearly two weeks ago, as our feature never suggested that AMD isn’t committed to supporting DirectX – that wasn’t the angle or focus of our article at all. Instead, our article, which includes quotes from AMD’s Richard Huddy, along with leading games developers such as Crytek, was the result of Huddy making several comments about ‘direct to metal coding’ during a wider interview on the future of the OpenGL API. During this interview, Huddy revealed some of the feedback he’d had from high-end game developers about the potential for direct-to-metal coding – we also made it clear that this would only be of interest to developers of games with cutting-edge graphics.

The quotes from Richard Huddy that formed the basis of the article were taken from an exclusive telephone interview organised by one of my staff earlier in March. It’s also important to note that an AMD PR spokesperson was listening in, so the company was well aware of what was said, well before our article was published. As a result, it comes as a surprise that AMD has not complained directly to us about the article – it looks as though our original article has actually just been misunderstood by a few other sites.”

__________________

« | »

, , , , .

33 Comments »

  1. Meat Circus says:

    Well, it’s good that he’s retracted his idiocy, less good that apparently he’s trying to blame somebody else.

    • subedii says:

      It was a ridiculous statement regardless. We NEED abstraction layers, they exist for a purpose. Programming “direct to metal” is at best only really viable practice on a fixed hardware platform. And even there, devs are still making use of DirectX (or OpenGL as the case may be).

      That and he was really overstating the overhead cost of using DirectX in the first place.

    • Gap Gen says:

      Yeah, a factor of ten seems a lot of slowdown for an established graphics API. I wonder how much improvement Crytek can get out of writing their own APIs for various cards, given that I presume they’d be doing much the same of what DirectX is already doing.

    • TillEulenspiegel says:

      It occurs to me that you could probably devise a build system that takes your Direct3D source code and builds two different binaries for each code path, one for AMD and one for NVIDIA. So you make all these little decisions at compile time instead of runtime.

      In reality though, CPU branching isn’t *that* costly. Doing it the normal, generic way was fast enough in 1995, and it’s sure as hell fast enough now on a Core i7. Let me know if you have a meaningful proof of concept where the GPU is not the bottleneck, then we can talk.

    • viverravid says:

      It occurs to me that you could probably devise a build system that takes your Direct3D source code and builds two different binaries for each code path, one for AMD and one for NVIDIA.

      Direct X already does this. Developers write shaders in a platform independent language like HLSL, then the hardware vendors provide compilers that turn it into code optimised for their cards.

      Sometimes developers end up writing different shaders for each card, but that is how it is supposed to work.

    • SeeBeeW says:

      I have it on good authority that Mr. Huddy is a clever guy who was genuinely taken wildly out of context.

      This is really more a case of some developers looking to bypass DirectX in their own middleware, due to their specific requirements, not a suggestion to permanently kill DirectX. His comments regarding the ‘lack of variety in shaders’ were apparently most likely referring to some current research on new, highly parallelized realtime raytracing techniques — the sort of thing which genuinely isn’t supported by existing shader models (since they still depend on rasterizing triangles).

      The point is, it really isn’t as atrocious a claim as it was made to look in the initial article.

  2. The Tupper says:

    I’ve definitely been playing Minecraft too much. I looked at the picture attached to the post and for a moment thought it was a new type of block.

    I am, like John Walker, not very techy but if Direct X is partially responsible for ending the upgrade madness that had threatened to sever my ties with PC gaming once and for all I applaud it. For a decade I was spending the best part of a grand every three years just to get games running. My present machine (bought four years ago) has only required one graphics card replacement (due to burn-out) that cost ninety quid and I can play pretty much anything with most bells and whistles turned up.

    • Gap Gen says:

      It would be nice to have some simple programmability in Minecraft directly, I guess. It’s often a pain when you have to give yourself RSI making hundreds of glass blocks, say.

    • drewski says:

      You should build a Direct X block out of blocks in Minecraft.

    • ANeM says:

      That is less to do with DirectX and more to do with the stagnant console generation and the exponential rise in development costs to continue improving graphical quality in games. It costs too much to make a Triple A game these days, certainly too much to risk doing anything short of designing around the consoles which represent a vast part of the games market.

      Even if developers could feasibly ignore the consoles, it is hard to say that pushing graphics technology further would really have a major impact on sales. For exampme: Fallout: New Vegas sold very well, and it is based on possibly the most revolting game engine of this generation.

      Over the last 10 years it has become completely inviable to target the hardcore PC market.
      Mafia is a good example of this, even being over 8 years old. It tried to target the top end of the PC market and failed. Even though it to great reviews it sold poorly because no one could really play it. In the end very shoddy, stripped down ports had to be made for the console versions.. Which also sold fairly poorly, because they were crap.
      Far Cry is another good example, absolutely stunning graphics.. and in the end Ubisoft stripped it down and ported it to 6 different platforms (5 consoles and an arcade cabinet) under five different names.

      I don’t mind the results though. Its quite nice to be able to sit at a computer built for Crysis and know that, not only does the sequel run at top settings, it runs BETTER.

    • KindredPhantom says:

      In the future games will be made out of minecraft type blocks, you start with a directx box and then add on from there.

    • Gap Gen says:

      Someone should totally write an IDE mod for Minecraft.

  3. Rond says:

    Stating that “directx is slow” is a very dumb thing to do pretty much for anybody, not just an AMD spokesman.

  4. Navagon says:

    Bah, I was hoping for that tech demo, dammit. :P

  5. Deano2099 says:

    It’s a silly topic really.

    Direct X does introduce a whole load of stuff between the game and the chips, but it’s pretty much necessary if you want stuff to work and to have reasonable development times.

    It’s like saying car engines are throttled by the fact that they have to support a car that holds five people and is easy to control, when we could actually make them much faster. Of course we could, as you get with racing cars. It just wouldn’t be practical for day-to-day use.

    Interestingly I remember back in the Amiga days there was a huge ‘demo’ scene, which was all about coding directly to the chips to make fancy effects and ridiculously pretty stuff that wasn’t doable in any practical sense in games.

    The fact that there’s no demo-scene directed at specific types of hardware these days (or at least, not one significant enough I’ve heard of it) goes to show how near-impossible it must be.

    • Zinic says:

      http://www.theprodukkt.com/kkrieger

      Anyways, besides that link.

      The statement they’ve made isn’t entirely wrong. There’s at least some loss of performance between the game and the hardware due to the way Windows handles hardware, however the original (wrong) statement was that the loss was 10 times the potential performance.

      Honestly though, the only real way for developers do code to bare metal these days is if they boot the game pre-OS (like most console games do). A practice that would be completely ridiculous just considering the amount of work that would be required to get it working, due to all the differences in hardware.

      Always keep in mind that 15 years ago, when we didn’t have DX, games were a lot smaller and far from as advanced today. It’s not just a matter of drivers or APIs anymore.

      However, people in the demoscene still manage to produce some interestingly results, like the above mentioned link. But not due to writing to the bare metal anymore.

  6. Mad Hamish says:

    Maybe I’m too trusting but I believe the guy. Googling his name this is the first thing that popped up, It’s him in 2009 saying why we should be excited for direct x11. What’s more likely, the guy AMD says something that the dog in the street knows is ridiculous and that could bite him and his company in the ass or a web site exaggerates an interview/story to get some more hits.

  7. Tom says:

    thought the article sounded a bit odd, but i dunno.
    removing standardisation from a diverse hardware platform sounds like a bad idea to me.
    surely backwards and future compatibility would be a nightmare…?
    no doubt the article generated a lot of page hits though.

  8. HeavyStorm says:

    It’s the same as when John Carmack stated that the lag we experience on online games comes from the overhead that the OSI model imposes.
    Either way, being a developer, I get their opinion. PC’s have much more power than consoles. But you need an abstraction layer like DirectX to enable you to program to many different hardware and hardware combinations. And my guess is that abstracting functionality for many hardwares pushes you to create an abstraction that represents best the lowerend stuff. After all, why are you going to support a feature that’s only available to 5% or less of your customers, when you could be spending your time and money on what 95% have?
    Finally, I guess this is the same as a Java / .Net vs Native duel. Java and .Net gives you a jump-start on programming, where you don’t have to care about the system it’s running and don’t have to spend time writing vanilla code that registers your window or deals with the OS message pump. But you pay the price: a game that runs over Java, for example, can be what? 5x more power-hungry then a native one? I now of PCs that run Bad Company on medium setting but have trouble running Minecraft.

  9. Unaco says:

    C’mon people! Wake up! Can’t you see the real reason for all of this hullabaloo? You’re blind*, all of you! It’s obviously all a ploy by the major and not so major developers, the Tides foundation, the console manufacturers, the hardware manufacturers, probably MicroSoft, the Chinese and the Weathermen (the East Coast, NY based, underground backpack hip-hop super group featuring El-Producto, Cage, Aesop Rock, Breezley Brewin etc, and not the Radical Left wing terrorist Weather Underground Organisation). Their aim? Obvious… to destroy PC Gaming (and probably force socialism on us all).

    First, they’ll get rid of DirectX and any other API, like OpenGL (they’ll maybe get rid of one first, then the other). The major developers will drop them, in favour for coding direct-to-metal, and soon they’ll be forgotten, a thing (or things) of the past. Like Freedom. For six months to a year, all will be well… We’ll get a few AAA titles with spanky new graphics features, showing the benefits of coding without the API’s.

    But then, they’ll tell us that developing cutting edge graphics for PC is prohibitively expensive and difficult without the API’s. Smaller devs, without the clout of the big boys will be unable to do it… to produce PC builds with equivalent graphics to the consoles. Then the bigger devs will claim that “the PC market has shot itself in the foot getting rid of the API’s. It’s not cost effective to produce for the PC any more”. Like rampant Piracy, the imbalance of lean, our lack of Sofas, and the rapidly approaching Rapture, it’ll be another thing the devs can point to and say “See! This is why you PC people can’t have nice things”. And then, soon after, the only game releases we’ll see on PC will be Indie spreadsheet simulators, more hats for TF2, and expansions for WoW. You have been warned.

    Anyway, I’ve said enough already (probably too much in fact), and I have to go now and see if I’m still picking radio up through my fillings. I bid you a good day, and a tip of my tin-foil hat to you all.

    *Not literally, of course… I mean you’re blind to what’s really going on here, in this situation. If you are actually blind, and you’re reading this through a screen reader or similar, I do apologise for any offence, it was not intended. Also though, if you are blind, could you shout at the Hivemind about using their ‘alt’ tag properly?

  10. The Sombrero Kid says:

    As i stated before, if crytek really wanted to do this they would have done as directx doesn’t prevent it, today more than ever. sorry but even his backtracking is full of hot air.

  11. Pijama says:

    Correct me if I am wrong, but didn’t RPS have made coverage of a similar issue raised by John Carmack a few months ago?

    As in, he talked about the state of middleware and how could the industry improve in the area, IIRC.

  12. BobsLawnService says:

    Bit-tech is acting all coy now which is a bit odd. Their headline was definitely wildly exaggerated and mischaracterised what was said on the phone. That discussion had nothing to do with the end of DirectX and everything to do with developers asking for a way to optimise certain sections of code by writing to the metal in a few specific cases. The 10X speed increase could be possible in certain instances and for certain algorithms. No studio in their right mind would try to code a whole game using low level hardware calls.

    So bit-tech, I’m kind of with AMD here.

  13. joshg says:

    I think it’s possible that CRN has misunderstood our article Farewell to DirectX?, which we published nearly two weeks ago, as our feature never suggested that AMD isn’t committed to supporting DirectX – that wasn’t the angle or focus of our article at all.

    So, the title of the article is “Farewell to DirectX?”, but the angle of the article isn’t about getting rid of DirectX.

    Also, your chocolate ration has been increased to 10 grams.

  14. Malibu Stacey says:

    What he’s saying does make sense though in the right context. If you’re making high end games engines you’re more likely to run up against the limitations of API’s like DirectX & OpenGL when you try to get the best performance to visual fidelity ratio implemented.
    For people like Epic, iD, Unity Tech etc whom are developing engines primarily to sell to other studios to use for making games, not using API’s make sense because they are in effect writing their own API that also happens to be an engine.
    The current workflow is
    game -> engine -> API -> drivers -> hardware
    merging engine & API removes a step which can affect performance. It’s not right for every situation but it’s not necessarily wrong either.

    • mashakos says:

      doesn’t make sense in the long term. You forgot to consider the huge QA / testing problem this will incur if hardware manufacturers are no longer in direct communication with API developers. Since only one API developer operates at present, co-ordination between Microsoft, nvidia and amd is straightforward. Topple Microsoft and the house of cards goes with it. You will then end up with a situation where Epic’s “DirectUnrealness” API crashes on mid range gpus but runs fine on low/high end, then a patch or version later it crashes on low/high end gpus and runs fine on mid range gpus – all because of the lack of coherency between the development of the API and graphics hardware.

    • stahlwerk says:

      It would also effectively prohibit any Hardware manufacturer to enter the gpu market, since everyone would end up coding software for the big two architectures (radeon and geforce) only.

    • Malibu Stacey says:

      le sigh. You both still miss the point. It’s not about getting rid of API’s completely, it’s about not using Direct3D in specific situations. There’s nothing to say you can’t still use DirectInput, DirectSound etc.
      Also one word which refutes both your arguments. OpenGL.
      And mashakos that situation you’re talking about happens all the time with new games. How does it get resolved? Quite often the vendors fix it in the drivers.

  15. dogsolitude_uk says:

    Am I the only one here who hadn’t heard the expression ‘direct to metal’ before today?

  16. kert says:

    Uhoh mr. Huddy.
    Im sure you have heard of OpenGL EXTENSION MECHANISM, no ? Coz im pretty sure ATI has published a truckload of extensions throughout the history. Some of these have actually ended up revamped part of the standard in further revisions of the API.
    Not sure why MS can’t be arsed to provide decent extensibility model around core DX calls, it would certainly be possible.