PCs Are 10x More Powerful Than Consoles

By John Walker on March 21st, 2011 at 4:14 pm.

You naughty little bugger

Much like I am ten times more powerful than Quintin, the PC is ten times more powerful than the consoles, according to AMD spokesman Richard Huddy. Which leaves him, and us, wondering why the PC version of games often feels more hobbled than bounding ahead. In fact, in some ways the PC is running at a tenth of the ability of consoles. He told Bit-Tech,

“To a significant extent that’s because, one way or another and for good reasons and bad – mostly good – DirectX is getting in the way.”

It’s not too surprising. The 360 is over five years old, the PS3 well over four. While both Microsoft and Sony are maintaining they’re only midway through their console’s lifetimes, the tech is achingly out of date. While the PC hasn’t had to make any massive advancements in graphics to keep up in a cross-platform world, the technology has continued to advance despite this. And AMD are saying that the contents of our boxes are being artificially held back from realising their true abilities.

Huddy told Bit-Tech,

“We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it’s very clear that the games don’t look ten times as good. To a significant extent, that’s because, one way or another, for good reasons and bad – mostly good, DirectX is getting in the way.”

It seems there is growing pressure on Microsoft to do away with the need for DirectX, which is restricting developers’ ability to take advantage of the hardware. On the consoles they can program for the available tech, but on PC they must negotiate the DX API (application programming interface). Huddy continues,

“By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that’s going to put pressure on Microsoft – no doubt at all. Wrapping it up in a software layer gives you safety and security, but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.”

If you’re now thinking, “But what about shaders?” then you know far more about this subject than I can pretend to, so you should get yourself over to the Bit-Tech story to read all the gory details. But for me, and my simple ways, the following just sounds bad:

“These days we have so much horsepower on PCs that on high-resolutions you see some pretty extraordinary-looking PC games, but one of the things that you don’t see in PC gaming inside the software architecture is the kind of stuff that we see on consoles all the time. On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can’t typically draw more than 2-3,000 without getting into trouble with performance, and that’s quite surprising – the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.”

Crytek’s technical director, Michael Glueck, expressed an interest in programming “direct to metal”, as I’ll now say with more confidence than I deserve. It’s problematic, it seems, with varying PC setups not being catered for by the API. But it would also allow the real power inside to be realised.

Thanks to Eurogamer for bringing the story to our attention. In other news, “Huddy” is the name by which my housemate, Craig Pearson, refers to House and Cuddy.

, , , .

141 Comments »

Sponsored links by Taboola
  1. Backov says:

    This is really dumb.

    DirectX is absolutely not getting in the way.

    Do we really want to go back to the days before DirectX? No.

    • vodka and cookies says:

      Completely agree, in fact this is one of the dumbest things I ever heard from someone so senior at a company. PC’s are not games machines they are general purpose computing devices.

      The nightmare of a billion different PC configurations and he wants to remove one of the major unifying aspects just to get more performance. Performance isn’t the problem, yes games consoles can use all the power more efficiently but there’s no way to match them without removing windows altogether while where at it.

      The fact developers aren’t harnessing the latest features in Direct-X is the markets fault, simply not enough appetite or interest in consumers or developers in PC land.

    • Lord Byte says:

      Not necessarily, what we need is DirectX AND the ability to bypass directX and go straight for the hardware if we support that specific hardware. He never said to do away with it but possibly to bypass it (even if it was for certain function calls this might be an enormous leap forward).
      The other avenue is that some developers support a third-party api (like say… OpenGL), but most tend to be tied in some way due to microsoft’s OS architecture (and have to go through DirectX in some ways anyway).
      They need “competition”, which would breed progress. At the moment there is nearly none (only pushes from hardware developers towards new techs and then only if Microsoft allows it in)

    • Premium User Badge

      Cinek says:

      DirectX is absolutely not getting in the way.
      100% agreed. That’s whole pure crap.
      It’s not DX what’s limiting the creators – It’s consoles and trend to create multi-platform games.

      Take a look at Crysis history – back in the days it was said that DX9 is what limiting the creators, woow, they couldn’t do crap with it.
      So Microsoft made DX10 and everyone were “wow, now we gonna have awesome games” – and guess what? Nothing happen, DX10 games looked EXACTLY as good as DX9c one.
      Meanwhile the Crysis was released – the game that looked STUNNING comparing to anything any console or PC game could offer before. Ofq. by the time there was almost no computers able to handle it, but that’s another topic. To keep in line with “DX10 rules” Crytek artificially blocked many features of this game, making it impossible to play on Very High with DX9. But guess what? People figured it out and patches were made to make game look BETTER on DX9 than vanilla looked on DX10 – which was unbelievable to so called developers like the guy here.

      No we’ve got bit different situation – DX11 is on the market and PCs are capable of handling graphics MUCH better than what games offer these days. But… somehow none of big developers capable of creating quality games release any good title. Why? Again: Look at Crysis 2. This game got released like… 3 or 4 years after Crysis 1 – do you see any progress? Not at all – moreover: some people argue that it looks worse than 1st game!!! (IMO it’s little bit better, but still difference is negligible considering 3 years of progress) What changed? Crysis 2 was released on consoles, while Crysis 1 wasn’t.

      So to sum this up: IMO there’s not a chance for any breakthrough in graphics quality till next generation of consoles will be released, DirectX have nothing to deal with it, using DX as an excuse is pathetic to say at least.

    • Masked Dave says:

      The guy says that most of the reason DirectX “gets in the way” are good ones. He isn’t calling to get rid of it at all, just explaining that its the reason we don’t the better performance on the ‘better’ PC hardware.

      Personally I’m happy to live with the overhead.

    • Premium User Badge

      FriendlyFire says:

      Giving access to low-level hardware might sound nice, but in fact it’s the most horrible idea you could ever imagine. DirectX has one feature to rule them all: cross-compatibility!

      How can this guy imagine direct low-level programming on the *insane* variety of hardware configurations that we have on PCs? On console toys, it’s easy, there’s just one variation available for the entire lifetime of the console. Any hardware difference does not modify the programming side of things. On PCs, you have to support AMD, Nvidia, Intel, ATi (no, I’m not calling them AMD) and each of their myriad of models.

      If you need a culprit for the slow growth of PC gaming visual fidelity, look at consoles. Their half a decade old tech and attractiveness for most publishers make them the platform target, so visual fidelity is framed by what they can run. The game’s just ported to PC after that with minimal differences – the higher quality usually coming from simple stuff like AA and AF or sometimes higher resolution textures because they had them during production anyway. A game geared specifically and solely for PC gives stuff like Crysis, which was so far ahead of its time that recent games don’t look all that prettier despite the game already going towards its 4th year.

    • Rich says:

      “DirectX has one feature to rule them all”
      …and in the darkness bind them?

    • Yuri says:

      I seem to remember that we had a small army of graphics card manufacturers.
      Remember 3dfx and Matrox?
      Almost every one of them had their own API and very different architectures in general.
      These days, we have two major manufacturers. AMD and Nvidia. While it would be more complicated than relying on DirectX, it certainly wouldn’t be comparable to our past nightmares
      Although, those statements are there simply to put pressure on MS.
      AMD/Nvidia probably long for the old days of the “I MUST GET THE NEWEST CARD TO PLAY THE NEWEST GAME!” mentality. Multiplatform games are not making it easy on their profits, since Crysis was the last game that really needed top of the line hardware when it was released.
      Hell, my 4870 still ran most games on max details in 40+ FPS. It was over two years old, ancient by PC standards, but still good enough to run everything.
      And still would, if it’s shoddy voltage chips didn’t heat up like an oven.

      Also, remember graphics quality jumps between versions of DirectX in the past?
      Just compare quality between DX7, DX8 and DX9 games.
      Then compare DX9 to DX10/11. Not much of a change, eh?

      Microsoft probably throttled the developing and optimizing of DirectX in general. It makes sense.
      Why give a huge graphics look boost to the PC as a platform(on which you make less profit in general as a company like MS), when you have your own little money making console? You don’t want to make your product look inferior to something else, no sir.

      This is especially true in the present, when HDTV’s have become common and there’s really no excuse for not building yourself a powerful but quiet PC, and hooking it up to your big screen TV.
      Then get in your comfy armchair with a wireless mouse, keyboard and/or controller.
      If the PC would have vastly superior graphics, why would anyone want a console?

    • Kaira- says:

      If the PC would have vastly superior graphics, why would anyone want a console?

      I don’t know, exclusives and hassle-free setup, perhaps? Unfortunately installing games has become quite common on consoles, and that’s one thing I don’t want to my console gaming. And then there are updates, which kinda became a loophole for developers to release buggy games and promise to “patch it sometime in the future”, which may or may not happen (BioWare, I’m still waiting).

    • AdamT says:

      Is it legit to look at DX9 and later DX versions and complain that they are too much the same, or that developers aren’t using the later versions? I think we had a lot of stagnation at DX9 b/c vista was so crap for so long that people didn’t upgrade. The market share that can’t move past DX9, as a developer you want to abandon those people?

    • Tacroy says:

      From what I’ve read, the problem is that DX is CPU-bound – that is, the processor has to tell the graphics card “Do this, do this, do this, then do that”. Instead, what he wants is for the processor to, essentially, load a program into the graphics card that will figure out what to do on its own.

      DX was written for stupid graphics cards that needed to be led by the nose by the CPU. Nowadays we have smart graphics cards, which could be let off the leash to do their own thing – but we’re not doing that, we’re still leading them around by the nose.

      (disclaimer: I’m a programmer, but I’ve never done anything involving graphics so I have no idea if this is true or not)

    • remover says:

      I hate DirectX. It’s proprietary to Winblows and the reliance on it stops innovative developers from deploying to other platforms because it’s easier or cheaper to just use libraries already built for using DirectX stuff.

    • wrath says:

      @Tarcoy You been sleeping under a rock? Hardware Tessellation look it up. Besides all data has to travel through the CPU, that’s the point. It goes to the CPU before it goes anywhere else, even if what he’s talking about happens, some amount of data will have to travel to the CPU.

      @remover I agree, OpenGL ftw.

      I’d say that Richard Huddy is incredibly short sighted, its just too easy to blame DirectX for all of PC gaming’s woes. Up until recently PC gaming was considered dead or dying, it was never true but it still meant that PC gaming did not get a lot of love, mostly ports. That’s not all there is, but If you ignore history and context, its easy to sell youself on a blind statement. I bet he doesn’t even know what it is he’s asking, its probably all gumdrops and rainbows in his head.

    • chwynn says:

      DirectX doesnt communicate directly with the hardware, it just communicates with the hardware drivers, which are very uniform.
      It would be a good idea to allow direct communication with the hardware drivers, and DirectX would still be around for those developers that want to use it. Its just like buying something that does most of what you want it to instead of making it yourself. For the big engines that can, (i.e. Cryengine 3, Source Engine, idTech 5, Unreal engine 4, Frostbite and similar (i’m sure there is an important one missing from that list)) allowing the developers of those engines access to the hardware drivers directly, and not through an API, would grant much more creative freedom to people like Tim Sweeny and John Carmack who can use all of that extra power.
      By extension, any game built on these engines would also have the associated features and performance boost.

    • viverravid says:

      This article is absolutely full of sh*t.

      The only way the PC can survive as a platform is for developers to have a reasonable consistent target to aim at. Or a series of them – low, medium, high, ultra high settings enabling progressively more complex features.

      “Direct to Metal” programming is the bad old days of having to tell DOOM exactly which soundcard you have, and having it not work if the developers didn’t cater for it.

      Some DirectX overhead is the price you pay for one set of code that runs on anything, and will run on anything yet to be produced.

      Only reason I could think of for this guy saying this is that AMD has something in the works that they expect to increase their market share for a while. A “Direct to Metal” paradigm would very quickly lead to the manufacturer with the largest market share dominating the market, as game studios wouldn’t put in the effort writing code for hardware few people have. This becomes a feedback loop cutting smaller manufacturers out of the market, leaving only one winner.

    • chwynn says:

      Don’t be daft, they aren’t going to do that, they are just simply proposing game engine designers have direct access to the hardware drivers. In a lot of cases, the hardware drivers are compartmentalised and general, so its just a case of writing the engine for the 10 or so different driver systems there are.

      Essentially, what will happen is the people who write these game engines will have to code their own version of DirectX, but integrate it into the engine, excluding features that they don’t require, and including additional features not supported by directX. All the consumer would see in the basic graphics options is AA, AF, Quality (low medium high, etc) and screen res.

      As a consumer, this is good for you. As a developer, its a bit of a ballache, because you would need to code your game engine from the ground up if you wanted direct access to the hardware. But the engine developers wouldn’t have to because DirectX isn’t going anywhere in either case. Because this is arguing that DirectX should simply not be ENFORCED. If this change was implemented, you could still program a game engine using any version of DirectX you care to mention, or you could access the drivers directly.

      This change could also simplify the purchasing of graphics cards, because theoretically, as long as the shader model is consistent, any graphics card could run any feature (although granted, not as well as one designed to perform that feature.

      The “direct to metal” approach you are talking about (Doom 3 is your example, wtf?!?) is when soundcards didn’t have drivers, (the drivers were directly integrated into the games/programs that used them) and so it was a pain in the ass to code for. This is not what’s being proposed. The time before DirectSound was a chaotic one, but DirectSount was first implemented as basically a “universal driver” for soundcards (I know that’s not technically true, but its close enough for the sake of example). DirectSound still exists today, and it still provides support for 20 year old soundcards directly, but it’s primary task is to provide a simple way to communicate with soundcard hardware drivers. Because of the massive variations between the hundreds of manufacturers that manufacture soundcards, it would be unwise to start providing direct access to sound card drivers.

      For those that don’t know, DriectSound is for Sound what DirectX is for Video output. DirectX doesn’t actually render anything, it just takes what the engine would like to render, and passes it onto the driver for the graphics card, in a format it understands. Its essentially a middleman that can speak ATi and nVidia (and 3 others). In the current system, there is no flexibility to speak directly to ATi or nVidia, you can only speak to DirectX, who does the translation. What the guy in the article is proposing is that Windows stops blocking direct graphics card access, (for approved programs) and allows game designers to talk directly to the hardware, cutting out the middleman (DirectX) IF THEY WANT TO. I’ll just repeat: DirectX is still going to be available, and most people will develop on it. But if the 5 major engines want to move to direct hardware driver access, then they should be allowed to. Its better for everyone. (except Microsoft, who lose some market share with DirectX and the poor game engine programmers who have to spend another 2 months coding for the direct to hardware stuff)

      Its like having a dad (Microsoft) who refuses to let you learn French (ATi) and German (nVidia), because you have permanent access to Google Translate (DirectX), even though Google Translate takes a few seconds more.

      If I want to learn French or German, I will Goddammit!

      Also, all this writing, and nobody will read it. :(

  2. zergrush says:

    “Much like I am ten times more powerful than Quintin”

    Quinns hasn’t been introduced to the Pie-and-Dogfood-Walker-Diet yet, I suppose?

    • The Army of None says:

      Walker does hoard all of the metal. I guess you don’t need to be good at healing when you have all the metallurgic power.

    • westyfield says:

      Quinns’s amaemia is tragic indeed.

    • President Weasel says:

      Empirical testing has shown that Quinns is the absolute worst of the “RPS Four”, and that Alec Meer is in fact the best.
      By pmpirical testing I mean naming some players in my Blood Bowl team after them, but you can’t argue with science. Iron Quinns is utterly, shamefully, hopeless. Walker is fragile, but game. Alec Meer is the undisputed star player. It’s a classic case of games imitating life.

    • zergrush says:

      Pimpirical testing is the only kind of test worth doing.

    • President Weasel says:

      Pimpirical testing ain’t easy.
      (Also, the edit function just makes a teeny tiny scrunched up box for me today, so I can’t edit),

    • Bret says:

      How did Gillen test, back in the day?

    • President Weasel says:

      Killer Geiron the rat ogre was actually on a different team. He was terrible! Not as bad as Iron Quinns, the incredible non-passing elf, but still more of an asset to the opposition than he ever was to me.

  3. Schmung says:

    All very well and good, but not sure there’s a practical upshot to this. Who’s got the time and money to invest writing that level of custom code for a select few (since it’s specific to gfx card implementations) when it excludes the vast majority? You’d have to write a separate There’s no money in it, so it ain’t gonna happen. DirectX might be holding back performance but it enables companies to develop PC in line with consoles at relatively little extra expense. Convergence with the console market is the only way PC gamers will get AAA titles and moaning about how they’re holding us back just ignores the financial realities of the situation.

    • Backov says:

      You know what’s ACTUALLY holding back PC graphics? The XBox 360 and PS3. If they’d go ahead and update their damn hardware already then PC games would go up to that new level.

      EDIT: Wasn’t intended to be a reply to this comment. :)

  4. Premium User Badge

    The Sombrero Kid says:

    I can safely say as a Game Programmer he is talking complete and utter bullshit, not a single thing he says is true or accurate in any way whatsoever.

    in order to give what he says even a shred of credibility you have to assume he’s talking about Direct 3D & Open GL, in other words a stable graphics programming interface, so that then becomes him saying that game developers want to have to rewrite this stuff themselves, and there’s absolutely nothing stopping them, if i want to write my own graphics interface layer i can, i don’t because it’d be stupid.

    • Premium User Badge

      Stijn says:

      That was what I thought too, which leaves the question of why he is telling us this nonsense. Does AMD have anything to gain by doing away with DirectX (or OpenGL etc)? Alternatively, does AMD have anything to gain by spreading misinformation like this? It just seems so random to me.

    • mcwill says:

      Likewise, I’d love to know what the intention of this particular bit of PC-vs-consoles FUD is all about.

    • Premium User Badge

      stahlwerk says:

      Hear hear.
      Almost every statement he makes sounds really uninformed and very 5 years ago. Since you can create VBOs and Shaders in both major APIs along with deferred shading, and immediate mode rendering rapidly is going out of style anyway (hence why I cringed at the “X000 API calls”*), graphics performance is now mostly a matter of on-GPU memory speed and driver maturity, up to the point that the ball is actually not in the API-vendor’s park anymore.

      *) Edit: rereading it he refered to “30.000 chunks of geometry”, which may as well be stored each in a vbo. This means an average screen area of 8 by 8 pixels on an HD-Display for each object, which is quite a lot of visual information. What happened to optimising engines so that stuff that’s not actually visible is not issued for drawing, the arcane knowledge of frustrum culling and visibility computations?

    • bob_d says:

      From the article:
      “Isn’t this going to make life harder for developers?
      ” ‘Absolutely, yes, it would,’ says Huddy, ‘But they love it when we make things hard for them, the little minxes.’ ”
      (Ok, he may not have actually spoken that last bit out loud.)
      I think what he’s saying may be true for the top-tier AAA engine developers (Crytek, Epic, etc. who I imagine are the few companies actively making demands of AMD), but yeah, for the rest of us in the remaining 99.999% of computer game companies, this is hilariously false.

    • Sucram says:

      Even if a developer had the time you wouldn’t want them to write custom code for GPUs. It’s would be like going back to the days where if your sound card wasn’t listed your game wouldn’t have sound and all sorts of .dll hell.

    • Luke says:

      Yeah I’m not entirely certain I even understand what his argument is.

      The bottleneck in the DirectX/OpenGL APIs for shaders and vbo’s surely only comes into play at loading times? If there are people out there that don’t load the majority of stuff into GPU memory long before rendering… well, I’d like to meet them and ask them why.

      But then, maybe this is what he’s advocating? In which case he’s a little late.

    • pepper says:

      Well Sombre, better get out the ol Assembly books and get started on your low level programming skills.

      To be honest, I find assembly tremendously fun. Although im quite alone in that aspect.

    • arghstupid says:

      Well, “Any performance problem can be solved by removing a layer of indirection.” as some jokey little nerd once quipped. On the other hand I imagine Mr AMD would rather like people to write code that only worked on his graphics cards, so perhaps we should lightly salt his words.

    • ScubaMonster says:

      There’s a much simpler way of looking at this. Replace everything he said with the sentence “I’m an idiot”. That’s all the rationalization you really need to do.

    • Hallgrim says:

      @pepper:
      I had fun writing a program using MASM that made red squares fight black squares, but it was pretty tedious.
      Also one of the guys in our class wrote over some critical address space and fried his windows install. That’s one way to learn to initialize pointers, right?

  5. Archonsod says:

    Kinda Ironic it’s AMD saying it. The entire reason for the API is to handle the differences in the architecture between companies like AMD and Intel. So as a first step, how about they adopt a unified architecture with their biggest competitor?

  6. Mirqy says:

    “To a significant extent, that’s because, one way or another, for good reasons and bad – mostly good…”

    I don’t think he really hedged that enough. Needs more hedge.

  7. Phoshi says:

    I think there’s one important point that proves everything he said wrong, or at least misguided.

    Graphics aren’t as they are and maxxing out cards. There’s plenty of additional headroom even on a midrange card. Nothing technical is holding us back because there’s plenty of room there, there is no bottleneck.

  8. Tei says:

    Is probably true that the design of DirectX is old. Is probably true that one of these graphic cards are x10 more powerfull than a console.
    About everything else, I can’t comment.

  9. Navagon says:

    All I can say to that is: show us, AMD. Give us a playable tech demo of just what is possible on current hardware without Direct X. Then we’ll have a reason to be interested and take this as something other than an attempt to get people talking more about AMD.

    • OTD Razor says:

      Agreed. Until we see proof of how far DirectX (or OpenGL for that matter) is holding back performance this is nothing but talk with an extreme level of conflict-of-interest.

    • SuperNashwanPower says:

      +1 agree. RPS, can you throw down the gauntlet?

    • dr.castle says:

      Yes, exactly what I was scrolling down to post. AMD needs to release a downloadable tech demo made direct-to-metal for current gen AMD cards. If it truly is possible to get the kind of performance improvements described here, they need to pony up and show us. I’m sure it would generate an enormous amount of publicity for them if they can make something as technically amazing as Mr. Huddy describes.

    • Ajh says:

      Yes. Prove this.

      And prove it without a prohibitive development cycle. And prove it on something that is compatible on a wide range of windows running pcs.

      Ready, Set, Go!

      When they do it, let me know so I can believe this all too.

  10. trooperdx3117 says:

    Dont know if I agree with him although that whole part about pc’s not able to draw as many chunks of geometry as consoles does raise an interesting question. How is that on my old laptop (before it burnt out) that it had 8x times more RAM, twice the processing speed and twice the Vram and yet it couldnt run games at anywhere near the speed or looks of a new 360 game, why?

    • Premium User Badge

      Stijn says:

      Because a PC is a general-purpose device, as opposed to a 360 which will always have the same hardware setup, which was put together specifically to run games and has software dedicated to that as well.

    • pakoito says:

      This is the most hardware architecture uninformed post ever. We will be holding an award ceremony 10th May, so start preparing a speech.

    • trooperdx3117 says:

      Alright i’ll make sure to have m old tux ready as well ;)

  11. mkultra says:

    Direct X knows where your children go to school.

  12. Jibb Smart says:

    ‘The funny thing about introducing shaders into games in 2002,’ says Huddy, ‘was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they’ve used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop the API, then people really can render everything they can imagine, not what they can see – and we’ll probably see more visual innovation in that kind of situation.’

    The API (whether DirectX or OpenGL) by no means encourages a converging graphics style. Shaders are relatively easy to write, fast, and give the programmer a tremendous amount of freedom. Converging graphic styles on brown pseudo-realism has little to do with engines or APIs — it’s simply a style that most developers want to go with. Another factor is developer laziness — re-using popular shaders. Taking away the API isn’t going to help that one bit.

    There are far more important things for a developer to worry about than the API’s very small overhead.

    I reckon they just want developers to try and do everything in OpenCL — yes, it’s another API, but much more general-purpose.

    Jibb

    • Premium User Badge

      fiezi says:

      This.
      Although I have to admit that even though shaders are cool and all, you’re still based on a relatively strict pipeline. Which is why we don’t see Voxel-Engines and the like being implemented in current Graphics cards. Also: even though we have OpenCL, the bottleneck of reading back from GPU memory still remains, and probably will so for a while, until mainboard manufacturers come out with a new PCIx standard…

  13. OTD Razor says:

    The premise and article are incredibly flawed, and the bias inherent of this coming from a video card manufacturer lacky makes the amount of cred this article is getting simply disturbing.
    Do we really want to return to the hardware compatibility chaos of the 80s and 90s? APIs like DirectX and OpenGL provide a necessary abstraction from the hardware that helps to ensure security and backwards compatibility. It makes development time significantly shorter and cheaper.
    Saying that the game development community at large wants this is ludicrous. It is no surprise that developers like Crytek would LOVE this. Their engines would become the new APIs as only larger studios could afford the time and money to program to the metal. The overwhelming majority of the game development community can’t afford to be programming this way in 2011.
    AMD would love this too because it would be a step towards vendor lock-in, much like Creative enjoyed with the SoundBlaster series before Windows 95 and DirectX audio support became popular.
    Claiming DirectX is holding gaming graphics back on the PC platform is just a load of smeg. Crappy ports from consoles are more the culprit. Second to that, a game developer has to consider what size of their audience has graphics hardware powerful enough to push pixels in their title. Crytek screwed up royally on Crysis with it’s super-high system requirements, then blamed piracy for why their sales figures sucked. Meanwhile titles like Minecraft are doing fine with crappy graphics but low system requirements and interesting gameplay.
    The entire computer industry has been moving deeper and deeper into higher level of abstraction from the hardware, and this trend is not going to change. Despite the overhead of such abstraction the benefits outweigh the disadvantages by a massive margin:
    1) more stable code
    2) better security
    3) greater backwards compatibility
    4) access to a larger range of hardware configurations, and not having to account for them
    5) faster development cycles
    It’s pretty clear this is a sensationalist article meant to leverage nerd rage against Microsoft to do something that would be extremely harmful to the health of gaming on the PC. Thankfully most game developers will see right through that and this joke of an idea will fade away in a week.

    Edit:

    Remember back when Microsoft released figures of what driver vendors were causing the most crashes on the Windows platform, and NVIDIA and ATI were responsible for plenty? That was software that interacted with the metal directly, written by the EXPERTS of their respective hardware ISAs. Do you really want to trust the stability of your system to Crytek?

    • bob_d says:

      Of course, the thing is, who even talks to the “GPU relations” guy at AMD? Crytek, and similar companies. So his notion of what “developers” want is likely pretty warped.

    • Lukasz says:

      UGH!

      Crysis never had super high requirements. It run on crappy systems on day one. And it looked pretty.
      Of course if you wanted to max it out you would not be able to play the game for 2 years since release.

      The stupid misconception comes from people who whined that their beastmachine cannot handle the game at max settings like any other game. That crysis does not look like it does on videos.

      the truth is that if your computer could handle HL2: Ep2 it could handle crysis.

    • OTD Razor says:

      I hadn’t even had breakfast when I wrote that, so let me rephrase that bit about Crysis vs Minecraft:
      Crytek screwed up royally on Crysis with it’s super-high system requirements, mediocre gameplay and inconsistent framerates (video settings that were fine during the first chunk of the game yeilded unplayable framerates once you hit those snow levels… there is something to be said about developers that manage consistent framerate through their ENTIRE game… ever heard of r_speeds? This isn’t whining about my ub3r l337 system not handling the game, it’s about idiotic design decisions that killed gameplay and immersion. This is a reasonable expectation, that I should not have to change video settings halfway through the game to make that last half playable! Don’t get me started about that lame-ass boss fight =P), then blamed piracy for why their sales figures sucked. Meanwhile titles like Minecraft are doing fine with crappy graphics but low system requirements and interesting gameplay.
      What can I say, I think better after I have a shower =P Sure me.

    • OTD Razor says:

      One more thing…

      This past summer I built a new rig with a Core i7 and a GeForce 470. Often enough when I upgrade to a new system I find it fun to take older games that didn’t run so hot on the older machine and max it out on the new one. Crysis was obviously on that list.

      And you know what? I didn’t make it past the second act because I was so bored. Having played more innovative and compelling games (Stalker CoP the last shooter to really captivate me) it was such a step backwards I could not continue. Maybe I’ll revisit to see all those pretty graphics, but that would be the only reason.

    • Premium User Badge

      Cinek says:

      @OTD Razor – there are mixed opinions on this, some people just don’t get it – like you for example. For me Crysis was nice solid game (say… 7/10) not just tech benchmark as some of ppl appear to see it. Warhead brought more character-driven storyline in it making the whole game even better. I happen to return Crysis 3 times – always having fun with it, and believe me – I’m one of the guys who very rarely return back to re-play some titles. And even now almost everyday I play Crysis mod called MechWarrior: Living Legends, probably the best mod ever made.

      But anyway – Lukasz said: “Crysis never had super high requirements. It run on crappy systems on day one. And it looked pretty.” and I agree with him. At the release date I was playing Crysis with motherboard build-in GPU and it was the best looking game I ever seen, even despite of playing on mid-low details. So unless you have some sickness forcing you to play on VH details or die – than you never should really have problems with Crysis performance. Just set the thing right. I dare to tell that in it’s own times it was one of best optimality games on the market capable to play well on PC waaay below recommended settings, and even much below minimal settings.

    • OTD Razor says:

      I didn’t like Crysis so I just don’t “get it”. Right….

    • luckystriker says:

      @Cinek Thank you for linking that mod. That’s completely awesome.

  14. po says:

    Here’s an extremetech interview with one of the original creators of DirectX from 2008, in which he specifically states (in part 2) that one of the problems DirectX has developed is a lot of bloat, which affects PC performance.

    • bluebomberman says:

      Quote from the article:

      “When we created DirectX, there’s a reason it’s call DirectX. It was direct access to hardware acceleration to the developer with very little abstraction and operating system nonsense in the way. So DirectX was meant to be very fast and low-level, and push all the OS bloat out of the way. Over the years that’s been forgotten, so each subsequent generation of DirectX has had more value added from Microsoft, which makes the API more complex, more bloated, harder to understand, and so forth.”

      I don’t think any sane person wants to go back to the bad old day of pre-DirectX, where your GPU’s compatibility with your game was a huge crapshoot. (Anybody remembers 3Dfx?) But DirectX has become another example of Microsoft’s massive code bloat.

      Sure, consoles force developers to cater to the lowest common denominator. That doesn’t mean DirectX (and its distant cousin Open GL) can’t benefit for some serious reinvention.

    • Baines says:

      Microsoft is bad about increasing bloat, and really dropped the ball with DirectX 10. Since they had already decided to break hardware compatibility, they should have gone all the way and used 10 as a chance to completely rebuild and restart DirectX, culling all the outdated material. Maybe it would be troublesome supporting two simultaneous DirectX branches and installs (9 for legacy support and 10 for current and future development), but the work probably would have improved DirectX in the long run.

      Left on their own, Microsoft mostly just keeps tacking new material onto old. Backwards compatibility is good, but that approach leads to a ugly mess of code and often an uglier mess of documentation (as Microsoft also seems to care very little about maintaining quality documentation. Like code features, they just tack new docs onto the old.)

      It doesn’t help that OpenGL development has seemingly stagnated, falling behind DirectX and killing any competition that Microsoft faced.

    • Starky says:

      Erm… Baines…
      …that is exactly what they did – the reason DirectX 10 was not backward compatible with XP was exactly because they removed almost all legacy support and redid their code base from the ground up, and designed it based around their new driver model.
      Windows Vista and 7 have a utterly* separate DirectX9 built into them which emulates DX9 as it ran on XP, and isn’t part of DX10+
      *Well not utterly – but it’s an addon rather than a inherent part of the API – it could be removed and DX10 would fully function.

      So basically Vista and 7 have DirectX 11 (assuming you updated) DirectX 10, 10.1 AND DirectX9 installed separately.

      DX10+ don’t work on XP because it was based on the new driver model would require XP to have been rewritten to support said driver model from bottom up, and probably broken 8+ years of hardware drivers – you could emulate DX10 through DX9c, as many hackers tried to do) but it would be slow, buggy and crash prone.

  15. Moleman says:

    Also, you have to take into account who a top level guy at AMD is going to be talking/pandering to- mostly the big companies that still produce commercially licensed engines that push the envelope on graphics- ID, Epic, and Crytek, basically. And losing a shared Windows API and moving to bare-metal programming would likely change the best case PC AAA development scenario to “Buy one of the current engines from the big boys and use their tools.” Everyone else sees increased hardware capability and graphic fidelity as increased development costs, not a profit center- there’s a reason development costs have skyrocketed, while the average length of a game has gotten shorter.

    Actually, it may be nostalgia for the older state of affairs from people who profited from it- I can remember around 2000, when nearly every high profile game was either Unreal or one of the Quake engines, and upgrading your graphics card was something that everyone did, not just hobbyists (we’re ignoring the fact that 2000’s “everyone” and now’s “hobbyists” are mostly the same people, but I think it holds). Back then, a new graphics card chipset was a BIG deal, but I can’t say I look forward to the return of “buy a $50 game, and a $400 graphics card to run it.” A lot of developers opted out of the PC arms race for the consoles back then, and we got several years of stuff that was crippled to run on the PS2/Xbox 1.0, not a new golden age of PC development.

    • bob_d says:

      Yeah, absolutely. It makes sense for the top-tier engine developers to want this; not so much for everyone else. Making things “harder for developers” actually benefits them. I wonder if he realizes how warped his perceptions (of what developers want) have become based on the particular developers he’s talking to?

  16. mashakos says:

    “head of GPU developer relations”?
    What this genius basically said is:
    Screw language, let’s all just do things 10x faster by pointing at things we want and grunting!
    Except not everyone might get what you’re on about, so you’d have to work on new “gestures” and “grunts” in addition to your existing pointing and grunting. Not only that, for every other person you meet who misunderstands your latest gesture or grunt, you will have to figure out a grunt/gesture that this person can comprehend.
    Way to go n00b.

    EDIT:
    This all pointless anyway since the graphics layer in Windows has been abstracted since Vista. So to go this route your game would require an OS other than Vista or Windows 7. Windows XP maybe? Or DOS?

  17. Premium User Badge

    stahlwerk says:

    “If we drop the API, then people really can render everything they can imagine, not what they can see – and we’ll probably see more visual innovation in that kind of situation.”

    This statement has a weird “kill em all, let god sort em out” subtext to it, doesn’t it? Also I disagree with the romantic/neocon notion that only hardship => innovation.

    • bob_d says:

      Who wants “tools” that make the job “easier”? Spending all your development resources making things work, that’s what real developers do!

    • ScubaMonster says:

      That’s one thing I never understood. In different game programming forums I’ve read people talking/asking about programming things in DirectX basically from scratch. They wanted to make their own engine. I don’t see any reason to do this instead of using an existing engine, especially if you’re a newbie/amateur. Just use UDK, XNA, etc. and call it a day.

      Note: I’m not talking about companies, but hobbyists. Though I realize that’s not really relevant to the discussion.

  18. AbyssUK says:

    But wait… wouldn’t it make sense then if AMD/nvidia released specific code for there cards then to aid developers i.e. Option 1 = use bog standard direct x shader Y or option 2 = use shader Y (with bells on) using hardware specific machine code.

    So directx draws its screen then the machine code draws the extra bits on before displaying… sort of like adding a bit of assembly into your C code to improve speed… does this make sense ?

    Is this bloke basically saying you cannot do this at present because directx won’t allow it ? that seems very dumb if it’s the case.. am sure its not, so the case is then surely the graphic cards manufacturers fault for not releasing decent specific machine code support.

  19. Premium User Badge

    Down Rodeo says:

    Ok, well the article has a few more quotes that deal with some of the issues raised here; I think one of the things he’s talking about is the procedure for rendering calls which requires elevated permissions with DirectX. So in that sense bypassing that step could have significant gains.
    On the other hand, I would not ever like to attempt such a thing. The whole point of an API such as DirectX is to make your game engine run on as much hardware as possible with as little tweaking as possible. Fair enough, increased performance, but is it worth the extra horrific mess? One of the guys from Crytek they quoted in the article said he would like something like a smaller API, that was a little bit closer.
    What’s the word on OpenGL in all this? I know it avoids that problem via its design but OpenGL 3 was a bit of a disappointment…

  20. Premium User Badge

    yhancik says:

    Because the past 10 years have showed us that being able to render X times more geometry chunks has made games X times more interesting, engaging and fun.

    • Pointless Puppies says:

      Because in the past 10 years the only thing that has ever evolved is graphics.

  21. cfp says:

    The number of draw calls is not a real limiting factor in any well designed engine. By batching draw calls you can get the number down to the number of materials in your scene, and if you’re prepared to have a little memory overhead you can even get by with just a handful of materials via texture atlasing and a per-vertex “material-id” parameter. Maybe you don’t want this extreme degree of batching because you want to maximise the benefits of z-culling, but even if you avoid batching together meshes that may possibly overlap in screen space, you’re still going to have well under 1000 draw calls.

  22. Eclipse says:

    That guy is a bit full of shit.

    First: DirectX is widely used on Xbox as well, secondary DirectX is NEEDED on PC. As it serves as abstraction layer from the hardware. And why PC needs an abstraction layer? Simply, because there are TONS of different graphics cards.

    There always will be the need of libraries such OpenGL and DirectX if you don’t want to deal with every piece of hardware possible.

    DirectX and OpenGL are not an optional. If you discard them, you still need to write something very similar on your own.

    Being an AMD manager ( AMD = OpenGL supporter, manager = someone that doesn’t know anything about real development) puts him in the worst position ever to talk about APIs and programming

    • Eclipse says:

      Also, every PC gamer knows what’s holding PC Games: and that’s console games.
      Every new PC game is still Directx 9 based, so developers aren’t even fully using the newest DirectX APIs, let alone being restricted by them

    • Kaira- says:

      “Also, every PC gamer knows what’s holding PC Games: and that’s console games.”
      And here I was thinking it was developers. Silly me. Those bad consoles just want to ruin all of our fun.

      “Every new PC game is still Directx 9 based, so developers aren’t even fully using the newest DirectX APIs, let alone being restricted by them”
      And why should they change that? Windows XP 32-bit is still the second most used OS on Steam, and with XP 64-bit that comes to total of 23% of Steam users. That is quite a lot of people’s money you’d say “no thanks”.

    • Soon says:

      One million crates of power!

    • D3xter says:

      Nowhere did he say they should *drop* DirectX or god forbid APIs altogether, all he was saying is that the possibility for those few that hopefully know what they’re doing and are in the field of making graphics engines (and license them to other devs) to code on hardware level would be a welcome change and an apparently “asked for” feature. Nobody said that every software and game developer should develop solely in assembler from now on. The guys making the engines and the people with the proper knowledge (like NVIDIA/ATI themselves) would handle that and either develop their own API and/or tools…
      The graphics market nowadays isn’t as it was in the days of yore with mainly NVIDIA and ATI/AMD making the GPUs (even the console ones), which for a large part are very similar and can be handled with an unified driver over several graphics card generations. Of course if not handled right that would be an insane source for system instability…

  23. mollemannen says:

    are you sure it’s not 10.1x or even 11x stronger? :P

  24. Metonymy says:

    I’m often amazed by how young and easily influenced the users of the internet/this site are. When it was necessary to optimize code, line by line, hand pick the order in which variables where placed in the stack, choose the perfect syntax so the interpreter wouldn’t waste time carrying out unnecessary instructions, and hand-coding the critical bits in pure assembly, just to generate crossword puzzles in a reasonable amount of time, great coders had to know a lot of stuff.

    These days you can pay your tuition, get a ‘obedient slave’ degree that shows you how to use libraries that other men have written for you, and write serviceable code with zero understanding of hardware, the API you’re using, the subtleties of the compiler, etc.

    What’s funny is that there are plenty of recent examples that are well known, so there’s no excuse for this ignorance. Every coder should be familiar with nesticle, one of the earliest NES emulators that had a graphics renderer written in pure assembly. This thing would run silky-smooth on a 486.

    I hate to say something so tediously obvious, but if you ever need software (unlikely, I know) that really takes advantage of the hardware that you are employing, you will be compelled to know what you’re doing, and you will not be able to resort to an ‘easy-mode’ like direct x. This writer is not incorrect, he’s just older and wiser than you. Thankfully, what he is saying is beyond you, and you will never have to worry about it.

    • pakoito says:

      Now back in the pragmatic world, nobody wants to return to THAT. You don’t want to multiply coding time on the engine just to make the space marine look prettier, neglecting the game content and rising the budget needs up the roof.

      What you described is still there, and they are called “engine makers” (not accurate, mind my french). They make the middleware or libraries needed for gamemakers to make their games. If someone went and made OpenDirectX and it was good, everybody would switch, no doubt. Meanwhile, leave things be easy and people out from assembler.

      PS: I’m doing my Masters in Comp Science, not in Videogame Dev or anything.

    • OTD Razor says:

      Wow Met, I have a background in x86 assembly and still find what you said silly. Big game engine developers can afford to throw time and money writing code in assembly… the overwhelming majority of developers do not. Pretty graphics are a waste of time if the gameplay sucks, and when you are on a budget something has to give.
      Using libraries does not make a programmer weak, it makes them smart. There is no reason to re-write and debug what someone has already done(that last one especially… using code that has been tested well outside your application’s little world leads to more stable programs). Making re-useable code is a best-practice of programming. Libraries do just that, as do APIs like DirectX.
      I understood fully what Huddy had to say, and it’s still BS designed to push hardware vendor lock-in and make engines like Crytek the new DirectX.

    • Metonymy says:

      Sure, I can accept what you’re saying, but just as long as we’re clear about exactly what it is you’re saying: This is a BUDGET limitation, and has nothing whatsoever to do with the quality of the code, the final product, or the efficiency with which the hardware is used.

      I have to laugh a little at your implication that more money is going to improve the game itself in any way, especially when I’m sure that isn’t what you meant. Sure, it’s nice to go through additional iterations of debugging and testing, and maybe have more time to look things over, more eyes to criticize the work. Ultimately, however, it’s the genius of the original design that determines whether the game is good or not. As you can no doubt attest, big budgets do not equal great games, especially if we remove the teenager vote.

    • Gary W says:

      Ahh, there’s nothing like an obscure graphics routine authored by a self-educated manly Man, preferably in raw machine code. They’re an indispensable part of the modern videogame and definitely deserve their place alongside Einstein’s tensor theory of gravity and Milton’s “Paradise Lost” as some of the greatest achievements of our, or any, age.

      (This important message has been brought to you via the “Programmers Without Liberal Arts Education” Employment Union).

    • OTD Razor says:

      I thought I was pretty clear, Met. Let me add some more clarification: Yes, it is a budget limitation. The crazy thing is, though, budgets MATTER in game development. Both money and time. Coding graphics code in a target GPU’s assembly would easily delay development time for a game. Smaller developers can’t do that and get away with it. Long dev cycles mean waiting longer for a paycheck, which is one place money comes in.

      In magic unicorn land we could spend this extra time but programmers generally operate in the real world.

      And I certainly was not implying more money = better gameplay. I’m not sure why you bothered to say that when then saying “I’m sure that isn’t what you meant.” What I was saying is that development teams have finite resources, like time and money. Spending more resources on graphics may come at the expense of gameplay. That is up to the development team.
      I’m not sure what you meant by the original design, but on the subject I can say this: A good deal of the best titles resemble little of their original concepts. Read the DOOM Bible for an example, or many of the post-mortems on Gamasutra. As a game is being built features are both added and dropped, and sometimes completely different avenues of gameplay are thought out well after the original concept.
      If I’m off on what you meant by original design, please disregard.

  25. LazyGit says:

    Seems like there are a lot of young people commenting on this who weren’t around in the days of Glide. The difference between Glide and DirectX back then was huge and there were many complaints at the time that the platform was losing a lot by going from separate APIs to a unified one.

    It’s also interesting that so many experts here know so much more than the creators of Crysis.

    • mashakos says:

      I’m not old enough to have been able to afford a PC with a kick ass Voodoo PCI card back in ’96, but I vaguely remember that 3dfx was assimilated into either the nvidia or microsoft Borg cubes.

      Either way, what you’re talking about doesn’t seem to be relevant since:
      a) The directx you remember no longer exists, and in fact former 3dfx staff helped rebuild directx either directly or through their hardware vendor overlord.
      b) Nobody likes Format Wars. We’re not talking about 3dfx vs. ATI here, two different API’s essentially mean that the PC is composed of two completely separate incompatible platforms. Yeah, being able to only play HALF the PC games out there, great for the industry!

  26. geldonyetich says:

    Sounds to me like something that could be resolved readily enough if Microsoft’s DirectX guys put some more emphasis on speed, efficiency, and harnessing the hardware better.

    If PC hardware is truly 10x faster than a platform that is beating it out, any reasons cited (such as security or compatibility concerns) are weak excuses. There’s no good reason an API should slow down the ability to utilize hardware by a ratio as drastic as 1:10.

    It’s probably deliberate. The existence of the XBox has set up a rather obvious conflict of interest for Microsoft. They would only be sabotaging themselves to facilitate the PC rapidly outperforming their gaming consoles.

    If that’s how it’s going to be, perhaps we aught to mandate our PC graphics API come from a company that does not have a conflict of interest.

  27. noodlecake says:

    PC’s are 10x better than consoles!!? Lies!! Mine bloody isn’t! How much do you have to pay for one of these so called 10x better than consoles PC’s? My portable one costs £600 and games look nowhere near as good as they do on my xbox360. Which is broken might I add! I miss it. :( I’m stuck with this 3 times as expensive but half as good gizmo here.

    I think that maybe the fact that most people aren’t incredibly rich or nerdy enough to have thousands of pounds spare to spend on computers that are actually that amazing is what is holding back PC games.

    • Lilliput King says:

      Do people read the articles anymore?

    • pakoito says:

      He bought a 900€ laptop to play, I don’t think he needs to.

    • ScubaMonster says:

      Regardless, his point does remain that generally speaking, a decent gaming computer is going to cost you more than a console. You can buy a 360 anywhere from $199 to $399 depending on what bundle you buy ($399 one is the 250gb Xbox with Kinect). I find it doubtful you’ll build a good gaming rig for that price.

    • Starky says:

      Maybe not, but for $500 you can EASILY build/buy a mid ranged PC (base unit only) that will handle any game you care to choose at full 720p (1280×720) with higher than X-box settings (texture details, view distance, Anti-aliasing). 720p is what most games are on the 360, hell most of them are upscaled to 720p and actually rendered at much less than that.

  28. db1331 says:

    Being a PC gamer feels like going to the water park with your little cousin. You want to ride the cool new mega-slide attraction, but you have to stay in the lazy river, floating around the same shit all day because your cousin (Xbox) isn’t tall enough to go on the new ride. Also, there is pee everywhere.

  29. Gravious says:

    So… DirectX is broken? umm…. Fix that? not re-invent the wheel?

    When i see games like Dead Space 2 running in full 1080p at 60fps+ with 32x FSAA on, with quality exceeding the consoles on a current gen mid-range card (560ti) it makes me angry when people still compare PCs to consoles.

    i kicked on the nvidia 3d stereo for kicks as well, it still didnt drop from 60fps

  30. Hunam says:

    To be honest, the only way we’d get the full use of the cards is if AMD/Nvidia only released one every 3 or so years and each game is made for those single cards. Instead, they should make their cards to be more like direct x/open gl api’s to get more power out of those systems.

  31. Sfyreas says:

    I agree with most of the comments. Turning DirectX into the scapegoat of why PC games aren’t what they should have been, is naive and probably bullshit. My view on the subject:

    1) Not every PC is high-end, on the contrary, and if the developer/publisher want a wide audience they better make sure they support at least the average rig. That means, tailor the game to be flexible performance-wise (sacrificing visual quality) without breaking the game mechanics (or bug the game). That’s where an API comes handy, generalizing the procedure as much as possible, forming a hardware-agnostic (well, not totally) wrapper.

    2) AMD and NVIDIA, if not happy with the current situation, have all the means necessary (budget + personnel) to make an API of their own that utilizes their hardware to its full potential. Of course that means working closely with Microsoft, since they don’t have control over the underlying low level source of the OS.

    3) Even better (although it may sound crazy), they could form a committee, to standardize and develop a gaming-oriented OS and tailor it as they wish, circumventing Microsoft all together. Let me elaborate:

    It could be open source / modabble, standardizing and supporting only the bare essentials needed to fully utilise their hardware ( low level graphics API, high level wrapper, networking, etc.) removing all the bloat that comes with a general purpose OS.
    PROS:
    – A unified gaming platform embracing all PC users (even macs if apple jumps on the wagon)
    – No bloat
    – Total control over gaming essential hardware features, maybe better anti-piracy features
    – Moddable / Scalable
    CONS:
    – Gaming specific nature means little or no support for non-gaming related applications -> frequent switching between OSes.
    – Backwards compatibility is a significant problem, older titles would require significant amount of work to be ported
    – Actually a leap of faith, requires a lot of determination and cooperation between competing entities and there’s no guarantee it will ever be embraced (plus i doubt Microsoft would ever let something like this happen)

    TL;DR, If AMD or NVIDIA are unhappy with the current situation they have all the means necessary to make something different. Blaming DirectX is not a solution to any problem, especially when DirectX isn’t the actual problem.

  32. jonfitt says:

    As someone who navigated the dark days before DirectX, I can only say:
    Nooooooooooooooooooooooooo!

    It was such a relief when everybody standardised on DX. Now it may well be that DX fundamentally needs to be improved/rewritten but going back to vendor specific architectures sounds like a nightmare.

    In fact I’m pretty sure DX needs a kick up the arse, but perhaps an open source joint venture between graphics manufacturers is the way to go.

    • OTD Razor says:

      That open source venture already exists. It’s called OpenGL. It’s up to devs to use it.

    • Starky says:

      Except OpenGL is pretty far behind DirectX, mired in committee design (lots of talking, no action) and sat op top of a horridly broken, ancient code base that needs to be thrown out, gutted and redone from scratch.

      Microsoft had that issue with 9c, but at least they had the balls to take action – even if it got them a lot of hate from users, by gutting it for DX10 and deciding not to make it backward compatible (there for not having to fill it with a load of junk legacy support), which I think now is starting to really pay off with the windows 7 uptake and the general uptake in DX10+ graphics cards.

      Microsoft is sprinting ahead of OpenGL, and openGL doesn’t seem to be speeding up – quite the opposite in fact.

    • OTD Razor says:

      I guess I was being to subtle =P I’m fully aware of why OpenGL is behind the times. To my point, any competing API will have those same issues if done by the agreement of many competing vendors. Look at OpenGL extensions to see what I mean.
      OpenGL 4.1 has for the most part feature parity with DirectX 11. But look how long it took to get there.

  33. Premium User Badge

    Andy_Panthro says:

    [some stuff about House the TV show, that I’ve redacted in case it’s spoilerific]

    Also, if anyone would like an interesting take on House reviews, check this out:
    http://www.politedissent.com/house_pd.html – Medical reviews of House, by an American GP (don’t read on too far though, spoilers abound).

    [edit] That was a reply to a post that seems to have disappeared. Odd that! Have amended it somewhat.

  34. Jabberslops says:

    Quick! Somebody contact Saint Carmack and have him talk in Nerd-Geekian! We must have truth!

  35. Hoaxfish says:

    Okay… DirectX is a layer of “bureaucracy” that slows down PCs… but isn’t Windows an even bigger bureaucracy?

    It’s an age old problem, that maybe Microsoft really need to gut Windows, in order to get it back in trim… something that they repeatedly try to do, but never really produce for the ordinary users (backwards compatibility being one of the big issues… but hopefully virtualisation should cure a lot of that).

    There’s all kinds of weird little experiments that they do with “kernels” and “libraries” and microkernels and exokernels, etc… but then go “oh, no, not for the public”. Well, that’s cool, but wtf is the point if it’s never for the public. “research into not giving stuff to the public”.

    More power isn’t really a solution to anything… stuff like ID Tech5/Rage show that even with technical restraints (or forced by them), people can actually look into “other solutions”, and start breaking into new areas of development, that can easily benefits everyone, without requiring the latest and greatest piece of hardware.

  36. Freud says:

    Why is anyone surprised by AMD, a company that is in the business of selling graphics cards, want a market where game developers make games which requires better graphics cards.

    I have been a gamer for a long time and playing catching up constantly sucked. I am happy that we have reached a plateau for the time being and I don’t have to upgrade my computer to run Crysis 2 in more than 15 frames per second. The days where 3DMark ruled sucked for the most part.

    I’m all for scalability and if developers want to add more optional bells and jingles, more power to them. But that most PC gamers manage to get good performance out of most games these days is a positive thing. It makes PC a better platform for gamers and developers.

    • D3xter says:

      What I find more surprising is why NVIDIA and ATI/AMD don’t put any more action behind their words, aside of helping certain developers and paying them amounts of money to include certain features when it is something that is so clearly interconnected with their business model. Why not found their own development teams to make HQ graphics games or buy a few? I mean hell, PhysX managed to do what they apparently aren’t (CellFactor, Warmonger etc.) able to in what… 2 years and they literally came out of nowhere till they got bought by NVIDIA and now their features are being utilized for spider webs and banners in Arkham Asylum… It would be the perfect synergy. Even Intel tried when they bought the Project Offset guys to launch with Larrabee (which were let go when that ultimately failed but anyway)…
      I’m actually yearning for the times when there was still innovation and I had to buy a new graphics card every 1 1/2 years if I wanted to keep up, but there’s just no reason to with the consoles nowadays…

    • Premium User Badge

      Cinek says:

      What I find more surprising is why NVIDIA and ATI/AMD don’t put any more action behind their words, aside of helping certain developers and paying them amounts of money to include certain features when it is something that is so clearly interconnected with their business model. Why not found their own development teams to make HQ graphics games or buy a few?

      These are very good questions. Even a story-less game with simplest mechanics would do the job as sole Tech demo – something like… very basic racing game or FPS might be? Very simple, release it for free, and SHOW THE DEVs WHAT YOUR GPUs ARE CAPABLE OF! Something outmatching consoles by these all years, and something to move this retarded market forward.

      Only reason why it didn’t happen yet I see is the founds limitation – A++ graphics cost money and time… might be nVidia doesn’t have either of these so we won’t see anything alike but – IMO that’s something market would need – kick in the ass, away from consolish look, and using blur as an ultimate 3D effect for everything (lol UnrealEngine).

    • OTD Razor says:

      Hell, take it a step further and maybe they should consider a full blown IDE for game development, something comparable to Unity or UDK.

      Still I fear returning to darker days having to worry much more about compatibility.

    • Premium User Badge

      Cinek says:

      Yea, the outcome of this most likely would be “nVidia-only” games and “ATI-only” games. ~_~

  37. rawgr says:

    whatever happened to the linux as a gamer-platform argument…..

    JEEEEEEEEEEEEEEEEEEEEEEEEEEEZZZZZZZZZZZZZZ
    SO MUCHHHH PAIIIIIIIIIIIN!

  38. cliffski says:

    yup, agree with everyone else, this is total bullshit from someone who has foirgotten was is involved in making a compatible, bug free modern pc game.
    Directx, and its standards are probably the best thing microsoft ever did. It doesnt vaguely get in the way. Card manufacturers need to concentrate on actually writing drivers that are consistent and stable, rather than digging at directx.

  39. kit89 says:

    What the gentleman says seems reasonable enough.

    Hardware is controlled through layers of abstraction, the higher your level of abstraction the slower it will be. Programming on a standard CPU has layers of abstraction, something like so :

    ( Interpreted Languages ) : Java / JavaScript ( Slow )
    ( Compiled Languages ) : C / C++ ( Fast )
    ( Assembly Languages ) ( Fastest )
    ( CPU : Coal Face )

    Programming in a compiled language like C or C++ is exceptionally fast, however, does not come close to the performance you gain through programming straight to Assembly. Though programming in something like C++ enables a greater level of portability( moving from the X86 architecture to the X86_64 or PPC, for instance ). Assembly doesn’t necessarily .

    In many cases most of the code does not need to be programmed in assembly as there is no need for the added performance gain. However in some cases there is. Certain small parts of Doom 3, for example, where coded in assembly to get the required performance.

    The reason why most code is not written in assembly is because it is not very intuitive. Unlike higher level languages, assembly is not human readable( ie, plain English ).

    If we look at the GPU instead of the CPU, the levels of abstraction are somewhat the same, however, one cannot gain access to the assembly level. So if you have a particular shader that needs low-level optimisation to run smoothly, you can not. You either remove the shader entirely, or opt for a less heavy hitting on.

    The gentleman does make sense.

    kit89.

    • kyrieee says:

      He wasn’t talking about writing shaders in assembly, he was talking about not using function calls that have to be interpreted by drivers. That’s how I read it anyway.

    • Premium User Badge

      stahlwerk says:

      I know this wasn’t the point you made, but I find it worrying that people still would forego the portability, comfort, power and maintainability of “interpreted languages” because statistics from 1999 say that the Sun JVM is slower than native C. This is 2011, we have JITters like Hotspot for Java, Groovy, the .Net/Mono-CLI with F#, C#, and other “scripting” languages like Ruby, Python, and for graphics or high-performance computing OpenGL and OpenCL bindings in nearly every host language under the sun. Even Javascript compilation has improved considerably over the last two years, with browsers even able to do real time 3d graphics in WebGL without breaking a sweat.
      Sorry for that rant, I’m just allergic to the notion of ASM still being proposed as a silver performance bullet, when it’s not even possible (or at least sane) to program multiple threads with it. How many lines does it take in C# 4.0/PLINQ? How many in ObjC with Grand Central Dispatch?

    • kit89 says:

      Your are correct kyriee, I miss read the last section of the article.

      Though the basic premise is the same from my initial post but instead of going from compiled to assembly it is interpreted to compiled. Though it would be nice to get to the assembly level. :)

      As it stands when a developer compiles their code the DirectX or OpenGL codes are compiled down to a “byte-code”, these instructions are then passed to the graphic drivers and interpolated into instructions the GPU can understand.

      As a loose example think of Java/JavaScript as OpenGL/DirectX, the Java Virtual Machine as the Graphic Drivers and the GPU as the Operating System. Just the same as a developer would not use Java for high-end games, it stands to reason that a developer would also want to bypass the inherent slowness of interpreted languages in relation to graphics processing as well. Though your lower-end games would happily run on top of the additional abstractions.

      It is possible for hardware manufacturers to standardise to a processor architecture. For instance Intel and AMD are standardised to the X86 architecture( 32-bit processors ) and x86_64( 64-bit processors ). So the same can happen with Nvidia and AMD. Though the advantages and disadvantages of this are for another topic.

      From a developer standpoint there could be little difference( if done correctly ), OpenGL or DirectX could still be used. Compiled as normal and instead of being passed through a graphical driver it would be directly processed by the GPU. Potentially in a similar way processes are passed to the SPU from the PPU in the Cell processor.

      kit89.

  40. dreamkin says:

    Of course what he says is moronic. But do not forget that AMD owns ATI. The industry is moving towards on CPU integrated graphics solutions. Intel is working on their own graphics hardware and so is AMD. AMD’s only shot at being number one system on a chip developer would be if somehow they became the de facto standard CPU manufacturer for gamers. In an ideal world for AMD that would mean games working on AMD (ATI) hardware perfectly. This is pretty much like returning to pre-direcx era.

  41. Gaytard Fondue says:

    There is no ATI. It’s been almost five years. The same way as Constantinople isn’t the capital of the Ottoman Empire anymore.

    • Hoaxfish says:

      and AMD is an anagram for MAD… I’m not really sure which had the stronger brandname (i.e. for quality etc, not just name recognition)

      I just wish they’d adopt a clearer naming scheme for the graphics cards

  42. Pointless Puppies says:

    Microsoft dropping Direct X? Hah, yeah sure. That’ll be the day. Does Huddy realize that the entire purpose for the Xbox in the first place was to promote Direct X? You know, the original name of the console being DirectXBox and all? When a specific aspect of the console is being included in its freaking name, I have a hard time imagining a scenario where Microsoft “buckles down under pressure” and takes it out.

    You know how Nintendo says they’d never make games if, for some reason, they couldn’t make consoles anymore? I believe it’s the same for Microsoft. Without DirectX, there would be no DirectX-box.

  43. zimbabwe says:

    I want to say thank you to RPS for defining API… although I know what it means, often when I read a blog I have NFI (NO DARN IDEA) what they’re talking about because they throw acronyms like everybody already knows.

    OT: seriously, I grew up in the bad old DOS days… that wasn’t fun. Can anybody argue that we need to go back to having different versions for different cards? Get a grip and love living in the future.

  44. Meathead says:

    This article is pretty uninformative. I’ve yet to see an example or percentage number for how much of our GPU power the evil M$ directX eats up. The only figures they mention in the article are chunk calls, which have almost nothing to do with anything unless there’s something wrong with the game dev.

    Even my aging mid range Radeon 5850 can run most cross platform dx9 games at native 1080p at twice the frame rate of a console pushing out its usually upscaled 720p. Once you add in processing power taken up by dx10/11 features that some new games have hurriedly added to their PC versions, that’s eating up even more GPU power. So its not like your PC graphics card is being throttled to Xbox 360 speeds so Microsoft can sell more consoles.

    Some developers (Crytek and DICE come to mind) would like more access directly to the video card, but thats hardly saying they want DirectX to go away, and I’m not even sure thats a good thing for us consumers (See the shiniest awesomest effects in Battlefield 3 ONLY with a AMD Video card!!!!).

    • adonf says:

      It’s the CPU that is limited by the API, not the GPU.

      And yes the PC can run console ports faster with a higher resolution but the gain in performance is not as much as you could expect from the raw hardware performance difference between PC and console. I think that’s what Huddy meant here.

    • Premium User Badge

      Catmacey says:

      Sorry. Hit the wrong reply button. Comment moved to it’s own thread.

  45. Nallen says:

    Is this not the exact same thing from the Sunday Papers?

    I’d like to thank Jim for drawing it to my attention…

  46. adonf says:

    I’ve done some D3D programming on both PC and Xbox in my time but I’d never heard of this 3000/30000 dichotomy in the number of draw call. It seems odd, especially on the Xbox 360 that uses DirectX. Does anyone have references for this ? I’d like to understand why the PC is limited.
    Oh and also I’m sure that linux driver programmers would be thrilled if GPU manufacturers released information on how to program ther chips. Game programmers… not so much.

    • adonf says:

      (Yes I’m replying to my own question after some Googling, in case anyone here is interested in this topic)

      From what I gathered, D3D on the PC does a lot of validating that it doesn’t do on the Xbox 360. I think it’s because the PC runs a general purpose OS that can not go down because of a programming error (console games must pass certification tests so there is some validation, only not at run time), and because D3D on the PC runs on a lot of hardware while all Xboxes are the same.

      This validation takes a lot of CPU power and adds a lot of overhead to each API call. Things have improved in DX10/11 but most PC games are still using DX9.

      As a side note, if a game runs like crap on a PC that is 10 times more powerful than an Xbox 360 that ‘s usually because the PC graphics programmers did a straight port of the code and did not limit the number of API calls.

  47. bill says:

    Don’t want to go back to the bad old days where you never knew what would work on your chipset.
    It’d also be a big pain for playing older games (though tbh DirectX has been a lot less successful at preserving gaming legacy by keeping games working on new hardware/os than i’d hoped).

    But the guy probably knows what he’s talking about hardware wise – in one sense. I do fine that consoles seem to be able to smoothly handle things that chug theoretically much more powerful pcs.
    But then again, consoles are dedicated machines and pcs cover a wide range of power-levels and component combinations.

    Anyway – i actually rather like the way we’ve plateaued and i no longer need to keep upgrading my pc. Don’t want to go back to the old hardware races or compatibility problems.

    So… fix DX or use opengl.

  48. Premium User Badge

    Harlander says:

    Why are AMD whining about DirectX’s dominance when their OpenGL support is and has always been monumentally terrible?

  49. Spliter says:

    I think he’s right.
    DX is holding us behind with the hardware power (though not nearly as much as the consoles). However at the same time we are completely dependent on it to unify all the myriad of graphic configurations out there.
    What we though is a universal and OPEN API that all graphic card manufactors must follow, and then add their own extensions for the stuff only particular cards support. This would allow us for an incredibly optimized model where program-wise it works the same but under the hood the API is completely different depending on the card/manufactor/etc. The only thing holding this back is the “buisness”.
    NVidia and ATI competition doesn’t allow this unifocation and so microsoft must do it for them by making a general model which supports only what both those cards support, making it clunky and quite heavyweight.

    Still, unitll we start working on PC games that ignore consoles rather than PC ports or unitll consoles get into the next gen

  50. Premium User Badge

    Catmacey says:

    Quite aside from the benefits of a common interface that DirectX gives to developer I find it disingenuous that the article does not mention the whole resolution and IQ factor. It’s really not comparing like for like.

    They avoid mentioning in the article that typical consoles resolutions are trivial compared to your average PC and with levels of AA and AF that just don’t compare. They also don’t mention the quality of the textures used in console games, great when you’re sitting 2m away on your sofa but hardly suitable for typical PC distances of around half a meter or so (or is that just me?)
    They also don’t factor in that the majority of console FPS titles use a very narrow FOV compared to PCs. A narrow FOV reduces the amount of geometry that is being drawn. A wide FOV requires more geometry and subsequently more texturing.

    For the majority of games, screen resolution and IQ/AA/AF have a direct impact on performance. (Not including ARMA II, it always runs like a dog)

    Consoles use up-scaling to make the games look good on HD TVs. The native resolutions they run their games at are very 1995 compared to the average PCs.

    An example:

    The current Steam survey shows that around approx 77% of PC users have a primary screen resolution of at least 1.3MPixel (1280×1024 & 1440×900 or there abouts) with around 48% having a resolution of 1.7MPixel or greater (1680×1050 and up).

    My PC is Win7 64bit, NVidia GT260, AMD Phenom II x4 2.8GHz, 4GBRam with a 1680×1050 monitor. Sure it’s a good machine but hardly cutting edge at around 2 years old and not a bank breaking proposition these days.

    A typical example of a recent game is CODBLOPS.

    I run CODBLOPS at 1680×1050 (1.7MPixel) with 4x AA, 8xAF and texture quality on high. I have sync every frame set and get a smooth 60FPS.

    According to this page the XBox 360 runs the same game at an internal resolution of 1040×608 with 2xAA. The PS3 at 960×544 with 2xAA. That’s a screen resolution of 0.6MPixel and 0.5MPixel respectively, barely a half of an average PC and only a third the resolution of my PC. It’s no wonder they have good performance.

    That’s just one game of course.
    I wonder what performance my PC would have at those at those resolutions?

    Oh and not to mention that on my PC I also have thunderbird, firefox and often Winamp running in the background and can ALT-Tab to them at a moments notice and that my wife who is still logged in under a different user account can ask me to pause and switch to her account, check her email, then switch back. All without a hitch – other than annoying me of course :o)

    I’m sure that DirectX has a performace impact but is it really a 10x hit? I’m also sure that writing directly to the metal would give performance benefits but only to those that have the right CPU/GPU and at a hefty development/debugging price. Is it really worth going back to those bad old choose your renderer (Glide/OpenGL/PowerVR/Software) days?

    I know that we all know this already… It’s just so annoying to see this sort of lazy comparison from an “expert” source.