Arkham Knight Officially Abandons Multi-GPU Support

I’ve long felt that multi-GPU systems are playing with expensive fire, and even worse can, like audiophilia, lead to an anxious, ongoing preoccupation with whether you have the ‘best’ at the expense of actually enjoying what you’re playing/hearing. Even so, I sympathise with people who invested both money and time into their SLI or Crossfire setups, only to now be told that one of the biggest games of the year will effectively never support ’em. Warners/Rocksteady have officially thrown in the towel for Batman: Arkham Knight [official site]’s errant multi-GPU performance.

In response to the ongoing grievances over on the Steam forums, Warner Bros. representative ‘wb.elder.pliny’ dropped the bad news. Despite apparently working with hardware and driver manufacturers on the issue for several months, “the result was that even the best case estimates for performance improvements turned out to be relatively small given the high risk of creating new issues for all players.”

“As a result we’ve had to make the difficult decision to stop work on further multi-GPU support. We are disappointed that this was not practical and apologize to those who have been waiting for this feature.”

This follows on from the company all but abandoning the troubled Batman sequel to its messy fate even after a re-release after pulling it from sale and spending months trying to fix it, not entirely successfully. They did, however, claim that they’d still work on multi-GPU support; now that’s been dropped I’d be surprised if much else happened to the game.

A sorry saga. And as much as this is a very special case, it only cements my opinion that it’s best to stick to the best single card you can comfortably afford. I’ll take slightly reduced settings over technical headaches any day of the week.

If you bought Arkham Knight from Steam at any point during its sorry release saga, unconditional refunds are being offered until the end of the year.

59 Comments

  1. OmNomNom says:

    Well that sucks because a single card setup doesn’t exist that pushes enough frames to keep me happy.
    I’d go single card if one existed.

  2. Dizrupt says:

    Inb4 SLi is useless shit anyway comments.

    • Mokinokaro says:

      SLI is great, when it works

      But often you’re better off just investing in a high end single card rather than buying two lower GPUs since both GPU companies have really screwed things up when it comes to support.

  3. RegisteredUser says:

    No surprise given that their multi-month “patching” withdrawal of the game lead to basically 0% performance increase. Yepp, that’s right, the benchmarks before and after yielded essentially identical fps results. Not bad, ey?

    In German, but the numbers should be international(basically check the graph to see before and after figures):
    link to computerbase.de

    • Capt. Bumchum McMerryweather says:

      I’m afraid I have to 1000% disagree with this. I have 2 R9 270s and the game barely ran AT ALL when I first bought it (the day it came out). After the game went back on sale, I got a fairly consistent 40-50fps with all the settings on high, but with all the nVidia guff turned off. And it was an awesome game (finally).

    • Capt. Bumchum McMerryweather says:

      Also the website benchmarks show that the ‘before’ was actually for the beta patch, not the original retail copy of the game, so these results are hugely skewed when compared to what you’re trying to say.

      • RegisteredUser says:

        Sure, you do have a point. However, from what I understand the time elapsed between the beta patch(2nd Sept-ish) and the final version(27/28th Oct) was over 7 weeks, during which apparently nothing further changed.
        If the beta patch already fixed things for everyone (not how it seems like after perusing several forums for comments) and the final just kept that going, then alright, but it still begs the question what the next 7 weeks were for.

        I expect they kept banging away at it those 2 months more and couldn’t really get anywhere and then just plain gave up and said fine, just shove it out then and offer refunds and let the chips fall where they may.
        This is of course just speculation on my part given all that I’ve read.

  4. Christian Dannie Storgaard says:

    Well, the Linux port is still on its way, so _some_ work will likely still happen. Also, if I understand the way Vulcan works on the upcoming NVIDIA drivers correctly, it might be possible to run it on multi-GPUs through OpenGL once the new drivers appear (and the game comes to Linux, obviously).

  5. ForthRight says:

    link to geforce.co.uk

    “This app supports… SLI”

    And it’s been like that for months.

  6. montorsi says:

    I’ve officially abandoned WB support, so I guess we’re even!

  7. GAmbrose says:

    Ironically Nvidia SLI allowed me to complete the game at 2160p resolution on release, but not in the way you would think.

    SLI didnt work, so I set my second card as ‘dedicated to phsyX’ and was able to have all the Nvidia game works effects on and negate the stuttering.

    • geisler says:

      Yup, exactly the same story here. Also worth noting that this only worked if you had 980ti’s or TitanXs (if you absolutely wanted stable 60 fps). The fact remains that if no second card would have been present, it would not have been possible to play at high resolutions or ultra settings.

      In the end it all depends on how picky you are about graphical fidelity. At the high end of the spectrum, there isn’t even a case to be made about “to SLI or not not to SLI”, as in almost all cases a single GPU is never enough to drive the amount of pixels those people want driven. If you’re on a “budget” and don’t really care about playing on highest settings, my opinion about SLI is this; stay away.

      • GAmbrose says:

        Yup, it is dual Titan X’s

        Now I could do with a Fallout 4 SLI profile…I have a 1440p desktop monitor that I can use when games don’t work great at 2160p though.

        • geisler says:

          Sounds like we share a similar setup, dual X’s here too. I play on 1440p as my main monitor, but always downscale from 2160p. Fallout 4 runs fine mostly, but around Trinity Tower and Boston Commons (so a lot of Boston proper) fps takes a dive, so i keep it at 1440p too. So far a almost all configurations have problems there, so it’s not quite clear yet if SLI compatibilty will fix the performance issues, could be an engine thing.

        • Sulpher says:

          Two Titans you say. Phwoar, what I could do with 12gb of vram (@.@)

  8. kool says:

    Should it really be up to the game developers to support this shit? It seems like such a difficult task still to get good SLI-performance, that it almost seems like nvidia\amd should sponsor the devs specifically to support their broken multi-GPU setups.

    • polecat says:

      Completely agree. I’m sure there are people who have multi-GPU systems that work really well for them on a number of games, but the whole thing smacks of turning it up to 11 and then spending time trying to make that stable. Surely there is more world happiness in us collectively relaxing and enjoying the frankly incredible graphical fidelity you can get with single card setups (I would argue even mid-range ones). The march of tech means these will get better over time anyway. Same reason 4k monitors are amazing but not my priority right now – all requires getting into an arms race I’m happier out of! Plus think of the possibilities opened up by not buying this stuff – go on extra holiday / buy loads of new games etc.

    • Carr0t says:

      How to break up the work across multiple GPUs and then reintegrate that data for outputting to a single monitor is a very complex task that can change vastly depending on how the game engine operates, so the hardware vendors can’t provide a one-size-fits-all solution in their drivers that will work with every game engine out there.

      If a game developer doesn’t want to support multi-GPU they don’t have to, as long as they produce an engine that runs acceptably on a reasonable selection of single cards available today. If every game dev had gone “this is too hard to get right, we’re not going to bother”, or if every gamer had gone “this is too expensive and flimsy, I’m not going to buy it” then both SLI and Crossfire would have stopped being developed (at least on consumer cards). As it is it’s only a small percentage of people who do buy multiple cards, but presumably that is enough to justify the development expense.

    • OmNomNom says:

      Maybe you say this as a non-SLI user but when it works it really can be awesome, the scaling is great these days too. Can be around 95%
      Who doesn’t want a smoother game?

  9. bit.bat says:

    Is SLI support a given for most games coming out these days?

    • RegisteredUser says:

      If by most games you mean the 10 or so AAA blockbuster seuelitis titles each year with millions of budget, then yes, if you mean the thousands and thousands of normal games, then no.

      • RegisteredUser says:

        That was supposed to read “sequel-itis”. As in the disease that instead of coming up with something new and cool, with millions at risk, they rather make Yet Another Call Of Duty and play it safe.

    • Premium User Badge

      JiminyJickers says:

      It is, the only one currently that I have that doesn’t support it is Fallout 4, it is being worked on apparently.

      I was afraid of SLI but it was much cheaper than upgrading to a different single card setup, but I have had very little problems with it so far.

  10. GAmbrose says:

    DX12 should solve SLI issues, apparently.

    It does all the work for the devs, AFAIK.

    • Malcolm says:

      Apart from the bit where you have to rewrite your engine in DX12 of course :)

  11. Drunk Si says:

    For some reason I had a feeling SLI support would never come. I have SLI 4gb 670s and a 1440p monitor; in the end I played the game at 900p@30fps (at 1080p the colours look washed out on this screen)

    It’s a shame the performance sucked because it’s a decent enough game at heart (despite being the weakest of the arkham games).

    SLI has done the business for me for the most part but I have noticed SLI support going downhill over the last year or so. Hopefully DX12 will make multi gpu setups more viable in the future. I’m going to get a new graphics card next year and I think unless DX12 does the trick I’d have to see a real bargain on a second card before I consider going SLI again.

  12. PancakeWizard says:

    Disgraceful. I even read somewhere that they attempted to sell the game full price as a ‘bundle’ on Steam BY ITSELF, because bundles don’t show the Steam reviews for individual parts of the bundle.

    It’s the insincerity that pisses me off more than anything else, and I never even bought the damn game. I don’t get what went so wrong here, as Shadow of Mordor and to greater extend Mad Max are just brilliant.

    • minijedimaster says:

      They hired some crap company to port the game to PC, that’s a big part of what went wrong.

      More importantly, I don’t understand why these companies are still developing their games for console then porting to PC. I would think it would make more sense to dev for the PC then scale back the graphics as needed for the consoles.

      • Jekadu says:

        Programming for consoles is very different from programming for PCs. There’s a lot to say about it and I don’t actually have any first-hand experience, but the gist of it is that consoles generally have a lot more bottlenecks that can’t be ignored than PCs.

        It’s like… if you think of it as piping, then a console will use small, unobtrusive pipes that don’t have a lot of capacity, but don’t take up a lot of room or cost too much, either. If you then try to increase the pressure in the pipes to increase throughput, then you’ll get leaks and not a lot of gain. Conversely, a PC would use gargantuan stainless steel pipes designed to operate at very high pressures, but which take up half of your available space and generate a lot of noise and are completely overkill for a console-scale system.

        It’s not a perfect analogy.

      • LionsPhil says:

        It’s easier to work within a tight resource budget then take those limits away, than it is to work to flexible limits then suddenly have to clamp down to a specific, tight one.

        Console-de-jour gives you a very clear performant-enough-or-not cutoff line for if you can do feature X, or if implementation Y is good enough.

        PCs could have all the glitz, glamour and attention, and it’d still make some sense to target consoles first from a technical standpoint (not necessarily a game-design one; certainly not in a porting-effort one).

      • OmNomNom says:

        Jekadu has it right, not a console developer myself but I know those who are.
        It’s basically the case of scaling stuff back being a lot harder than bolting on extras once the poop version is done. Even though the consoles and PCs running the games are more similar than they have been in the past there is a large gap between them in terms of capability, speed and memory size.
        We won’t truly have great ports until the systems are much closer in spec or we reach some kind of gaming machine singularity

  13. Siimon says:

    I never knew the Batman games were big enough to be “one of the biggest games of the year”. Have I missed out on something? Last I heard this was, even apart from bugs/performance issues, a very mediocre game.

    • Asurmen says:

      Subjective quality aside, yes it was one of the biggest game releases of the year.

      • Siimon says:

        As far as peoples expectations? Sales numbers? That it is the only(?) “brand name” fighting game on PC? What made it such a big release?

        I assumed it wasn’t that big of a deal, especially with so many other big releases to compare to (CoD, Tomb Raider, AssCreed, Starcraft, Overwatch, JC3, Star Wars, R6:Siege, FO4, TW3, you name it)

        (If it isn’t clear: Genuinely curious, not trying to be snarky)

        • Premium User Badge

          Aerothorn says:

          In terms of pre-order numbers/launch sales, how heavily it is marketed/hyped, etc. If it seems smaller to you this may be an affect of hanging out on mostly PC gaming websites – it was a bigger deal in the largely game-starved PS4/Xbox One catalogs.

        • drewski says:

          It reviewed as a GotY contender on console.

          It’s hard to say how it would have been received on PC without the bug SNAFU, obviously. But I think you’ll see it top 5, top 10 at worst on most outlets Best of 2015 lists next month.

    • SuicideKing says:

      Yeah, that’s what I was thinking. Loads of hype but it’s not like multi-GPU users can’t play TW3 instead.

    • PancakeWizard says:

      The first two won GOTY. Oranges didn’t, but is easily my favourite (although I miss the dark fantasy of the others when I play it).

  14. Moragami says:

    So much for “The Way it’s Meant to be Played”. Guess all that talk about Nvidia’s top guys helping them perfect the game for PC was just that, talk.

  15. TheColorUrple says:

    I don’t understand the beef against SLI. It’s an affordable way to get more performance if you’re shopping mid-range GPUs, and with the notable exception of Tomb Raider’s Tress FX tech, I’ve never experienced any headaches whatsoever. I guess this game is another argument against SLI, but frankly the game runs so poorly, it’s almost a moot point.

    So what do you guys experience that’s so problematic? I’d like to know, since I would recommend an SLI rig wholeheartedly to anybody with a decent power supply.

    • TheColorUrple says:

      I’ll also mention that I built my rig around the idea of SLI, which means beyond a 1200w supply (overkill) I bought a large roomy case with lots of cooling, and both GPUs are the EVGA supercooled versions. I think a lot of SLI paranoia comes from the hardware side of things, but if you build correctly you shouldn’t have any problems.

  16. JamesTheNumberless says:

    I only have a single GTX 570 and I seem to be able to play games at 2560×1600, sure I can’t use some of the more ridiculous settings, but I find the ability to see sharply into the distance unrealistic and unnatural anyway ;)

  17. racccoon says:

    I enjoy the game, there seems so far really nothing wrong with it for me.
    I think the work they did is cool and very well executed.
    No bandwagon refund from me.
    Cheers for a great game :)