Why You Don’t Need Multiple Graphics Cards

Stop that. It's silly

Apparently contrary forces, but suddenly complimentary. Are AMD and Nvidia about to become the yin and yang of the PC gaming world? Possibly. Rumour has it graphicsy bits of that DirectX 12 thing that arrives with Windows 10 will allow for asynchronous multi-GPU (graphics processing unit). In other words, you’ll be able to use AMD and Nvidia cards in the same rig at the same time to make games run faster. As rumours go, this is pretty spectacular. But it does rather remind me. Multi-GPU is basically a bad idea. Here’s why.

To be clear, I’m not saying multi-GPU is entirely devoid of redeeming features. My core position is that it’s something that can come in handy in certain circumstances. But it’s not something you should prioritise over a good single graphics card for better gaming performance. It is not, in other words, recommended by default.

What, exactly, is multi-GPU?
First, though, what exactly is it? The basic principle is super-simple. Use more than one graphics card to make your games run faster. And that’s exactly what AMD claims for its Crossfire multi-GPU tech and Nvidia likewise for SLI. How and, indeed, if and when this actually works is a bit more complicated.

As soon as you have more than one graphics card (or strictly speaking more than one graphics chip, since some multi-GPU solutions are housed on a single card), you have the problem of how to share the rendering load between them. Broadly, there are two options. Split the image itself up or have each card render entire frames but alternate between them.

For option one, known as split-frame rendering, you can slice the image into big slabs, one per GPU. Or you can chop the image up into lots of little tiles and share them out. But most of the time, that’s an academic distinction since alternate-frame rendering currently dominates (though this new DX12 lark may change that; more in a bit).

Again, the philosophy behind alternate-frame rendering makes perfect sense. You have all cards rendering frames at the same time, but with a suitable overlap and get double the performance with two cards. Triple with three. And so on.

One frame. Then the other. It’s not complicated

What’s more, when you factor in how graphics cards are typically priced, multi-GPU looks even more attractive. High end boards can be double or more the cost of a decent mid-range card but offer perhaps only 50 per cent more performance.

Obviously, there’s a whole spectrum of variables in terms of cards, pricing and performance. But for the most part, the price premium at the high end is not matched by the performance advantage. That applies whether you’re planning a new rig with box-fresh hardware or you’re thinking of adding a second card. So the theory behind multi-GPU is sound.

Does it actually work?
But what of the practice? The problem is that it doesn’t necessarily work. I don’t mean it doesn’t work at all. I mean it doesn’t work quite well enough or quite often enough, and I’m not talking about simple peak frame rates. When everything is hooked up and humming, multi-GPU frame rates can come close enough to the per-card boost the basic theory suggests.

Well, they can if we’re talking two GPUs. The benefits drop off pretty dramatically beyond two chips. No, the real problem is reliability. Sometimes, multi-GPU doesn’t work at all, defaulting your rig back down to single-card performance.

Exactly how often does this happen? Probably not often at all if the context is well-established, mainstream game titles and graphics tech. But with anything really new, be that a game or a GPU, the odds of multi-GPU borkery go up exponentially.

If you’re going to go multi-GPU, this is the way to do it. Just two cards

For me, the really frustrating aspect is that it’s fundamentally hard to be absolutely sure it’s working. Again, for a familiar title, you may have a pretty good idea of what frame rates to expect. But with a new game you’ll be unsure and it just gnaws away at you, that doubt. The not always knowing if the multi-GPUness is working.

Then there are the image-quality issues. Multi-GPU can throw up funky things like see-through walls that aren’t supposed to be see-through, flashing textures and micro-stuttering. Yes, on both a per-game and global basis this stuff gets addressed over time, in patches or in drivers. But then new games or drivers or GPU architectures come out and stuff is broken again for a bit.

Without fail, every time I’ve reviewed a multi-GPU solution I have some kind of problem running games. In that sense, I probably have a better overview of the wider realities than someone else who has an SLI or Crossfire rig and finds it runs pretty sweet nearly all the time.

So far, we’ve been talking about software faults. But add a second card and you are increasing the odds of hardware failure. That’s true in terms of the simple maths of having more components. If you had a billion graphics cards, you’d experience constant failures.

It’s also true, though to a much lesser extent, in terms of the stress it puts on your system. It loads your power supply, can increase overall operating temps. That kind of thing. In every way, then, multi-GPU makes your rig that little bit dicier.

A single big beast remains best

Who actually uses it?
I’m also far from convinced many people really use multi-GPU. Sadly, as far as I can tell, Steam no longer breaks out stats for multi-GPU, so accurate figures of how many of us use multi-GPU aren’t available. But I reckon it’s a very small proportion. In fact, I reckon multi-GPU mainly continues to exist as it’s handy marketing tool that helps lock us into a given vendor or other and has us buying more expensive hardware with multi-GPU support on the off chance we might use it one day even though the vast majority of us never do.

Like I said above, I’m not damning all multi-GPU tech as foul and useless. If you only play a few games or incline only towards the biggest new releases, you’re cutting down the odds of having problems dramatically.

I can also see how grabbing a second hand card on the super-cheap can make sense if your system is otherwise ready to accept it. Lets say you’ve got an AMD 280 or Nvidia 770. If you could pick a used one up for cheap from a reliable source, that is possibly appealing, and cheaper than upgrading.

But it would have to be very cheap. And the more promiscuous your gaming in terms of chopping and changing titles, the more likely you are to have problems. For almost anyone, I still prefer the spending a bit more on the biggest, beefiest single GPU you can lay your hands on approach and just getting on with enjoying your gaming.

As for whether the latest rumour involving asynchronous multi-GPU will change any of this, I doubt it. OK, the prospect, as mooted, of changing the way multi-GPU works so that as you add cards your available video memory actually goes up sounds very interesting and helps with one of the biggest weak points of using multi-GPU over time, namely that the frame buffers on old cards can be small enough to bork frame rates. But I doubt it will actually work because I doubt AMD or Nvidia will want it to work. And I doubt that even if it does work it will be fundamentally more reliable than single-vendor setups. Quite the opposite.

Shout out below if you disagree: I know multi-GPU certainly has its proponents out there. At the very least, it would be interesting to get a feel for how many of you are happily powering along with multi-GPU of some flavour or other. I feel it’s few. Let’s find out.


  1. TormDK says:

    I’m staying on 1080p untill a single card can power 1440p @ 60FPS. The Geforce 980GTX is almost there, so I’m suspecting I will be purchasing the Ti, but similar to this article, I’ve always been a single card gamer.

    • Tacroy says:

      Speaking of which, when are the 8GB 9-series GPUs coming out? I was this close to buying one last week, until I realized they were all 4GB models.

      • Artist says:

        And for what do you need those 8GB cards right now?

        • airmikee says:

          Epeen bragging.

        • Continuity says:

          You may have noticed that the new gen of consoles have lots of memory, this translates into needing even more memory on PC, 8gb vram is on the horizon, 4gb is already the minimum you can go for if you want to run everything (6gb if you want to use ultra textures on Shadow of Mordor).

          Frankly at this point you want as much vram on your card as you can get, if 8gb is an option when I buy later this year then trust be I won’t be getting anything less.

          • PancreaticDefect says:

            You’re definitely right about the high-res textures for Shadow of Mordor. I have a single GTX 980 and the game runs beautifully with the ultra texture pack…That is, until it starts raining….

          • jrodman says:

            Somehow I am entertained. I still own packaging that claims “over 150KB of graphics”.

          • Howard says:

            Sorry, but this just isn’t true at all. Yes, the current consoles have 8 gig of ram, but in any realistic situation, they only have access to between 4 and 5 gig of it, and that has to act as system RAM and VRAM. Also, as to SoM, I have a 970 that runs that game flawlessly, without hitching, at 1080p at around 90fps and most other games (minus a few such as FC4) can be run comfortably at 4k (DSR) at well of 60fps.
            The only games that have not run flawlessly on my rig are those that were just badly written – looking at you, Dying Light….

          • Zekeocolypse says:

            I have a 780ti, ran Shadows of Mordor on Ultra while downsampling (sorry Nvidia, supersampling) at 4k, and I had not a problem running it. I never noticed a FPS drop, it ran just perfectly. So did Watch Dogs for that matter.

        • Cinek says:

          8 gigs for console ports. See: Shadow of Mordor. Ultimate-mega-detail mode requires 8 gigs.

        • MadMinstrel says:

          Octane, Cycles, Redshift, baking textures, painting textures… Making games, basically.

    • StevieW says:

      Exactly this… at 1080 there should never really be a need for multi GPU setups. I run 2x 970’s and at 1080p it’s a waste, but since using 1440 it makes all the difference. With 1 card there is no consistent 60fps, but it flies with 2.

      • Buuurr says:

        SLI was never really for those trying to stay on the cutting edge of technology. It was more for those trying to keep up with technology’s pace without destroying the pocketbook. Anyone who had been in this industry for any amount of time would know this.

        • Sakkura says:

          For those staying on the cutting edge of monitor technology, it is necessary. Any single graphics card will struggle with demanding games at 4K resolution, and now 5K monitors are available with 8K (!!!) coming soon.

          In my opinion though, monitor technology is descending into madness. 4K is already pretty extreme, but 5K and 8K is just ridiculous. The benefit of those extra pixels is very questionable, while the downsides are obvious.

          • Buuurr says:

            I don’t know who is buy these monitors at the speed they are coming out. I would say it doesn’t even hit a percentage point… but the Titan Black is more then anyone needs to handle anything out there at any resolution.

          • TacticalNuclearPenguin says:

            Well, this is actually good, since in two years 4k monitors will be considered old tech, and that plays in favour of those willing to finally shell out for it and actually even expect to find a suitable single GPU, provided they get the best there is in 2017.

            A couple games with that setup might still hover in the 40-ish FPS, but only the huge hogs, otherwise it’s pretty nice.

    • Person of Interest says:

      I run a single GTX 970 @ 2560×1600. I had planned to use it for 1080p but came to possess a nicer monitor, and I was happily surprised to see It works at locked 60 FPS for most games I play, with some concessions.

      Tomb Raider: OK with Ultra quality preset, but not Ultimate, so no fancy hair shader or SSAA. Occasional dips to 55fps in rain scenes. i7-860 is CPU-limited to 40-50 FPS when overlooking the shanty towns.

      NBA 2K15: Some cutscene stutters (45 FPS?) with 8xMSAA, but no stutters on the court. GPU-Z reported “maxed” (3.5GB) memory use, but I think the game just aggressively caches textures.

      Most games from 5 years ago: fine with 8xMSAA but can’t keep 60 FPS @ 4xSGSSAA (they can at 1080p though). Games I’ve played: Hitman Blood Money, Fallout New Vegas, The Witcher, and Far Cry 2.

      Most AAA games have such a wide range of graphical settings that you can probably play anything 2160p @ 30 FPS or 1440p @ 60 FPS if you’re willing to turn off MSAA, ambient occlusion, and/or special effects. Take a look at the GeForce tweak guide for Far Cry 4 which shows FPS and VRAM impact for every graphics setting. You can make up the difference between a GTX 980 and a GTX 970 by lowering water reflection resolution and losing some trees in the far distance. It’s up to you whether those details are worth the wait and added expense.

    • scannerbarkly says:

      My 980 is chewing up 1440p when it comes to frames. If you want to wait for another version of it I imagine it wont have any issue hitting numbers the current version is already crushing.

      • TormDK says:

        link to bit-tech.net – would beg to differ, unless you enjoy gaming at below 60FPS on average. (I don’t, especially not in shooters).

        DirectX12 might do a bit, but I am certain the next generation Cards will be powerful enough to meet my requirements, which will make it an expensive round to do the upgrade because I would also need a decent 1440p monitor to go with it.

    • adwodon says:

      Depends on what games you’re playing, my 770 handles all the games I want at 1440p at 60, probably would be higher than 60 in most cases if it wasn’t a 60hz panel but there are no IPS panels over 60 that I’m aware of.

  2. Wedge says:

    I won a computer back in the day that had dual 4870’s in it. Almost always left one off because the benefits were rarely worth the random problems that would always turn up with both on. Eventually just put one of them into a friend’s computer that had a dying GPU. A 7950 was my next GPU (in a new PC) a few years ago and haven’t felt the need for anything more on my 1080 monitor since then. Waiting to see what happens with Occulus or affordable high res monitors to happen or whatever until I bother looking at anything else, and I figure by then there will be a single GPU that is good enough for those.

  3. AgoraphobicHobo says:

    I run two cards, a Radeon 7890 and a 7770. Cheap cards, but together they run everything I’ve thrown at them pretty well. It’s a pretty big improvement over running just the one card; and the Crossfiring actually cuts down on the heat produced by both cards, and I rarely see either of them get much higher than 50 degrees.

    • Sakkura says:

      Eh… what?

      There’s no 7890, maybe you mean 7870 or 7790? Anyway, neither of those can run in Crossfire with a 7770, because they use different GPUs.

      • Howard says:

        Whole point of Crossfire is that it does not need matched GPUs, so yes, they can.

        • JimmyM says:

          I think Crossfire only works with the same series, not a hardware guy so could be wrong. That’s what I’ve always worked off.

          • Howard says:

            Nope, you’re wrong, sorry. You can Crossfire just about any two AMD cards. For example, my laptop has a AND CPU with on board GPU (6xxx series) that Crossfires happily with the discrete 7xxx card it also has

          • JimmyM says:

            @Howard – That’s really cool! I had no idea. Wikipedia and Tom’s Hardware have both always said that it’s only cards of the current generation, and I’ve never tried it out myself, so I just went off what they said.

        • Dale Winton says:

          You can’t crossfire two different amd gpu. They have to be the same. You can crossfire certain amd gpus with one of their apu processors

        • Sakkura says:

          You’re wrong. You need to have the same GPU for Crossfire to work. But note that GPU means the actual processor here, not the card model; they can make different models based on the same GPU. For example, the 7850 and 7870 both use the Pitcairn GPU and would work in Crossfire. But not with a 7790, since it uses the smaller Bonaire GPU.

          Here are AMDs charts for Crossfire compatibility: link to support.amd.com

          • Dale Winton says:

            it will only function at the lower GPU’s speed though if you do that

      • JimmyM says:

        Hi Sakkura, the AMD 7890 was I think the name of the AMD 7870 Tahiti LE (which should blates have been called the 7930, as it was essentially an underpowered 7950 as far as I could tell) maybe that’s what GP means?

        But…then…no…still wouldn’t work I guess. It’s a hacked up Tahiti board so it doesn’t even play nice with 78xx series, only 79xx. Oh well. Must have just been a typo – maybe a 7870 Tahiti running with a 7970? I believe that would work, and would explain the “7890” as well.

        On an unrelated note, it’s extremely weird to be seeing the names of all the GPUs that were hot shit when I last bought a PC and having to remember that they’re now ancient in hardware terms.

        • JimmyM says:

          gah no edit – I meant to say “The name of the 7870 tahiti le while it was in development

      • AgoraphobicHobo says:

        More likely, it’s a 7870, I’ve had the card for a while, and just had to guess at the model beyond the whole 7800 series. But, yes, it is crossfired, and it’s made quite the difference.

    • horrorgasm says:

      Yeah. It seems so great until you begin to realize how many modern and upcoming games still don’t support SLI/Crossfire and won’t run well or be playable at all with both cards turned on. I really, really regret going dual card with my last PC.

    • Huels says:

      in late 2009, I built my a master gaming PC rig. Total wost was almost $10,000. I put four AMD HD 5970’s in that beast. They were the first duel GPU cards that AMD offered and they run hot and powerful. At 14 inches long each and over $1,240 each I think I got a good deal because I’m still running them and I upgraded to a 5K Dell monitor in January. I can’t run Crysis 3 or Battlefield 4 on ultra at those resolutions but I can ran anything else I have tried so far.

      I keep thinking about buying three or four Titan Blacks. What would you suggest?

  4. Xerophyte says:

    I think the primary strength of multiple GPU setups isn’t so much games as other applications nowadays. GPU-based renderers like iray or cycles can happily use a more or less infinite number of them as long as the scene is simple enough to fit. Serious professional multimonitor setups still tend to need at least 2 cards as well, although the companies are getting better at that point.

  5. ikanreed says:

    You seem pretty well informed about how graphics cards work, but that made me very confused as to your argument.

    You suggest there are exactly 2 ways to use 2 graphics cards:

    1. Frame over frame
    2. Divide the surface

    But if you’ve programmed game graphics before, there’s something something crucial you’ve overlooked: render passes.

    Rather than being concrete steps that you take sequentially, render passes are often multi-layer, through prepping different stencil buffers for future render steps(like transparency, reflections, lighting/shadows) to advanced shadings for special effects or vertex processors like particles or animations.

    All these different steps have complex inter-dependencies that, if you’re sufficiently interested, can be split across multiple graphics cards with ease. Moreover, you can get some distance out of the extra GPU memory by delegating certain models and textures to be processed only in certain steps.

    That’s not to say 2 graphics cards are as good as one graphics card with more cores, but there are some things you’ve glossed over.

    • souroldlemon says:

      The benefit of alternate frames is that they take roughly the same time. Parallelising a complex process is never achieved with ease; ensuring that you partition it exactly into two halves is even harder. That means that the scaling would be very inefficient.

    • Geebs says:

      Only problems: the driver and the GL don’t actually work like that.

  6. b15h09 says:

    I see one place for multi-GPU, and that’s with VR. The ability to render separate frames for separate viewpoints on separate cards seems like a good idea.

    • Continuity says:

      At the moment VR is the last place I expect to see SLI, even just with one card you get enough problems with things like shadows and reflections not rendering properly, throw in the SLI rag bag of bullshit problems like stutter and flashing textures and its an instant recipe for vomit on your keyboard.

      • BrickedKeyboard says:

        VR support done right is completely flawless, with no possibility of those problems. This is because when it’s done right, each card just renders a separate eye. Each card has exactly the same information, and does exactly the same thing. They do not have to be synchronized with each other, either.

        VR support inherently is probably the one good use of SLI because the scaling is perfect, it’s exactly 2:1.

        • jrodman says:

          Do we have a good way to guarantee the two cards are always in lockstep?
          I mean, sure the pcie clock exists, but that doesn’t mean the software and completion states will always align.

          I mean even if you don’t believe that the software can have trouble doing everything with the cards instantly at the same time, consider the problem of the different perspectives having different rendering costs. There’s no guarantee a different perspective will have the exact same scene complexity.

          • BrickedKeyboard says:

            You don’t keep them in lockstep.

            Each card is ordered to render the same scene with a slightly different view matrix. All resources are the same – every time you send any resource over the PCIe bus, you mirror it to the other card. (at the hardware level you might send the data to one card and transmit it over the connector as it arrives, or do 2 bus operations, or just have one card sniff the bus for the data sent to the other).

            This does mean that in terms of resource data, both cards need the information to render things not needed from their perspective, which is slightly inefficient, and you also are storing all of the resources twice, essentially.

            The big reason to do it this way is latency.

        • Person of Interest says:

          Clearly they don’t do exactly the same thing, or you wouldn’t get a stereoscopic effect.

          Are you speaking from experience? Because Timothy Lottes, who developed the FXAA algorithm for Nvidia (and obviously knows stuff about graphics, whereas I do not), says that SLI is useless for VR.

        • whorhay says:

          Why would you use SLI in that case? If the video cards are sending output to separate displays there is no reason for them to be linked.

        • souroldlemon says:

          If stereoscopic calculations are done right, then you basically produce a pair of frames, each of which are typically minor distortions of the same picture. Thus you have a big calculation for the mono view, followed by a pair of relatively small calculations for the stereo pair. Doing the big calculation twice is very inefficient.

          • MattM says:

            Is there a actuate enough way to change the rendered scene from one perspective to another other than just rendering it all twice?

          • Shadowcat says:

            Yeah, what MattM said. How on earth do you propose to take a single rendered mono view and convert it into a pair of views for stereo vision? Surely that is utter nonsense. (We would all have amazing goggles for viewing photographs in stereo, if this were possible. Or we would be living in Blade Runner.)

            e.g. Given a stereo view directly in front of a wall, where one ‘eye’ is just barely extending past the corner, the mono version of that, assuming a central viewpoint, will be nothing but wall. There is no “minor distortion” of an image of solid wall which will convert it into an image of what’s obscured by the wall.

          • souroldlemon says:

            You’re right, but a lot of the setting up is the same, so that it’s worth the cost of sometimes throwing things away when it’s not minor. Most stuff is just scrolled sideways a bit, and if you get fancy about efficiency then the lighting may need redoing.
            I don’t know how individual drivers handle it, that’s just what I’ve done.

  7. Zenicetus says:

    I’ve always shied away from multiple GPU’s for several reasons. Based on what I’ve seen on various game forums, the support is often either late or buggy when it’s supported at all, and many games don’t support it.

    It’s also a more complicated upgrade than just tossing an old single card and bolting in a screaming-fast new one, especially on a computer that I haven’t designed from scratch as a multi-GPU beast with oversize power supply, extra cooling, etc. Lately I’ve just been buying pre-assembled, so it’s easier to go the single card upgrade route. Currently I’m running a single GTX 970 and I’m happy with it.

    The *one* thing that would be a tipping point where I might try it, is if X-Plane supported SLI/Crossfire, and it doesn’t. Something about the sim being limited by texture swapping between cards, and that going multi GPU might actually slow the sim down, compared to a single fast card. Info about that here, which I think is still current:
    link to x-plane.com

  8. caff says:

    I run a 4K monitor (Philips 40″) using an i5-2500K and a single GTX970.

    It seems to cope with almost everything.

    • Continuity says:

      I assume that by “almost everything” you mean almost everything you’ve tried on low settings, that or your “almost everything” covers very little indeed.
      What people are looking for is stable 60fps at ultra settings for 4k, and currently there is no single card that is close.

      • Risingson says:

        TBH, I agree with that sentiment. Maybe it is because me myself don’t need 60fps the whole time, but even with a poor 650ti, I play everything with highest detail (except Cryostasis and the first Metro 2033) with 1080 high details.

        Or maybe it is because games take more power from the CPU than the GPU after all.

  9. noodlecake says:

    I can’t even afford one good graphics card! My card was sort of upper mid range about 3 1/2 years ago! It’s still holding out for medium settings on most new games but when the Witcher 3 comes out and I can’t play it I’m going to just cry.

  10. xoroku says:

    I’ve been running multi monitor for 5 years now and I can’t see myself using only one GPU for it.

    I tend to buy the flagship -1 card (i.e. 970, 290) and buy a second one about a year after the first one (usually when that generation is at an end).

    Then, I wait about 2-3 generation (for instance I have 2 670s which I will upgrade next gen) so that one card becomes the same power as my two card plus at least 25%.

    So far it’s worked fairly well but at the moment I’m really feeling that 2GB VRAM limit with the more open sandbox games of last year.

    That being said, unless you’re in a situation where you have a high resolution (multi monitor / 4k) or aim for that sweet 144fps, there aren’t many advantages in running multi GPU.

    In short, this is a question of need to attain an uncommon goal.

  11. aircool says:

    Multi GPU was supposed to be a sticky plaster to give your old card a boost when you couldn’t afford a top of the range new one. There was never actually a real need to buy two or more new ones.

    • Buuurr says:

      Yes, correct.

    • jrodman says:

      But if you’re not buying new ones, then aren’t you buying an old one as your sticky plaster? An old one with worse efficiency due to process shrinks and other factors?

      • Buuurr says:

        It really depends on the situation. There are many cases of an SLI setup that cost $400 destroying a $600 single card setup. Its an out if you are pressed to the wall. It certainly beats having no option.

  12. Continuity says:

    Basically IMO SLI is about getting the best when cost isn’t much of a object, many professional youtubers for example use SLI 980’s. What does SLI give you? put simply more frames at better settings, both of which are very nice to have, especially if you have a high frame rate monitor.
    But yeah, if you want to run everything on ultra settings you need SLI, don’t let anyone tell you otherwise, there are thousands of games out there some very demanding and some just very badly optimised which run like shit at max settings without beastly graphics horsepower.

  13. mrhidley says:

    Totally agree with this, I had two 3gb 7950’s until the end of last year. They were great if the games had proper crossfire profiles, but most of the time, the games that really needed them, didn’t have them. Sold them both, bought a single GTX 980 and the experience has been much better.

  14. raskolnikov.mx says:

    Last Christmas I just changed my two 7870s for a single 980, which feeds a 1440p Asus monitor (that is a plain 60 Hz refresh but PLS goodness).

    Be it the lack of micro-stuttering, drivers issues, the crossfire profiles brouhaha, etc. I am totally glad I went this route.

  15. GAmbrose says:

    I run a ‘tri-fire’ AMD setup with a 295×2 and a 290x in a Haswell-E system


    Simple. I have a Panasonic 4k TV (with DisplayPort) and its the only way to get decent 4k performance. I’m talking graphical bells and whistles and 60fps+. I only got Haswell-E (5930k) because it has 40 PCI-E lanes, so is really the only system to consider once you use more than 2 GPU’s

    Its frustrating when the drivers aren’t updated frequently enough with dedicated crossfire profiles for games (Hello Far Cry 4, Elite and Dying Light) but on most games it works great.

    Its also ridiculously expensive to build. But I do love sitting back on the sofa with my controller pad in hand and playing games at 4k. Most look really incredible.

    • Person of Interest says:

      Bless my average acuity: my eyes obey the one arcminute rule. From couch distance, I can barely see the difference between a 720p and 1080p source.

      • GAmbrose says:

        I agree it’s pretty pointless for TV and movies, currently.

        But I’ve done my gaming on big screens as long as I’ve been able to and over the past 10 years have gone from a Panasonic 50″ ‘HD Ready’ Plasma screen (1366×768), then to a Panasonic 55″ 1080p Plasma and now a Panasonic 58″ 4k LED TV

        At 4k, lines are less pixelated and jagged, and therefore it does look better due to Hyper acuity. Same reason still pictures look better at 4k (or ‘UHD’ at least)

        link to scholarpedia.org

        • mrhidley says:

          I totally disagree. If you think there isn’t a worthwhile difference between a 720p film and a 1080p film, you’re either not watching the correct content, your TV isn’t calibrated very well, or you’re sitting a hell of a long way from the display. I sit around 6 feet from a 47″ TV and there is a massive difference, games are way more noticeable.

          • mrhidley says:

            this was meant to be a response to the above post.

          • Chaz says:

            You only sit 6′ from a 47″ TV! Madness!

            My living room area is only 10×10 feet, I have a 32″ set and I’m only about 8 foot away from that, and honestly, I don’t think I’d want anything bigger. A 47″ set would just seem monstrous.

          • mrhidley says:

            It’s mostly because I live in a flat, but I like it, especially for gaming. Put on a great film with the surround cranked up and it’s better than being in a cinema IMO.

        • Person of Interest says:

          Thank you for the link. I should be able to test this for myself at home, right? I have a 101 PPI monitor, so when sitting further than three feet away the pixels will be about one arcminute apart, and from six feet the screen will be twice as sharp as I should be able to resolve (assuming I have 20/20 vision).

          What kinds of test patterns could I use to demonstrate hyperacuity?

    • Lord Byte says:

      SImilar setup, I don’t sit on a couch though ;) 4K right in my face ^_^ I had huge stability issues running two XFX 290’s though. Which didn’t exist before turning on Crossfire. And my setup should easily support two :(

      In my opinion most games run fine @4K and if they don’t, they don’t support crossfire anyway :( So I returned it .

  16. one2fwee says:

    We had a multi-gpu setup at work and it was awful. In most games it didn’t work, no matter what mode – AFR or otherwise. On occasions where it did “work” it was actually worse than a single card due to massive amounts of micro-stuttering.

    Unless you game is AAA from a big developer, there is no way it will work properly with a multi-gpu setup. Even then i imagine it would be awful.

    It gets even worse when you try and use it with triple screens. This is something that you would buy extra graphics cards for, however, criminally, this is an area where multi-gpu rendering is immensely broken.

    So yes, i would strongly dissuade anyone from trying it (or even multi-core gpus. Both manufacturers are as bad as each other in this regard.

    Multi-screen gaming support is pretty atrocious all round to be honest. Only driving sims do it properly (probably flight sims too, but i’m not up to speed on those). The rest just render as one surface which leads to huge amounts of rectilinear distortion and looks utterly atrocious.

    Why can’t games / applications run on multiple EXTENDED monitors? Why is eyefinity / surround necessary? Does windows seriously not support allowing multiple simultaneous windows to have full screen applications running at once? Or a single application with multiple “full screens” If not, why on earth not? And is linux any better in this regard?

    Seriously, if triple screen wants to be decent, it needs to be separate viewports, running in extended mode, so you can mix any resolutions of monitor and sizes / positioning. You could have a config tool in windows that you tell it how big your screens are physically and where they are in physical space. This would just be a load of 3d planes obviously.

    It could then tell applications this info so they could run accordingly.

    The way it is atm, the few applications that support this – assetto corsa, iracing, rfactor 2 (though not configurable atm grr) have to all be configured in-game individually.
    Also, none support horizon positioning afaik, meaning you have to vertically centre your screens with your eyeline, which is a pretty stupid limitation.

    • GAmbrose says:

      There was a lot of Hoo-hah about frame pacing in 2013 but since then AMD have really sorted out there drivers, particularly when it comes to multi GPU crossfire gaming so there isn’t any microstutter anymore

      • one2fwee says:

        As far as i know, the improvements in frame pacing only apply to single-screen gaming – when you combine crossfire and eyefinity or SLI and surround, that is when it spectacularly falls apart.

    • Chirez says:

      I use two monitors as a convenience, having games running on the secondary, with the desktop on the primary.
      Or at least that’s the idea. It’s incredibly rare to find a game which even recognises there are two monitors, let alone gives you the option to run fullscreen or borderless windowed on the secondary.

      My point being that even the basic options aren’t a consideration for 95% of games, let alone actual multi display output.

  17. Frank says:

    Thank you for finally posting a “you don’t need” column

  18. Pointy says:

    I gave up on mutli-gpu setups ages ago, I was spending more time comparing benchmark result than I was actually playing things. This goes for overclocking too.
    There is a place for needing this much grunt but in the end, there is only a handful of titles that actually make good use of it.
    Being old and cranky, there are too things high on my priority list: Running cool & running quiet.
    1080@60fps is nice too.
    I admit that I still have an SLI setup in my shed, 2x 8mb Diamond Monster Voodoo2s, 1024@100hz on a CTX 17″ flatscreen. Some games have never looked better.

  19. racccoon says:

    I know the technology costs to produce but once the makers of the vidoes cards bring down the price to reasonable levels, then you’ll get mass gpu’s added. I’m still waitng on the gf980 now having changed my mind from the 970 price drop wait, I still got a GTX 650 working on a i7 4.4gig comp. so when they go down to buyerable price, I’ll do it.
    Thats there problem at the moment graphics card developers seem to think its best to get money really slowly rather than in a massive rush by actually dropping the prices like the gaming industry they provide for does.

  20. TigerWolfe says:

    Some folks have mentioned the older cards in SLI can oft outperform the “better” cards. But additionally, at least for me, is the number of outputs. I run at a minimum 3 monitors on my desktop so I end up using 2 cards to make sure I have enough outputs.

  21. Fontan says:

    When I assembled my computer, not knowing that much about it I got help from a friend who advised me to get two 560Ti instead of one 570. This is a move I came to regret later and when I next upgrade I will definitely get a single, more powerful GPU.

  22. mattlambertson says:

    I’m extremely sensitive to frame rate stability issues so when I discovered that my dual GTX 660 setup a few years ago created awful microstutter that I couldn’t get rid of even in graphically undemanding games (iRacing, ffs), it kind of killed any enthusiasm for multi-GPU setups. I like my GTX 970 and think I will continue to like it for a number of years. :)

  23. SpakAttack says:

    Sounds like I’m in the minority, but I’ve run twin Nvidia card setups for the last five years, without any significant problems. I recently bought the phillips 4k monitor, and my two 670GTX’s could no longer generate an acceptable frame rate with good quality settings, so they’ve been replaced with twin 970GTX’s which do.

    It’s expensive, sure, but this is my main hobby and I don’t spend much money on anything else.

    No regrets.

  24. triclops41 says:

    Because of my backlog in games, I usually don’t play games until they are close to a year old.

    That said, I play at 2560×1440 with sli gtx 770s, and am pretty promiscuous with the games I play…and it is glorious. I don’t have technical issues, and have a properly ventilated case with a sufficient power supply.

    I think the real issue is that with SLI and Crossfire, you should be waiting at least a few weeks with the new games. (you should probably even wait that long with a single GPUs, as the trend of releasing broken games is only worsening).

    Once the bugs are fixed, multi-gpu gaming is pretty great, actually.

  25. tehfish says:

    I’m running dual GPUs not for rendering, but for other issues:

    Many GPUs are unable to use the minimum idle clockspeeds the card is capable of if you plug a second monitor into it. This same restriction does not apply if you use one monitor per GPU.

    So i’m left with the slightly weird situation where my PC runs quieter and cooler when idling with two GPUs than it does with one.
    (currently ATI6870 1GB and a passive ATI6570 1GB i had going spare – make sure the RAM sizes are equal though)

    • Person of Interest says:

      Are you sure this isn’t an AMD vs Nvidia problem? My HD 5850 had higher idle power use (20 watts more?) if a second monitor was active, and when I searched for a solution, I heard it was a common AMD problem. My GTX 970 does not seem to suffer the same problem. TechPowerUp mentions in its R9 285 review that, “Multi-monitor and Blu-ray power consumption is still bad – while AMD has made small improvements here, the gap to NVIDIA cards is still gigantic.” There’s also a chart on that page that shows idle multi-monitor power consumption for a wide range of cards.

      • tehfish says:

        Possibly, haven’t brought nvidia cards in a long time though. Switched away due to the epic driver instability in the geforce 4 era, and their proprietary shenanigans have kept me away ever since…

  26. MattM says:

    Ive gone from GTX 260 SLI, to GTX 570 SLI, to GTX 770 SLI and I’m pretty happy with it despite a lot of caveats. I’m not a launch day gamer since I like to pick up games after the worst bugs have been ironed out (or ignored), which is good since SLI profiles are often a bit delayed. I also play plenty of 2d or indie games that don’t really need the extra power.
    When I play though some graphically advanced title like The Witcher 2, Metro 2033, or Far Cry 3, is when SLI really feels worth it. I get to keep most of the setting enabled, turn on AA, and still get high framerates. It adds a really nice smoothness to the whole experience and advanced lighting and shadows can contribute to some great atmosphere to moments.

    • MattM says:

      And firing up slightly older 3d titles and getting to play them at locked 120fps is pretty nice too.

  27. celticdr says:

    Recently upgraded my practically ancient Nvidia GTX 260 with a Radeon R9 290 which absolutely flies in pretty much every game… only problem is that it doesn’t play nice with my second monitor projector resolution settings, quite annoying when watching movies with black bars around the edge… cant win ’em all I guess.

    • Colthor says:

      Check the overscan settings: last couple of AMD cards I’ve had defaulted to stupid with my telly, might be doing the same with your projector.
      (Getting the drivers to remember the setting was a battle, though.)

      Anyway, crossfire: had an x1900 setup once. Micro-stutter and input lag, fun times.

  28. drewski says:

    Finally some tech advice I can actually use!

  29. lanelor says:

    I am running HD7950+R9 280 and the feeling is mostly bad – most 2014 games hardly work with a single card, let alone crossfire/sli. So you double your investment+power draw, while having to wait for updates/bug fixes/drivers or some old games just crash because “2xGPU=error”.

    + more FPS [sometimes]
    – new power supply
    – drivers/game optimization sucks

  30. Lord Byte says:

    I got silly, I bought that Philips 4K screen. And then noticed that some of the games either looked awful at 1920*1080 or played horribly at sub-30fps @ 4K (Far Cry 4, WOT,…) . So off I trundled for another exact copy of my videocard (XFX R9-290X-EDBD 4GB), I’d built my system with that option in mind.

    How wrong I was….
    First of all the games I needed the extra card for, didn’t support Crossfire, so that sucked, next up came enormous stability issues. The cards themselves would crash regularly (black screen for a few seconds), and sometimes would blue-screen or force a reboot! Now my system is (was) solid as a rock, it just doesn’t crash, if it does I will work out what’s broken and fix it, I cannot abide a system that’s unstable.

    Normally I’d say this is a power issue, but I got a 750W antec TRIO Truepower, and made sure every card had its own rail. The moment I took out the second card (or disabled Crossfire), the issues disappeared. It could be a fault with the second card, but simply the fact that the games I needed it for didn’t support it, and most others worked fine with one card, made me return the card.

    So I’d say, unless you have a specific title in mind, that supports multi-vga, and are fine with possible stability issues, DON’T!

    • lanelor says:

      I don’t find 750W antec “TRIO” Truepower, only ordinary 750W PSU’s. unless you actually have 3 psu units, dual R9-290X on a single 750W PSU is madness – hense the blank screens and restarts. Check your 12V. If it falls under 11.4V the system restarts automatically.

      • Lord Byte says:

        o.O 12V remained stable. Are you kidding me? A 750W triple rail insufficient for a dual graphics card setup? What kind of power setup are you expecting for triple or quad graphics then?

        • lanelor says:

          I don’t understand what psu do you have. 1 single 750W PSU with 3 12v lines? If so the R9 290X requirement is 750W, but what about the second card? My first HD7950 was ok with 500W, but adding R9 280 made me buy 850W.

          • Lord Byte says:

            Antec Truepower classic 750, Dual rail apparently, should still be more than sufficient. It’s got three seperate sets of dual connecters for PCI-Express, which does not necessarily mean a thing with crappy PSU’s but I do trust Antec.

          • lanelor says:

            “The cards themselves would crash regularly (black screen for a few seconds), and sometimes would blue-screen or force a reboot!”

            This looks like the PSU is chocking. I had similar complains and it turned out my old PSU was insufficient – when the 12V falls below 11.4V the system resets. You can get a reading of the system and psu load with HWmonitor – create a log of all parameters, play some heavy 3d game and check the log. If you have MSI afterburner, the 12V can be added to be displayed on screen as you game.


          • Lord Byte says:

            Thanks for the info, I already send it back, mostly because the stuff I needed the second GPU for didn’t support Crossfire, and the rest ran fine. I’ve had dual GPU’s before (HD4890s) and they were more trouble than they were worth.

            At least now I know what to look out for if this happens again. Still uncertain that was the cause, because some crashes happened on games that didn’t support Crossfire (which meant the second card idled at minimal power consumption). I could force the blue-screen by just switching resolutions in WOT for instance.

  31. xcopy says:

    I am running two GTX 670 since 2 years and are very satisfied. I have a 1080p high refresh monitor and most games run smooth at 120 fps with all bells and whistles turned on. So far I have not encountered any graphical glitches / bugs / driver issues. (I know, this is not representative!)

    Heat was an issue, since both cards don´t blow the hot air out of the case. Solution: New configuration of the case fans and a custom fan profile via evga precision and speedfan. Now everything is cool.

  32. CookPassBabtridge says:

    Twin 980 user here. Not had a chance to really push it as am yet to buy my super res monitor, and fell in love with flight sims recently that only have eyes for my CPU. Really bought them for VR as at the time, Nvidia VR SLI was looking the way to go (and may still be with Steam VR and the NVidia headset). This definitely made a massive difference in Elite Dangerous VR with full settings, no judder.

  33. frightlever says:

    So I might have a spare GTX 750 TI soon – any point sticking it into my existing system (GTX 660 ti) as a dedicated PhysX processor?

  34. Ethilin says:

    I run 2 * GTX 780 (not TI) in SLI for 3D gaming @ 1080P. Yes it appears to be a dead tech, but then I switch off SLI for all new games for at least a month or two after release, just to avoid the additional crashes and glitches that are present upon release.

  35. Stevostin says:

    1 – power
    I have what it takes but it took me an effort

    2 – noise
    way too much just with one, are you kidding me ? And I did what I could to limit it.

    3 – room for cables
    Just having the HDD cables reaching MOBO despite the limo that is my GPU is challenging. No way I put a second one there.

  36. rowan_u says:

    As a guy who has used SLI a lot (gtx 480, 680 and 780ti), I agree wholeheartedly. SLI creates way more problems than it solves. At most, you’ll be using it for 1 or 2 games while everything else is broken beyond use. This goes double if you like to play on release day. That said, when it does work the boost is remarkable. Playing stuff like Far Cry 4 at 144hz is pretty darn amazing. Is it worth it? Probably not, but hasn’t stopped me from forking out a heck of a lot of money :P

  37. Major Seventy Six says:

    Having both cards pulling separate strings seem like a lots of fun.

    With the differences in architecture, one is bound to be better at texture rendering while the other at geometry.

    Once this is established I would love to solve the puzzle of which Radeon/GTX combo is best at each price point.

    Imagine the hours of fun… you’d have to establish which one is the bottle neck then either increment it or lower the other to stay in the targeted price range and thus achieve the best possible performance at that price point.
    I am thinking using 2 bottom shelf 80-100$ cards in tandem to achieve higher performance as this type of rendering would use the best card for what it can do best.

    Though, I hardly imagine that this is something AMD and NVidia would like.
    Hell, maybe even Intel “GPUs” might “do” something.

    Then again, Pairing an AMD APU with one of their GPU already works this way somehow.

  38. Wisq says:

    Still running 2x GTX 690 (so 4x GPUs in SLI) and not having any major performance troubles some 3 years later. Granted, that was $2k of video card at the time, but it seems to have paid itself off.

    • RegisteredUser says:

      700$ a year would have gotten you the top tier card every year. Sell the old one used the next 50% off and you could have done even more.

  39. Initialised says:

    Where multi GPU makes sense:

    One GPU per screen in a triple screen system (4 GPUs is the current limit) or Stereoscopic system, on the assumption that the driver, API and game engine are all sensible enough to partition the workload correctly rather than duplicate the frame buffer then spread the load across X-GPUs. So far this hasn’t happened

    Distributed number crunching aka Crypto coin mining and Folding@Home

  40. NZLion says:

    If DX12 is meant to enable the capability of all fitted graphics cards having cumulative memory in what is in-effect a common pool, Is there any chance we will see add-in boards that are just more vram?

  41. RegisteredUser says:

    I have basically been trying to tell people all of this for years, yet they continue to try and run SLI setups.
    Simply staying in a sweet spot of “great performance for price” and lagging the super-duper-shmooper latest games by 12 months will get you both the GPU and the games with massive discounts(20-40% for the GPU, 50-90% for the games).

    SLI, preorders and other epeenery are for people who have an issue with exactly that. Or vastly too much money and time and nothing else they wish or know to spend it on.

    • jrodman says:


      I play games 5 years old and more and I never have to upgrade anything!

    • OmNomNom says:


      I have at least as large a penis as people with more money than me.

  42. sansenoy says:

    I’m hoping adaptive sync will put multi-GPU where it belongs – in the gpu compute camp.

  43. adwodon says:

    I’ve got a 770 and it runs the games I want to play at 1440p on a beautiful Dell IPS panel with no problems, I tend to play Starcraft 2 and other PC only titles, like Heroes of the Storm, The Sims 4 and soon Cities: Skylines.

    I have my PS4 for the big budget AAA stuff, I actually prefer not having to deal with high end graphics on PC, I want to upgrade every 2-3 years and not spend more than £3-400 when I do and I’d rather have no options on console than get annoyed when I can’t hit ultra on my PC, I know I could still get better graphics on my PC if I tried but I prefer not worrying about it.

    As far as I’ve seen there are no reliable 4k panels on the market at the moment so there’s no way I’d upgrade, I work in embedded systems, specifically display walls so I faff about with this stuff fairly regularly and I’d not recommend anyone else do it either, you’re better off buying a nice 1440p panel and super sampling in my opinion.

  44. SuicideKing says:

    TechReport had a hardware survey a short while back, and an exceedingly small percentage of the community had more than 1 GPU in the system.