Mass Effect: Andromeda system requirements confirmed

Now that Mass Effect: Andromeda [official site] has gone gold [for the benefit of younger readers: ‘gone gold’ is a reference to how developers would celebrate finishing a game by dancing around to Spandau Ballet’s Gold -ed.], BioWare have confirmed its final system requirements. If Holly’s recent preview has you interested in the new space adventure, hey, read on to discover how much power you’ll need in your datadeck. Your rig. Your GamePig. Your gameslammer. Your roxxor boxxor. Your beast. Your neon demon. Your make-the-pictures-go machine. Your computer, yeah? Is it fast enough?

The specs are listed on Origin:


OS: 64-bit Windows 7, Windows 8.1 and Windows 10
PROCESSOR: Intel Core i5 3570 or AMD FX-6350
HARD DRIVE: At least 55 GB of free space
DIRECTX: DirectX 11


PROCESSOR: Intel Core i7-4790 or AMD FX-8350

Of course, exactly how playable a game is on the minimum spec is always a bit of a grey area, but those are your numbers. My laptop wee laptop should still run that so I’m content.

March 21st is the big day.


  1. rabidwombat says:

    I’m looking for advice from a fellow game-rigger:

    Should I upgrade my RAM? I have 8GB of 1600 RAM. That fits the minimum specs, along with my video card. I plan on upgrading the GPU, but that’s a separate discussion. So. Will I notice an improvement in, well, anything, by upgrading the RAM?

    • CraigItsFryday says:

      Yes. Upgrading your RAM should help your overall performance. 16GB of RAM is pretty standard these days. After looking at the recommended specs and realizing my machine just hits it, makes me want to start shopping for parts.

      • LexW1 says:

        16gb is pretty standard? Really? I’ve almost never seen someone list 16gb unless they had some sort of ludicrous cutting-edge PC, and very few games even have it in their Recommend Spec – this one of the first that I’m aware of.

        So I’m not sure if that’s true.

        • Bull0 says:

          16GB was cutting edge when I built my gaming PC six years ago.

          Battlefront and Battlefield 1 both listed 16GB in their recommended specs last year (in fact wasn’t Battlefront the year before?).

          It’s not an area worth skimping on, the difference between 8 and 16GB is probably about £20-30 max.

        • Unsheep says:

          I agreee. 16GB is not the standard among PC gamers, especially if you are looking at it from an international perspective. I would argue that the standard is 6-8 GB, perhaps even 4.

          • Rosveen says:

            According to Steam Survey, the standard is 8 (31%), followed by 12+ (21%) and 4 (20%). I’d make a guess that most dedicated gamers have either 8 or 16; 4 are probably older machines or PCs not suited to gaming in the first place. I’m nowhere near the so-called “cutting edge”, I’m still using a laptop from 2011 that orignally had 4GB RAM and even that got upgraded to 8GB a while back.

          • noodlecake says:

            Hmm. I think a lot of people who have Steam are probably kids going on their parent’s PCs/their cheap laptops, or students who can’t afford gaming PCs. I think that would explain people having less than 8.

            I only recently upgraded to 16gb. I think my processor doesn’t meet the recommended spec but my 980 should be more or less okay.

          • Pizzzahut says:

            16gb is definitely the standard for gamers. The standard for kids using their Mum’s PC (ie; the whatever you can get your hands on spec) is probably 4gb, but that doesn’t really count.

    • Jokerme says:

      If while playing your games you don’t do extra things like recording, streaming, etc. I’d say 8 gb will be the sweet spot for at least a few more years considering the console limitations.

      • sosolidshoe says:

        I thought so myself until last year. I was having a lot of issues with games for which I ostensibly more than exceded the minimum requirements, and in all but one case(that one required a GFX upgrade) they were entirely solved when I switched up to 16 gigs of RAM.

        A lot of the time when modern games list 8GB in the minimum they mean it actually uses 8GB, not that a system with 8GB will run it – once you factor in OS overhead and background apps, get ready for lots of out-of-memory crashes if you stick with 8GB much longer.

        • Jokerme says:

          As far as I know, without modding, there isn’t a single game that can use 8 GB of RAM. They don’t ever use that much. If a game uses 4, your minimum should be 8 with the system and the rest included. And even then you’ll have some fair amount of free RAM because system optimizes itself if RAM gets full.

          I’d say you either had a faulty RAM or your system was very “dirty.” I have used a 3 GB system until last year and most of the time games with 4 GB minimum RAM requirement ran without any problems.

          More RAM doesn’t improve your performance, it just waits there without doing anything. As long as all applications have their share of the RAM, there is nothing more RAM can do. And EVEN IF you don’t have enough, system can still tolerate that with disk swap files, but of course at the cost of perfrmance. That’s when you upgrade your RAM.

          • sosolidshoe says:

            What are you on about chief? 64bit has been a thing for a good few years now and the assertion that no game uses more than 4-ish gigs is odd given that. You’re talking like we’re all still on Windows XP or something.

            For the record I’ve been building and maintain systems for over a decade and I keep my machine in good nick both in hardware and software terms, I did actually bother to check whether my RAM was working properly before I went out and spent money on more – if I have 8GB with ~10% usage for system and background apps, and games are crashing or causing the windows “low memory” popup while the system is showing ~90% of RAM in use by that game’s executable in Process Hacker, and if after upgrading to 16GB those same games use up a bit over 8GB and then stay there running stable, it’s pretty obvious what the issue is.

    • mattevansc3 says:

      Not enough to warrant the expense because unless you’ve nearly maxed out the other components its not going to be a bottleneck.

      Instead I’d pump the extra cash into my GPU budget.

      • malkav11 says:

        Another 8 GB of RAM runs like $45-60, will make a real difference, and wouldn’t move the needle that much on a GPU unless you’re going real low budget on that. And if you are, then you are going to have to make performance concessions anyway.

        • LexW1 says:

          Where does 8gb of fast new ram cost $45?

          It’d have to be two 4gb sticks, of course. Plus some people would need to upgrade the whole lot, because their ram is too old or already taking up all their slots, so they’d be looking at 16gb.

          • malkav11 says:

            Amazon. It’s not a brand I know so I wouldn’t necessarily recommend it, but it’s there.

        • mattevansc3 says:

          It’s about £60 which is the difference between the 3GB and 6GB variants of the GeForce 1060 cards.

    • KastaRules says:

      Bear in mind that having more RAM does NOT make your computer faster; you know you need to upgrade it when you start reaching the 100% mark of memory in use fairly often. Otherwise it would give you no benefits at all.

      • rabidwombat says:

        That’s why I gave the MT/s rate of my DDR3. I could upgrade to 8GB of 2133 instead, which would improve speed (theoretically), but I wanted to hear from experts. I suspect that, for the purposes of gaming, it would not.

        • Jokerme says:

          No, going from 1600 to 2133 won’t make that much of a difference.

      • TormDK says:

        More RAM might not, but faster RAM will.

        Please see; link to

    • Physickl says:

      If u can get another 8g to make it 16 I would recommend it, yes it does make a difference, not by much in terms of FPS but it can, its more to do with spikes, if ur memory is anything past 90% that’s bad, with PCs that’s most likely the case, there not the same as a console, a console only handles small data compared to a PC, u cant do anything but game an watch movies, surf the web.

      • Don Reba says:

        A console only handles small data compared to a PC, u cant do anything but game an watch movies, surf the web.

        But those are pretty much the only memory-intensive activities on a PC, as well, unless you decide to play while encoding video.

        • Unsheep says:

          True. The vast majority PC gamers use their PCs only for gaming and media, … which is very similar to how American console gamers use their consoles.

          It’s only different for people who use their PCs for actual 3D work, but most 3D designers use an iMac in my experience, unless you’re talking CAD.

    • Koozer says:

      As the chap above says 8GB is perfectly adequate unless you’re running multiple RAM heavy programs at once, or are using high res texture packs or the like. If you encounter stuttering when eg. moving quickly or turning, you may notice an improvement with more RAM.

      • mattevansc3 says:

        I might be oversimplifying this but hi resolution textures tax the GPU VRAM. Frame stutter then arises once you’ve maxed out the GPU VRAM. The system starts using system RAM which is considerably slower.

        The only way to resolve that is to have a GPU with more VRAM, generally in the region of 6-8GB.

    • Spuzzell says:


      There’s a 1 FPS advantage (55fps to 56fps) to a system with 16GB of RAM over the same system with 4GB of RAM (that’s four GB, not eight) in GTA V.

      The same system with 8GB of RAM posted identical numbers to 16GB.

      8GB is plenty. Don’t bother until you actually have to.

      • TormDK says:

        While this is true, the speed of the RAM matters quite a bit more, please see; link to

        I went from DDR3 1333mhz to DDR4 3000mhz when I recently built a new PC to be ready for ME-A, and it has helped in Battlefield 1 as an example. You might not see higher FPS, but you will see a higher average FPS.

    • Jekadu says:

      I’ve found that one of the biggest benefits of extra RAM is just peace of mind. It removes a few bottlenecks and lets you do silly stuff like run two games at the same time (which actually makes sense sometimes!). It gives you room to spread out.

    • Premium User Badge

      ooshp says:

      It’s worth upgrading because it’s so cheap, but no you don’t really need it and won’t notice a massive improvement. Unless you’re running a large amount of stuff in the background, that is. Low GPU RAM is far more likely to destroy your frames.

      Heavily modded Kerbal is the only game I can think of that ever crashed my 8gb before I upgraded to 16.

      edit – also what the guy above said, sometimes I totally forget to exit a game before running another one, definitely want 16gb for that

    • Hohumm4sh3d says:

      After your memory unless you have one already then I would advise an upgrade to a SSD. Once you go beyond 8gb of ram (Obviously more is better) the speed increase from a spinning disk to SSD makes a large difference in loading times in a lot of games too.

    • TomA says:

      I memory crashed a couple of days ago with my 8GB with a few chrome tabs open, rainbow six siege and Geforce instant replay running.

    • Sagiri says:

      RAM is cheap. I have 32GB of RAM; I’ve never used anywhere near that much, but it wasn’t very expensive, and it’s not like have extra RAM is a bad thing.

  2. brucethemoose says:

    Mmm, my 7950 is inching closer and closer to minimum spec these days… And those clearance Furies are really tempting.

    • ColonelFlanders says:

      You can get a GTX970 for really not a lot of money these days, especially if you go second hand – there isn’t a great deal between it and the fury.

      • MacPoedel says:

        A GTX 970 is quite a lot slower than a R9 Fury. The GTX 970 is more comparable to a R9 290(X) while an R9 Fury can be compared with a GTX 980 Ti. See benchmark comparison: link to (the R9 Fury there is an overclocked version, but just barely and there is no reference R9 Fury). There are lot of sidenotes with that comparison, the 970 can be overclocked a lot more, closing the gap a bit, and it uses a lot less power, but on the other hand the Fury has better drivers now than when initial benchmarks were made (the 970 as well, but not with quite such an effect as AMD GCN GPU’s have seen), a lot more memory bandwidth and much better 1440p+ performance. Bottom line remains for me that they’re still a different tier of graphics card, in general the Fury is faster.

        You can’t even get a GTX 970 and R9 Fury from the same channels, the 970 can’t be bought new anymore, but you can get a good deal second hand (however, I’d still say an R9 290 is a better choice). On the other hand I can barely find second hand Furies, but new ones have been discounted a lot lately.

        • brucethemoose says:


          I like to ~double my perf when I upgrade then sit on the GPU for a long time without breaking the bank, which means I’m looking at a Fury since the 980 TI is still so expensive.

    • Premium User Badge

      ooshp says:

      I can’t say enough good things about my 7950 OC, it was happily running games in 1440, and almost all VR titles, until I replaced it in December. If all I was doing was 1080p gaming I’d probably still be using it.

  3. Det. Bullock says:

    Intel Core i5 3570?
    I can accept my video card (AMD 7770 1gb) is completely inadequate but isn’t the processor a bit high for a minimum requirement?

    • kickme22 says:

      I’m sorry, but the day you call an i3 “a bit high” is the day you should know you need to upgrade…

    • mattevansc3 says:

      Open world/sandbox games are generally quite CPU intensive as they are running more AI routines than a standard action game.

      • Det. Bullock says:

        I dunno, it still feels quite excessive to me since this thing is supposed to have a ps4 version too and the CPU on that thing isn’t exactly cutting edge so either the cpu requirements are inflated like it sometimes happens nowadays (I remember being able to play at Arkham Asylum decently with a single core CPU and a low tier video card by lowering the resolution a bit and I could keep the textures on high easily) or we are going to have another Arkham Knight on our hands.

        I mean, if the thing can run on a PS4 (even considering the difference between a closed system and an open one) shouldn’t it be able to scale down a bit more than that?

        • Lukasz says:

          2 core vs 4 core.

          So while speed of i3 is enough the lack of cores might be a problem.

          If you program your game to take advantage of 4 separate physical cores then even if you chuck at it super powered dual core it will struggle (and i3 is not that good anyway)

          That’s why amd even though slower is still going to be better option.

          I3 these days is worthless as gaming CPU

    • Jason Moyer says:

      My favorite thing is that those 2 CPU’s aren’t even near each other in Tom’s CPU hierarchy:

      link to

  4. Rich says:

    It’s a sad day when your CPU appears on the minimum requirements.
    It’s a shame CPU’s don’t (can’t?) have long-standing standard sockets like GPUs. It’d be nice to be able to keep up with developments without needing an entirely new motherboard.

  5. Darth Gangrel says:

    I thought my PC was a beast (except for the GPU, Nvidia GTX 680) when I bought it in May. Admittedly, the guy I bought it from built it a couple years ago, but even so, I’m surprised to see the rec. specs being so high. I have 16 GB, which I thought at the time was overkill, and an i7-4770 (just below the rec. i7-4790, but probably won’t make a difference).

    Whatever, I bought it for the purpose of playing The Witcher 3 and that game is still so new (i.e. far into the future) to backlogged me that I’ll probably have upgraded my GPU once I get to Andromeda.

  6. indociso says:

    A lot of games these days seem to have 4 cores as the minimum spec. I’m using a Core 2 Duo E8500, but I’ve got 8GB RAM and (soon) an RX 470 so otherwise I’d be relatively happy with my spec. Is my lack of cores going to make this unplayable?

    • leeder krenon says:

      A Core 2 Duo CPU is practically prehistoric these days. Time for an upgrade.

    • thatdosbox says:

      I’m using a Core 2 Duo E8500

      You should give serious consideration to upgrading that. The Frostbite engine (which Andromeda and DA:I use) will make good use of additional CPU horsepower:

      link to

      This quote is particularly relevant:

      Dragon Age: Inquisition ran with 16fps on our simulated dual-core system (with major stuttering issues), while our simulated tri-core system was able to push 45fps. On the other hand, our simulated quad-core system ran that scene with 60fps

      Fortunately for you, reviews of AMD’s new Ryzen CPU’s should be coming out on March 2nd, so you’ll have some time to consider whether to go with Intel or AMD for your upgrade.

      • indociso says:

        Yes, I should have mentioned that CPU, motherboard and RAM are the next on my list of upgrades! But it’s probably going to be a while until I can do it.

        Thanks for the replies though. I wasn’t sure how much difference the extra cores would make so it’s good to know. Looks like I might have to get this on *another platform*

        • TormDK says:

          Good stuff considering the RAM upgrade as well, do keep the speed of the RAM in mind. Sweet spot seems to be the 3000mhz DDR4 variant.

          More info here; link to

  7. thatdosbox says:

    developers would celebrate finishing a game by dancing around to Spandau Ballet’s Gold

    I laughed.

  8. Papageno says:

    Re: the origin of “gone gold,” I’d always read that it stemmed from the developer traditionally sending the master copy of the game to the publisher on a gold-colored CD. At least that’s what the guy who ran the long-defunct site put out there.
    I guess it depends on whether the expression pre-dates the CD era for games (or maybe there were gold-colored floppies?).

  9. geldonyetich says:

    If Andromeda doesn’t work with my current rig, I’d say I’ll be picking up a Ryzom sooner rather than later.

  10. ulix says:

    I have a good old i5 2500k with a relatively recent GeForce 970, and I expect this to run well in high details. Like every game up until now has run well in high details on my old CPU.

    I may not get 60 fps, but I’m pretty sure I’ll be able to get a 30 fps lock.

    We’ll see.

    • Love Albatross says:

      You’ll be fine with the 2500k. I find it doubtful that moving one gen forward to the i5 3xxx the min specs mention would make any noticeable difference.

      Every time Intel releases a new CPU I look at it and see no compelling reason to upgrade from the trusty old 2500k. Ended up putting my CPU upgrade cash into a GTX 1070 and it’s been a huge improvement.

    • redhqs says:

      heh, guess I’ll find out if my old i5-760 can keep up, the venerable dreadnought is starting to look more like a dreadnought emeritus

  11. Disgruntled Goat says:

    Is it me, or does Andromeda look like a giant yawn?

    It is probably just me.

    • sweenish says:

      I was in the same boat, the combat gameplay video changed my mind.

      I still won’t be getting it at launch, and I won’t be getting it until I have at least a new GPU, but I will be getting it now.

    • Unsheep says:

      My impressions: a cover-based open-world shooter.
      The series sure has strayed a long way from the tactical and explorative RPG that was Mass Effect (1). The franchise went completely downhill after that first game.

      • TormDK says:

        You are going to be able to do alot more tactical stuff in ME-A than you ever could in ME1.

        The movement options alone is going to blow ME1 out of the Water, plus the whole setup around abilities that can be switched around will give great tactical flexibility.

        We don’t even know what difficulty level they made those videos in, but I will not be assuming that you can simply wade through it all at the highest levels.

    • Czrly says:

      It is not just you. After enjoying 1 and 2, I started playing 3 and completely lost all interest. The “character” of the game was absent and, without that, all that’s left is boring and repetitive cover-based shooting.

      I am planning to build a new desktop PC to replace my four-year-old laptop, this year, but this AAA title is not my motivation – anything but.

      Probably, I’ll end up with a monster and use it for Stardew Valley. There’s more innovation and originality and charm, there, than there is in the whole AAA industry put together.

    • dskzero says:

      I felt the same about the first trilogy though. It looked incredibly uninteresting. This one though has an air of… a lack of trascendence, you know? We’ve been already told there is no unstoppable threat, not apocalyptic scenario, it all feels like.. a bit too casual.

      I’ll hold on though for reviews and opinions.

  12. Samudaya says:

    Are you going to post every little scrap of Mass Effect news over the coming 4 weeks? Then I’ll come back in a month.

    • Frosty_2.0 says:

      Hah, it’s not as if there’s been a flood of posts, they didn’t even post about the new (2nd) Gameplay Series/diary video link to when they did do with the first one;
      They combined the gold news into this Sys Requirements post when most other outlets are making individual posts of bites from dev tweets and the like – popping up on my feed.

      This is nothing compared to many other marketing campaigns (remember the deluge of marketing ahead hype of ME3)…

      • Unsheep says:

        Yes, this is rather modest compared to the explosion of videos we’re about to get from GameSpot, IGN and many others, which such delightful and informative content as:

        ‘Andromeda vs Dark Souls/Witcher3/Skyrim/DragonAge’
        ‘PC vs XBox One vs PS4’ … as if people actually choose
        ‘who can we bang first ?’
        ‘look how shiny this game is’
        ‘Top 10 silly things to do’
        ‘ … [multiplayer, multiplayer, multiplayer]’
        ‘why you should NOT buy Andromeda’ [obvious click-bait]
        ‘How you should play the game’

        • Ghostwise says:

          Well, people click on these articles. Like, a lot.

          Every article you do not run on $HUGELY_POPULAR_GAME is money left on the table.

  13. aircool says:

    I may get this as I really want a decent sci-fi (ok, sci-fantasy) RPG as I’ve had enough of fantasy RPG’s, particularly the hey-nonny-nonny folk music that pervades every Inn where you go to pick up rumours (quests) etc…

    • Czrly says:

      Then don’t get this. Mass Effect is *not* an RPG… at least not any more. It’s a cover-based shooter with AAA production and that’s it.

      • Nevard says:

        It’s still as much an rpg as The Witcher.

      • Von Uber says:

        In that case it never has been an RPG.

      • aircool says:

        Damn… does no-one make turn based combat rpg’s anymore?

        • Wulfram says:

          There are quite a few turn based RPGs around at the moment, like Divinity: Original Sin, Torment: I can’t remember the subtitle, Shadowrun and so forth.

          Probably some JRPGs too, but I’m not qualified to comment.

        • Janichsan says:

          There’s also Wasteland 2.

      • Zenicetus says:

        I thought the term used for these was “Guns & Conversation” games? Or don’t we use that now?

        I always liked that better than “Action/RPG” because there isn’t much room for actual role-playing with the predefined characters in games like the Witcher series, recent Tomb Raiders and Wolfensteins, and the ME series. You can make the lead character a little more gruff or a little more sympathetic and that’s about it. It’s guns/swords/bows & conversation.

  14. wombat191 says:

    this game gave me an excuse to finally get that graphics card upgrade… my baby is ready for it :D

  15. skyturnedred says:

    I have faith in my GTX470.

  16. Von Uber says:

    Looks like my geforce4 has finally seen it’s day.

  17. ruaidhri.k says:

    well, it would seem my trusty Intel Core i7-960 might finally be getting long in the tooth

  18. alsoran says:

    Amazed to see my FX-8350 up there :)(Tick!)
    16 GB RAM (Tick)
    GPU is R9 380 (Wah – Wah!)

    Two out of three ain’t bad. Not to worry, I won’t be getting this until its been tested, reviewed, updated and had a couple of dlc behind it.

    Also I still have to finish Witcher 3 and the rest of my backlog.

    • BlueTemplar says:

      Last year when I upgraded to a FX-8320E I was betting that in the future developers would be better in using its multi-threaded processing power…

      Lo and behold, now they’re placing for this game :
      Intel Core i7-4790 :
      $318 – 2,286 PassMark points in single-threaded performance, 10,002 in multi-threaded,

      at the same level than :

      AMD FX-8350 :
      $150 – 1,505 PassMark points in single-threaded performance, 8,937 in multi-threaded !
      link to
      link to

      FX-6350 gets :
      $128 – 1,479 single – 6,959 multi
      (Yup, assuming they’re not just guessing, multi-threading from 6 up to 8 cores does make a difference now!)

    • tehfish says:

      Nice to see my CPU still just about holding on there for the recommended specs, despite being almost 5years old now!

      FX8120 @4.2ghz (so roughly FX8350 performance)
      16GB RAM (tick)
      R9 390 (tick) :D

  19. Robomonk says:

    I do wonder what the requirements will be for Cyberpunk 2077. 100GB of HD space? GTX 1080/RX 480? I7 7700K/Ryzen? 32GB RAM? Then again, they’re also making it for the console, so maybe not so out there – unless they have a PC exclusive downloadable high definition texture pack. It would be nice if they allow for finer control in the graphical settings.

    • Don Reba says:

      Hey, by the time it is released, smart phones will have more computing power than that.

    • Zenicetus says:

      Witcher 3 did an outstanding job on less than current state-of-the-art hardware, based on game design and writing quality.

      If Cyberpunk 2077 can match what they did with Witcher 3, then they don’t need to wow anyone with eye candy that needs high-end hardware.

  20. Chaoslord AJ says:

    55 GB harddrive. Oh goodness. Guess I have to move my skyrim installation elsewhere.

  21. Nauallis says:

    There’s a pun buried in there somewhere… something something system specs something something game about exploring a galaxy… must be a boring game if you only have to worry about specifications for a single system… Nope. No humor there, sorry.

  22. RaunakS says:

    I game on my laptop due to the problems of always being on the move, but game hardware requirements still stupefy me. I apparently have a 4GB GT 940MX card + an i7-7500U + 16 GB RAM, all of which are fine for the amount of neural network prog I do but terrible for most AAA games. I am looking forward to ME4 still, but I won’t buy if I can’t try it out with a demo.

  23. AlteredCarbon says:

    AMD A10-7700K Radeon R7, 10 Compute Cores 4C+6G

    this is my CPU. do you think this would be an okay CPU. everything else is fine but I’m not sure about my CPU

  24. Doomshot8 says:

    Someone help!!! will this ASUS LAPTOP…
    Processor: Intel(R) Core(TM) i7-4720HQ CPU @ 2.60GHz
    Installed memory (RAM): 16.0 GB
    System type: 64-bit Operating System, x64-based processor
    Nvidia GEFORCE GTX 960M
    run mass effect andromeda good on like high settings (not ultra) ?
    everyone keep saying ” dam i have 970M” well i have 960M…

    I would buy it for xbox one cause im console > pc but I wanna test the whole “pc master race” thing with this game, i dont think my laptop is top notch 60 fps 1080 ip or w.e blah blah system requirments up but is this enough for a legit 100% good graphic experience on this laptop?