What An Eye Full: Watch Dogs’ System Requirements

By Alice O'Connor on April 7th, 2014 at 11:00 am.

You'll have someone's eye out with that thing

I’ve grown complacent, and my PC weak. I haven’t upgraded at all in a fair few years, but now face building a whole new system. See, the demands of multiplatform games pretty much stalled once console developers starting pushing their limits, but now a whole new load are out and oh, their games are hungry. One of the first all-singing, all-dancing shiny big games is Ubisoft’s Watch Dogs, and its system requirements confirm that it’s about time I take my dear old friend down to Regents Canal in a hessian sack. Good night, sweet prince.

Ubisoft has leaked and pulled and re-revealed system requirements for Watch Dogs before, but now they’re on Steam let’s pretend they’re actually real and plan accordingly (until Ubi open up pre-orders worldwide, you may need to visit the US version to see the specs).

The minimum spec demands a 2.66Ghz Intel Core 2 Quad Q8400 3Ghz AMD Phenom II X4 940, coupled with 6GB of RAM, a 64-bit version of Windows, and a 1GB Geforce GTX 460 or Radeon HD 5770. Given that “minimum spec” essentially means “your PC will load this in half an hour then run at 15fps,” look more towards the recommended spec, which seeks 8GB of RAM, a chunkier video card with 2GB of RAM, and better CPUs with more cores. Thankfully it won’t be too expensive to build a PC keeping pace with the early days of the new console generation, but I’d grown quite accustomed to life outside the upgrade rat race.

Watch Dogs is due to launch on May 27, barring the traditional Ubisoft last-minute surprise PC delay.

__________________

« | »

, , , .

97 Comments »

  1. rhubarb says:

    The specs go up, the time since last update gets longer and the money I would need to catch up gets bigger and bigger. Maybe I should go back to just not playing new AAA games at all. Oh no, maybe I’m why big publishers don’t like PC…

    • Runty McTall says:

      Given how many unplayed games I have, the amount of free time available to me (wife, kids, career, other interests) and budgetary considerations (wife, kids, other interests, mortgage) I’ve basically decided to surf along two or three years behind the modern release schedule.

      Between Moore’s law and Steam sales this makes the gaming hobby incredibly cheap and also skips the majority of annoyances like buggy releases and launch-day server performance issues, while also giving access to things like GOTY versions with everything packed in for zero or minimal additional cost. Plus you can more easily look beyond the hype and get a feel for what is genuinely good.

      I’ve made a few exceptions, paying full price on launch day for games whose devs I liked (Dishonoured, XCOM) or which held co-op promise (love me some good co-op – bought Wargame: AirLand Battle for example), but basically I’ve still got a massive library of unplayed games on my machine and neither the games nor the equipment cost me very much at all.

      Sometimes a game comes along that makes me waver a bit but it generally passes. Also, sometimes there’s a game I’d like to play under my own rules (used to be no more than £7 in a Steam sale but now I think I would go as high as £9) which is mystifyingly never reduced very much on Steam (Black Ops 1 – I’m looking at you) but I’m pretty good at resisting these temptations.

      • Syra says:

        I’ve also been doing that on and off with single player sorts of games where I’m not desperate to play, given time constraints I play them a year or two later, pick up a GOTY in a steam sale etc. It’s just a shame the same does not work if you are at all interested in multiplayer experiences (where the community will have moved on) or playing with friends who will inevitably be chasing down the next new release.

        • Darth Gangrel says:

          That’s why you don’t play multiplayer games and why you don’t have friends, or at least not friends that you play games with :P. Without being dependent on other players, you’re free to play whenever you like.

      • aiusepsi says:

        I’m starting to wonder if there is any topic which doesn’t have a relevant xkcd: https://xkcd.com/606/

        • Runty McTall says:

          Heh, “the cake is a lie” thing is interesting actually, because I only played Portal a few weeks ago (not really for any of the reasons above, just because I never got around to it) and knowing that the cake was a lie probably had more impact on me psychologically than in practice – it’s pretty clear from early on in the game that you are being deceived so I’d probably have worked out about the cake early on but it did definitely make me feel like I was discovering it less organically.

          Also, being a bit of a completionist, knowing that the ending had been changed upon the announcement of Portal 2, made me a little sad.

          Meh, my friends aren’t on the bleeding edge either so it’s not like I’m missing out on pop-culture exchanges – they mostly play Supreme Commander 2 against absurd AIs :)

          • roryok says:

            the ending was changed?

          • LionsPhil says:

            Yes. The very last thing that happens from your PoV in Portal 1 was patched in during the Portal 2 hype windup.

            As were all the radios, actually, IIRC.

          • Runty McTall says:

            Yes, but only a few seconds, I think. Both endings are on YouTube, IIRC (don’t want to spoil anything for anyone) by writing more here.

          • Surlywombat says:

            Only in a very minor way, to explain why you are back testing in portal 2.

          • Fenix says:

            Why would anyone play Supreme Commander 2 over the [vastly-superior] original SupCom?

          • The Random One says:

            I played Portal some six months after it was released and even though I avoided outright spoilers, simply knowing that GladOS was meant to be a funny characters spoiled most of her dialogue in the first few sections, when most of the jokes should be funny because you weren’t expecting them.

      • rexx.sabotage says:

        ^ Nailed it!

    • PoulWrist says:

      Except you have the post setup wrong way around. The longer you wait between upgrades the less you actually have to pay to get the performance required to play said games. Whereas with piecemeal upgrades you just generally end up not feeling any difference and limp around with some bottleneck in your system and end up just being screwed over by whatever.

      The specs to play it at the recommended level? About the price of an Xbox One or PS4…

  2. iainl says:

    More than four cores? Ouch.

    Realistically, I doubt my 4670k + GTX 770 with 16GB are going to be too upset, mind you.

    • TacticalNuclearPenguin says:

      Heh, a chunky processor still beats a weaker one with more cores. It’s not like the new consoles have “real” 8 cores anyway, more like 4 “modules” with sub 2ghz speeds.

      • SuicideKing says:

        That’s not correct, the consoles have Jaguar cores, not Piledriver modules. So there are 8 cores, albeit weak ones, divided into two clusters of 4 each, connected by…something like a north bridge, i don’t remember this last detail.

        Jaguar’s also used in their Kabini and Temash SoCs, and is mainly a competitor to Intel’s Silvermont uarch.

        • PoulWrist says:

          Except no, they’re connected by an internal bus in the CPU, less like every other CPU out there. The AMD CPU modules, however, are two processing cores each that share a common cache.

          Also a northbridge is a piece of hardware that sits outside the CPU, and hasn’t been used in modern hardware for maybe 5 years.

          • SuicideKing says:

            4 cores in the module, not 2. Piledriver has 2 integer ALUs and 1 FPU per module.

            I know what a northbridge is, i was meant the interface connecting the 4-core clusters, which lies outside both clusters iirc. I also said i don’t remember this properly.

            Though yeah, i guess i used “like a north bridge” a bit too vaguely.

      • TacticalNuclearPenguin says:

        I stand corrected, you’re right ( on some parts ), but then again we’re still talking about architectural gimmicks.

        I’m not saying AMD is crap either, there are applications for more cores and games can most definitely benefit too, sometimes, but just not as much as encoding, raytracing and other perfectly multithreaded applications.

        Afterall, the problem with games is that they still have a main thread that has to be synched with others ( read: waiting for ), and not everything is perfectly prone to be parallelized.

        Any good intel quad is still going to stay on the high end of things for a lot of time, especially over the 4 ghz mark.

        • SuicideKing says:

          Very true. Though looking at how DX12 redistributes the workload, parallelism will be increasingly favoured next year onwards.

    • paddymaxson says:

      The listed system requirements are incorrect. it states the recommended is Intel Core i7-3770 but says 8 core, the chip is only 4 core with 8 threads. at 3.5ghz You’d likely be more than OK with an i5, especially an overclocked one.

  3. Artist says:

    A newsarticle about system requirements of a computer game? Really?

    • Syra says:

      On a website about computer games? Oh my.

      • Henke says:

        Yeah really, this is the final straw. I come to this site for 3 things, and 3 things only: rocks, papers, and shotguns. I’ve been putting up with all this videogame talk hoping that they’d eventually get to the good stuff, but now they’re bring up system requirements as well? That is it. I’m leaving.

      • roryok says:

        On a website about computer games? Oh my.

        A website on the internet?

    • LennyLeonardo says:

      Really. Really really.

      Really.

    • 12inchPlasticToy says:

      I know, right? And next thing you know, they’ll be talking about gameplay and graphics, of all things.
      Youths these days…

    • Runty McTall says:

      I have no objection to a news post about system specs at all, unlike the OP here, i guess.

      However, I am curious – how much do you guys really scrutinise them? My PC must be coming up for 6 years old and in that time I’ve only upgraded the graphics card once (a couple of years ago), added more RAM and put in a nice SSD. Basically is was a reasonably high spec machine (not top end, absurdo prices) 5 years ago with minimal refresh added.

      I never look at specs at all and I’ve never had a game run badly (putting aside Rage’s issues with AMD’s OpenGL drivers…) or look ugly. Maybe I miss the extra graphical bells and whistles and one day I’ll have a revelation when I do get around to building a new PC or whatever but I really can’t remember the last time I worried about system specs (beyond things like Just Cause 2 being DX11 locked and therefore not available on Windows XP).

      • SuicideKing says:

        Well, I guess when the frame rates fall below 30, you know the time for changing the CPU or GPU is near.

        I get annoyed by tearing as well, so for me 60 fps is a requirement…Not that I’ve changed my CPU since i got it in 2009, but I’ve changed everything else since, and added SSDs. My GTX560 isn’t ideal for 1080p, so I’m considering the Haswell Refresh this year, followed by Maxwell Proper next year.

      • Low Life says:

        My life hasn’t been the same since the Shader Model Wars of 2007. I go through every line, every word, every character of system requirement listings now. I don’t want to experience that ever again.

        • Darth Gangrel says:

          I haven’t ever experienced a game not running well on my computer, except for a demo of Gothic 3. Man, that felt like a badly made stop-motion movie with all the lag. Luckily it was just a demo and back when demos weren’t so rare.

          I currently have dual-core laptop and it’s all I need, since I basically don’t play many/any games younger than 5-7 years. I’m glad whenever I see a new game where my computer at least meets the min specs, but if it doesn’t it’s no big deal. It’s not like I’m gonna be playing that game anytime soon, either way.

      • Surlywombat says:

        It may be an age thing. In the olden days we used to have to look. Sometimes there were even two version of the same game depending which soundcard you had.

        I also brought a 3D card for Force Commander (LUCASARTS I STILL WANT MY MONEY BACK!)

  4. Niko says:

    Is it worth upgrading though? At least Dark Souls 2 has decent requirements, so guess I’m fine for now.

    • TacticalNuclearPenguin says:

      It’s getting harder to imagine how this game will actually turn out.

      I’d wager it’s going to be a disappointment for many people, but it should at least turn out to be a decent entry with some serious production value. Then again, the latter can’t guarantee a fun game though.

  5. TacticalNuclearPenguin says:

    Not too surprised, i too noticed that something was probably toned down between the first demos and the last trailers, but then again it’s still pretty impressive.

    I mean, Doom 3 was pretty incredible back then, but it’s also true that it was set in corridors and that it still wanted some decent hardware.

  6. RedViv says:

    Guess we still have until July to shop for components then!

    (I do wonder what creates such a wide gap between recommended specs in many games, and my experience playing them just fine on what should theoretically not allow me to have any fun at all. Is it just that I don’t feel the need to fill my entire visual space with game, and just have the same small size of display I had for… soon seven years?)

    • TacticalNuclearPenguin says:

      It’s because you like the “minimum”. The current definition of “recommended” is something along the lines of 40+ fps with medium/high settings.

      For this game to be “maxed” at -stable- 60fps you’re still looking at SLI setups or at least something like a single 780 or more, that’s why some people don’t ever bother to actually read such specifications.

      People’s expectations and requests can vary wildly, and that’s a problem with “reccommended” and “minimum” specs as there is no fixed standard that gives them a meaning.

      • RedViv says:

        Good points. Nobody says what the recommended specs would even achieve, at which resolution and settings.
        I guess maxed options at a common 1920×1080 would be a fairly reasonable assumption, but I’ve seen games both go under and over that with the recommended specs.
        I guess my “just fine” lies in performance above 40fps at max settings. Minus DoF and blurs, because that’s a lot of performance loss for something I hate the look of. Might figure into that.

  7. Revolving Ocelot says:

    I’m quite impressed that Watch_Dogs has so drastically turned popular opinion around from its initial reception and unimpress everyone, before it’s even released. Not even Mass Effect 3 managed to achieve that.

    • Laurentius says:

      And one reason for it would be AC:Black Flag, proving that UBIsoft even with good design and good ideas will never stop drowning its games into being quite medicore, often being masters of underachiving, it’s quite hard to keep up with the ubisoft hype about Watch dogs seeing how AC series stem away from ambition to typical.

      • Universal Quitter says:

        I’d like to think that Ubi will take note of the number of users that own Black Flag, but no other Assassin’s Creed game released before or since, and just fund the open-world pirate game that we really want. It’s not like they wouldn’t make a gajillion dollars off of it.

        Or, probably even more likely, and probably with a better result, someone ELSE will make said pirate game.

    • fish99 says:

      Honestly a big part of what generated the hype was the visuals – the detail and the density – and in the latest trailers it looks like they’ve had to cut the amount of detail back significantly and reduce the amount of cars and civilians visible.

    • jorygriffis says:

      I wasn’t too interested in the initial reveal stuff–the game revels in fantasies I’m not particularly interested in–but you’re absolutely right. For me, the big turn-off was the story trailer, which was just laughable.

  8. Sleeping_Wolf says:

    Well, considering it is an Ubisoft game, I figure the reqs are less about graphical fidelity and more about poor optimisation (looking at you Ass Creed). I do hope to be proved wrong though.

    • wererogue says:

      You should have seen the minspec *before* the delay.

    • Deadly Sinner says:

      AC4, at least, was optimized well. It’s certainly the best looking graphics that my PC has output while still having good frame-rates, especially considering that it’s open world.

  9. Crimsoneer says:

    I’ve kind of accepted I’m going to need a whole new PC at some point in the next year or so – the case I’ve been using the for the last ten years needs replacing, for one. Still, I suspect it will run this just fine on medium.

    • PopeRatzo says:

      I wouldn’t be too hasty. I’m not sure there will be any AAA games worth playing any time in the next year. Right now it’s April, and there hasn’t been a decent big-budget game since Thanksgiving.

  10. Shadrach says:

    I guess I just about manage the min. specs with my 2008 Q9550 and newer GTX580… but just barely.

    Ass.creed IV was horribly laggy for me, so I just could not play it, so that’ll have to wait I guess.

    Watch Dogs does not look as complex and doesn’t seem to have the same wide open spaces though, so maybe it’ll chug along ok.

    Don’t think I’m getting it anyway since it’ll probably require that horrible Uplay to be installed… :(

    • fish99 says:

      I had a Q9550 before upgrading to a I5-760 and then a I5-3570K. I can tell you the difference it much greater than you’d guess from just looking at the headline GHz speed. The newer chips do a lot more work per cycle due to having much higher transistor count and better architecture. DDR3 helps too. I picked up a load of FPS with those upgrades without changing video card, even in games which weren’t CPU limited.

      Of course I understand it’s a big expense to replace mobo+ram+cpu.

  11. fredc says:

    Should we maybe wait and see if the game is shit before even considering PC upgrades? This was a rhetorical question.

  12. Harlander says:

    Hmm, it looks like it’s the CPU which is keeping my PC from the recommended specs.

    Shame, I kinda hoped for an excuse to replace my thunderously loud graphics card…

    • LionsPhil says:

      Getting an ASUS with its gamer-gamer-gamer DirectOC cooling gubbins turned out to be one of my best failure-induced-upgrades, because it’s so throttled by my CPU that it keeps the fans nice and slow.

  13. Rao Dao Zao says:

    I wonder what screen resolution these specs are built for? I’m still rocking a 1280×1024, maybe they’d not be so ridiculous for such a tiny wee monitor…

  14. Volcanu says:

    Hmmm. I am currently mulling a completely new PC build (my current machine is horribly ancient) and had been thinking of going for an i5 4670K overclocked to 4.4GHz with an overclocked Palit Jetstream GTX 780.

    My question for any RPS tech-sages is, would I be better off stretching to a GTX 780 Ti or an overclocked i7 4770k? I had assumed it was more or less a no brainer to go for the better card as received wisdom appears to be that games don’t currently use anything more than quad cores and that often an i5 performs better.

    Is that likely to change over the next few years and hence i7s will start to perform better? These system requirements sort of imply that the extra cores will be important.

    And I appreciate that this is like predicting the weather, but does anyone (better informed than I) think we’re actually likely to see 20nm graphics cards this year?

    • SuicideKing says:

      Oooh. Tricky stuff.

      At 1080p, and for 60 fps, I’d say get an i7 + 780. Above 1080p, for ~60 fps, i5 + 780 Ti.

      Honestly, this is a very tricky question at the moment. DX12 and 20nm GPUs will hit next year (i’ll come back to the 20nm part of the story), and Maxwell should provide a large boost to performance over Kepler, simply because they’ll be able to cram a lot of transistors on the chip (because of 20nm and Maxwell’s reorganisation).

      DX12 will cut overhead on the CPU per thread, which would make the difference on the CPU side less, and put more load on the GPU. However, that would also put the focus on more threads over speed, so if you have both (like you do have on an i7), the hardware should see a longer life span.

      That way, you can sell the older card and change to a newer one, and it’ll be much cheaper (and easier) than having to change the entire platform, though you could simply swap the CPU too.

      If you use your system for other productivity tasks as well, then obviously, an i7 is a good choice in that light as well.

      At higher than 1080p resolutions, or for 120 Hz (and thus FPS) gaming, you’ll be better served by a 780 Ti.

      Now, it’s also worth considering that Intel’s announced OC-friendly Haswells (and the 9-series chipset) for the June-August time frame, so you may want to wait for them, though they’re unproven at this point.

      The other things i really think you should wait for are benchmarks of this game, to get a sense of how things will be moving forward. Look at Tom’s Hardware, Tech Report, AnandTech, and others for this.

      Finally, about 20nm, it’s unlikely anything’s coming out before the very end of this year. TSMC’s node seems to be delayed so far. Evidence for this has been AMD’s complete silence on new GPUs, Nvidia changing both Tegra and GeForce roadmaps and Intel getting lazy on 14nm.

      Note that an i5 + 780 will be sufficient, so choose the 780 Ti for minimums of 60 fps (at 1080p) for at least the next year or two, and the i7 for a solid platform for around 5 years (during which you’ll have to change the GPU perhaps as soon as next year, depending on how things go and what you want from the system). As i wrote before, wait for Watch Dogs benchmarks.

      • Volcanu says:

        Thanks for taking the time to write such a comprehensive and helpful response. Much appreciated.

        It sounds like I should let the rational half of my brain win out and go for the ‘wait and see’ approach – at least until later this year. It does seem like there are enough new tech developments coming down the tracks over the next 12-18 months that it’d be wise to see how things are shaping up before making such a big investment.

        Of course the side of my brain that says “ooh shiny new toys now please” will need to be shouted down or otherwise distracted!

        Cheers again…

        • SuicideKing says:

          Any time, friend!

          I know, I’ve been in that same dilemma (buy or not to buy) for some time myself. Problem with keeping track of PC hardware roadmaps is that you constantly trick yourself into waiting for the next shiny toy. :(

          But I think I’m going to do a similar thing, get that Devil’s Canyon goodness this year and grab a Maxwell card next year…my GTX560 should stretch it till then. Senior year in engineering college approaches, so less time to game, have to save up for a laptop too…I think the decision is easiest when you don’t have money lol.

          • Rizlar says:

            Yeah, cheers for the sweet deets. Despite my disinterest in Watchdogs, I was browsing the comments hoping to find out if I should upgrade to a 2GB+ GPU now or wait. Guess I will wait a year…

  15. fish99 says:

    Those requirements are badly written. It only needs an 8 core if you’re going AMD. The intel I7-3770 is a quad, so something like an I5-3570 should also be fine (hyperthreading makes little difference to games).

    I’d also like to point out that Watch Dogs is coming to PS3/360.

    • TheManko says:

      Watch Dogs on PS3/360 seems like it’ll be a completely different game, kind of how the sports games like Fifa do it. So instead of PC getting the last gen version we’re getting the next-gen, where presumably the world simulation and game design is optimized for PS4/Xbox One. This is why the minimum requirements are so high.

      As for the 8 core thing I guess the game uses 8 threads, so they dumbed down the specs to say “8 core” referring to the i7 and 8350. But they probably just mean something capable of running more than 4 threads, whether it be real physical cores or hyperthreading.

      • fish99 says:

        The recommended requirement says 8 cores, not 8 threads, and then lists a 4 core CPU. They are not the same thing. Also your OS is perfectly capable of assigning multiple threads to a core, which is why an I5 performs very close to the equivalent I7 in just about everything (remember I7s have more cache).

        Let me put it this way – if you were using those requirements to decide on a new CPU and went to the intel website, you’d see I7s listed as 4 core and think they were not good enough. I would also bet a good I5 runs the game absolutely fine.

  16. PopeRatzo says:

    Watch Dogs is due to launch on May 27, barring the traditional Ubisoft last-minute surprise PC delay.

    How you say that with a straight face, I will never understand.

  17. Themadcow says:

    …and this kind of thing is why I was hoping for Steamboxes to be a standard spec for each generation of console hardware, with upgrades optional. A standard ‘current gen’ spec Steambox could mean that rather than ‘recommended specs’, developers create an optimal Steambox configuration that takes all the second guessing out of PC audio / visual settings on games like this. Obviously it would still be up to the individual to tweak settings if they want to – but could rest assured they would be getting a PS4 equivallent experience at a minumum.

    But no. Instead we (and PC port developers) get a stupid array of different manufacturers, specifications and the extra added fun of a different OS to think about.

  18. MadMinstrel says:

    So you posted an article on system specs, and then failed to list them? Good job.

  19. c-Row says:

    Am I the only one around here who doesn’t even remember his system’s specs by now and just thinks in categories like “runs Skyrim fine in 4k unless ENBs are used”? I am seriously out of touch with hardware stuff these days… I don’t even know the model of my gfx card.

    • Darth Gangrel says:

      I know my system specs, but what I have trouble with is comparing them to a computer described in the system requirements that doesn’t feature numbers. Like, how the hell am I supposed to know how powerful an AMD Phenom II X4 940 is compared to my dual-core laptop. At least the Intel CPU has some numbers attached to it and I can then compare GHz’s and cores. It’s like grading something threeel skoots out of blerrrn grrrshh and comparing that to fesshh shtonck out of woaarrg bisf. That incomprehensible rating system says as much to me as the performance of an AMD Phenom II X4 940 compared to another computer.

  20. SuicideKing says:

    Well, my system’s become minimum spec, now (Q8400 + GTX560 + 8GB RAM).

    • trjp says:

      Amazing the life that Q chips have led, isn’t it?

      They’ve lasted beyond at least 3 or 4 newer generations of chips just by having 4 cores.

      OK – you’ve probably spent their cost 10 times over in electric but you’ve not need to heat your home eh? :)

      • SuicideKing says:

        Yeah that’s true. Those that got lucky with overclocking their Q-series chips must be even happier. I think Tom’s Hardware did this story once where they overclocked a Q9000 series chip to 3.5GHz or more, it came pretty close to today’s stuff.

        Power consumption isn’t that bad…i did some rough maths, for an average of 60W over an average run time of 11 hrs for the last year (I used my SSD’s total power on count for this), it comes out to be about 250 KWh for the year, just for the CPU…that’s about the same cost as the CPU, by the rates here.

  21. noodlecake says:

    Nooooo!!! I spent my student overdraft on my PC two and a half years ago. I don’t have any more student overdrafts to spend on upgrading it! PC games have to stay at the same graphical fidelity they were at two and a half years ago forever! These are dark days indeed.

  22. PoulWrist says:

    And yet another year will tick by as my system continues to chew through recommended settings. Despite being cheap.

  23. ChaoticPesme says:

    Well, with my budget, I think I’ll stick to the PS3 or 360 version ‘^^

  24. trjp says:

    You realise that sysreqs are just guesswork and marketing- yeah?

    I think some people think that developers test their game on loads of PCs and carefully determine what’s required – no-one has EVER done that ! :)

    You make-up something which represents the sort of PC you THINK will run it – obviously there are some actual needs such as Shader Model support etc., but for the most part you simply invent something which is

    a – likely to actually work so you avoid loads of tech support (brush-off plebs as ‘below requirements’)
    b – not likely to deter too many customers

    I believe UBI contract-out their PC ports to faraway lands and the job they do is often best described as ‘functional’ – the odds are the sysreqs are created by UBIs marketing people here tho, to ensure a balance of sales against “not too many annoying customers moaning about problems’.

    As for the reqs themselves – 6Gb of RAM is a LOT – 8Gb is silly – those are just ‘made up numbers’. Quad -core isn’t an unreasonable request these days (tho it might piss-off people who bought the nice newer i3s I suppose) – that said, there are ‘dual core’ i3s which would run rings around their Phenom requirement, The GPU requirement is basically ‘something which isn’t old and shit’ too – no biggie surely?

  25. roryok says:

    My games machine is a core2quad with 4GB of RAM and a HD6770. So it’ll just about run this. So I probably won’t buy it. Hell, I still haven’t even opened my copy of Far Cry 3 for fuck’s sake.

  26. Lemming says:

    I’m ignoring these specs. I’m pretty sure they come up with them just to keep the console execs happy.

  27. Perkelnik says:

    Im not upgrading until Witcher 3 comes out.

  28. Deadend says:

    This is why I bought a PS4. I mean this game, but also this generation hop. My PC is 6 years old (CPU, mobo) and the launch of the 360 was when requirements shot up before stabilizing. I figured it would be cheaper to buy a console now, to get me through the marquee releases for a few years (my PC handles everything great to fine still) until it’s safe to sink $800 on a desktop, or maybe even.. A GAMING LAPTOP (that was with disgust).

  29. Sheogorath says:

    I refuse to buy Ubisoft games on Steam until they stop making me go through Uplay.

    So I probably won’t be getting this.

  30. bill says:

    Sigh. Don’t want to (and can’t afford to) go back to the days of upgrading PCs. My 2 core laptop is hopefully gonna last another 4 years. Guess I’m on indie games and clearing the huge backlog for most of that time though.

    It was so nice to be able to assume that any game would just work on my PC for a few years back there…

  31. goettel says:

    Considering how often and how much we had to spend on our rigs in the olden days, I find the recommended spec on this non-daunting in the extreme. Just like zeds, the threat from consoles don’t come from strength, just numbers.

  32. Elektre Martin says:

    Today is the when the desire of many mind is completed Watch Dogs Game released today all of us weight for this game for a long time, little time more to weight in a few day we will be able to play this game, when i saw the trailer i was very inspired from that game and i get the Aiden Pearce Coat wearing in the game to my brother he love it very much. if any follower want to get this outfit the you should visit Famous Jackets its a best affordable place to get this.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>