An Exciting Post About AssCreed Unity’s PC Performance

Now that I’ve a) got code and b) got said code to run after a ton of tinkering, I’ve spent a couple of hours with gaming’s latest whipping boy, Assassin’s Creed: Unity. Given yesterday’s web brouhaha about its shonky performance on console, it seems worth sharing my technical experiences with it on PC too.

Below are a bunch of numbers, if you like that sort of thing. TLDR: it’s not disastrous, but something sure ain’t right.

Clearly, this is just how it runs on my PC. Your mileage may very well vary hugely. Particularly, I haven’t been able to try this on an NVIDIA card yet. Here’s what I got, though.

The major observation is that I cannot get this sucker to consistently run faster than just over 40 frames per second, whatever I do. That peak is at lowest graphical quality and lowest resolution (though it will only go down to 1280×720 on my screen), but even then it spends most of the time in the 30s. It just won’t run at 60 fps no matter what I try. There’s no sign of a framerate cap; this is simply a performance issue, so far as I can tell.

What’s odd is that the performances changes relatively little if I pump the resolution up to 1920×1080 (or even 2560×1440) and stick the settings on High or Very high. Generally, it’s stuck around between 30 and 40. (Yes, I do have vysync turned off, the refresh rate set to 60 and the very latest graphics drivers installed).

Ultra High, which I wouldn’t have expected to be smooth on my system anyway, flickers around the late 20s at 1080p. Doesn’t look a huge amount different to Very High anyway, so no great loss. Whatever the settings, the frame rate’s all over the place – huge jumps and drops depending on what I’m looking at or what action’s going on. Clearly that’s the case in any game, but it seems especially and distractingly pronounced here.

I do prefer to play at 60fps rather than 30 where possible, but if I’m honest that’s more about principle than noticing the difference in practice, at least in a game like this. So I can deal with 30fps/1080p well enough. Trouble is it’s an entirely different matter in cutscenes. These feature super-detailed versions of the character models, and slump down to about 17Ffps, which is far too treacly to deal with. Not that I particularly want to watch the cutscenes going on what I’ve seen so far, but I’ll get back to that once it’s review o’clock.

My concern, essentially, is that Unity remains a 30fps game even when everything’s set to rock bottom, and that the frame rate spikes so wildly whatever my settings are.

Above: ACU at lowest settings. The image at the top of this post shows ACU at Very High settings. Click through for bigger, uncropped versions.

While my PC isn’t the greatest it’s pretty well-equipped – Core i7 980x [edit- this is a hexacore CPU which I’ve overlocked to 4GHz; while it is not maxed out during play, according to Speedfan, its possible that its age means it lacks some tech ACU wants to use. Again though, the worst performance has been in cutscenes], Radeon 290X, 8GB RAM – so I strongly suspect this is down to the game rather than an inadequacy on my part. This would also reflect the major performance issues being reported in console land; apparently the game regularly drops below 30 frames on PS4 and Xbone.

The good news is that, on my system, I can have it sit at a 30fps minimum (bar cutscenes) without too much compromise. I suspect this is more to do with my relatively beefy graphics card than it is the PC version being in better shape than the console versions.

This, I suppose, makes the PC version the best-running version of the game, provided you have the hardware. And fifty bloody quid. And 42 gigabytes of disk space/bandwidth. And don’t mind not having it on Steam.

Hooray? No, not really.

As for the game itself, clearly it’s too soon to make even the slightest judgement, but what I’ve experienced so far has been very traditional Creed with no surprises other than a little more building interior stuff and a somewhat, but not dramatically, prettier world. Oh, and also lead character Arno has very nice eyes.

I’m not expecting the freshness of Black Flag, but I would like to see more invention in the missions later on – so far it feels excessively familiar.

Anyway, gizza couple of days and I’ll let you know more.

79 Comments

  1. DarkLiberator says:

    If the framerate isn’t changing much, it might be CPU bottlenecking the framerate perhaps? I’d assume this was the case because of the thousands of useless NPCs walking about sucking up CPU resources.

    I do notice a lot of settings don’t really apply till you restart the game completely, which is rather annoying.

    The architecture in the game looks lovely, shame the performance is just terrible. I could barely do one mission in a church that required you hopping about within a time limit because the framerate hovered around 6-7 FPS in that building.

    I think it needs an NPC density slider, it might help out a lot of people.

    • TacticalNuclearPenguin says:

      I guess i can be thankful that games are finally nodding to overclockers with extra reasons to actually use the added clock cycles, but yeah, something to tweak for those people with a stock clock and heatsink ( they’ll never learn ) should be there.

      Perhaps it would be useful to test that 980x with some added juice, even though it doesn’t really have the IPC of the newer stuff, but if an overclock can raise the bar considerably we might have an answer.

      PCG claims far better numbers with an i5 4670k and “just” a 970, unless they’re lying it’s pretty clear what the bottleneck is, especially considering that Alec’s performance doesn’t change with graphical settings, suggesting the GPU is spot on.

      • antoineflemming says:

        From what I’ve noticed, it’s the lighting (and environmental?) effects, the rendering of all of the buildings, and the large crowds that are negatively impacting performance, particularly the CPU performance. I’ve noticed that, when using eagle vision, that the game tends to run a bit better. From what I can tell, the lighting effects are reduced when using eagle vision. As for the cutscenes, the Nvidia hair technology (and maybe the cloth too, but not sure) is negatively affecting GPU performance for lower end users. COD Ghosts had similar fur technology for the Riley dog character, but they had a setting to turn that feature off. We need similar settings, especially since they really are aesthetics and not required for gameplay. Oh, and it’s not primarily an AMD issue, because it’s not primarily a GPU issue. It’s a CPU issue.

        Again, settings to reduce the quality of the building meshes in the distance, reduce the number of people in crowds (by removing npcs from the exterior and having much smaller crowds), and reduce the lighting effects and maybe even shadows would help boost performance for those who meet only the minimum requirements.

    • FriendlyFire says:

      That sounds probable, but there’s one detail that doesn’t fit: why would the FPS tank when in a cutscene if the crowds were the issue? Depending on how they build their cutscenes, it’d either require the same CPU performance if they keep the crowds, or an awful lot less if they suspend all crowd sim to focus on the cutscene. Yet the game drops well below normal gameplay, and I doubt the higher quality models can account for such a massive drop.

      • zaphod42 says:

        Could be anything. Disk fragmentation, poor hard drive performance, some bug in some driver somewhere, some problem with decoding, who knows. Its really hard to say without seeing Meer’s machine.

        But without testing on multiple configurations I wouldn’t leap to blaming the game software.

      • amateurviking says:

        Also given Alec’s pretty hardcore hexcore processor it doesn’t make much sense. The i7 980x is old but it’s still a bit of a beast. Especially compared to the APU in both consoles.

        • Alec Meer says:

          I’ve also got it overclocked to 4ghz, which I should have mentioned. The game demonstrates moderate CPU use on all cores (according to Speedfan), but nothing like maxing them out.

          I mean, it’s a beefy CPU even before overlocking, but its age means it does lack some tech that perhaps ACU wants to take advantage of.

          • DanMan says:

            Unlikely. It’s not THAT old, and the game probably wouldn’t even start if something was amiss.

          • SuicideKing says:

            Alec, your CPU is far more advanced than the ones on the consoles. If they can manage 20-30 fps too, yours should be quite a bit more.

            Anyway, indications across the internet seem to suggest that it’s a poorly done game with really bad optmisation.

  2. jezcentral says:

    “And don’t mind not having it on Steam.”

    Wot, no Steamworks?

    • kevmscotland says:

      Can’t even buy it on Steam at the moment. And no Ubi game has ever used steamworks, they all come with UPlay, even if you launch through steam first.

      • Greggh says:

        That “no-Steam” shenanigan seems to be going on only in the UK (possibly some few other places).

        • Hmm-Hmm. says:

          Just checked, and sure enough it shows up on Steam for me. Huh, and here I thought they’d pulled them from Steam everywhere.

          • Sp4rkR4t says:

            It was briefly then Ubi changed their mind again and decided they only wanted to fuck over the Uk.

          • jonahcutter says:

            @ Sp4rkR4t

            From the sounds of the performance issues, and the sameness of the bloated gameplay systems being reported in reviews, perhaps Ubi has favored the UK.

          • SuicideKing says:

            They had, for a few hours.

      • ScottTFrazer says:

        What’s weirder, though: If you buy it through steam, then shut steam down and launch Uplay and launch it through there, it fires up steam again.

  3. Penguin_Factory says:

    I can’t even run this game since my PC is below the minimum specs (but just at the recommended for Dragon Age Inquisition and slightly below recommended for Far Cry 4 which strikes me as completely ridiculous) although given what some of the reviews are saying I’m not sure I’m too upset at that any more- I’ve never liked the franchise before so a six or a seven to most people would be about a three or four for me. I had hoped this game was going to fix a lot of the asscreed problems, but apparently that didn’t happen.

    What I want to know is why didn’t Ubisoft just drop the visual quality of the game to get it running smoothly? Surely they must have realized prior to release that this was going to be a problem?

    • iainl says:

      According to the devs themselves, and backed up by the findings of both RPS and Digital Foundry, the problem seems to be the game being primarily CPU-bound rather than GPU. Changing graphics options don’t really reduce the CPU load that much, because they’ve blown the budget on masses and masses of crowd AI.

      • SuicideKing says:

        Hitman Absolution, Sleeping Dogs, etc. did crowds pretty well. :/

    • KenTWOu says:

      What I want to know is why didn’t Ubisoft just drop the visual quality of the game to get it running smoothly?

      They dropped the visual quality of Watch Dogs and that didn’t help either. Because it’s a damned if you do, damned if you don’t situation.

    • Hmm-Hmm. says:

      I’ll hazard a guess and say it’s because it’s a poorly optimised game. And poor QA.

      • zaphod42 says:

        Do you work on software?

        Making a game that requires a CPU from the last 4 years isn’t necessarily bad design. Its just most gamers have managed to get by on ancient CPUs since consoles are holding back the game industry right now, so when a game comes out that requires a decent CPU everybody goes “What the hell is wrong with this stupid game?”

        • Hmm-Hmm. says:

          When people with beastly pcs can’t get it to run on the highest setting without performance issues I don’t think it’s just that.

        • P.Funk says:

          Right, so all those people with consoles that have frame rate issues… shoulda bought a newer… err… PC?

          So basically 2/3 of the market for this game are using the wrong platform and of the other 1/3 at least half are probably in the same boat.

          Sure, makes sense. This is how one sells games.

    • TacticalNuclearPenguin says:

      Far Cry 4 is a very minor ( if any ) step forward technically compared to 3, and both that and Inquisition share the same conservative approach, especially when it comes to textures, so i wouldn’t be too much surprised about different requirements.

      Unity is where they go all out, but then again the PC version is treated by the same Ubi’s division that was never exactly famous for masterful optimization, let alone the fact that all ACs have been locked to 16:9 only ( no clue about Unity ).

    • Stepout says:

      Holy smokes! I didn’t realize the minimum requirements for this game were a GTX 680 or an HD 7970. Apparently the 7950 I bought earlier this year is obsolete :)

  4. melnificent says:

    In console land it’s the slightly faster Xbox one cpu that has the advantage over the significantly more powerful ps4 gpu. This leads to the leading theory that Unity is heavily cpu bound. Which makes sense considering the number of npcs with independent Ai routines at any one time.

    Ubisoft really should get to grips with gpgpu to aliviate the bottleneck

    • woodsey says:

      That would make sense, the AC games have a tendency towards putting more on the CPU than they need to. AC3 had a massive problem with it and Black Flag as well, to a lesser extent.

    • zaphod42 says:

      Unfortunately it seems Meer has never even heard the concept of “cpu bound” before. xD

    • mukuste says:

      It’s notoriously difficult to put AI on the GPU (at least the most common current cards without the latest compute features). Too many conditional branches, not enough data parallelism.

      Also, the overhead of moving stuff to the VRAM and back will dramatically outweigh whatever actual computation you do there.

  5. Wyrm says:

    Never pay full price for Ubi games.. I just started AC3 which i purchased for about £7 – at that price I am willing to put up with Uplay etc.. but only just.

  6. Horg says:

    ”This, I suppose, makes the PC version the best-running version of the game, provided you have the hardware.”

    If you want to make good performance reviews (btw, do that more often RPS, PC gamers love that stuff) you should hang on to your old hardware and get tests on a variety of system specs. I strongly suspect that your i7 and 290X are not being pushed anywhere near their limit with Unity, and you would get comparable performance on a system with previous gen hardware. From what you have described, Unity sounds like an unoptimised, CPU bound mess. It was probably never intended to run over 30 FPS when development began, you know with Ubisofts insistence that it was ”more cinematic”, so I can’t imagine optimisation was that high on their priority list.

    • TacticalNuclearPenguin says:

      If the game is so heavy on the CPU that particular i7 is no longer a player, especially on stock clock. We might see far better result with 4+ ghz on a newer i5-7.

      I’m pretty certain that way you can have that 290x used pretty heavily, and even if it isn’t you can still bump the resolution even further.

      I agree with you on budget though, if it’s impossible to have 2 different high end systems, it’s better to have just one slightly above average and a medium one, than having just one to test with, but then again i don’t think RPS is willing to specialize that much in that side of PC gaming.

  7. Janichsan says:

    My concern, essentially, is that Unity remains a 30fps game even when everything’s set to rock bottom

    You should be happy, since 30 fps is so much more cinematic than 60 fps – at least according to Ubisoft.

    • mukuste says:

      Which is, of course, utter bs since games don’t have the natural motion blur that fast movements captured on film have.

      • TacticalNuclearPenguin says:

        But more importantly, games don’t have a set number of frames always perfectly spaced at the same time interval between one another, which is another thing that helps movies since the eye can’t detect inconsistencies.

        Games instead are full of inconsistencies especially if the hardware is struggling, you might have 3 frames very close from one another and the the next one might come with some hefty delay, and that’s something your eyes will always be able to recognize even at higher framerates.

        TL:DR: “average” framerate in games doesn’t tell even half the story, and it’s unable to tell you if there is random stuttering and spikes.

    • chargen says:

      Well those lucky console players are receiving a perfectly cinematic 24 FPS.

  8. GAmbrose says:

    I knew this would be CPU bound, the engine is a POS really. It hasn’t even been optimised for multi core CPU’s

    How did I know? Well it hasn’t really changed much since Assassins Creed III, and to get that running acceptably I had to overclock my Haswell i7 4770k from 3.5Ghz to 4.5Ghz.

    Use something like MSI Afterburner (with Rivatuna) and in ACIII you can see it uses about 30% of the GPU power and 100% of ONE CORE and hardly any of of the other cores.

    Black Flag runs a bit better, but not much.

    Can someone with Unity monitor CPU usage with the game and see what happens?

    • DarkLiberator says:

      I can definitely confirm this time Unity using all my cores 80% utilization on an i7 3770k for me. Nice change from Black Flag which is a disaster cpu usage wise.

      But its not enough for the framerate. Those thousands of NPCs are probably why the CPUs are getting proper use now.

    • Leonick says:

      AC3 and Black Flag both run at 60 fps on my PC and I only have a i5 2300. Clock is 2.8ghz on that thing, up to 3.3 in turbo I think . You must be hitting some other issue somewhere because the engine definitely isn’t that bad.

  9. melnificent says:

    Can we talk about the in game purchasing? From $10 – $100 for in game currency, that’s twice the price of the game itself.

  10. scudly says:

    I think also that some of it has to do with what GPU you’re using. I’m running an i5 2500k OC’d to about 4.2ghz along with a GTX 780 and I can keep it at around 60fps with the low points being 45 fps with most settings on Ultra save for Ambient Occulsion and AA.

    • Horg says:

      The overclocked CPU might account for the entirety of the performance difference. The 780 range and 290X range are roughly comparable GPUs, so unless performance is much worse on AMD for some unknown reason, i’d put my bet on the CPU still. The i7 980x is a 3.3ghz 6 core for the stock model, so it’s very probable that a 4.2ghz 4 core would out perform it, especially if Unity isn’t using multiple cores efficiently.

      • TacticalNuclearPenguin says:

        Aye, not only that but the IPC is also lower, so the difference is even stronger than just lower clocks.

    • tehfish says:

      Could it be physx shenanigans again? That might explain why the GPU makes a difference: Nvidia went out of their way to de-optimise physx as much as they could on non-nvidia setups.

  11. Jamesworkshop says:

    and what is exciting?

  12. Halk says:

    That is EXACTLY the same problem I’ve been having with Black Flag and AC3. Details and resolution have very little impact on it, and the AC games are the only ones that do that on my pc, I can run everything else smoothly. They are doing something really shitty with their engine. Really frustrating. Why can’t they figure it out?

    Damn that’s annoying, I thought they would finally address the issue with a “next gen” game.

    .

  13. Repeltepeltje says:

    I would say its a well worth the money.

    link to gamescoon.net

  14. Dale Winton says:

    Overclock that CPU and you might get better performance. It was released nearly five years ago now though

  15. HisDivineOrder says:

    If changing the video settings doesn’t amount to much change in framerate, that usually means either your CPU isn’t up to the task OR the game is unoptimized. Or both.

    Since it’s Ubisoft, I expect you need a patch.

    • Geebs says:

      Well, it means the game’s not fragment bound; it might still be geometry, bandwidth, CPU or texture memory though. In particular, if you calculate something on the gpu and then need the result back in main memory, the performance penalty is huge (which is why “just do it on the GPU” is not always the answer).

      Anybody have any figures on the GPU memory load?

  16. El_Emmental says:

    I wonder if people will still ask for better AIs in video games after that, seeing how devs experience difficulties optimizing them, while users often have worse CPUs than GPUs in their gaming rigs.

  17. flexm says:

    The minimum cpu specs seems to be the first decent cpus that’s got AVX instructions, which newer game engines kinda like to use, and which the Core i7 980x doesn’t have. So that might be part of the problem? Game defaulting to some terribly unoptimized path for that kind of vector ops?

  18. Neutrino says:

    “Generally, it’s stuck around between 30 and 40. (Yes, I do have vysync turned off, the refresh rate set to 60 and the very latest graphics drivers installed).”

    I don’t understand how anyone can play any kind of 3D game with vsync off. Do I have bionic eyes or something, does no one else notice the horizontal stepping with no vsync?

    • jonahcutter says:

      I’m in the same boat as you. Vsync is a must. I can’t help but notice screen-tearing when playing, to the point of almost constant distraction.

    • Vandelay says:

      Agree that tearing without vsync can be horrific, but isn’t that only a problem if your frame rate exceeds the refresh rate? Alec won’t be getting that problem at those fps numbers.

  19. Laurentius says:

    The idea that this game poor performance might be caused by Core i7 980x being brought to its knees is too ridiculous to consider seriously.

  20. suibhne says:

    This sounds bizarre. I haven’t finished Black Flag, given its incredible tedium past the 50% or so mark (I’m around 80% now), so I won’t be queueing to buy this anytime prior to a big sale. But Black Flag is Smoov as Buttah(tm) for me, at max settings. Granted, my rig was pretty cutting-edge just six months ago, so it’s relatively high-spec…but all the reports make it sound like Unity is vastly less scaleable than Black Flag.

    I get all the argy-bargy re. crowd AI…but if that turns out to be responsible for this huge performance hit, I’d say there are fundamental problems with the design priorities in this game.

    • ffordesoon says:

      If you look up any interview with anyone working on Unity pretty much since the game was announced, you can see that they didn’t have good design priorities.

  21. zaphod42 says:

    >What’s odd is that the performances changes relatively little if I pump the resolution up to 1920×1080 (or even 2560×1440) and stick the settings on High or Very high. Generally, it’s stuck around between 30 and 40. (Yes, I do have vysync turned off, the refresh rate set to 60 and the very latest graphics drivers installed).

    That’s not odd at all. It means you’re CPU bound, not GPU bound. That happens often. Surprised you don’t expect it. Games like GTA often bottleneck your CPU before your GPU, and no amount of turning down GPU settings is going to increase your CPU performance.

    Meer have you guys OC’d that CPU at all?

  22. Rutok says:

    Well i watched a stream on twitch that was running the game on ultra settings at 1440p with constant 60 fps. The guy had 4 titan blacks in quad sli and a seperate streaming pc that would put even beefy gaming pcs to shame.

    So its possible to run the game smoothly.. you just need to invest a small fortune into overkill hardware.

  23. reggiep says:

    I don’t think the “thousands of NPCs” have as much effect on performance as people think. Ubi is using a number of tricks here. Anytime there are “thousands of NPCs”, they are all huddled in a group and assets are duplicated heavily. Very few of them have any agency/AI/behavior that would draw on the CPU. I’m betting that the number of NPCs with agency on screen at any time is equal to what was done in any previous AC game.

    For me the game performs fine when there are lots of NPCs. When I get into battle with as little as 3 people on screen, the game stutters severely. So there’s definitely something wonky going on.

    Shadow of Morder ran beautifully on my system even when engaging 30 orcs at a time. Ubisoft has no excuse. This is what happens when you let 5 different studios build a game and then slap it all together so you can release it for the holiday season.

    • Vandelay says:

      I recall seeing a video with a couple of the devs talking about the crowds. There are about 100 npcs at most using the higher quality models and AI routines. The rest are low quality. It worked well to create an illusion of a crowd, although those “AI routines” looked like fairly standard canned animations and they didn’t interact well with each other.

      I’m sure other games have pulled off similar tricks without it being such a huge performance hit. I also don’t see why there can’t be option to low the number of high quality models there are.

  24. Lobster9 says:

    I have almost the exact same rig as listed in the officially released ‘Recommended Specs’ only with slightly more RAM.

    The frame rate isn’t so bad when things actually move, but there is a heck of a lot of pausing every few seconds, and the animation is extremely jittery. I have seen a lot of NPCs sinking into the ground, or spawning on people’s shoulders, and each hiccup is followed by a little global physics explosion that sends all the PhysX hair a-flyin. The cut scenes have very choppy animation (though the camera remains smooth) and characters frequently pop-in between camera cuts. It’s really hard to fight enemies that can teleport too!

    It is quite a rare experience for me to play a AAA game from one of the big-three AAA houses, that has the hardware performance of an obscure European Simulator. Shoddy network performance? sure. Ugly graphics that looks nothing like the screenshots? certainly! but a straight up glitch-ridden mess? Certainly not in the last couple of years.. I think the last one for me was Double Agent. *shivers*

    The game itself doesn’t seem too bad. I like Assassin’s Creed well enough, and even though it’s not doing anything too groundbreaking, I am fine with another trip around a historic city. I just wish the thing would work!

  25. rcguitarist says:

    The fact that the consoles are having problems proves that this is an issue with bad optimization and bad QA. Those console versions should have been able to be perfected since there are no variations. So when a developer can’t even get a game to run right on those, what hope does the PC version have. All Ubisoft games are $5 bargin bin items…even a new game like this.

  26. Jimbo says:

    Have you seen the microstransactions though? More like Assassin’s GREED am I right, Meer??

  27. Warbutt says:

    As always it is all about the build of your PC, the average computer is going to get wrecked by this game. However the other top 5% of machines out there we are drooling over this game and we are happy campers.

    Here is my gameplay so far I only do intro commentary. Running i7 3770k, 16gb ram, and gtx 980 on 1440p maxed settings between 40-60fps. I did have to change my codec after the first few videos to smooth it out and get a little better quality so watching Sequence 2 Memory 2 or 3 and after runs very smooth and great quality =)

    • Nouser says:

      Is the popping always that terrible? I can see the geometry being loaded even in the indoor scenes.

    • thekelvingreen says:

      Having never played an Assassin’s Creed game, there may be an explanation of which I’m unaware, but why are the palace guards in revolutionary France apparently from Sheffield?

      • Lobster9 says:

        Previous AC games have had the correct localized accents for all the characters. Even Black Flag last year had an array of character specific accents including, Welsh, Spanish, French, etc. (Note: The player character of the first game had an American accent despite being middle eastern, but they kinda fixed that in Revelations.)

        For some reason they decided to ditch that kind of detail in Unity and use a variety of British accents.

        I am sure Ubisoft will have some BS public reason for doing so. ‘it’s more cinematic’ or ‘it helps us tell the story we wanted’ or something along those lines. Though the real reason is likely to be even more annoying. Some guy upstairs probably felt that the games didn’t sound enough like a TV drama, or the Les Miserables movie. Okay, maybe not.. but it’s really easy to be cynical with this company.

        • Alistair says:

          Given you can have the dialogue in French if you choose, that doesn’t seem like too much of a complaint.

          • Lobster9 says:

            True, but it still seems odd to me that they decided to throw out the old way of doing things. Like I said, it’s mostly just my own cynicism given that the game is running so bad. I feel like Ubi rushed to get a Next-Gen AC game out the door in time for the 2014 season, and I’m just bundling all my frustration into every tiny problem.

  28. CookPassBabtridge says:

    Has anyone looked at the cores being used whilst this thing is running? I know some developers have been building in six core usage to make use of the consoles’ extra cores (Apparently the Havok physics engine does or soon will). Do you see idle cores on 4 or 6 core CPU’s at all?

  29. Darthus says:

    Alec et al: I’ve done a lot of testing on this, and I can pretty confidently say it’s not CPU bound, at least on my system. And I’m running a Core i5-750 (quad), much below Alec’s Core i7. I had an Nvidia 760 GTX, 2GB Ram. I have 8GB system ram. So my Gfx card is the newest, most updated component by far. The game was nearly unplayable on anything other than low settings. I ran a benchmarking tool in game to monitor CPU, Ram and gfx card usage. Even on Low,my CPUs never went above around 50% usage, my GFX card was sitting at 99% the entire time and I was doing maybe 30 fps.

    I upgrade to a GTX 970 4GB and now the game runs on all ultra, max settings at 60fps. EXCEPT for Anti-aliasing. If I run TXAA, it drops to 30fps, but with running FXAA in the driver, it’s consistently 60fps.

    To me, on my system at least, that is proof that the game is HIGHLY gfx card sensitive, especially if my non overclocked core i5 can run everything on ultra at 60fps. Also, NVIDIA themselves say the game requires a 2GB gfx card for Low Textures, 3GB for High Textures and 4GB for Ultra Textures. So, use that as a guide for your texture settings, make sure you’re using FXAA or no anti-aliasing, and see if that helps.