Skip to main content

An Exciting Dishonored 2 Performance Update

This is my life now

I've a 'mare of a time with Dishonored 2 [official site]. You can tell because I've devoted an order of magnitude more words to the subject than I've written to my parents in the past two years. As I wrote yesterday, the latest patch has ameliorated but not solved the performance problem - however, I might now have found the sweet spot. Not without compromise.

I've written a lot here, to the extent that I now never, ever want to write about Dishonored 2 performance in any capacity ever again, but skip right to the last paragraph if you want a fast summary of it all and an answer to the question of whether it's now 'safe' for you to buy a videogame we like a lot.

The key, for me, was 1080p. As I've mentioned, I'm foolish enough to own a 3440x1440 ultrawide monitor (it's tax deductible for the self-employed! Also it's really, honestly handy for multi-tasking!). I'm currently powering it with a Radeon R9 Nano, which has been overclocked (via increasing the power settings) to be pretty much nose-to-nose with AMD's current flagship the R9 Fury X. No, not exactly top of the range (I wish AMD would get on with releasing the RX 490 or whatever they're planning next - NVIDIA currently have a handy march on them with their GTX 1080), but it's certainly high-end, and I'm going to write this in bold because it's very important:

Literally any other videogame runs really well at high and often even max (MSAA aside) settings at this resolution and on this card.

Yeah, I'll take it on the chin that I am playing games at a relatively rarefied resolution (and I know I've painted myself into an uncomfortable corner of always needing quite high-end cards as a result) and so absolutely cannot expect the hallowed max settings/60 FPS. That's fine. But: literally any other videogame.

In Dishonored 2, it's not even as simple as 'my framerate isn't high enough at the settings I'm happy with'. It's that my framerate fluctuates severely - less so since the latest patch, which I'm grateful for, but it still roves unpredictably everywhere between 20 and 60 within the space of moments.

I certainly don't need 60 fps, particularly since this silly monitor has Freesync and therefore gives me a fast, tear-free image at everything from 45 to 75 frames, but I'd like 45ish as it feels far less sluggish than 30. I can get 45 on 3440x1440 at Ultra settings. Then it'll be 20 in the next scene. Maybe 50 in the next. Even on the lowest of low settings, the frames are all over the place - going as high as my screen's (unnecessary, for me) 75, as low as 25 and everything in between.

Even with freesync, the jerky feeling is unpleasant when the frames fall to sub-45. I can lock the frame rate to 30 at a mix of Medium and High settings which gets me broadly smooth but slightly treacly feel with a distractedly washed-out, oddly blurred-seeming image (no, I don't have adapative res turned on). That's the best it gets at 3440x1440. Playable now, yeah, but some distance South of sweet spot.

This isn't a principle thing. If monitors were able to display non-native resolutions without the image looking all 60s-vaseline-effect, I'd use a lower res without hesitation. But unfortunately that's not the case, and instead non-native looks sludgey - as does Dishonored 2's built-in 'adaptive resolution' option, which can lower the res on the fly to compensate for performance drops. It's a neat idea, but it really does look awful.

However, finding the 30 fps lock a mite distracting, I tried dropping the res to 2560x1080, the lower ultrawide standard. Yes, it gave me the whole 'playing with smudged glasses' look, but it also did magnificent things to the framerate - I don't care about the actual number, but I do care about a game feeling nice in the hand, which it now did. This is because the number was now relatively stable, instead of roaming all over the place. It won't stay at 60 on any setting, but it will spend most of its time there (and often my sreen's full 75) even at Very High, while Ultra gets me 45 almost all the time.

A choice of compromises, then. Fortunately, Dishonored 2 added several new settings in its latest patch, one of which was a sharpening post-process effect for its TXAA anti-aliasing. I whacked this all the way up to 20 instead of the default 10, and the image looked significantly less blurry. It's still obvious it's non-native, but it's OK, no longer like a dog did its bum-cleaning shuffle across the screen.

Why is it so much better? Well, first some maths. We're talking about rendering 4953600 total pixels at 3440x1440 versus 2764800 at 2560x1080 - a drop of 45%. So we might as well say I've effectively halved the resolution. On paper, a 50% speed boost - from 30 to 60, for the sake of argument - makes sense. But only if one takes the position that I am not entitled to expect a certain baseline of performance of 3440x1440.

Sure, if I were trying to do 4K, with its very silly 3840 × 2160 / 8294400 total pixels, I wouldn't expect much, but I ain't going anywhere near that foolishness. I've been playing games at 1440p for half a decade now (had a grey market Korean 2560x1440 before this one) and it's always been fine so long as I had a decentish graphics card - I've not put myself in a bizarre niche here.

Again, I don't expect max - but I haven't had this severity of problem or had to use this solution in literally any other game. Sure, sometimes I've had to put settings lower than I'd like, but that's OK so long as I have a non-pooey image and smooth feel. It's the fluctuation and attendant wonky game-feel at any setting that bothers me, not being able to reach a certain maximum.

I'll note here also that, prior to yesterday's 1.2 patch, I was getting very poor, unpleasant-feeling performance at 2560x1080 too, and didn't test that res yesterday because I had it my head that I'd continue to find that 2560 and 3440 were doing more or less the same thing. I was wrong, and should have tested that first.

So, couple of suspicions:

1) The performance-fixing patch has been specifically tuned to getting 1080p right. Makes sense and I don't begrudge it, as that is the go-to res for gaming on most platforms now. D2 at launch caused problems for a whole mess of people, especially those with AMD cards, and I can totally appreciate that the priority has to be helping the majority. Maybe fixes for bigger resolutions will follow now that the baseline has been sorted out.

2) There remains an AMD issue. M'colleague John is playing the game happily on ultra at 2560x1440 (i.e. standard widescreen 1440p), using a GTX 1080. Yes, that is a real beast of the card, and clearly he's going to do better than I will on my Nano at a wider res, but he's also able to play at much higher settings and without the weird fluctuations that I see.

This latter could be down to ongoing AMD problems, it could be simply down to the power of the card, or it could be that it has 4GB rather than the 1080's 8GB of onboard memory (the R9's RAM is supposed to be much faster, but I honestly don't know how that shakes out in practice). I'm increasingly aware that I need an 8GB card for this res in the not-too-distant, particularly in terms of max-settings textures, and if it's that, I can accept it's to blame - D2's textures sure are fancy, after all. Not that lowering them makes more than a couple of frames' worth of difference.

I think what's needed is more granularity about minimum/recommended system requirements. I have well above Dishonored 2's recommended spec (but even then had unplayable performance prior to yesterday's patch), but if the official line is that they only mean 1080p by it, I'd really like them to say so in order that I can check my expectations before going in.

Or more in-game advice. For instance, a couple of recent Square Enix games, such as Deus Ex: Mankind Divided, popped up a little message when I set textures to high saying that this would require more than 4GB of VRAM, and so I duly set 'em lower and was content.

Solutions, then, are either to accept a lower resolution or buy a new graphics card. Can't afford the latter (plus want to wait for AMD to get its act together and release something new, as GeForces can't use my monitor's FreeSync - I have no brand loyalty either way otherwise). I'm going to go with the first approach, as that new sharpening option makes a decent difference. I could wait it out and see if future patches help the situation, but to be honest and I'm sick and tired of thinking about this and so will take the compromise.

The point of this post? Mostly that I don't get out much. But also to expand on yesterday's take on the latest patch, in which I broadly said 'good, but not good enough'. My revised take is 'good enough for 1080p/2560x1080, but not good enough above that'.

By which I mean Arkane have done the work and now sorted out the problem for the majority - if you've got a 1080p screen and have hitherto held off from buying the game, I believe that you should now be safe. It is a good game, and well worth your money now it's fixed for most folks.

As for relative outliers like me? Put up or shut up for now. But at least it's no longer lurching all over the place even at a lower res.

Read this next