Skip to main content

Storm in a Teacup: Anti-Anti-Aliasing

I keep seeing this story doing the rounds, and originally didn't deem it worthy of a post, but seeing as this week seems to have inadvertently been Hot Button Issue Week on RPS, may as well give you folks a chance to have a shout about Assassin's Creed too. I know you do like to.

There's some fury amongst the more technically-minded echelons of PC gaming about the announcement of a patch for the divisive stealth/Parkour/action/endless cutscene 'em up. The patch's main purpose is to fix some of the PC port's many glitches, but there's an unhappy rider attached - it removes the DirectX 10.1 codepath from the game. DX10.1 requires DX10.1 hardware - so your GeForce 8 can't run it, and will default to 10.0 instead. The latest ATI Radeon HD cards, however, do support 10.1. The differences between 10 and 10.1 are very minor, and certainly not worth upgrading your card for (especially as AssCreed is, I believe, the only game to support it so far), but there's one stand-out goodie - 10.1 corrects a snafu in 10 that caused some anti-aliasing slowdown. So, if you're running a Radeon HD 3000 series card, you can smooth out AssCreed's edges with significantly less decrease in framerate than if you're running a GeForce (in Vista, anyway). Unless you apply the update, which coldly removes the Radeon advantange. A patch that makes a game run worse? Hmm.

(I'm avoiding the technical detail on how this works for the sake of clarity, but it's here if you want it.)

For the sake of performance, I generally don't bother with anti-aliasing unless it's an old game or I'm taking screenshots, but crisp edges are really a big deal to image quality junkies. Thus, Angry Men On The Internet are extra-angry about the performance hit, while The Angriest Men have even accused NVIDIA of strong-arming Ubisoft into deliberately killing ATI's advantage. NV and Ubi have been sharing the AssCreed marketing campaign, and the game sports that The Way It's Mean To Be Played guff, so clearly NVIDIA wouldn't want it to run better on rival cards. Would they really go to such lengths, though? Both NVIDIA and Ubisoft strenuously deny there's any funny business going on.

According to TG Daily's extensive report (which includes comment from all involved parties) on this oh-so-specific scandal, Ubisoft have told them the game was developed mainly on 10.1 hardware and not tested on enough 10.0 systems, meaning a whole ton of gamers have been experiencing problems, including crashes. If that's true - and it's a pretty silly oversight if it is, given most every bugger seems to have a GeForce 8 these days - then just ripping out the 10.1 path could have been the cheapest, easiest way of making the most serious bugs go away. It may well be as a simple as there's no budget or motivation for a best of both worlds patch. Or maybe there is something sinister going on, though there's certainly nothing to prove it as yet. Simple sloppiness, so common in console-to-PC ports, seems the most likely explanation.

At any rate, there seems to be an easy answer for us gamers - install the patch if you're running a GeForce 8/9 or an early Radeon HD, but don't install it if you're running an HD 3400/600/800 card. Unless, of course, the patch is such a box of additional delights that it's worth having anyway. That said, the patch isn't out yet - when we've even got EA backing down in the face of internet rage, it's not out of the question that Ubi's plans might change.

Anti-aliasing, eh? It's the new piracy.

Read this next