A/S/L/FXAA/MLAA? Edge-Smoothing’s Future

By Alec Meer on April 10th, 2012 at 12:30 pm.

As smooth as Kenny

Do you care about anti-aliasing? Do you dream of snuggling up to its sort of crisp edges and mild performance hit? Or are jaggies an acceptable compromise in the name of RAW INCREDIBLE SPEEDY SPEED? It’s one of those things I find it increasingly hard to go without (though not as much as anisotropic filtering, missus) yet it’s always the first thing to go if a game’s not running so well on my ageing PC. Also, so many games don’t include a decent/any option for it in their settings, requiring me to have a fiddle in driver settings with variable results. Both NVIDIA and AMD are trying to change that, with newer anti-aliasing tech and the option to force it on globally in driver settings.

(I’ll probably get a bunch of stuff wrong here, but I am just trying to address the very broadest strokes. Dave Science I am not.)

FXAA is an NVIDIA-governed alternative that’s been doing the rounds in some titles with broadly positive results. It can muster edge-smoothing that’s not quite as nice as traditional MSAA but has only a fraction of the performance hit and fares better with transparent textures. The snarkier elements of the tech audience observe that FXAA is basically a post-processing blurring, but I’ve found it’s a decent halfway-house in practice.

Those who demand the finest image quality will stick with the old ways, but rank and file PC gamers may be pleased to hear that NVIDIA have just introduced FXAA as a driver toggle for their legacy cards, not just their new GTX 680. While not all games will play nice with it, ‘hundreds’ apparently will, and without having to have it as an option in their own settings.

AMD cards don’t officially support FXAA (apparently hacked drivers are available if you know where to look, and certainly Battlefield 3 was fiddled to offer it on AMD), having instead pushed their own alternative MLAA, or Morphological Antialiasing. Version 2.0 of that was recently released with the 12.3 and 12.4 beta Catalyst drivers, and apparently offers sharper image quality than both MLAAA 1 and FXAA.

So in theory no-one’s losing out in this new anti-aliasing party, but as with so many things NVIDian and AMDian throughout history, technology – and with it PC games – is branching out in slightly separate directions that’s going to complicate matters in all sorts of ways.

The new GeForce 301.24 beta drivers, plus a clutch of chest-thumping about what they offer (adapative V-sync and frame-rate limiting are the two other headline features) can be found here, while AMD Catalyst 12.3 is here or the 12.4 beta here.

, , , , .

91 Comments »

  1. Risingson says:

    Is this Michael Bolton disguised as Kenny G?

  2. indigohjones says:

    Much of a performance hit from NVIDIA’s FXAA or no?

    • D3xter says:

      It’s not really as much “Anti Aliasing” as filtering, it doesn’t entirely work based on edge correction and trying to draw straight(er) lines, but filters over entire frames retrospectively in single passes checking for edges, it might include HUD elements and chat/text etc. too if forced…
      So yeah, it’s faster but doesn’t exactly do MSAA. It’s probably good for consoles as they don’t have the power for complex algorithms or if other shaders take priority rendering.
      http://www.ngohq.com/images/articles/fxaa/FXAA_WhitePaper.pdf

  3. Derppy says:

    SMAA > FXAA/MLAA. Better quality and not owned by Nvidia or AMD. That’s what developers should use and just forget about the latter.

    I also hate to see games dropping support for MSAA entirely. The “fake” anti-aliasing blur filters are horrible for stuff like small fonts, because they treat the whole image as a bunch of pixels and aren’t aware of the context.

    The performance hit might be huge compared to these blur filters, but it should still be there, because it’s superior and eventually everybody will have rigs that can utilize it.

    • CilindroX says:

      AND you can inject it on most games! Used the SMAA injector on Skyrim and The Witcher 2. The performance benefits are worth it against the (minimal) image quality degradation (compared to the built in or AMD’s/Nvidia’s approach)

      Just remember to disable the built-in AA on the game, and do not force anything on your VGA’s control panel.

      Linky-links:

      injector – http://mrhaandi.blogspot.com/p/injectsmaa.html
      SMAA – http://www.iryoku.com/smaa/

      • Tokamak says:

        Is there a master list of games that work with SMAA? Or at least configurations for them? So far I’ve tried about a half dozen games and they’ve all crashed or simply didn’t work.

        • CilindroX says:

          I don’t recall there being one but, honestly, I’ve tried with a good part of my Steam library without issues.

          Remember you need to be using the correct wrapper for the DX version of the game executable AND disabling any kind of *forced* Antialias that’s set on your graphics card control panel (this seems to be your problem here).

          So far I’ve used this on all the games mentioned above, GTA IV, an awful lot of Unreal Engine 3/2.5 variants, the Assassin’s Creed series, Trine, Deus Ex HR, and Darksiders.

        • Freak2121 says:

          Any game that’s DirectX 9 or above should work (or at least it does for me).
          Sadly, OpenGL isn’t supported yet. :(

      • Subucula Tertia says:

        Here’s a more recently updated SMAA injector: http://forums.guru3d.com/showpost.php?p=4389641&postcount=763

    • Khemm says:

      Totally agree. I use “SMAA injector” to enable Anti-Aliasing in games which don’t have any proper AA support (Dead Space, Settlers 7) and the results are pretty good. Supersampling and MSAA look the best, no question – they provide image sharpness and get rid of the jaggies – but SMAA also does its job, though obviously not as well as MSAA.
      http://mrhaandi.blogspot.com/p/injectsmaa.html

      FXAA and MLAA are terrible. TERRIBLE. They blur the image beyond belief to the point I can’t shake the feeling that vaseline has been spread all over the screen, not to mention the filter FXAA applies has a negative side effect on textures – the loss in detail is clearly noticeable.

      We need some equivalent of MSAA, that’s for sure.

    • royale says:

      Wow, ty for the link to SMAA. Just tried it out on Dead Space 2 and it looks fantastic, cleaning up the jaggies on distant objects without killing framerates. Wish I would have had this for Dead Space 1, back when I spent hours trying dozens of NVIDIA compatibility codes. (That game’s visuals would make me sick at times. (The edges, not the aliens.))

      This is really impressive and I’m sure I’ll be following up with trying it on other games.

      • CilindroX says:

        @Royale : I’ve seen the most benefits on games based on Unreal Engine 3 (which is a PITA to deal with AA) – so far it has helped me with both Mass Effect 1 & 2, Gears of War, and Batman Arkham (Asylum & City).

        Also, when Skyrim was compiled like ass, this granted you an extra +5fps or so, making the experience really worthwhile. Same for GTA IV and all its variants’ shitty performance.

        And granted, from time to time I’ve got to press the “PntScn” just to remind myself how awesome this thing is.

    • kalirion says:

      Hmm, I wonder if this SMAA injector would work on my laptop’s Intel 4500mhd GPU, which doesn’t support any sort of AA natively.

      Obviously the performance hit would likely make all but the simplest dx9 titles unplayable, but would be nice to have some AA in Zombie Bowl-a-Rama at least :)

    • Premium User Badge

      FriendlyFire says:

      The reason more games are dropping hardware-based antialiasing is that many developers are switching away from forward rendering and toward techniques like deferred shading. Deferred shading does not support MSAA or SSAA and activating it will in fact cause all sorts of nasty stuff. There are some unclear workarounds for it, but generally speaking hardware-based AA is turned off for deferred shading.

      This makes the need for a good post-processed antialias pass crucial. Further, the post-process can actually take advantage of the specifics of deferred shading, like access to screen space normals, to better compute edges and smooth out only what is necessary. I believe we’ll see even more “PPAA” down the line with the likes of SMAA and FXAA 4.

      And if you’re wondering why deferred shading is being used at all, the reasons are numerous, but the primary one is that it lets you cheaply compute numerous light sources. Forward rendering requires rendering all the geometry once per light source, whereas deferred shading renders geometry once and then computes a simple “light pass” for each light, which is very lightweight. Hundreds of lights at the same time are entirely doable with deferred shading, whereas forward rendering would be limited to a few dozens at best.

    • Sweetz says:

      Yup. SMAA cleans up edges as well as or better than FXAA/MLAA with only very minimal blurring within textures. The thing is, we’ve all just been using the injector and getting great results. the injector can only apply the most basic features of the SMAA algorithm. If developers implement native SMAA, there’s even more they can do with it – but so far no one has used it in a production game. Hopefully we’ll be seeing a native implementation soon and it will catch on and supplant the use of FXAA.

      As far as “games dropping support for MSAA” that’s due to the near ubiquitous prevalence of deferred rendering these days which doesn’t mesh well with how MSAA works. Both nVidia and AMD have come up with compatibility hacks to enable MSAA in deferred rendering games (notably UE3 based titles), but it’s still partial coverage at best.

      Deferred rendering is key to how good games look these days, so asking developers to stop using deferred rendering so their game can have MSAA isn’t reasonable. Given the results I’ve seen with the SMAA injector, and the promise of how good it can look in a native implementation with some of the other features turned on, I’m ready to give up MSAA in favor of well done post-process AA.

    • phenom_x8 says:

      Wow, this is great guys. Thanks for those smaa injector, it’s awesome to watch my GTA 4 texture looks sharper than before (by using FXAA injector ). I never thought that there is some existence for a much better post process AA than FXAA. I’m rather dissapointed by MLAA that affecting text very clearly and FXAA that blurs texture significantly. After tried it on GTA 4, looks like both of them are diminished almost completely. I hope more developer took this as alternative AA replacing FXAA and MLAA with their stupid exclusivity(although FXAA are not exclusive for NVIDIA card due to those injector)

    • Snakejuice says:

      “The snarkier elements of the tech audience observe that FXAA is basically a post-processing blurring”

      I’m one of these guys, I love MSAA but find FXAA so bad I rather run with no anti-aliasing at all. Would it be worth my time to look up and experiment with SMAA or is it just a reimplementation of FX/MLAA?

      • CilindroX says:

        From the link:

        “Our method shows for the first time how to combine morphological antialiasing (MLAA) with additional multi/supersampling strategies (MSAA, SSAA) for accurate subpixel features, and how to couple it with temporal reprojection; always preserving the sharpness of the image.”

        To my eyes it’s something along the lines of “FXAA on paper, but way crispier and with the same performance toll.”

        Try it for yourself, it won’t set your VGA on fire, and you only waste 4 clicks at the most.

    • Zyrusticae says:

      I feel compelled to point out that the noted flaws of FXAA have been largely fixed in the driver implementation by Nvidia.

      I’m actually surprised by how effective it is at getting rid of aliasing while preserving the readability of text.

  4. mckertis says:

    It’s been many years, and I’m still struggling to understand why people are so obsessed with AA in games.

    • Sheng-ji says:

      Me too, personally I find it the equivalent to having grease rubbed in your eyes if it’s too strong and totally destroys any crispness or sharpness your card may be capable of delivering.

    • Apples says:

      The effect where you see jagged pixels run up and down diagonal lines of models as you run around in a game is not hugely aesthetically pleasing. Too much AA makes everything look slightly weird and soupy, and probably really slow in terms of FPS, but none at all on modern games with high-poly models can also look pretty toss.

      People who go back to old games (as in 90s to early 00s) and add AA, on the other hand… I don’t really get that. That wicked jaggy look is part of the graphical charm!

      • LionsPhil says:

        Actual, real, anti-aliasing, as in compensates for the signal processing issue know as aliasing, e.g. supersampling, does resolve that case quite nicely. It will also catch things like a very narrow ship mast, fencepost, or the like in this distance (fences and rails going off into perspective is a great example) which are hovering around just about a pixel wide.

        Edge-detection/blur hacks cannot, by definition, handle that case, since they operate on pixmaps, not geometry, and will have said objects flickering in and out of view.

        Edit: In fact, the comment below with the animated GIF toggling between no and 4x AA in Half-Life 2 shows it nicely; look at the cables. 4x AA is effectively using geometry information to determine what fraction of the cable is in each pixel (in that it’s checking to see which points at some higher resolution the polygon hits), so can make them look consistently thick. No AA, or no-AA-just-a-blur-hack, can’t; the information is lost when you quantize it down to some integer number of pixels wide.

    • tehfish says:

      Count me as the total opposite. AA is the single most important graphics setting to have i think.

      It’s not quite so bad on consoles, as their usual 720p(or less) to 1080p upscaling provides a measure of aliasing anyways plus people generally sit far away from the screen

      On the PC though, jagged lines are very noticeable and in my opinion, look terrible/distracting.

      • diebroken says:

        I rarely go above the minimum (2x/4x) for AA, and focus on textures/AF (anisotropic filtering) instead as the main point for graphics settings…

      • royale says:

        It’s absolutely necessary for me, although some games are worse than others. I found the jaggie effect in both Dead Space and GTA4 to be so bad as to render the games unplayable.

      • Premium User Badge

        The Sombrero Kid says:

        upsampling makes jaggies more pronounced, not less, this just reinforces my opinion that vocal advocates of anti aliasing actually don’t know what it does or notice it.

        • Premium User Badge

          FriendlyFire says:

          Depends on the method. Often, you’ll get upsampling with filtering (it’s still cheaper than rendering at the target resolution), which tends to make things blurry and thus filter out the aliasing.

          Of course, this is really the worst form of antialiasing; might as well apply depth of field everywhere – oh wait…

        • DrGonzo says:

          Well jaggies do bother me quite a lot in certain games, not so much in others, but a smooth framerate always wins out to AA. But if you can’t tell the difference you either have a very high resolution monitor or sit quite far away from it, its a very noticeable and obvious difference in my opinion on my 1680×1050 screen.

          edit: didn’t read your post properly, ignore me!

        • TechnicalBen says:

          Not if it’s an amount less than double the scale and no filters applied. Try it in an image editor. Double scale = more jaggies. 1.5 = less. Is 1080p twice that of 720? No, then some of the edges will be “smoothed” along the pixels.

      • Sheng-ji says:

        Nah, I know what thay are – in your example, x4 is fine, but get it up to x16 or x32 and it starts to make the edges look out of focus (not in the same way as bloom). Those jaggies on a pixel level are responsible for the micro contrast which causes sharpness. Wash them out to strongly and that sharpness disappears. Look at any graphics card advertising, you won’t see jaggies, but everything will be crisp and clear inferring that their “best” settings is a quite weak, possibly only single direction AA filter.

    • hatseflats says:

      AA is one of the most important graphical settings IMO. Jagged edges are really distracting and are far more obvious than (lack of) tessellation, polygons or even low resolution textures. Instead of, for instance, a cable you get a few chaotic pixels. That’s far more nagging than an object with low details, because then the object is at least an object.

      • grundus says:

        While I don’t doubt that you really notice the jaggies without AA, I struggle to see any difference between any of the settings on pretty much any game. I tried Dirt 3 last night, I went from 4x MSAA to whatever the highest setting is and saw no real difference. When it gets to the point where you need a loupe to tell the difference it’s just pointless strain on the graphics card (though my 580 still got in excess of 100fps anyway). I play BF3 with 4x MSAA and it looks great, I can’t really see how it could improve even further. Maybe I’ll find out when I get my 680…

        • TechnicalBen says:

          Just because you are unable to notice, does not mean other people don’t either.
          Some of us get a strobing effect due to the screen resolution. Other times it’s just our eyesight is better. ;)

        • Fiatil says:

          It seems like a weird leap from, essentially, “anti aliasing is pointless” to “anything over 4X anti-aliasing is pointless”. I’m one of the people who makes a huge deal out of anti-aliasing in my vidjagames, but anything past 4X or 8X in some cases and I can’t really tell the difference. The complete lack of it is what makes things look so terrible and jagged, you don’t need 40X AA to be able to notice the huge difference between no and some anti aliasing.

    • simonh says:

      Maybe it’s just that people sit at different distances from their monitors? Personally I prefer to have my eyeballs almost pressed against it so that it takes up most of my field of view. I know many who have their screens at the back of their desk though, of course you’re not going to see the pixels then.

    • fish99 says:

      Depends on the game. Some games where there’s tons of foliage and a limited palette, like say Crysis or the Hunter, you aren’t really going to see many poly edges against a different colour to notice them. Other games though it’ll stick out like a sore thumb – iRacing is a good example, it’s ugly without AA.

    • LionsPhil says:

      The ultimate end-game is for everything to have a, if you’ll excuse the Apple marketing, “retina”-grade display with a DPI so high that AA becomes pointless, because you’d need a magnifying glass to see individual pixels anyway. And more brute force power from graphics cards to fill those pixels, rather than farting around with shaders.

      This even includes text rendering, which means we could go back to rendering a character being just rubber-stamping it onto the screen, rather than fancy blends onto the background (especially when getting sub-pixel hinting involved).

      • mendel says:

        Well, that’s just oversampling in hardware, sort of.

        • LionsPhil says:

          Well, once you go past the limits of the human eye, yes, sort of, although it still applies universally, including outside of games, and allows the actual content the application is rendering to be AA-ignorant. So some font-rendering complexities go away, and bad interactions between supersampling AA and various lighting/shader tricks stop mattering.

    • Metonymy says:

      I absolutely despised AA when it first appeared.

      (blurry textures? In MY Doom/Quake? It may be more common than you think)

      But now the games look good enough that you need some 2xAA just to get rid of the artifacts. Anything more than this is nonsense, of course. If only the games hadn’t lost quality over the years. Can you imagine playing a REAL Doom game with good graphics? Not garbage gameplay like Bioshock, Skyrim, CoD, Rage, etc.

  5. Apples says:

    I’ve never actually bothered with MLAA on my AMD, but apparently it blurs text, especially small text, quite badly. If FXAA is just a sort of overall blurring effect it likely has the same problem. Not a good candidate for your Morrowind-alikes but probably nice for fast-paced shooty games.

  6. CliftonSantiago says:

    I was hoping this article would be a nice decription of what all those bloody stand for, and what they actually mean! The ones relating to Ambient Occlusion are even worse.
    I feel so intimidated by acronyms.

  7. Swabbleflange says:

    I really don’t care for it, and very often turn it off in games if it’s toggled on as a default.

    I don’t know what it is… I think it just stems from the fact that I like to ‘see graphics’ – and not just from a pure low-res pixel art point of view. I used to like seeing how things were drawn, especially back in the VGA days (modern smoothing filters on emulators are a crime against humanity). I suppose with a lot of things these days being high-res textures mapped onto polygons the nature of ‘drawing graphics’ is diminishing more and more, but I still really appreciate seeing those edges. It’s kind of what defines the visuals as a computer game.

  8. noclip says:

    Antialiasing isn’t only about smoothing edges. FXAA is nice if that’s all you care about, but nothing quite replicates FSAA in making things look “right”. The nice thing about FSAA is that it’s actually physically correct. The math of it is just a low-pass at the Nyquist frequency to remove information that shouldn’t (in the real world, couldn’t) be there.

    • TechnicalBen says:

      It fails on transparent textures like fences etc though (they have jaggies and no poly for defining the edge of the texture).

      Which makes Super Sampling the true best option for quality. But FSAA sounds second and makes up for it with performance increase.

      • ohne says:

        You actually need randomized supersampling to really (probabilistically) remove all aliasing, not just regular grid supersampling.

  9. CliftonSantiago says:

    Also…Kenny G! Boo! Hiss!

    • The Tupper says:

      Kenny G is so mellow that when he relaxes the rest of the world gets stressed. Or something.

  10. AbyssUK says:

    I find half a bottle of tequila before a gaming session helps greatly with the “jaggies” It also helped me complete Rogue Warrior without vomiting…

  11. Jams O'Donnell says:

    Tough on jaggies, tough on the causes of jaggies.

  12. amorpheous says:

    Hm, so enabling FXAA in Skyrim on my HD4850 was actually not doing anything? :(

    • Premium User Badge

      FriendlyFire says:

      FXAA is just a post-processing filter. If a game supports it, then it’ll support it for all machines, regardless of GPU manufacturer.

      The article’s a bit misleading on this: what he meant to say is that FXAA is integrated into Nvidia’s control panel so that you can force it on games that don’t support it. If FXAA is natively supported by your game of choice, then it’ll work everywhere (and most likely better than control panel-based FXAA).

  13. Phinor says:

    What little effect FXAA has on aliasing, it completely destroys image quality and I hate even the idea of having such a filter in a game. Now I usually do have a choice but we are getting more and more games where the ONLY AA method is FXAA and they don’t even bother to tell you that their AA is actually a ridiculous blur filter as was the case with for example Mass Effect 3.

    Now the easy argument is that I could disable the in-game AA and enable proper AA through for example CCC but that doesn’t simply always work. With ME3, I forced 8x MSAA through CCC but nothing happened in the game so I ended up fighting the issue 30 minutes with Google and CCC. So by all means add FXAA as an option into your game, but please provide actual AA as an option too!

    The worrying thing is we are getting more and more game engines that don’t actually support MSAA at all, not even by forcing it through drivers. Instead, we are getting more and more support for that blur filter. I understand why some people prefer the filter: games run a lot better and it makes games look more like, well, console games with their lower resolution and shoddy image quality. Actually some people even prefer having font smoothing on in Windows and I’m fine with that even though I’d never do it myself. But goddamn, give me the option to choose!

    edit: And just like that, the post below mine is from someone who loves FXAA and that’s great. Why not offer us both the satisfaction in form of a choice?

    • fish99 says:

      Agree about FXAA, it just looks like a blur filter to me. I tried it in Skyrim and it made the whole game blurry, like when your LCD is upscaling a lower-than-native res. Ugly and IMO worse than no AA at all.

  14. stillwater says:

    The recent trend of FXAA is a dream come true for me. It simultaneously (almost) solves three annoyances I used to have with many games:

    -massive performance hit of AA
    -AA that smooths some surfaces but ignores others
    -the unnatural ultra-sharpness of objects in most games.

    No one ever mentions the last one, which baffles me. For me, it’s right up there with stiff facial animations as one of the areas that is instantly recognisable as unrealistic and is most in need of improvement.

    • Zyrusticae says:

      Absolutely agreed!

      I actually like the slight (and it’s only slight) blurring effect of the post-processing AA solutions. Games as it is are unnaturally sharp in many situations, and this can take me out of it very quickly. The slight blurring helps sell the idea that I’m looking at the world through a camera or a pair of eyes instead of a rasterizer.

    • DrGonzo says:

      FXAA in Skyrim is wonderful I find. No performance loss, or at least no noticeable one and it makes it look rather nice on my TV. But, in BF3 it looks horrendous and blurry and horrible, can anyone enlighten us as to why that is?

  15. Premium User Badge

    The Sombrero Kid says:

    As a developer I’m just not interested in implementing Anti-Aliasing, to my mind it’s super sampling or nothing everything else gets in the way and is rendered redundant by higher resolution displays.

  16. airtekh says:

    I have never ever been able to tell the difference between AA and non-AA gameplay.

    Because of this, I always turn it off when the game gives me the option, so I can benefit from the extra performance.

  17. Jabberslops says:

    Anti-aliasing has become less necessary over the years. Increasing monitor pixel density and higher resolutions are “quickly” making Anti-aliasing obsolete.

    Although my monitor is not the best (Asus vh226h, it was the best deal for the money I had available to replace my CRT), I find that I don’t need AA in most games at 1920×1080 and prefer to use higher graphics settings assuming I can maintain 50+fps average in multiplayer games on my aging computer (q6600 oc’ed to 3Ghz and gtx 460 1gb). I tend to play multiplayer games with the graphics on low for max fps or as high as I can go before I dip below 50fps.

    Singleplayer fps is not as important to me as the eye candy. I can usually tolerate 30fps minimum as long as the game looks great. Skyrim for example is usually tolerable with lower fps, but with the recent increase in performance with the newest patches I average about 45fps in most areas with all settings maxed at 1920×1080.

    I also usually skip on maxing Anisotropic filtering in multiplayer games, preferring instead to set it to 4x since in most games it doesn’t seem to improve the textures enough to be worth the potential fps drop.

    • Premium User Badge

      The Sombrero Kid says:

      There’s no perceptible overhead to using anisotropic filtering & it doesn’t reduce visual quality like AA, you should always max it out.

    • Jamesworkshop says:

      I find AA less important as motion blur and HDR post processing alreadt soften the image enough as it is

      • TechnicalBen says:

        Cannot play with motion blur on. Messes up my vision. :P
        Post processing though, like DOF is wonderful!

  18. Zaxwerks says:

    I can’t stand jaggies – they taunt me from the screen!!!

    I have to have AA enabled to be at one with a game, for some reason nothing pulls me out of my imersion in a game like stepped jaggies shouting “LOOK AT US WE’RE UNALIGNED PIXELS!!! YOU’RE NOT CONTROLING A REAL PERSON!!!!” at me, the fact that I am somehow able to spontaneously hover 20 meters above the ground and control a bunch of Space Marines with a sodding massive mobile arrow seems to be perfectly acceptable and not worrying to me in the slightest.

    I sit 3 foot away from my 24″ 1920×1200 monitor and the jaggies are still very noticable and distracting even at native resolution. Perhaps it’s because I am slightly red/green colourblind so the subtle colour bluring is not noticable to me, but apart from minute text as has already been mentioned I don’t notice a bluring of the textures, just lovely smooth natural edges to things…. smooth like a shaved chihuahua that has then been laminated… it’s an image that works for me, your milage with it may vary…

  19. DickSocrates says:

    I never turn any of it on. I’m not fussed by seeing some edges, in fact I prefer the crisp look to the dulled look. I’m more interested in playing the games than stroking my e-peen. However, I’m more interested in stroking my real peen than playing games, though I’ll often combine the two, so save time.

    When games start looking exactly like real life, then I’ll start worrying about stuff that doesn’t happen in real life, like jaggies.

  20. Wut The Melon says:

    I’m getting slightly confused here – I get the impression some of you commenters confuse (MS)AA for FXAA or other post-processing solutions as discussed in the article.

    MSAA or other forms of real-time AA do take quite a performance hit and, at a high pixel density, is not very noticeable in some games, but it doesn’t ‘blur’ edges or reduce image quality. It’s called Multi-Sampling (MSAA) or Super-Sampling (SSAA) for a reason: it renders (parts of) the image at a higher resolution and then downscales it to your actual resolution. If anything, this increases sharpness and image quality (real-life digital photography works pretty similar, actually).

    FXAA, MLAA and other post-processing solutions are intelligently applied blur filters, so those will in fact decrease image quality. Though especially with the latter or SMAA the lack of jaggies more than weighs up to the hardly noticeable decrease in image quality.

    …or that’s how I think it is, anyway.

    • Jamesworkshop says:

      FXAA is compatible with SSAA as it only applies to transparent surfaces

      http://hardocp.com/article/2011/11/28/skyrim_amd_crossfirex_performance_iq_review/3
      -Now, on the bottom right image we have 8X MSAA + 8X TR SSAA + FXAA enabled, and now this texture has its aliasing reduced thanks to FXAA being enabled. This is why turning on FXAA alongside 8X TR SSAA helps reduce aliasing in the game. You get the sub-pixel accuracy of 8X TR SSAA plus the reduction of specular aliasing with FXAA. It is the best of all worlds, all aliasing is reduced with these things enabled.

      Smoothing pixels, or reducing aliasing may give the impression that the image is “blurry”, but in reality, the texture is exactly the same. The only difference is the reduction of aliasing. So we think this is what people are noticing with FXAA. We think the reduction in specular aliasing is making it appear areas are more “blurry”, when in reality all it is doing is reducing the aliasing.

  21. Jamesworkshop says:

    I’ve never really cared for AA or even V-sync

    MSAA is fine but developers have started using deferred rendering pipelines where MSAA is not easy to implement especially in a multiplatform enviroment as the consoles are very outdated and most games do not use Dx11 and those that do often do little with it.

    frame-rate limiting

    They have had it for a while (not sure why it’s not an option in the driver control pannel), but Frame rate limiting is the first thing I look for in .ini and .cfg type files for games so it’s a very welcome edition to be able to lock to 60Fps no matter the game.

    I was glad both the Diablo 3 beta and the Fully patched Trine 2 both allowed you to set frame rate as a slider from inside the game.

    My computer beeps horribly at me if the frame rate goes to high, which often happens at 2d menu screens and loading bars.

    • royale says:

      Why do you dislike V-sync?

      • Jamesworkshop says:

        Well adaptive V-sync does solve the frame dropping issue that V-sync has, really what it comes down to is input lag and the fact that screen tearing just doesn’t bother me

  22. Drinking with Skeletons says:

    I understand what AA does and value it highly, myself. However, why would anyone value Anisotropic Filtering above it? I’ve never been able to figure out exactly what AF does. I get the vague sense that a game looks slightly better with it cranked up, but it’s always hard to tell and mostly negligible. Maxing it out always seems to kill my performance without any appreciable boost, so what’s the point?

    • Jamesworkshop says:

      AF is a form of texture filtering

      http://en.wikipedia.org/wiki/Anisotropic_filtering

      I only avoid it in Crysis where POM does a much nicer job

      • Premium User Badge

        Stellar Duck says:

        I haven’t played Crysis for a good while so what on earth is POM?

        As for AF, I usually crank it up as high as I can in what looks like the best way. I never did manage to find out what the various names of it mean though.

    • eclipse mattaru says:

      AF makes textures sharper when they’re at an angle (which is pretty much all the time in 3D games). It’s actually very easy to see it in action: Just look at a tiled floor or a brick wall; the lower the AF setting, the closer to you that the tiles/bricks will start becoming a blurry mess (the subway level of the first Max Payne, for instance, is a great place for this test).

      By the way, the performance hit is completely unnoticeable (and this is said by someone who has been struggling with framerates for years), so why it isn’t always on by default is beyond me.

  23. fooga44 says:

    The solution is for higher resolutions, the higher your resolution the less you need AA. You usually only need AA on big monitors and you’re running less then 1280×1024.

    If you still need to run AA @ 1920×1200 or above there is probably something wrong with you. A great test game (if you have a large monitor) is supreme commander 2, it’s a game filled with jaggies you wouldn’t notice on a smaller monitor at say 1024×768, then flip to 1920×1200 and the need for AA goes away.

    • Grayvern says:

      Not necessarily ME3 looks like complete arse on my 1900 x 1200, even with developer supported AA. I had to use nvidia inspector and some guides, and it looked so much better with a small amount of multisampling and sparse grid supersampling.

      Then again I do sit about a 2 feet away from my 24″ monitor.

  24. arkestry says:

    This is one of the more troubling comment threads I’ve ever read on here. I can understand not knowing the difference between the various types of AA, but not understanding what it does at all? Or not knowing what AF does? Guys, there are like… a single digit number of important graphics settings things. For something you only ever would run across while already on a computer, why wouldn’t you just look it up the first time you saw it? I’m not very technically inclined or anything.

    And if I’m surprised there are people who don’t use AA, I’m astounded there are people who claim not to like it. Are you conflating not having the hardware to run something well with hating it? HD textures take a hit on performance too, but certainly you don’t dislike them or think they’re unnecessary just because you can’t use them, right? Obviously it’s better to choose whatever settings give you a smooth framerate, no one advocating AA is disagreeing with that… it’s something extra if you have the specs for it. And I don’t myself, or not for maxing it out on titles from the last couple years at least.

    And claims about post-processing AA making things very blurry are greatly exaggerated… they surely can do on small text, but if there’s noticeable actual blurriness and not just anti-aliasing I think ur doin it wrong and have your graphics card and the in-game settings working at odds or something. A few differently worded image searches didn’t reveal comparisons I would agree are unduly blurring the image, but maybe I’ve just never seen it. Pics or it didn’t happen. If that is happening of course I wouldn’t recommend something that’s putting out a blurry result on your setup, but––and I’m conscious of how condescending this sounds, and of course I hate when people respond to my internet complainings like this when I know what I’m talking about––but are ya sure ya don’t hate overuse of bloom and depth of field? There seems to be confusion in at least some corners about what various effects do so that seems not out of the question.

  25. outoffeelinsobad says:

    That is a sharp-dressed man.

    • outoffeelinsobad says:

      I like how he un-buttons two of the buttons on his sleeve to show that its probably bespoke. Well played, Kenny.

  26. Special Agent Brown says:

    Was expecting Saxophone Hero or something.

  27. Arachnyd says:

    MLAA pretty much saved Dead Space 2 for me. That game has unbelievable jaggies without any AA, and for whatever reason traditional AA solutions nuked my framerate. MLAA got rid of the jaggies at a *slight* expense of some blurriness around edges, but almost zero FPS hit.

  28. pawel86ck says:

    FXAA does blur textures and text but not soo much as I thought it would be, you need to play game that has very sharp textures and it would look like lower anizo settings on ground textures (in the distance).
    Image quality is still good without performance hit so I use it in crysis 2/gta4/dead space.
    But I need to say FXAA in ALan Wake/Battlefield 3 are much blurier than just FXAA in Nv control panel, dont know why, it looks like they arnt the same FXAA.

    Tried SMAA injector in Crysis 2, it misses some important lines, trees/vegetation in the distance for example so it looks like no AA. FXAA is beter