Hard Choices: The Week in Tech

By Jeremy Laird on December 8th, 2012 at 5:00 pm.

The pitfalls of high performance PC graphics. Pitfalls. Geddit? Sigh

Graphics, graphics, graphics. It’s all you lot care about. Actually, it’s what I care about most when it comes to PC performance. So why fight it? Instead, I’ve got a couple of graphics-related titbits for you this week. Firstly, I’ve had a chat with Intel’s graphics guru, Richard Huddy. Odds are, you’ll be gaming on Intel graphics one day. What’s more, the mere fact that Intel has snapped up the likes of Mr Huddy, previously known for his dev-rel uberness at ATI, when there was an ATI, is symptomatic of Intel’s increasingly full-on attitude to graphics. The other part of this week’s awfully exciting package is NVIDIA’s new GeForce Experience. It’s an automated game settings optimisation tool. The idea is to take the headache out of graphics settings and give you the holy grail of PC performance and visuals with console levels of setup pain, which is to say zero pain.

First to Intel and its integrated graphics. We’ve previously covered the fact that Intel has been revving up its processor graphics cores in recent generations and that next year’s Haswell chips will take that even further. But just how serious is Intel about bona fide gaming high performance gaming? And can it really deliver?

On the hardware side, I don’t doubt that Intel can wheel out big improvements in raw graphics hardware capability, generation upon generation. It’s doing that already. In recent and coming generations of Intel CPUs, most of any given increase in transistor budget has been spent on increasing graphics power, not beefing up or adding additional cores.

It’s the software, silly

What I’m not so sure about is Intel’s ability to get a grip on the software side of the equation. “We’ve come a long way in the last three years,” says Huddy. “The difference now is that we’re absolutely focussed on delivering a complete solution of hardware and software.”

Of course. But how, pray tell, will that actually be delivered? After all, Intel’s track record for graphics drivers isn’t exactly stellar. Hell, AMD gets a kicking for its graphics drivers and, for most of human history, they’ve been on a different planet from Intel’s drivers.

A big part of the solution is embodied Huddy himself. He heads up a team of engineers working with game developers across Europe. He’s been around the block doing the same job for ATI and AMD. And he knows what he’s doing. “Regular driver releases are critical,” he says, “as are improved responsiveness to game developers and quick turnarounds on driver bugs.”

Huddy serves up an example in which developer Avalanche was wowed by a demo of Just Cause 2 running on Intel integrated graphics and is now keen to work more closely with Intel. But if I’m honest, I don’t get the impression that a huge amount of per-title optimisation has been done already.


3x Richard Huddy. From his ATI days. Because it needs to be 600 pixels wide

But then Huddy has only been with Intel since the beginning of the year. Anywho, this is not immediately dramatic news. But given the trouble AMD is in, it’s worth knowing that Intel is making all the right noises about driver quality and developer relations and that it’s got the right people working on the job. Frankly, it’s a good sign that it wants to get the message across to you guys in an interview.

I’ll be intrigued to see whether Haswell chips really do deliver a half-decent gaming experience. In the meantime, it would be interesting to hear if any of you lot do significant gaming on Intel graphics. If you’ve got feedback, particularly regarding driver quality and consistency, let it rip. There’s a good chance Intel will be watching. You never know, one day integrated graphics might not suck.

It’s an experience, alright

This week’s second hot topic is GeForce Experience. As I said above, the idea is PC performance combined with console ease of use. Functionally, that means two things, automatic optimisation of game settings and driver management.

On paper, it’s a bloody good idea. Personally, I get no joy out of mucking about with graphics settings. The idea that somebody has done all the work for me sounds just peachy.

NVIDIA says playing at ridiculous non-native resolutions and craptastic graphics settings is surprisingly common because people simply install a game and fire it up without touching the settings.

I doubt too many RPSers fall into that camp. But I bet plenty of you would like a helping hand with achieving the best settings. So how’s it done? Much of it is in-house testing at NVIDIA with what it claims involves thousands of hardware configurations. There’s also an element of user feedback come crowd sourcing, too, so it won’t just be NVIDIA decreeing settings. Gamers will inform the process.


GeForce Experience doing its one-click optimisation shizzle

It’s worth noting that both image quality and frame rates are targeted and the latter varies according to game type. In other words, twitchy shooters are optimised for higher frame rates than brainy strategy games. Oh, and it only works with NVIDIA graphics cards in Fermi and Kepler flavours, which for the most part means GeForce 400, 500 and 600 Series boards.

Usefully, the interface shows you what settings it’s gone for, so you can also use it as a learning tool. Right now it’s in beta with a limit of 10,000 users and 30 game titles supported. It’s only just been released and I haven’t had a chance to have a go. But if a few RPSers would like to take it for a spin, shout out and I’ll put the request in.

, , , , .

60 Comments »

  1. Raiyan 1.0 says:

    It’s kinda appropriate we get our top-end PC hardware news from a guy who gets invited by various manufacturers to test their luxury cars, eh? :)

  2. Premium User Badge

    Cardinal says:

    GeForce Experience let me download and run without any particular registration. I think it’s unlikely to make a difference to savvy RPS readers, but was unobtrusive and accurate.

  3. Buemba says:

    I don’t mind automatic optimization as long as the option to tweak certain effects is still there (Which is why I’m all for an external program handling this like the one Nvidia proposed but am opposed to how Carmack wanted Rage to work). I usually don’t like DoF and motion blur in games, and the ability to turn them off in a shooter is as much a reason for me to play them on PC as mouse control.

    • MattM says:

      Rage was infuriating because it took out all the graphics settings then didn’t properly recognize my video cards. It defaulted to comically bad textures while running at max FPS and barely loading the GPU.

      • Universal Quitter says:

        You know you can force the settings from your graphics drivers, right? I’m assuming you do, but I’ll explain it for anyone that doesn’t. Right click the desktop. It’s usually the first choice in the list that pops up. Might be last for intel, I can never remember.

        Anyway, some kind of control or vision center should pop up. 3d application settings. Find your game. This is also a fantastic way to make a game ctd, if you don’t know what you’re doing.

    • Shivoa says:

      Agreed, auto-selected defaults that match my hardware are great (and if you can’t twist the developer’s arm on making that then the GPU vendor is the second best guy to be spending a lot of time thinking about that problem – there is only so many driver hacks and advised settings they can do to get around games not configured to run ideally on specific hardware configurations) but it cannot come at the cost of the user’s ability to configure their system as they desire. Simple things like some visual settings being marmite effects or users who sit close to large screens needing a lot more AA to get the accuracy on an equivalent eye-arc level. Even users who will much rather sit around 30fps or those who would crank back something to give them 60 (but do they want to crank down an effect or antialiasing method?)

      I have on occasion used GeForce.com, which is where nVidia have previously published all this data about running different games with different hardware they make and finding what they consider the optimal playable settings. They even surface the results of this testing for 3D Vision titles by saying what effects/settings need to be disabled/turned down to avoid stereoscopic artefacts. It is certainly no bad thing to have an app out there to let all users get quick access to that and even pump the settings directly into the games if they have open config formats and the user is informed the program is going to be messing about with their config files (and making a backup so they can revert any changes).

      Hopefully Intel are going to be moving forward (although how much of this is driven by Apple and needing integrated solutions to drive 4K screens and do a decent job of quickly compositing the effects heavy modern UI + browser render is another question, as that work will come to a conclusion well before major discrete graphics are challenged) as good drivers are an important part of the equation (drivers + AMD giving compelling super-cheapo mobile graphics solutions is what prevented my last laptop purchase being an Intel model). Intel are already a step ahead of the pack on Linux with their decision that they’re far enough behind to embrace an open driver (rather than shipping out their closed binary blob which the community can’t fix while the community tries to build an opens source alternative that is unlikely to ever hit the performance and feature level of the manufacturer crafted solution).

      I’m not a big fan of forcing users to buy lots of GPU transistors with their CPU as the promise of using that block to do serious compute tasks (beyond possibly video encoding) doesn’t seem to be there yet, but then I’m unlikely to give much less than 100W to a dedicated GPU part so obviously my choice of graphics rendering won’t have serious competition from a CPU that’s main job is to have a good branch prediction OoO general purpose performance with a GPU stuck on the side. Hopefully the option for GPU-less options will become more attractive in future generations (either correctly priced to account for the saves transistors or using the roughly same die size to offer hexacore for near to quad core prices if you buy without a GPU).

    • MiniMatt says:

      Agreed – auto settings, no matter how good, cannot be a replacement for user tweaks. As already noted so many features are marmite settings and individual users will always want the ability to, eg, dump HDR and depth-of-field because they prefer to spend their “fps quota” on beautiful shadows.

      What I always did find useful were the tweakguides (which I think Nvidia have snapped up?) which gave rough approximations of how taxing individual features were so you could save a few FPS by dumping a particularly expensive sparkly which doesn’t float your boat and “spend” those FPS on something shiny like more anti-aliasing.

      • MattM says:

        Yeah the tweakguides.com guy has done some of his tweak guides for Nvidia. I really like them and even if you have an AMD card they are still useful. Nvidia actually has some pretty nice stuff on their site for a moderate number of games.

  4. dancingcrab says:

    I downloaded it as soon as I read about it, mostly out of curiosity. The changes it makes are sensible, although I’d tweaked my Skyrim settings to accommodate for the slow down in Markarth, which clearly NVIDIA had not, at least for my rig, because it boosted everything to Ultra MAX again. Which is fine for the rest of the game… just bloody Markarth that chugs.

    Interested to see how the BF3 settings work out – it mostly pumped them up, but I’ve struggled to maintain even FPS in that game.

    • Screamer says:

      Wasn’t that fixed in a patch? Marcath runs like vaseline on butter after patch 1.6 I think

  5. Radiant says:

    Embarrassingly enough I don’t know what half these acronyms mean when I’m setting up the gfx on something like say… farcry 3.

    MSAA? Isn’t that immune to antibiotics?

    The best one is SSAO. Which I assume is setting off global thermal nuclear war somewhere as it does nothing noticeable in game. [Farcry actually has all sorts of wonderful letters I can select instead of zaggassao or whatever it is].

    Halp.

    • Caenorhabditis says:

      Hah! I’m just the same. Although I have been gaming for quite a number of years, I really don’t know the stuff either. At least some settings go from off to 2x, 4x etc. so they make sense. But Far Cry seems to know all the letters of the alphabet, giving me no fighting chance.

    • MattM says:

      MSAA is Multi Sampling Anti-Aliasing. Anti-Aliasing reduces the jaggy stair step you see on the edges of objects in games.
      SSAO is Screen Space Ambient Occlusion. Ambient Occlusion darkens cracks and corners of objects with diffuse shadows and makes the world seem more real.
      A lot of games don’t give much info with their settings screens but just googling the acronyms will pull up some pretty helpful pages where all the types of DOF, AA, AO, and AF are explained.

      • PopeRatzo says:

        OK, I’ve got SSAO, but that setting has two other choices, “HDAO” and “HBAO”. What the hell are those?

        They all sound like psychoactive pharmaceuticals to me.

        • Conor says:

          HBAO is Horizon Based Ambient Occlusion, which is a more accurate/realistic/something method when compared to SSAO. HDAO I have no clue, sadly.

          • Sakkura says:

            Google says High Definition Ambient Occlusion or Highland Dancers Association of Ontario.

          • Radiant says:

            Ah dancers!
            So that’s who I shot.

    • onestepfromlost says:

      I know why cant we just have “pretty” and “not so pretty but fast”

      • Ich Will says:

        Because people with the other brand of graphics card or a different model etc would complain that the choices didn’t make any difference to them as it has been calibrated to turn off stuff their card handles with ease but not stuff their card struggles with.

        • subedii says:

          There’s also the fact that people in general have different aesthetic preferences on what does or doesn’t make the game appear “better” to them

          Bloom or no Bloom? How strongly is it used in the title to begin with?

          Motion Blur? How much? Would it depend on what type of motion blur the game implements?

          How high an anti-aliasing setting do you really need? Can you crank it down a fair amount and still not notice? And again, what type?

          Are you going to crank the whole thing down a few notches because you really prefer a solid 60 FPS? Or would you prefer to push everything as far to the MAX as you can but run it at 30FPS with the occasional dip? Or somewhere in-between because there’s certain visual effects you just don’t want to compromise on?

          I realise it’s not for everyone, but I really like having those kinds of options. It’s one of the things that irritated me when Crysis 2 came out and they replaced everything with 3, completely obtuse graphics settings. And then when people found a workaround via the command console, they closed access to the command console. And only relented after a silly amount of complaining on their forums.

    • Poliphilo says:

      Really the only acronym you absolutely need to know is: DoF for Depth of Field. Because this is something you simply must turn off at all times. The same goes for Motion Blur but that’s not usually abbreviated. Other settings like anti aliasing and ambient occlusion you could always just tweak according to the performance you’re getting.

      The only possible use for DoF I can think of would be in a game where the player actually controls an actual in-game handheld camera (like wot tapes things and TV and stuff). Game developers have no excuse for still using heavy DoF on default settings. It doesn’t simulate anything at all, it’s just an utterly useless pseudo-cinematic effect.

      • Tuga says:

        It’s actually quite useful in the Total War games, as it has the effect of blurring out the heavily LoD scaled background stuff, sprite trees, sprite armies, and such, making the low-quality artifacts appear less low-quality than they are. Since the player is basically a cinematic camera in a Total War game anyway, this does not bother me at all. (unlike things like lens flare in an FPS)

      • Caiman says:

        And this is why we need options, because I like DoF where it’s used well. I’m glad your eyes have an infinite depth of field, because mine sure don’t, and when implemented properly it adds effective realism. The most obvious example is looking down the barrel of a gun, which gets used a lot these days. Of course, it’s not always used well and in that case it gets turned off. I’m trying to remember the name of a third-person game where this was the case, but I’ve blanked it from my memory using electrotherapy.

        • Premium User Badge

          Aninhumer says:

          It’s not that our eyes have infinite depth of field, it’s that our brains compensate to give that effect most of the time. In real life, when I look around, everything I look at is in focus. But in a game with DoF, when I look around, most things are out of focus, unless I only ever stare down my crosshairs. If the game could actually follow my eyes and make sure what I’m looking at on screen is in focus, maybe it would be a nice effect, but without that, it’s just making most of the screen blurry.

          Looking down the barrel of a gun is probably one of the only situations it makes any sense.

      • MattM says:

        Yeah I usually turn of DOF too, since games can’t track your eyes it really doesn’t know where the focus should be. There are some games that make good use of it though. Witcher 2 used it only in directed cut-scenes where it made more sense.

    • Rinimand says:

      Here is a good site to explain Ambient Occlusion (for those who asked what SSAO is).
      http://www.independentdeveloper.com/archive/2007/11/27/what_is_ambient_occlusion
      I knew the basics (real life light reflection?), but this explains it much better than I could have. And the picture says a thousand words.

  6. povu says:

    Anyone remember GTA 4 and how it thought it knew better than you what your PC was capable off, and limited your ability to change settings? :P

    Damn silly that was. But an optional thing like GeForce Experience which hopefully will be a lot better than that too, that’s a cool idea.

  7. Baines says:

    If you can’t run a game at max settings, it can be pretty frustrating to try to work out what settings actually work for you. Particularly when you can only test some things at certain parts of the game, and heavens forbid the ever annoying “Must restart game” changes.

    Will reducing grass draw distance help that slowdown? Or should I disable dynamic lighting? Do I really need FSAA? I understand it isn’t the most efficient option anyway. Oh, wait, the game doesn’t give me a choice to disable it, so it doesn’t matter. Is it the particle effects? No, maybe it is the shadows, I’ll try reducing that to “medium”. Wait, what *is* a medium shadow anyway? Hrm, that seems to work. *10 minutes later* Sigh, now I’m facing more enemies and its slowing down again. Wait, forums say most people having problems should just adjust this particular value, but the game doesn’t give me that… Sigh, where’s the config file? Why wouldn’t they put such a useful adjustment in the game itself where people could actually find it on their own? They took the time to make these less useful options available in-game.

    • Ich Will says:

      You’re describing my first 10 hours in Skyrim!

    • webwielder says:

      Yup. And this is why consoles make more sense for a lot of people. Although that’s negated by the fact that despite having a fixed hardware specification to code and design for, a lot of games on consoles still have poor frame rates and slowdown.

      • subedii says:

        Blighttown says hi.

      • Baines says:

        I like the ability to change settings. I just wish games did a better job of presenting that ability. Most are really bad at it.

    • Stochastic says:

      I highly recommend visiting Tweakguides.com as a resource for graphics settings. The website creator, Koroush Ghazi, now works for Nvidia and publishes graphics settings guides for popular games. Usually he’ll point out the 1 or 2 settings you’ll want to turn down to get a massive framerate boost at minimal expense of image quality.

    • Low Life says:

      One thing I liked about Far Cry 3′s menu system was that it rendered the game scene in the background, so I just went into a low framerate area, popped framerate display on (using MSI Afterburner) and started fiddling with the settings. The effect on framerate was visible right away, so it was fairly quick to find which settings had the biggest effect.

      That doesn’t eliminate the problem of variable FPS in different locations, but at least it makes changing the settings more comfortable.

    • Solskin says:

      Hehe, exactly!

      I liked how Max Payne 3 did it, where it told you how much GFX RAM you had and how much the current setting used. I don’t know if that’s all that matters, but it did give me something to go after.

    • grundus says:

      You know, I went for months trundling along in the Project CARS alpha at a rock steady 38fps or so. The other day I realised my card should be capable of way more than that, so I had a look at the settings and I was running not one, not two but THREE types of AA! 8x MSAA and then two post-processing filters, SM and FX I think they are. 8x MSAA is generally overkill in my opinion on it’s own, so I turned the other two types off and the game looked exactly the same… Except way smoother in motion because it was pretty much 60fps at all times. Perfect.

      That seriously honestly took me about 7 months to do. The weird thing is with any other game I get the settings dialled in first, I suppose the difference is I couldn’t just stick everything on max.

  8. Turk Anjaydee says:

    People have different opinions on how much fps they’d want from their games which makes me think this kind of software will never be a big thing. Not to mention different people prefer to disable different things first. PC’s have never been about the ease of use. the flaming a studio gets for not letting users customize graphics settings individually is often quite big. While profilers are nice, I don’t see them as very newsworthy.

    I’d prefer if they’d just concentrate on screaming at the devs to make every graphical setting available at least through an .ini file. Then they could have a forum where people could trade settings files. That’s quite popular in Borderlands as it is.

  9. Scratches Beard With Pipe Stem says:

    Why are there compression artifacts in that Pitfall image? Is it my graphics card?

  10. Wixard says:

    Intel does graphics like Cyrix did CPUs, and even at that cyrix did a good deal better.

    They don’t have the culture or engineering expertise. They didn’t back in the 90s and they still don’t. Lets not forget their stellar CPUs prop up very shoddy IGPs. Slap that HD XXXX on an AMD and watch it all tank.

    I think it’s too late for anyone to challenge intel in the CPU world, and too late for anyone to challenge AMD/Nvidia. The designs are complex and the software just the same. They doubled and tripled their efforts a few years back, and barely managed to get in the same solar system as Nvidia.

    The only thing that has kept intel in the conversation at all, is game consoles slowing it all down. Were it not for that, they would be as unimportant as they ever were. But as it is, there’s a static line of performance that’s “good enough”. At least until the new consoles come out.

    Just my opinion, as uninformed as it is.

    • SuicideKing says:

      I think you’re probably correct, i don’t see integrated graphics delivering the same performance as a discrete card any time soon.

      There’s a reason that a graphics card has it’s own separate GPU, RAM and controller. It does specialized stuff that requires the kind of memory bandwidth you simply can’t deliver with regular system RAM, unless DDR4 somehow matches the bandwidth of GDDR5.

    • Nate says:

      Yeah, I agree, Intel isn’t going to beat the current graphics kings, but it’s not all about features and framerate. Cost, heat, space, noise are all large concerns for a part of the market that ATI and Nvidia aren’t paying a lot of attention to.

      At the same time, I think we’re approaching the limits of what we want graphics cards to be capable of. Keep in mind that when 3d acceleration began, we were aiming for getting 30fps at 640×480, with fewer than 1000 triangles, and maybe four screen passes worth of fillrate (not to mention texture size). With the current top-of-the-line graphics hardware, we’re talking about doing anything your game wants to do, at framerates approaching the refresh rate of your monitor, at native resolution, with anti-aliasing. With enough memory for textures appropriate to that level of resolution and AA. Taking advantage of more than that means ever larger art budgets for game developers, or else just lazier engine coding.

      I doubt that Intel wants to get into the high end graphics market, because, as you said, that’s a tough nut to crack at this point. But Intel definitely could get into the good-enough graphics market, and now is a great time to do that, because good-enough isn’t going to change nearly as fast as it has over the last fifteen years.

  11. SuicideKing says:

    Integrated graphics doesn’t really suck with AMD’s Trinity chips, especially the A10-5800K coupled to DDR3-1866 or DDR3-2133 RAM.

    And i’m surprised, you posted about that Intel rumor that they’re going to ditch sockets, but didn’t post about the company’s clarification statement?

    http://www.maximumpc.com/article/news/intel_says_company_committed_sockets2012

    • Jeremy Laird says:

      Given that the general tone of our contribution to the subject might perhaps be précised simply as “chillax”, I don’t think it’s worth a separate post. Moreover, Intel’s use of the term “foreseeable future” essentially leaves them scope to ditch LGA as they see fit.

      If they’d said “at least X years” or “at least the next X CPU generations” then you’d have something to hold them to. As is it, they can drop BGA as soon as, say, next year and dismiss any objections by simply stating that they hadn’t foreseen the need.

      That said, thanks for the heads up. I will put an update in the story with Intel’s statement, however.

      • SuicideKing says:

        True, everyone’s been jumping up and down with the “foreseeable future” part…I’m not quite sure they could have said “X number of years” because of the way the tech landscape changes. i mean, we still make fun of them for saying things like “10 GHz chips by 2010″.

        I somehow don’t see much in all this, really. It doesn’t make a whole lot of sense economically for the mainboard vendors and OEMs. Neither does it make sense for enthusiasts, power users, gamers etc. Rest of the people won’t really care, but i doubt it’ll be viable for Intel to do that throughout the market segments. They’ll probably distract themselves from their main strength anyway (making CPUs).

        I don’t think anything’s going to change dramatically until the current known tick-tock cycle is over, i.e. Skylake/Skymont in 2015/1016. That, i think is the limit of the “foreseeable future”.

        Also note that Broadwell=Haswell’s die shrink on the same socket, so i think that’s that.

  12. BulletSpongeRTR says:

    GeForce Experience is best used by those gamers with 400 and 500 series cards. I have a 670 and it recommended the same settings for my games I already use, Ultra.

  13. Xzi says:

    This seems like a grand waste of time and effort. We’re lucky to get five PC-specific video options out of most games these days. Which take a grand total of 30 seconds to set on your own.

    The one exception of course being games like Skyrim, but in that case, you’re going to be spending a lot more time modding to make it good than you will be tweaking settings anyway.

  14. Caiman says:

    This is a welcome initiative, frankly, but I agree with others that it shouldn’t come at the expense of user intervention for those that want it, or to work around poorly-implemented examples. Whenever I go and visit my folks, who are both retired and have bought themselves the greatest gaming rigs known to man, I die a little seeing them playing Skyrim at 800×600 with half the graphics turned down. They say they’re not bothered by it, but I know they appreciate the difference when I tweak it for them.

  15. MikoSquiz says:

    GeForce experience is just annoying. It has two things it recommends: Everything to max (on games where I have everything set to max) or everything to low and switch to the highest possible resolution (on games where I have everything set on high and a corresponding lower resolution to prevent it chugging).

    Oh, and it insists that I need to stop running TF2 in a borderless window. I don’t think there’s anything I could do to make TF2 drop below about 50fps, but I refuse to run it full-screen, alt-tabbing makes it unstable as all hell.

  16. The Random One says:

    So was I the only one who went cross-eyed at the Richard Huddy x3 picture?

    Damn, my graphics card has 3D support, I don’t need to go crosseyed. Fetch me my blinder glasses

  17. BrightCandle says:

    I had a go with GeForce Experience on my dual GTX 680s. In the games it recognised in all cases it recommended setting higher than what I had set specifically for acceptable performance (never below 40, mostly 60). While their settings look better the settings they provided in PlanetSide 2 for example would produce 24 fps in heavy battles, that is unplayable.

    Its a great idea but they need to get realistic about the settings they use and use something that will produce a good gaming experience all the time. The last thing I need is for my FPS to plummet just as the action kicks off, at that moment I don’t need pretty I need to aim and shoot.

  18. Initialised says:

    What I’d like to see in a modern game engine (I think it was in Giants Citezen Kabuto) is an option for dynamic image quality based on a user definable minimum frame rate. When you are running above this it turns up the image quality, when you’re below it turns it down.

  19. Koozer says:

    I wish in game graphics options were clearer on what they did. How hard can tooltips be to let me know the difference between MXAA and FXAA or whatever the hell AA is called these days? And what does turning shadows down from ‘high’ to ‘low’ actually do, without me having to just try it and restart my game repeatedly? lower their resolution? Use a different rendering method entirely? Does it reduce load on my GPU, or my CPU, or both? Pah I say.

    • sophof says:

      I’ve never understood why there isn’t an ‘automatic’ setting. Surely it must be possibly to make a minute long ‘example’ and automatically set all the options based on what the user wants. So let’s say I want my native resolution and FPS to never drop below 30. It would then adjust everything based on that. It is not that far away from an ordinary timedemo, just with less effort for the user.

      Good settings are extremely important to the player’s experience, but they never appear to put much effort in it.

  20. sophof says:

    That ‘geforce experience’ sounds pretty handy actually. With enough crowdsourcing I’m sure you can pick of a subset of three or so settings and that’s more or less the only real choice one needs to make. It also means that if it is not working for you, you know you probably need to fix something.

  21. Premium User Badge

    jrodman says:

    IMO, rock solid hardware implementation and open source graphics infrastructure would do a better job.

    Really, I suspect that’s the truth.
    However, it won’t maximize profits so I don’t expect it any time in a few decades.

  22. haleha says:

    Christmas is coming! This is a nice gift! http://vai.la/2Tpf $155.42 10.1″ Android 4.0.4 A10 1.2 GHz External 3G Tablet PC
    You can a child this great gift for Christmas! Get great deals! Highly recommended! You can not miss this chance!