NVidia Apologises For Crummy Tomb Raider Performance

By John Walker on March 7th, 2013 at 11:00 am.

My hair!

NVidia have written a little apology note to all suffering with Tomb Raider graphics issues. Although I’ve yet to receive chocolates. I mentioned in yesterday’s Tomb Raider review that I had some issues with running the game on prettier graphics, and it seems I’m not alone. Apart from the silly hair mode reducing NVidia cards to jelly, I had peculiar problems with the OSD occasionally causing the game to judder, and couldn’t play above the normal settings. Extraordinarily, as spotted by Joystiq, this is because for some reason NVidia didn’t receive final code of the game until the weekend before release, so didn’t have a chance to create an update to accommodate it all.

They weren’t the only ones. For reasons unclear, PC code of the game wasn’t made available to reviewers before release either, with an endlessly slipping promise explained by the developers’ “tweaking until the last minute”. However, it seems that it would have made sense to ensure NVidia had access whether they were worrying about details or not. It’s also not known at this point whether AMD – they behind the TressFX hair daftness – were given some sort of priority, since all appears to work very well with their tech. Games have long picked one side or the other to befriend, but it seems openly detrimental to give a disadvantage to one of the two producers of your customers’ graphics cards.

A statement left as a comment on the NVidia site states,

“We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.”

So the take-home message here for NVidia users is to keep a close eye on their pages to look out for the release of the next driver, and make sure you keep your game patched to the eyeballs for when CD/Nixxes releases an update.

__________________

« | »

, , , .

85 Comments »

  1. SirKicksalot says:

    Looks like it’s missing graphical effects when played in fullscreen on a wide range of cards: http://steamcommunity.com/app/203160/discussions/0/846947231090179422/

    • Caiman says:

      This is the inevitable result of Crysis 3 having used ALL THE GRAPHICS. Now there aren’t enough left to go around.

    • Buemba says:

      I think I prefer the look of the game without these. Not a fan of the way games try to emulate camera effects (And shoddy cameramen, since they seem hell-bent of getting as much dust, water and lens flares as they can in every frame).

      I was actually impressed when I noticed the game wasn’t obscuring my vision with strawberry jelly whenever Lara got hurt. Shame it’s a bug.

      • TheManko says:

        I’m all for maintaining the artists intent, so I hope they fix this in a patch. Maybe add an on/off switch for those post effects for those who always complain about those effects on forums and such. They’ve been doing it ever since bloom got introduced in Prince of Persia Sands of Time and haven’t shut up about it since!

      • buxcador says:

        Camera effects? I call it camera flaws.

        I hate how precious graphics resources are wasted to worse the image.

  2. Prime says:

    I’ve always found it odd that publishers knowingly release games that don’t work with the latest drivers. I mean surely they should be coding for stable, working drivers or else how are they managing to test their game?

    Why should we be waiting for driver updates at all? Optimisations, fine, but when issues are as serious as this then what is the publisher doing releasing a game that doesn’t work properly?

    Anyone have an answer to this?

    • RakeShark says:

      I’m not sure but this is my theory:

      The major manufactures have development drivers they sell to developers, with extra options available like, say, triple tessellation if the developer want to really work with that. These devkits are very stable and show off everything both the drivers and the hardware can do, but the problem is they cannot be copy-pasta’ed into game code, be it UE3, Source, or Crytek. This is mostly because every game tends to rewrite how the engine refers to usage drivers, either out of feature priority or optimization. Therefore developers tend to cobble together a jury rig driver and send it to the manufacturers saying “This goes here, this goes there, this isn’t working, can you fix it in two days?” This happens throughout development, and gets really busy towards the end, to the point multiple driver candidates are being tossed around trying to squash one graphics bug. At release, the driver used to optimize the game is so disjointed that the manufacturer often needs a few iterations of drivers to fix the reference codes and communication to a point it won’t break everything else a user has.

      That’s probably why an established driver works well with previous releases, but is shit for a new release. Conversely, it’s also probably why when a driver released specifically for one game works alright, but utterly breaks a few others

      I could be wrong though, but I think this is a reasonable generalization to keep in mind.

    • Lars Westergren says:

      If a game studio releases a game with all sorts of new graphical bells and whistles, they get an edge over the competition. If a graphics card manufacturer can boast that their cards can display new graphical bells and whistles of currently hot game X, they get an edge over *their* competition.

      So there is an incentive for companies to cooperate on pushing the limit, but of course when you are on the bleeding edge, sometimes a bug knifes you. Or something.

  3. mrd says:

    Wait. NVidia are saying that the final release was less stable on their hardware than a pre-release version?

    Doesn’t something smell a tad whiffy about that?

    • Fox89 says:

      Not at all, all it means is that the code has changed in the meantime, and this can cause compatibility issues. I’ve tested online games and sometimes upon getting a new build you have to bug things like “3D models no longer display on Firefox”. So it’s not like CD have intentionally sabotaged Nvidia or anything.

      It is a strange situation that Nvidia had to wait so long for final code though. Maybe that was a term from AMD in exchange for working on the TressFX? We’ll never know… I doubt it will be long before a new Nvidia driver comes out to fix a lot of it anyway. And my GTX 680 is running it just fine.

      • Baines says:

        When the first news about TressFX hit, I figured Nvidia cards might have problems with Tomb Raider. It just seems to be standard nature for games that push a particular card-maker’s features.

    • JabbleWok says:

      It looks like another consequence of “release on this date” instead of “release when ready”.

    • Jonesy says:

      At least it doesn’t sound as bad as Rage was for AMD users. That one straight up sold me for and Nvidia card next upgrade.

  4. abandonhope says:

    The graphics card wars is a lot like the fast food wars except in the end you don’t get to eat a lot of Taco Bell.

  5. Rich says:

    Not that I really wish this crap on anybody, but it’s nice that games can come out with bugs specific to NVidia cards, rather than AMD ones, for a change.

    • rei says:

      Kinda thought the same, with the same caveat.

    • Jason Moyer says:

      I agree. In fact, it’s been two years since War In The North came out, I’ve since upgraded from a 5770 to a 7870, and the game is *still* unplayable on AMD hardware.

    • sonofsanta says:

      Aye, I had the same thought; usually it’s nVidia tempting developers and leaving AMD playing catch up.

      What this really underlines, once again, is that you should Never Ever Ever get a game on release day. Wait a fortnight for this stuff to get sorted.

    • Sparkasaurusmex says:

      +1

    • montorsi says:

      Yes, it’s rather wonderful that a game developer left a hardware manufacturer in the dark until the last minute. We need more of this!

      • Ragnar says:

        We clearly don’t, but it also happens all the time with Nvidia’s “The Way It’s Meant To Be Played” games which perform much better on Nvidia hardware because Nvidia works together with the dev while AMD gets the code at the last minute. Now AMD did the same thing back to Nvidia.

        It’s very much “an eye for an eye” sort of thing, but maybe now Nvidia will realize now that these “hardware exclusives” are pretty shitty, only end up hurting customers, and thus come to an agreement with AMD to not do so in the future? Well, we can dream, at least.

        • KenTWOu says:

          Also nVidia should stop using proprietary tech and improve its OpenCL/DirectCompute (therefore TressFX) support. Cause 600 series sucks in this regard.

    • Urfin says:

      Yeah, the actual state of hardware support is the only reason I’m still buying nvidia. But really, I wish all these fuckers with their unending platform lock-in attempts would just die horribly.

  6. Morlock says:

    If this apology had come a few days earlier it would have been a pretty useful warning.

  7. Fox89 says:

    If you are experiencing problems on your Geforce card: first thing to try is disabling Exclusive Fullscreen and disabling Tessellation. Doing that fixes a bunch of crashes, at least on the 600 series.

    • abandonhope says:

      If you’re experiencing problems with TressFX, do not sing the following song at your monitor. It doesn’t help.

    • Spacecraft says:

      This is truth. Tessellation is the only thing causing me to crash. Without I’m running the game at 70+ average FPS with the TessFX hair. Dual overclocked 670s.

  8. PC-GAMER-4LIFE says:

    Get used to this crap AMD are the ones dominating sponsorship of PC games now with their AMD Evolved program so they will lcok out Nvidia GPU’s. Nvidia appear to have retired their TWIMTBP program which used to provide decent drivers for the latest PC games.

    • DarkLiberator says:

      Nvidia hasn’t retired it, they’re just not as active. I believe Witcher 3 and ArmA 3 are nvidia titles. Also its okay for Nvidia to sponsor games but not AMD (Ironic since their Gaming Evolved Titles are all great ports like Hitman, Deus Ex, Sleeping Dogs, Tomb Raider, and Bioshock soon)? Personally think both companies should both keep their noses out.

      • PC-GAMER-4LIFE says:

        Effectively they have as their arrogant CEO decided Tegra was the way forward & who needs to spend $10M every year on the 250+ engineers they used to employ to QA the latest PC games in the TWIMTBP program! Now Nvidia have lost all the next console GPU contracts & also sidelined the TWIMTBP program meanwhile AMD are stealing market share & winning customers back. Nvidia have decided F2P garbage is the best way forward to bundle those with their future GPU’s.

        • Sparkasaurusmex says:

          What you are calling a “program” was just a marketing slogan

          • Ragnar says:

            Nah, it was a program. It ensured that Nvidia worked with the devs to have the best performance, while locking out AMD, and thus ensuring that Nvidia hardware would perform better. And now we saw AMD do the same thing back at Nvidia.

            If corporations were people, Nvidia might at this point have thought to itself, “Wow, now that I see how it feels, and how it hurts consumers, I realize that I’ve been a huge dick all this time with that TWIMTBP program.”

  9. xellfish says:

    Did I step into some kind of interdimensional portal or something? In what weird dimension would that be NVidia’s fault? That’s like having Mozilla make a public excuse because their browser isn’t rendering my page properly, because they didn’t get a final build of my website before release and couldn’t release an update for their browser.
    How about fixing the effin’ game?

    • Hoaxfish says:

      It’s something I don’t really get either, but NVidia and AMD seem to put out a lot of patches to their drivers for specific games, when I would assume that people make games to match the standards (a-la OpenGL/ DirectX) that the cards support.

      • OrangyTang says:

        Speaking from experience it’s entirely possible that the game *is* written to the standard, but for the particular edge case that the game hits the nVidia drivers don’t implement the standard correctly. Which would be why it works fine on ATi cards.

        It would explain why nVidia are complaining about not having received “the latest code” from the developers – nVidia haven’t fixed the problem because they didn’t know exactly what they needed to fix yet.

        Although the phrase “these issues cannot be completely resolved by an NVIDIA driver” would imply that there’s still other bugs in the game code which may be making things less clear.

        • marach says:

          No it implies NV limits their max (non-graphics) calculation speed on consumer cards. They can’t fix that with a driver since it’s burnt into the chip during board creation.

      • Shivoa says:

        Games are coded to standards, when the driver doesn’t know what the game running is then it does a reasonable approximation of complying with those standards. Running an optimised game involves looking at things like a flush command that, if you ignore, gives you 10% better rendering performance as the dev who added it only considered a subset of the hardware configs and in some cases it hurts and doesn’t help performance. Add in throwing away triangles you know won’t render to the visible screen, optimised tweaks to use less detail as it’ll only get thrown away later in the pipe, and a whole host of other things (rewriting a shader to be better optimised with the same output for your architecture and so detecting it and hotswapping it when the game loads); drivers are not doing what the game asks but rather are doing the least work they possibly can to render something almost identical to what a standards compliant rendering of what the game asked for each frame would look like.

        Also modern GPUs (like the CPU brethren) are clocking dynamically and different sections are under different loads so the re-ordering of commands to best use all the resources while staying under the thermal/voltage limits is at play, even if GPUs are out-of-order designs, the game can be detected and stuff re-ordered on the CPU before the GPU picks it up at the driver level. There are many a complex thing going on and games are being coded to be as fast as possible and drivers are working to optimise out as much they don’t need to do as possible and in the middle the standards are there and are kinda being applied by both sides. It is no surprise that sometimes things go wrong for big games which are going to influence future purchasing decisions when benchmarked and so it is important for all manufacturers to show the best numbers they possibly can as soon as possible. Code on both side of the table (dev and driver teams) could be slightly off and causing issues and it sounds like nVidia are expecting both their driver optimisation and CD’s code to need another pass before they have a stable and fast implementation.

  10. DickSocrates says:

    Out of the 4 cards I’ve ever owned, only one has been non-Nvidia. Current card is Nvidia, yet I still think this serves them right. All the stunts they’ve pulled, especially the proprietary PhysX CRAP. They have grounds for complaint here.

    I would say only the consumer suffers, but to call low performance in your video game “suffering” would be foolish.

  11. RogerMellie says:

    Not sure on the ins and outs of this but I appreciate the sentiment behind NVidia’s post. Too many times companies employ head in sand tactics so let’s give some credit to a company being open.

  12. Ninja Foodstuff says:

    Not that it helps anyone who is having issues of course, but I haven’t experienced any problems with an EVGA GTX 670 with the “ultimate” preset. Is this only affecting specific cards?

    EDIT: Having just looked at the steam thread someone linked above, I see that much of what’s being discussed are missing camera effects, which it’s entirely possible I just didn’t notice were missing.

  13. Diziet Sma says:

    * if (tl;dr) { what xellfish said.;} *

    Is this really NVidia’s apology to make? Surely Square should be apologising. I mean the game is clearly heavily optimised for ATI, but surely the onus is on the developer not the graphics card manufacturer to ensure good performance and communicate with NVidia if there are performance issues. Unless NVidia ignored the bug reports as the game was pushing ATI as a brand.

    I’m an ATI owner so i’m pretty blown away by the graphics performance in the new TR, but I remember the problems with Rage on ATI. I never blamed ATI for them but iD Software.

  14. jimbonbon says:

    The issues are pretty substantial – running at Ultra settings (which includes tessellation) I am getting complete driver/GPU crashes resulting in a need to hard reset the whole system. I will be waiting for the next set of drivers and a patch before playing more, even though it seems I can avoid the issue by turning off some features.

    Also interesting to note that the game doesn’t work at all for me with the EVGA Precision OSD enabled – on launch I just get a big black window.

    What I have played of the game so far though, is awesome. Not quite as ‘interactive’ as you would hope but a really awesome and cinematic experience, especially in Surround.

    If AMD did get preferential access to code I wouldn’t be too surprised. It’s not very ethical and sucks for customers of the other vendor, but I have no doubt that the same has happened for NVIDIA before now.

  15. KodiaKKoolaid says:

    I have a gtx 560ti with newest beta drivers and the game runs flawlessly with tessellation/ssao on @ high settings.

    Perhaps this only affects higher-end cards.

  16. gruia says:

    never again Nvidia. this was the last straw :( ruined my purchase.

  17. squinky says:

    The hair situation bogs me down. I’ve a laptop with an AMD 6990M 2gb in it, and with the fancy hair situation it runs quite poorly, 20-30fps; with the hair off it runs flawlessly, I couldn’t say the framerate but I expect it’s at or above 60. Is the fancy hair supposed to cause so drastic a drop in performance? I thought it was supposed to just ‘work’ on AMD cards.

    • bstard says:

      ” I’ve a laptop “

    • Damn Skippy says:

      I’ve got the 6990M 2GB as well in my lappy, and yes turning on the hair crap drops my framerate like a rock. I basically turned off that and tesselation and it’s running really well now (tesselation seems to slow me down more than it’s worth in most games, so i generally turn that off anyway). I don’t think i’m getting 60FPS out of it, though, what are your other graphics settings? I mean, the card is about equivalent to a 6850, so while high end for a laptop it’s really just midrange, so I couldn’t expect to max things out for long, but really the portability and flexibility make it a worthwhile tradeoff for me.

  18. DMStern says:

    Remember when game-specific tweaks in drivers was considered cheating? Now it seems that’s all driver updates are.

  19. Tei says:

    Probably often we see errors in games that are causes by poorly programmed drivers. And we never see a apology by the makers of the video card. We always blame in these instances the game dev’s.

    So is somewhat weird to get here a apology by the videogame makers, for a game that apparently is not as smooth as should be in this particular hardware. Is this a admission of a error from NVidia? don’t sound like one. It sounds like taking blame for something themselves never did.

    Is cool, I suppose.

    Are the new consoles ATI or NVidia? maybe devs will change love interest now. Most games used to be made for NVidia first, and ATI second, maybe things could change.

    • Hahaha says:

      I believe they are both using ATI

    • Moraven says:

      PS4 and Wii U are AMD.

      New Xbox is not known yet.

      • theleif says:

        Isn’t it “confirmed” that the new XBOX will use a 6670-based AMD GPU ?

      • Wedge says:

        It’s all but confirmed as AMD, both processor and GPU. It’s kind of laughable because the PS4 and new Xbox thing are barely distinguishable in terms of hardware.

        • Sparkasaurusmex says:

          almost laughable but then really sad when you realize that despite being nearly identical they will be proprietary and can’t play the same games, and players can’t play against each other.

          • admiraltaftbar says:

            To be fair, Sony has been very open about cross platform play. In fact I definitely see Planetside 2 releasing on the PS4 and being able to play with the PC version of the game. The only reason the Xbox and PS4 won’t be able to play against each other is because Microsoft is very adamant about keeping their online service a walled garden. It’s easier for them to justify charging owners a premium to use it when you make it out to be something exclusive and good.

  20. simulant says:

    Jeeez…. And I was impressed that it runs very well (>=40 FPS) on a single 550TI at High settings (1280×1024), minus TressFX which appears overrated anyway. The game looks fantastic.

  21. Low Life says:

    Been playing this on my laptop (with a Gefore 640M LE), runs just fine at 60+ FPS at almost all times on medium settings, but the Shantytown section really destroys my framerate. Even on the lowest settings that part brings my FPS down to around 30.

    Apart from that, I can’t really complain about the performance.

  22. Zaxwerks says:

    “nnnnwwwwwaaaaaaaaarrrrrrrrrr…nvidia!”

  23. jrpatton says:

    For once I pick the right time to buy the console version instead of the PC version. It’s rare, but it happens.

  24. Sc0r says:

    Obvious AMD contracts are obvious

  25. fish99 says:

    The crashing is the developers fault not nvidias. Nvidia drivers are stable with everything I’ve got installed on my PC, close to a hundred games, most of which were written before the 600 series even existed, so there’s nothing wrong with the drivers themselves. A little bit of Q&A from the developers/publisher could have easily avoided this.

    TBH though when you hear CD only gave nvidia the game 2 days before release there’s a niggling suspicion that the devs have deliberately sabotaged the game on nvidia cards. Why else would you leave it that late?

    Also are we now living in a world where you need to install new drivers for every new game release? I doubt the average PC games updates their display drivers more than a couple of times per year.

  26. paddymaxson says:

    I was about to say “I’m glad they’re taking their time”. But they’re not, still kind of are, but with something as wonderful as TWD Game, I really want to see them do season 2 right. A game of this much importance (and it is important in a number of ways to the games industry) shouldn’t be getting yearly sequels.

    That said, If it’s just 1 episode at a time again, then I think this autumn is far enough away for 2-4 hours of game to be made in.

  27. Zitacos says:

    Not quite sure what a big deal it is. I’m running old 300 series drivers on a GTX580 because I’m lazy. Set the preset to ultimate, and turned off tessellation, and high precision, and of course TressFX. Near solid 60fps on 2560×1440 with a few drops into 50s. If I drop it down to 1080p its fine.

    None of those features make a drastic difference in visuals, and are all huge performance sinks. Game seems fine to me. Unless you’re benching with Titans or a Crossfire setup, seems like its no big deal. Wait for a driver patch, its still very playable.

  28. Shih Tzu says:

    So the take-home message here for NVidia users is to keep a close eye on their pages to look out for the release of the next driver, and make sure you keep your game patched to the eyeballs for when CD/Nixxes releases an update.

    Or, like me, decide not to purchase until hearing confirmation that the situation has improved. I’d been considering taking the plunge this weekend or next (after the more-positive-than-I’d-expected RPS WIT along with other reviews), but if it’s not ready yet, I don’t suppose I’ll bother. It’s not like I’ve gotten anywhere in Dishonored yet, anyway.

  29. Fallward says:

    Wow, i’m actually very impressed. Usually it is AMD with driver issues (in fact, 90% of the time it is) which is why my current card is nVidia (and will never go back to AMD), yet this one rare occasion they don’t have drivers up to scratch they APOLOGISE! AMD, take note. This is how you keep customers.

  30. Kein says:

    *pets his granny GTX260*

    I’m schorry, what are these ultra setting thou is speaking of?

  31. CommanderJ says:

    Having a look at Square’s support forums, it’s painfully clear that this isn’t nVidia’s fault. Both console versions have rampant game-stopping bugs, save corrupting bugs etc etc which make people unable to progress/even launch the game. Clearly this game was pushed out the door without being anywhere near finished. It’s not often a game is shipped to consoles with multiple showstopping bugs, methinks.

    There was a PC patch which ruined everything instead of fixing anything, it was pulled back and all references to it removed. Does not bode well… In any case I’d imagine they are working their asses off to make the game even work at all on consoles….enabling the pretties for nvidia cards is probably a distant second.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>