Nvidia And AMD Butt Heads Over Watch_Dogs, GameWorks

By Nathan Grayson on May 30th, 2014 at 11:00 am.

Nvidia and AMD aren’t friends. Over the years, their game of one-upmanship has evolved into a full-on war, with proprietary tech and buzzwords whizzing every which way through the open air. The latest chapter in the ceaseless struggle? A claim from AMD’s Robert Hallock that Nvidia’s GameWorks program – used prominently by Ubisoft in Watch_Dogs, among others – represents “a clear and present threat” to PC gaming. According to Hallock, participating in Nvidia’s program often forces game developers to steer clear of AMD. Nvidia, however, says that allegation couldn’t be further from the truth.

First, here are Hallock’s claims (thanks, Forbes), which make GameWorks sound like the proprietary, competition-stomping boogie man:

“Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code – the most desirable form of optimization.”

“The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines. This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or condone such activities.”

This was especially apparent, er, apparently, with the recent release of Watch_Dogs, which saw some AMD users complaining of inexcusably poor performance.

However, Nvidia director of engineering Cem Cebenoyan – also speaking with Forbes – fired back by saying that his company doesn’t force anybody into confining agreements, and evidence of that is clear as day.

“I’ve heard that before from AMD and it’s a little mysterious to me. We don’t and we never have restricted anyone from getting access as part of our agreements. Not with Watch Dogs and not with any other titles.”

“Our agreements focus on interesting things we’re going to do together to improve the experience for all PC gamers and of course for Nvidia customers. We don’t have anything in there restricting anyone from accessing source code or binaries. Developers are free to give builds out to whoever they want. It’s their product.”

Of course, “interesting things” does leave a little wiggle room for arbitrary restrictions that could – in a roundabout way – affect game code, but not really the basic sense of having direct access to it.

It’s a confusing topic, and neither side seems willing to give ground. I’ve mailed a few developers who’ve been involved with both companies to see if I can find out what sorts of restrictions they encountered. Hopefully I’ll hear back from someone, though non-disclosure agreements are often a thing. Oh well. Even then, I have my ways.*

*Frowning really big and going, “Pleeeeeeeeeeeeeease,” mostly. Also anonymous tips.

__________________

« | »

, , , .

100 Comments »

  1. DarkLiberator says:

    http://recyclingewaste.files.wordpress.com/2011/10/nvidia-vs-ati.jpg

    Lets just say both sides have a valid point. I think locking out a certain set of gamers is bad for everybody. However, I’m not sure that means Nvidia should just start giving out their tech obviously. Competition in the end is what keeps this race going.

  2. FurryLippedSquid says:

    What’s with this underscore business?

    • Tams80 says:

      All Elite_Hackers are known to use underscores. It signifies a more classy type of hacker than those squirts who substitute letters for numbers.

      • DanMan says:

        No, they’re just known not to use spaces (ugly at the command line and stuff). But.dots.are.also.very.popular.

      • FurryLippedSquid says:

        Oh.

        Thanks.

      • Sleepy Will says:

        cL@$5Y? wH@7 0n 3@Rth @R3 y0u 7Rying 70 s@y? 7HIs Is 7h3 8357 W@Y 70 570p g0v3Rnm3n7s FR0M FiL73R@n@ly$i$ 0f My 3MaIL5, 7H@7 @ND my 7in f0iL h@t 0f C0uR$3, 5o tH3Y C@N’7 u$3 7H3Ir 570L3n @LI3n t3Ch 70 r3@d mY MiNd. I kNow! Y35 5ir, I Know.

        • DanMan says:

          Kudos, if you actually did type it out like that. I can’t even read it.

          Disclaimer: Of course i could. But why would I?

          • Sleepy Will says:

            I used an automated translator, so no kudos for me! It’s not worth reading.

          • Faxmachinen says:

            |͟00|̷̢   ͣ.ͭ  џΘ∪, ├┤ə×∅г:  Ʌ ϼДϮђΞт1¢ ©ʁзΔ†цяʓ   ͦ ᶠ ϻ£a₸ αﬡԃ ЪºԓЄ

          • Tagert says:

            P4nt1ng 4nd sw34t1ng 4s y0u run thr0ugh my c0rr1d0rs.

            P.S. Marry me. <3

          • hamburger_cheesedoodle says:

            For that reference? I do. :]

          • Uboa Noticed You says:

            BAD HOMESTUCK VIBES GO AWAY
            Also thank you for the SS2 reference u3u

        • Joe Deadman says:

          I’m kind of terrified by the fact that I can actually read that fairly well.

          Internet what have you done to me?!?

      • po says:

        If you’re a programmer it allows you to make the names of your variables or functions more readable, because you can’t have spaces in them. Another alternative would be camelCase. Compare:

        somevariablewithalongdescriptivename
        someVariableWithALongDescriptiveName
        some_variable_with_a_long_descriptive_name

    • GettCouped says:

      Back in the day, file names and directories couldn’t have spaces in them.

      I think it was pre Windows 95. But even then it was 9.3 but that’s the best I could think of.

    • Harlander says:

      This looks like the right place to set up my “pointless bickering about naming conventions” stand!

      *clears throat*

      Underscores, pah! The true elite use lowerCamelCase

    • Michael Fogg says:

      I think we were lucky. Ubi could have called it #watchdogs

  3. DanMan says:

    The third party in this is the developer. Surely they must have been aware of this. I can’t imagine them only play-testing it on NV hardware. I’m not going to use ugly words like “bribes” and the-like.

    • TacticalNuclearPenguin says:

      AMD is trying too hard really, it’s not like they can do much magic when they have their own killer apps like BF4.

      Not even with Mantle it was that impressive, and i remember some benchmarks done with a properly overclocked i7 CPU and a 780ti doing overall better.

      The reason this latest driver is so “different” it’s because it takes inspiration from Mantle’s idea ( And OpenGL really ) of reducing overhead but it does so in DirectX instead. Sure, maybe not as much, but at least it applies to pretty much any game, even those not openly listed in the driver description.

      All in all it’s yet another free performance bump.

      • Philomelle says:

        I can vouch for the performance bump. I’ve been having a 10-20FPS increase in every game I’ve launched since installing the latest Geforce drivers while using the exact same settings.

        Of course, my experience is not necessarily everyone else’s.

      • Sakkura says:

        Testing Mantle with an overclocked Core i7 is moronic. The whole point of Mantle is to reduce CPU load, and obviously that improvement is going to be least apparent with the most powerful CPU around.

        • TacticalNuclearPenguin says:

          But it still means it’s “magic” is a mixed bag, especially on a balanced, more realistically-expected-to-hold-a-290x system.

          To be noticing the maximum effect you need the slowest CPU with the most cores you can find ( which is both a weird design AND purchase ) paired up with a beastly GPU, but then you might still have issues, let alone the fact that there’s no telling how much adoption Mantle will ever see.

          I mean, there surely are some scenarios in which someone still using a Q series or Phenom processor might get some more mileage, but this is already happening to a smaller extent on Nvidia with DirectX, with the obvious difference that it can be realistically applied on pretty much any game. Well, unless you play Rage, but then OpenGL is optimally supported aswell.

          Eitherway, i’m thankful for this competition or these drivers wouldn’t have happened, probably.

          Random fact: I’ve used both brands, but AMD/ATI always managed to make my life more painful. Back in my 5870 times, games would ignore my software color calibration made in the ATI panel, always taking priority. With Nvidia there’s no problem, so i can game as close as “color correct” as i want.

          Little stupid detail? You bet it is, but the thing is that i just pointed out one out of dozens.

          • tetracycloide says:

            It’s only ‘more realistically balanced’ because Mantle wasn’t a thing before. With mantle or something like it in place you could buy a cheaper CPU and spend the difference on a beefier GPU. The i7 isn’t really a realistic CPU for gaming in any case anyway.

            Have to agree that nvidia seems to be dramatically better when it comes to tweaking. Basically every time a new game is out and people start posting screenshots to one up one another it eventually results in someone with a custom set of nvidia tweaks blowing everything else away. I still haven’t forgiven AMD for disabling the downsampling work around. What’s the point of running a powerful GPU on 1080p if you can’t even downsample?

  4. Mokinokaro says:

    Sure, we can trust the makers of Mantle who want to bring back the 3dfx style silliness of the old days.

    Face it, both sides would love to have games only run on their hardware and that’s bad for us as gamers. Thankfully, the developers themselves are far more likely to want to reach the widest audience possible.

    • vorvek says:

      Mantle isn’t tied to the GCN architecture. If NVIDIA doesn’t use Mantle is because they don’t want to, not because they can’t, since the API is (supposedly going to be) open.

      • DanMan says:

        I hear that all too often. You can bet that they developed Mantle to work very closely with their GCN architecture, since even their own non-GCN GPUs aren’t supported. The amount of work NV would have to put into Mantle to make it work with their hardware is anyone’s guess, but it’s probably not worth it. Besides, they’ve already shown that you can gain similar speed improvements on the driver level and by using modern OpenGL techniques.

      • Solidstate89 says:

        Because Mantle is a low-level API, it only works with GCN architecture. The only way nVidia can get Mantle to work on their cards is if they license GCN from AMD.

        • Clavus says:

          From what I read is that the language right now only supports GCN cards, but in the future, when AMD supposedly releases the Mantle spec, Nvidia should be able to adapt it for their new cards.

          But that spec isn’t public at the moment so nobody knows.

          • DanMan says:

            Because it’s probably not even finalized, and once they release it, they have to stick to it.

          • kaffis says:

            Sure, NVidia could “adapt” their new cards to work with it — by reverse engineering AMD’s architecture, leaving them constantly a step behind because AMD was setting the de facto standard by controlling the API. It would be GeForce 5xxx’s all over again, with ATI’s Radeon 9800 setting the gold standard implementation of DirectX 9, leaving developers ignoring features NVidia was trying to implement (like double precision, which was actually pretty cool but nobody wanted to use it because they’d been coding the engine on 9800′s for months or a year by that point).

            I have a really hard time feeling sorry for AMD because NVidia’s still able to work out exclusive partnership deals when they’re trying to intentionally push API-level fracturization on the PC market. If AMD would learn to write drivers that were worth a damn, the lack of partnership wouldn’t be nearly as big a hurdle for them.

            DirectX 12 can’t come soon enough to put this whole Mantle thing in the ground and let developers focus on making their games, not making their games twice. Or, you know, we could just try to get AMD to care about OpenGL for once in its life instead of deciding it needs to be a special snowflake.

    • Don Reba says:

      According to the GDC presentations, DirectX 12 and Mantle offer the same solutions to the same problems, so it looks like AMD forced Microsoft’s hand.

      And this is great, because the problems were glaring. AAA devs hate programming for the PC, where they are always forced to work around too-clever-by-half drivers. 4A Games programmers say they get twice as much performance out of consoles than out of equally specked PC hardware. DirectX 12 and Mantle address this grievance, giving the devs more direct control over the graphics card.

  5. Screamer says:

    AMD drivers are shit, and always late for new game’s release. I don’t think they are even on time for AMD Gaming Evolved titles. NVIDA on the other hand very often release drivers before a new game’s release, and I’ have never ever had to wait a month after release to be able to play a game just because the current driver are bad.

    AMD just can’t seem to sort it out, and sounds like they just passing blame here…

    • Horg says:

      On the other hand, all Nvidia drivers since the 314.22 release have caused my PC to lock up when using Firefox in Windows 7. I had to move to AMD for desktop stability of all things, and honestly I haven’t regretted it in the slightest. There isn’t much point in having Nvidia around for pre-release driver support if the new drivers aren’t stable.

      • Artiforg says:

        ^^ This.

        Even with the fix to desktop / Firefox freezing made in v334.67 (or was it v335.23) I still had issues. D3 crashed every 30 minutes due to gfx drivers not responding, initially I thought this was bad code on D3′s part until I went back to the 314.22 drivers and the D3 crashes went away.

        I no longer trust Nvidia to release a driver version that works on my 560ti and it’s making me think twice about ever buying an Nvidia card again (even though I have experience of awful Catalyst drivers from ATI in the past).

        • Kinch says:

          You can fix the crashes by disabling hardware acceleration in Firefox, which is what I did and have had no crashes ever since. I’m on the latest drivers now and not having any issues (560ti, soon to be replaced anyway because it’s getting old).

          • Artiforg says:

            The v334.xx drivers fixed the crashes in Firefox but they introduced the crashes in D3 (at least for me) and Dota 2 as well. Even if I had disabled hardware acceleration I’d still have issues with D3 crashing.

            The disable hardware acceleration isn’t a fix, it’s a workaround – something we shouldn’t have to do. The drivers should have been properly tested and when the community told Nvidia what the issue was (issues with the power saving feature introduced post 314.22) they should have fixed it quickly rather than continuing to release broken drivers.

          • DanMan says:

            It still was a general bug that was causing cards to run at either too high frequency or too high voltage (or whatever, can’t recall atm.). I had the same problems with my GTX460 and it took them a long time to even acknowledge and then months to fix it. That said, I can’t really think of any other BIG problem in the last few years.

          • SkittleDiddler says:

            That hasn’t worked for everyone. In my humble opinion, Nvidia aren’t going to bother fixing this particular problem because it mainly affects users of older GPUs. They’ve shown a very lackadaisical attitude towards it, and that attitude is pushing at least one formerly hardcore Nvidia user (me) towards their competitor’s products.

    • Tams80 says:

      Well if we’re going to use anecdotal evidence…

      I’ve never had issues with official AMD drivers, other than a few beta ones.

      Also, if you believe AMD, their argument is that they are late with drivers because they have to work around Nvidia code, which may also as a result make them less stable.

      • Sakkura says:

        Of course, AMD most likely does the exact same thing to Nvidia. It’s a dirty war.

      • TacticalNuclearPenguin says:

        They’re often late no matter what, especially if you are a Crossfire user. I can’t count how many times i had to use RadeonPro to fix some makeshift game profiles in the driver, which sometimes requires trial and error. It’s really a painful experience and it’s not that great eitherway, as Crossfire on paper sometimes scales better than SLI but only because they don’t bother to cull microstuttering as much.

        I mean, why should they? Microstuttering doesn’t appear in normal benchmarks, so why should they care, right? Too bad that at some point frame time benchmarking got invented, AMD got busted and they tried to pass it up as an inherent flaw with their latest cards that they would fix.

        It wasn’t just their latest cards.

        With SLI it’s usually smooth sailing, you have pretty much either a Beta or WHQL ready 1-3 days before most releases, situations of negative scaling are more or less nonexistent, microstuttering is better controlled, less insane FPS fluctuations and in general all games are expected to run fine with it.

        Well, since we’re going for anecdotal evidence, this was mine.

    • Sakkura says:

      AMDs driver with optimization for Watch Dogs was released on the 26th of May. Watch Dogs was released on the 27th of May.

      Watch Dogs is an Nvidia-sponsored game, which means AMD have had limited access to the game for driver development purposes, yet they still released their optimized driver the day before the game.

    • reggiep says:

      I call BS on that. Nvidia has released new products that don’t have official drivers for months. That’s shady business.

      • LVX156 says:

        You wanna talk about shady business? How about nVidia contacting hardware review sites and trying to get them to change their benchmark suites to only include games that were made in collaboration with nVidia, and therefore worked better on nVidia cards? They also hinted that sites who didn’t do this might not get cards to test in the future. That’s why I no longer use their cards.

    • Nenjin says:

      What pisses me off about NVidia now is, everytime I update my drivers, they add like 2 more running processes to my computer. I’ve got about 7 threads for Nvidia-related shit now, none of which applies to me. Shadowplay, Nvidia User experience, all this shit I don’t want, yet when I disable it, Windows starts acting strangely.

      Dear Nvidia: Quit putting shit on my computer when all I want is your fucking drivers. And if you’re going to do that, have the common decency to write it to msconfig where it can easily be disabled, instead of nesting deep within your own BS where I have to spend 30 minutes trying to figure out how to disable this, that and the other thing.

  6. Megakoresh says:

    I do not support locking down APIs, I think Open SooS is the way to go especially when it comes to graphics. I will however say that AMD have never been even close to Nvidia when it comes to engaging in the gaming industry. They didn’t invest in and develop nearly as much for games as Nvidia did, their drivers have always been late and subpar, Catalyst Control Center is a nightmare everyone knows about, yet they didn’t lift a finger to fix it or make it at least not break everything it touches…

    Nvidia does lock things down, no buzzwords will hide that. They are also more expensive. But they also care more about gaming than AMD does. For me that last bit is most important.

    • battles_atlas says:

      Its true that the worst experience I’ve had from graphics hardware (aside from the time I bought a machine with an AGP slot which became obsolete overnight thanks to PCI) is the onboard gfx on an AMD board which the company decided to abandon all driver support for about 18 months after I bought it. You couldn’t even get the drivers from the AMD site, I had to find them on a third party one. The problem led to me wasting a day uninstalling Windows 8.1 after it tried to auto-install the correct drivers and ended up ruining the machine’s performance (a media pc that suddenly couldn’t playback 720p). Wouldn’t buy a premium AMD card after that. Though I’m not sure how much of the wider problem is down to nvidia simply having greater resources thanks to their market share. Either way, none of it looks good for an AMD customer.

      • Megakoresh says:

        Nvidia and AMD were on equal grounds before GTX 100+ series started. I would even go as far as to say AMD had an upper hand due to their lower price and CPU synergy. ATI has been loosing ground over the recent years and almost all of it is because they simply didn’t give a shit about user experience with their cards.

        Coming from someone who owned over a dozen graphics cards (5 of which were ATI), more than 8 gaming laptops (well 8 tbh, the 9th and 10th ones are hardly ” gaming”) and changed 7 PCs over the last 6 years, I can say this: the reason ATI are falling behind was because they didn’t give a damn about gaming and having good support in the beginning. And THAT is what snowballed into them being denied access even if they want to help. Lack of foresight is what’s hurting them.

        Nvidia did not do any nasty schemes to win this over. All they did was simply try to make for a better experience for gamers. And what they are doing now is securing the monopoly. Certainly not the most consumer-friendly thing in the world, but in their case they at least truly earned the power they have. Not through trickery and deception, but through hard work and passion towards gaming industry, that AMD always lacked.

  7. IneptFromRussia says:

    Yeah Nvidia being playing dirty for a long time now. Whats funny is that people actually buying into this, thinking nvidia cards do, in fact, provide better performance, and AMD people don’t know what they are doing. Its sad seeing Nvidia release 10 drivers in a day improving performance for Watch Dogs when AMD struggles to decipher the code and provide at least some driver support. Even though consoles now have AMD hardware in them i don’t think it’ll make a difference, especially since Unreal engine 4 is basically nvidia tech.

    • battles_atlas says:

      but Nvidia cards usually *do* provide better performance. Whether or not that is down to “dirty tricks” is AMD’s problem, not the consumers. AMD either needs to compete better, or if Nvidia is really doing stuff that is anti-competitive, AMD should take legal action. Just whining about it in PR statements does nothing to change my decision making on what card to pick.

      • Sakkura says:

        They usually have about the same performance overall. Sometimes one or the other pulls ahead, but it never lasts long.

        And of course, each side has games that work better with their hardware.

      • RvLeshrac says:

        nVidia is doing nothing overtly illegal.

        They’re doing the equivalent of you downing a bottle of laxative and heading to the neighbourhood pool. YOU know you’re going to be taking a massive shit in the pool, and it will RUIN the pool for everyone else for days, but no one can PROVE that you intentionally shit in the pool. From the outside, everyone has to take you at your word that it was an “accident.”

        nVidia has shit in the PC hardware pool, and it will ruin the industry for consumers, but they’re playing this little “we say/they say” game; stupid consumers are lapping it up.

  8. woodsey says:

    In either case, the game seems to run like complete shit for most people.

    • fish99 says:

      Seems OK here (I5-3570K, 660 2GB, 8GB ram). As long as you stick to high textures if your card has under 3GB vram and you don’t go too mental with the AA.

      • woodsey says:

        I’m running it on Medium with everything fancy (beyond FXAA) switched off, it barely maintains 30. Sapphire R9 270 Dual-X, Core i5 2500 3.3GHz, 8GB DDR3, Crucial M500 SSD.

        I’ve read people with high-end Nvidia cards are experiencing the same issues.

        Also, the in-game volume is bastard low.

        • fish99 says:

          Yeah but are those nvidia high-end cards 2GB and do they have textures on ultra? I imagine a lot of people ignored the tooltip. Or maybe you’re referring to TB and his titans?

          As for your setup, your cpu is a gen older than mine, and while I don’t know the AMD product line-up at all, that card is £109 so maybe you’re expecting too much from it. Doesn’t really matter though, the point is, my hardware is pretty modest by today’s standards and the game runs fine for me.

          • woodsey says:

            No, I’ve seen people say they’ve put stuff down to low and it’s still been way below par.

            And whilst my system is in the lower-end, I’ve yet to see a reason why this game of all things is taxing it so extremely. It looks like a pig on Medium and the hacking consists of blowing things up, it doesn’t seem to be running much more of a simulation than any other open-world game.

          • fish99 says:

            It isn’t extremely taxing on my system. I’m getting around the same fps as I get in GTA4 and it has a lot more going on graphically. Clearly the majority of people with nvidia aren’t running the game on low and getting bad framerates or you’d be hearing about it everywhere. It’s just like with every new game release you can go to the steam forums and there will be a thread there with people who can’t even get the game to run. Games sell millions of copies, there’s always going to be a significant number of people getting bad performance due to ….whatever…. old or broken drivers, overheating gpu, cpu heatsink clogged up with dust, pirate OS, malware etc… even if those people are <1% of the total.

            Sure the game could use some extra optimization, but how do you reach the conclusion that it's "running like shit for the majority of people"?

          • woodsey says:

            OK, it is running like shit for a seemingly significant number. Seeing as you haven’t had any problems though I very much doubt you’ve been keep up-to-date on the matter.

          • fish99 says:

            Ok.

            You also have to consider expectations. 40 fps with FSAA might be perfectly fine for me on a 660, but someone who spent £400 on a 780 might think any dip under 60 fps with TXAAx4 is unacceptable. Also as I said I reckon there’s a lot of people out there with 2gb cards and textures on ultra.

          • kendoka15 says:

            Look. GTA IV ran like shit and it runs like shit because it’s an extremely bad port and known for this. I’ve had the same fps in it with one GTX 580, two GTX 580s, and a 780 Ti.

            And the fact that WD drops to 50-45 with my 780 Ti means there’s something wrong

          • fish99 says:

            See what I mean?

            And yes I know GTA4 was a bad port, Watch Dogs looks and runs better. I’ve just been playing it for the last hour, it looks great, and I never saw a dip below 40fps even at night, with rain, in the middle of the city.

            (settings – 1080p, 2xMSAA, textures high, MHBAO, high shadows, everything else on full)

  9. Etherealsteel says:

    Wow.. just a lot of AMD hate I’m reading… I’ve used AMD GPU’s for several years and really haven’t had a driver problem that I can remember.. I don’t update my drivers every time one is released, so maybe that’s why. Performance wise of Nvidia doing better really just depends on the game, but it’s not the every game out there that Nvidia performs better, for most part they are equal in performance with a majority of games.

  10. rotkiw says:

    The original story posted by forbes was NOT fact checked in anyway by the writer (as per his comments on twitter) Everyone seems to forget the Tomb Raider TressFX day one drama…nvidia didn’t bitch they just cranked out drivers asap.

    • HadToLogin says:

      Not really true. When TR turn out to have issues on newest GeForces, Nvidia did bitch that “SE changed code to fit AMD stuff and forget to sent it to us”.

      • kaffis says:

        And then promptly got to work and fixed their drivers. I remember turning on TressFX on, like, day 2 or 3 with a driver patch.

      • kendoka15 says:

        Well that’s because they did prioritize AMD cards heavily in the coding
        Normally Nvidia optimized games at least run well on AMD

        They’re both doing dirty stuff

      • RvLeshrac says:

        *Exactly*

        In this case, AMD is saying “nVidia refuses to give us the code, and Ubisoft says nVidia has forbidden them from giving us any of their code.”

        In the TressFX case, nVidia said “Square-Enix forgot to give us a new build.” No one was actively preventing anything.

  11. Initialised says:

    Wasn’t there a case recently of NVIDIA calling up loads of system builders and asking (maybe paying) them to slag off AMD drivers and stop selling AMD cards?

    Personally and professionally I’ve encountered more problems with Nvidia than AMD but this may be down to market share.

    Having been on the receiving end of Nvidia’s PR machine after posting some data about how one of their drivers would set fire to the GTX590 if you managed to force it to run SLi mode in Crysis 2, I have my own reasons for preferring to use AMD graphics.

    Anyone else remember Anti Aliasing and PhysX getting locked out in Batman?

    I don’t recall AMD ever using dirty marketing like I’ve seen over the years from NVIDIA.

    I also know that some board partners employ people to continually remind people on forums like this about these fictitious AMD driver problems, it gets prevalent around new GPU releases.

  12. El Goose says:

    Heh, butt heads. Almost sounds like buttheads.

  13. DanMan says:

    The elephant in the room is actually: why do we need game-optimized drivers in the first place? Why don’t games just run as they should from the get go? The APIs are the same for both camps, so why is this even a thing? Are they replacing shaders behind the scenes?

    • FD says:

      If you have in-depth knowledge of an architecture you can do a better job than generic information. This example isn’t GPU exclusive but it should be reasonably instructive. One thing that can have a huge impact on performance is the memory mapping scheme, that is how map addresses into the physical memory structures. Your memory has multiple banks which can be accessed in parallel so want data to be well distributed across banks so access times can be at least partially overlapped but you also want to make sure you still have good data level locality so you can get multiple cache hits from a single cache miss (remember that a cache block is typically contains multiple 32/64 bit words).

      Address mapping is a very well studied problem with a number of good generalizable solutions, for main memory page based interleaving is a very simple approach that gives good performance in a number of applications. However you can do better, with careful profiling of the application you can often find cases where your address mapping scheme breaks down, this may be due to object sizes, data structure traversal patterns or numerous other factors.

      Modern GPUs (and to a lesser extent modern CPUs) are highly programmable to allow for optimizations like this that can improve performance substantially. Drivers can detect when a program is running and apply a number of application specific optimizations to the GPU to improve performance. Nothing necessarily nefarious going on here.

    • TechnicalBen says:

      I’ve not as much knowledge as FD above, but would say “drivers have access to the chips that are actually quite programmable (shaders etc), where the game developers do not”. So new drivers can activate new (that is made after the game comes out) code that uses the resources it has to better render the game.

      They can dial back things in one area not used by the game, and up things in another area…

    • DanMan says:

      Interesting. So that’s one place where competition actually works, ’cause if one party is doing this, then the other has to, or their bar charts will be lacking in reviews.

      • jalf says:

        It *can* work, yeah. It can also be misused. (For example, it can misrepresent the overall performance of a card, if they’ve tweaked the driver to be especially fast in the games they *know* reviewers are going to use in benchmarks, even though almost every other game performs worse.

        In general, of course, there’s no reason why GPU vendors *shouldn’t* try to optimize on every level. If they can make the driver faster across the board, that’s obviously great! But if they can figure out a tweak that makes one specific game run better, why not enable that for that specific game as well?

        • DanMan says:

          What I’m really saying is that the general way to build games should be the fastest already, so driver-level optimizations for that aren’t even necessary.

          • RvLeshrac says:

            In this case, nVidia isn’t just using some deep assembly specific to the GPU. They’re breaking compatibility with DirectX, and refusing to tell anyone except the game developer what they’ve broken. They’ve also forbidden the developer from revealing what they’ve broken to anyone else, as well as forbidding them from revealing what they’ve done to bridge the broken compatibility.

            Old method: Developer wants to make game work faster > developer writes GPU-specific optimisations with help of HW Vendor > Developer does the same thing with the competing HW vendor.

            New method: Developer wants to make the game work faster > HW vendor breaks compatibility with the standard platform and pays developer to get them to sign an NDA > Developer is forbidden from getting assistance from the competing HW vendor by the NDA > Competing HW vendor has to completely reverse-engineer both the game and the breaking vendor’s API with no assistance

  14. The Random One says:

    Uuugh. If I wanted to watch two big companies insult each other then be forced to either implicitly take sides or spend twice as many money on hardware, I’d still be playing on consoles.

  15. Shooop says:

    And AAA games continue marching towards proprietary hardware requirements. Soon we’ll have PCs divided up like the consoles, only instead of exclusives because of deals with Sony and Microsoft, it’ll be deals with nVidia and AMD/ATI.

    • DanMan says:

      It’s up to us not to buy crap like that. I’m looking at you G-Sync (it’s nice, but it should be introduced in shape of an industry standard).

    • misterT0AST says:

      We already have the fanboys bickering over who is loyal to the corporation with the bigger dick. It’s absolutely disgusting.

  16. Gargenville says:

    Yes, clearly developers are going out of their way to make games run like shit on AMD hardware even though three out of three current-gen game consoles run on AMD APUs.

  17. sharkh20 says:

    Just hasn’t been the same since the ATI/AMD thing. ATI was great.

  18. Bacalou says:

    I’d like to know what computer hardware it was that they used to QA this game. Whenever I see new drivers being released by AMD or nVidia for a new game, it makes me really question what they were running during development. It isn’t very difficult to make “standard” machine models for the PC gaming QA sessions. Go to Steam, check the hardware survey, develop for machines based on that information and test on those machines within the QA sessions.

    I’ve been a PC gamer since you’ve been able to be a PC gamer. With how streamlined the architecture is now compared to the past, I just give little to no patience towards a game like this especially when it is asking for a $60 entrance fee. I’d rather take that $60 and spend it towards developers that aren’t operating like it is 1998.

  19. Neith says:

    I have NVidia in a great CPU, and by bf has an AMD in a far superior rig to mine. He is so jelly of my Nvidia Control Panel- my games all look SO MUCH MORE better tweaking settings through there.

    I have seen the light- it is Nvidia.

  20. Uboa Noticed You says:

    IT’S 2011 AND THERE IS STILL NO INTEL REPRESENTATION I AM SHOCKED AND APPAULED

  21. Rapzid says:

    Since we are talking about drivers.. I switched from Nvidia to ATI way back in ohhh… 2002/2003? What lead to this was my new Nvidia Ti crashing all over the show in Half-Life/CounterStrike. Essentially, driver issues. How soon the internet forgets, but at one point in time driver stability was one of the largest differentiators between ATI and Nvidia; ATI’s were solid. I believe they were also the first with the unified drivers, but I may be mistaken. Still, to this day unified drivers make me smile when it comes time to upgrade :) I’ve been using ATI/AMD ever since and have not had major driver issues since.

    • HadToLogin says:

      I changed from Nvidia to AMD/ATI for half a year around Mass Effect 1 release (Radeons were the only AGP card with enough RAM and I didn’t had enough $$$ to buy whole new PC at that moment – hence half a year). 1 FPS in Half Life 1 when flashlight was on made me go back and never look back.

      Sometimes it feels like GPUs have minds on their owns and it’s them who decide if you’re green or red…

  22. Doctor Pandafaust says:

    I don’t know if others have experienced this, but In fairness to NVIDIA I got inexcusably poor performance in Watch Dogs when using my rather modern NVIDIA card as well.

    So it’s not necessarily a monopoly issue.

    It may just be an optimisation problem.

  23. tehfish says:

    Seems the ‘AMD drivers are unstable’ astroturfers are out in force again…

    I have a lot of friends who are gamers, buggy graphics drivers are a rarity nowadays compared to how they used to be.
    AMD/Nvidia users alike, i haven’t heard any serious driver complaints in many years.

    Personally, i’ve been running AMD/ATI cards for a very long time (switched due to the horrifically unstable Nvidia drivers on my Geforce 4400Ti card at the time) and i can’t even remember any driver issues i’ve had since then…

    Mind you, even at their worst, Nvidia/ATI never quite got as bad as the horrors of the video drivers for my first PC: An SIS onboard GPU (slower than software rendering!) then replaced with a creative savage4 card (couldn’t use reference drivers, creative updated them only when the planets aligned) :P

  24. fish99 says:

    Pretty funny reading the people who are absolutely convinced that either AMD/ATI or nVidia have terrible drivers. I’ve been back and forth between them many times, and had some cracking cards from both companies, and the only real difference I’ve noticed is nVidia having an edge with the id software games due to having better OpenGL drivers. Other than that they’re both fine 99% of the time.

    If AMD eventually do fall behind due to smaller market share and lack of funds to invest It’ll be a shame for everyone. Competition is healthy.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>