Silence Broken: Ubi Say NVIDIA Deal Safe For AMD Users

By John Walker on September 2nd, 2013 at 3:00 pm.

C-O-L-L-A-B-O-R-A-T-I-N-G

Hurrah! We’ve at last got a response back from Ubisoft regarding our queries into how their recently announced NVIDIA deal will affect AMD customers. As AMD card owners will know, NVIDIA aren’t exactly the best when it comes to sharing tech, so when a publisher sides with them, there’s potential problems ahead. Not so, say Ubi, in this case. “It will benefit AMD users as well,” we’re promised.

Explaining the delay in replying was due to the avalanche of Gamescom, Ubi’s Michael Burk spoke to the right techie people, and this was their reply:

“The optimization done for the PC versions of Ubisoft’s games will benefit AMD users as well, since our dev teams test on many different hardware configurations in order to ensure the best possible experience. Many of the new features, such as tessellation, will benefit ATI users as well as Nvidia users. Of course, performance will vary from one graphic board to another depending on its raw power (regardless of brand). TXAA is specific to Nvidia, but ATI boards will compete with other Anti-Aliasing methods that are also available.”

So, there it is. Quite where this leaves the NVIDIA deal in terms of being so special is unclear. But of course such things are usually mutual back-scratching affairs. For instance, there’s currently a bundled version of Splinter Cell: Blacklist with NVIDIA cards, that gets a couple of extra levels and some unique guns. And in turn, NVIDIA shouts about Splinter Cell on their site, etc. But the good news is, for the 50% of gamers who don’t run NVIDIA cards, they say there’s nothing to worry about.

Now we just need to hear back about Assassin’s Creed 4′s PC delay!

__________________

« | »

, , .

56 Comments »

  1. RCoon says:

    Extra levels and guns dont change the fact Sam Fisher is a 50 year old with a kids voice and young body. No iconic Ironside voice, no thanks. Plenty of others feel the same, and the BS with the motion capture is just that.

    Also amused Ubi techies call it ATI still…

    • Morlock says:

      What are the opinions on the new Splinter Cell? Is RPS going to do a Wot I Think after all the coverage?

      • John Walker says:

        Yes – our review is on the way.

      • RCoon says:

        The game is OK and nothing more. They ruined Sam Fisher basically to save money, and claim technology is the issue.
        The multiplayer is alright, there’s a lot of forced dual breach, dual boost etc, which are entirely pointless other than to say “YEAAAAH FRIEND, YEAAAH TEAMWORK YEAAAH”. Also it still uses the Unreal Engine 2.5.

        • Morlock says:

          How much is the game like Chaos Theory, and how much like Conviction? I am talking about gameplay. Sam Fisher never interested me as a character.

          • RCoon says:

            More like conviction, only in this game there is hardly any darkness, most of the missions are done during the daytime. It is nothing like its epic predecessor Chaos Theory.

          • Tavrin says:

            It borrows elements from the two games, the possible customisation is great and there’s alot of gadgets, the level design is clever and the enemy ai better than before. Most levels I played were in the dark and even the non dark ones didn’t force me to stop my ghost playstyle and kill or even knock out anybody (except certain places where it’s part of the story). I would even say that in Perfectionist difficulty it’s actually better than Chaos Theory. There is alot of content with the side missions (like the Grimm ones, my personnal favorites where you have to do objectives without being discovered at all) and the multiplayer is nice, the classic mode is not as technical as before but it is tense, and the new ones are much more fast paced conviction style but still fun to play. And the darkness is there, you WILL have to use your goggles at times. So there you go, It’s milestones better than Conviction and (personnal opinion here) it’s slightly better than CT.

    • Thermal Ions says:

      Techies?

      I think you’re confused. As far as I can tell Michael Burk is the Director of PR at Ubisoft.
      https://www.ubisoftgroup.com/en-us/press/press_contact.aspx?cid=tcm:99-28774

  2. Grey Poupon says:

    No mention about whether this deal includes spamming PhysX or not though, which is what most were worried about. That and TXAA, which he says is going to be in the games.

    • RCoon says:

      When you get into bed with NVidia, PhysX always crawls out the sheets in the morning.

    • DarkLiberator says:

      Can’t say I’m a massive fan of TXAA to be honest.

    • Gap Gen says:

      Yes, I’m unsure with modern screen resolutions I really need AA that much.

      Plus, I presume the PhysX thing has happened in games before? I haven’t ever really noticed a problem with not having an nvidia card, although I have a quad core CPU so maybe pushing the physics onto the CPU isn’t such as big deal.

      • gunny1993 says:

        From what I’ve read about it pushing Physx (and shit like it) onto the CPU is damn near impossible or not tenable in any way.

        Given that I have no idea why this was, or where I read that i recommend getting your daily intake of sodium out of the way.

        • RCoon says:

          PhysX can be pushed onto the CPU but only on the lowest of low settings. At high PhysX, your maximum frames (with a non NVidia system) are likely to be between 10 and 20FPS.

        • Gap Gen says:

          I suppose another thing is that I’ve not consciously noticed better physics in games beyond a Half-Life 2 sort of level, with a few objects rolling about here and there. What sort of thing does PhysX improve in recent games, exactly?

          • gunny1993 says:

            Bugger all usually, mostly it’s just really small things that make the game look nicer and be somewhat more believable, but it’s never something that is needed.

            Borderlands had particle and water effects that were quite nice
            Think Metro had things that cause particle effects of bullet strikes.

            And the FPS cost is usually more than the effects are worth.

          • VelvetFistIronGlove says:

            Metro: Last Light used Physx for a whole lot of particle effects, smoke effects, destruction, and things like that. There’s a good comparison video here: http://www.youtube.com/watch?v=VafzR7JqO2I

            I have an AMD card though, so I didn’t get any of that. I can’t say I missed it much—and to be honest, some of those effects look a bit overdone (as is so often the case when features are added to demonstrate tech, sadly).

          • rapchee says:

            older stuff: mirrors edge had more destroyable stuff, even some textiles, mafia 2 had small stuff flying around from bins and the like, and smexy trenchcoat physics. srsly though, those look pretty amazing, i can’t go back to not-realistically-hanging-and-colliding coats anymore

          • hotmaildidntwork says:

            Planetside 2 uses it mostly to add gratuitous amounts of sparks and debris to bullet and shell impacts. It looks kind of neat, but it’s makes it kind of hard to see.

      • Grey Poupon says:

        Oh, I’ve got a quad core i7 too, which is still a fraction of the power you’d want for truly good physics calculations. You want physics to be run on the GPU and PhysX enables this only Nvidia’s cards. Even Borderlands 2 struggles when it’s physics are run on the CPU (when the extra physics are turned on, obviously)

        When there’s a lot of physics libraries around that would support both manufacturers, using PhysX is pretty much the equivalent of giving the finger to AMD owners for the sake of saving a few bucks.

        • Vandelay says:

          Borderlands 2 has been the only PhysX game that I have had any joy in running with the feature enabled. I can’t tell you actually frame rate, but I would say it was normally at about 30, which I find tolerable in a single player game. It did drop to a slide show during big boss battles though, so I turned it off eventually. Think this was medium settings.

          Of course, if the physics effects were available for all, we would see games that used physics in meaningful ways, but we will have to put up with them being used for the (occasional) pretties, for the time being.

        • Hyomoto says:

          Really, even on RPS this isn’t common knowledge? Contrary to some beliefs, the PhysX SDK offers multi-core support for CPUs and when used correctly, comes close to the performance of a single GPU. But of course, there is a catch: PhysX handles thread distribution, and moves the load away from the CPU and onto the GPU when a compatible graphics card is active. Therefore, the game developers need to shift some of the load back to the CPU. An effort that’s probably regularly ignored, especially if the game happens to have an nVidia logo.

          The GPU is almost no better at calculations than the CPU, what IS better is a dedicated physics solution such as, wait for it, A SECOND DEDICATED nVidia PhysX GPU! That’s right! A single nVidia GPU takes a TREMENDOUS hit trying to do physics calculations, JUST LIKE THE CPU. When nVidia bought up PhysX, they basically bought a way to sell us two cards.

        • FakeAssName says:

          Or like how using DirectX gives the finger to releasing on a non windows platform, despite OpenGL typically being ahead of or equal to DirectX.

    • Keyrock says:

      Hmm, I can’t say I’ve used TXAA yet. I do like FXAA, though. Obviously MSAA is higher quality, but the performance hit tends to be massive and my lappy (i7 3630QM, GT650M GDDR5) can’t always handle MSAA. FXAA on the other hand, has only a fraction of the performance hit of MSAA.

      • TheMightyEthan says:

        I use FXAA even on my desktop most of the time, cause like you said the performance hit is much lower than MSAA and at 1920×1080 the pixels are already small enough that you don’t need huge amounts of AA to start with. Also, for some situations FXAA does better, like high contrast edges. About the only time I use MSAA is if it’s a game that I can easily max out, and even then I’ll often run FXAA on top of it to get the light/dark boundaries.

    • LazyGit says:

      You can turn PhysX off.

  3. trjp says:

    It’s worth remembering that this is the same Ubisoft which published M&M Heroes 6 – a game which managed to not support whole generations of recently released GPUs for months, if not YEARS (ironically – nVidia faired worse than AMD in this too!)

    This issue was only ‘fixed’ when they released the last standalone expansion – upto that point all they’d done, in response to a landslide of complaints (Google “might magic 6 gpu problems”), was change the System Requirement to list specific (older) generations of GPUs only.

    This is also the same Ubisoft who released Trials Gold in a state best described as ‘broken’ and which still pretty-much runs-like-shit (a game where timing is key has wonky timing thanks to FPS issues)

    I’m guessing this is really a “we’ve teamed-up with nVidia because we have no fucking clue what we’re doing”

  4. LukeNukem says:

    I would just like to say that the accompanying image really enhanced the post, particularly the alt-text. This is what journalism should be about.

    • Hahaha says:

      LMFAO no it isn’t this is crap, where are the mentions of the AMD deal that happened not so long ago? where is the actual digging and not just advertising for the game?

  5. Screamer says:

    “here’s currently a bundled version of Splinter Cell: Conviction with NVIDIA cards,”

    I think you meant Splinter Cell: Blacklist? Arkham Oranges will apparently also be bundled with nVidia cards.

  6. ZappaBappa says:

    Like I’ll believe those bull lies again? They still haven’t fixed AC3 for PC with AMD. It still runs at about 15 FPS whenever i enter Boston. I wont be buying any of their games anymore until i can confirm that they actually run properly on AMD hardware.

    • welverin says:

      I have a Radeon card and it ran just fine, though I do have Intel processor.

      • ZappaBappa says:

        i7 960, with a ATI 6990 4GB here, all Ass creeds before ran flawless, bf3 runs flawless, metro last light runs great, crysis 3 ran great. And this? AC3 ran absolutely terrible. In forest areas it ran 60FPS all the time. Second i entered boston, dropped down to 15/20 FPS, no matter what setting i put it on. Tried countless of fixes, nothing worked. Recently reinstalled to see if Ubi had taken the decent aproach of fixing it themselves. Nothing, still runs absolutely god awful. I’ve lost allot of trust in them.

        • basilisk says:

          Yeah, there’s something very wrong about the Boston harbour which many GPUs hate. I’ve eventually managed to make it run somewhat bearably through a combination of several tweaks and black sorcery, but it’s rather obvious Ubi really don’t care.

  7. fish99 says:

    That’s pretty much just PR fluff, but really…..what did you expect them to say?

  8. Deano2099 says:

    In cases like these, would best practice not to also be to update the original post with the response?

  9. slerbal says:

    Thanks for continuing to cover The Silence – as a gamer and a former game dev I really appreciate you refusing to stop reporting on these issues – at worst it does no harm, but at best I think it encourages better, more responsible games making, greater transparency and most like a better relationship between developers and studios/publishers.

  10. cptgone says:

    silence broken
    by john talker
    a heart breaking drama
    based on real events

  11. tehfish says:

    So what about things like Physx? Where nvidia cannot cripple non-nvidia users more than they have already, literally de-optimising CPU Physx any way physically possible. http://semiaccurate.com/2010/07/07/nvidia-purposefully-hobbles-physx-cpu/

    Until Nvidia end such profoundly dickish moves, i will never buy one of their cards again…

    • Baines says:

      Thinking about it, has RPS made much in the way of negative posts towards Nvidia?

      This “Silence” issue was aimed at Ubisoft, which has for years earned the dislike of PC gamers. But PC news sites generally don’t speak out against Nvidia.

  12. AmirBan says:

    nvidia helped them for better pc port, first ubisoft title in this gen that actually ported well on pc, AMD or Nvidia card owners both should be happy with it. it is not a exclusive deal as AMD gaming Revolution

  13. Dead guy says:

    Alas, AMD kid John Walker finally got the answer he wanted: not surprisingly, Ubisoft games will work on his favorite brand as well! I’ve read there’s no freedom of speech here (very anti-democratic), but allow me to say something. As somebody else already posted, this “brand” problem goes both ways and it has troubled both the “good” AMD users and the “bad” NVIDIA users in recent and not so recent years (I could even go way back in time, when 3DFX started the Whole thing…). You guys at RPS are obviously free to post about it, but wouldn’t it be nice if an independent site (unless you decide to post AMD is one of your sponsors. In that case, I’ll have to shut up) like this called it down the middle?
    To cut things short, I’m a very horrible person (i.e. Nvidia user) who has suffered the consequences of AMD’s aggressive partnership with several publishers in the last few years: games like Sleeping Dogs, Tomb Raider, Alan Wake and Saints Row III took months before I could fully enjoy them on my Nvidia PC, while others (Deus Ex HR, Hitman Absolution, Far Cry 3, Bioshock Infinite) are still suffering from random crashes, stuttering issues and sudden fps decrease.
    Why have I never read anything about it? Also, you say Nvidia has proprietary technologies, but what makes you think AMD hasn’t? Ever heard of EyeInfinity, Hd3D and their AO tech which led to so many issues with Bioshock Infinite and Nvidia cards? I’ll never understand why some bloggers perceive AMD as a charitable, no profit company while Nvidia is a bunch of greedy bastards. It’s all business.
    P.S. You also talked about integrated chips. Most of the games I named don’t offer any compatibility with them

  14. Sordarias says:

    Right. It’s not an issue. It’s not an issue that, for a week after release with games that utilize PhysX, AMD machine often run fucking awfully, the framerate TANKS and generally underperforms compared to machines with NVIDIA. It’s not an issue at all that customers of a product want to know what the product they buy will be like at launch, rather than a week or two later. It’s not an issue for those whose recently bought purchase runs worse on their AMD machine than it does on anothers’ NVIDIA machine.

    Yeah, it’s not an issue to optimize games to run on one card, but not the other. Yes, it is often fixed within a week [except for the Boston bullshit of AC3], but that does not excuse anything. If I want to play a game I bought with my money at launch, why should I be punished for going with AMD over NVIDIA? Why should I buy your product if you don’t bother optimising it for both AMD and NVIDIA users? We both get shafted, constantly. I’m not going to speak up for those who have this happen to them — I won’t claim to, as some machines using AMD might not have the same exact problems others’ are running.

    But I will say that it IS an issue, especially to hose whose AMD machines experience these types of problems or more, who bought the game they want to play, and who now may be unable to do so because of PhysX bullshit, and so IT IS an issue to MANY who suffer these kinds of problems. It doesn’t always occur, obviously, but that doesn’t make it ‘correct.’ or worth ignoring. By refusing to ignore the issue, you only strive to make it worse.

  15. Hahaha says:

    Why is the article written in such a bias way that completely forgets that this is a problem that goes both ways?

    It’s flame bait but that’s not surprising anymore

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>