The Witcher 3 Has Better Hair Than You

The Witcher 3 promises to be 'shaggy' in more ways than one.

This is why you don’t schedule meetings for 5pm on a Friday. I can just picture the team at Nvidia sitting there, itching to get home. “Last item on the agenda. What can we name the new GameWorks hair effects?” A pregnant pause and some furtive looks towards the door. Someone at the back eventually pipes up. “…HairWorks?” A growing murmur of approval and some scooting of chairs later, here we are: with a trailer showing off a bunch of cool physics effects in The Witcher 3 [official site] (including, yep, HairWorks) exclusive to those of you with Nvidia cards. Looks like AMD fans are stuck with less luscious locks than their Geforce friends.

As stupid as the name is, the tech is pretty impressive, as you can see from the trailer below. You’ll also notice vastly improved cloth physics and destructive particle effects, but the main focus seems to be on that hair. That luscious, full, shimmering hair that you just want to run your hand through, even if it does belong to the ugliest troll you’ve ever met.

To see all this in-game, you’ll need to update your Nvidia drivers. You can grab them from the Geforce website or through the Geforce Experience client, so you’ll be all set as soon as The Witcher 3 drops. Which is just a few hours away. We’ve only just received review code, so we’ll be telling you wot we think as soon as we can.

67 Comments

  1. SAeN says:

    Yeah yeah hair, when are they going to develop Nvidia HorseWorks?

  2. John O says:

    So it’s a guy with a ponytail going around stabbing hippies at a medieval market? That’s cool I guess

    • Misha says:

      Stabbing hippies is always cool, no matter what the circumstances

  3. Ejia says:

    I was wondering whether this was useful for cloth physics, too. Surely Textile Merchant simulator can’t be far behind.

  4. Rumpelstilskin says:

    Someone should tell CDPR that youtube supports HFR videos for more than half a year already.

  5. Sir Buildbot Winslave says:

    Bargain Bin Simulator 2015.

  6. Bernardo says:

    I guess we should be glad that NVidia didn’t feel inspired to develop – BoobWorks (Badumm-tsch).
    JK, looking forward to all the “rumpy-pumpy” (to quote Charles Dance) on a selection of stuffed mythical creatures.

    • dahools says:

      There may well be some muff that’s been given the hair works treatment. We won’t know until it’s out ;)

      • Bernardo says:

        Hairworks drops the Framerate significantly on my computer, so unfortunately no wavy bush for me. Judging from Geralts legs in the first scene, however, this is going to be the Surprisingly Well-Groomed Middle Age Kingdom of HBO. Fantasy! Makes everything possible.

  7. montorsi says:

    Good lord that looks awful. As pointless as TressFX or whatever it was in Tomb Raider. I mean, sure, it looks all bouncy, out of place and distracting, but… who actually wants that?

    • gunny1993 says:

      Everyone who thinks that hair moving as one static object is far worse a little bit of odd bounce.

    • jacobvandy says:

      Yep, I left that junk disabled in Tomb Raider, and not just because of the horrid framerate dips whenever there was a close-up shot of her head… They’re so excited to get hair physics working that they forget to configure the damn stuff correctly. Hair is heavy enough to not look like it’s got a leaf-blower working it in slow motion! This isn’t a beer commercial!

      • aleander says:

        This isn’t a beer commercial!

        I guess that means you haven’t read the books?

    • UncleLou says:

      The animals look terrific to me, the wolves in particular. And while human hair is far from perfect, it still looks alot better than when it’s turned off. Pretty sure it’s not worth the massive framerate hit though, mind.

      • Cinek says:

        We’ll see how “massive” that framerate drop will actually be…

        • jacobvandy says:

          According to Nvidia, it’s up to 25-30% when the furball action is thick: link to geforce.com

        • the_r says:

          I launched it just after midnight only to see how it runs on my pc (I3750, GTX660, 16GB RAM, not stellar specs, but minimal requirements passed). The game runs fine until there are hair in the shot :D. It kills the framerate like the witcher kills monsters. That said it looks rather nice. I don’t know if I should upgrade my card, I’m itching for GTX970, but on the other hand, my card still has like 7 years warranty (10 in total), and last time it failed, Evga sent me 660 instead of 460 that I originally bought :P.

          • Lanessar says:

            Currently, it’s 30 FPS at ultra 1440p with Hairworks enabled, 50 FPS with it turned off. I’m running a 4670K, mind you, with said 970. It’s ridiculously expensive. Don’t buy a 970 expecting it to save your framerates.

          • the_r says:

            Thanks for your input. I ran my (very short) test on 1920×1200 res, with AA off, probably most settings on medium, some medium-high, and was getting probably around 30fps (didn’t measure it with anything). I know I’d probably have to upgrade to QuadSLI Tinant X to run it on ultra with HairWorks to get 60fps (mind, you I’m perfectly fine with 30fps). 970 looks like a good middle ground. Especially that it has been announced as recommended specification for running with Oculus Rift. I’ll probably hold off on any purchases till the end of the year though. Maybe even wait for 1000 series (in the meantime hoping my current card will fail and they’ll send me an better replacement :D, free upgrade 460>660>960 :D)

  8. Borodin says:

    And so we geta slew of monsters with hair where no monster should ever have hair

    • TacticalNuclearPenguin says:

      Sure, that’s the definition of a monster, it’s something that doesn’t have hair.

    • Hedgeclipper says:

      Monsters should wax

  9. Laurentius says:

    The colour pallete is so run of the mill (meh) and light is the same wierd and awful as in Dragon Age: Inquisition. Just look at these 2013 screens link to rockpapershotgun.com
    What the hell happend ?

    • rargphlam says:

      Graphically, most likely a combination of console parity and originally using assets that struggle on less than top of the line hardware.

      As for that washed out look, no clue.

    • BobbyDylan says:

      This happened:

      link to giant.gfycat.com

    • Vandelay says:

      Pretty much my reaction to this video too. Not good for a video promoting the graphical fidelity.

      I expect it wasn’t helped by YouTube compression, but that didn’t even look as good as The Witcher 2. A shame if this is really representative of the game, as the previous videos and screenshots (and the requirements) made this look like a massive step forward

    • Nouser says:

      They had to make it run in real time.

  10. kalniel says:

    Because the PC market is so huge we need proprietary market splitters…

    Although, didn’t Far Cry 4 have hairworks and it ran fine on AMD cards or something?

    • Horg says:

      As long as they restrict themselves to who can render the floppiest hair, then we can just keep saying ”that’s nice Nvidia / AMD : | ” and carry on regardless. If it gets tot he point where some games wont run on the competitors cards, then we got a problem.

    • TacticalNuclearPenguin says:

      This stuff is indeed not excluse and AMD can use it again, though it won’t be much optimized. Basically it’s the same thing that happened with TressFX but in reverse.

  11. nimbulan says:

    Actually AMD users can use Hairworks, there’s just a larger performance impact. AMD has yet to release a new driver for the game though so that will likely improve performance.

    • Frosty Grin says:

      The big issue with Hairworks is that AMD doesn’t have access to the code, so they can’t optimize their drivers for it. Even worse, Nvidia explicitly prohibits game developers from optimizing Hairworks on AMD’s hardware. That’s why it’s a little disconcerting that RPS, of all the places, is promoting this nonsense.

      • TacticalNuclearPenguin says:

        I think AMD are really very good at crying and whining, fueling all those conspiracy theories and what not.

        When TressFX happened, everyone knew ( and it was said ) that it was optimised for GCN, that it would run worse for Nvidia ( by design ), and nobody was going insane.

        This happens with AMD even on their CPU side, always complaining about this or that, especially with Wildstar, apparently ignoring that they have “fake” cores and terrible single threaded performance. They have a lot of problems to solve before they start pointing fingers.

        • alms says:

          Sounds like you come from an alternate universe where Intel didn’t purposely alter their compilers to perform worse on AMD CPUs.

          • TacticalNuclearPenguin says:

            No, i come from this very universe and indeed i know the various shady things that each manufacturer does, just like i know that Physx runs on the new consoles with AMD hardware and it could happen in PC space aswell, if AMD were to licence it.

            Just like Nvidia would have to pay AMD if they were to use Freesync ( Yeah, it’s not free, it’s VESA based but it’s still AMD’s own implementation ). When it comes to GPU manufacturers they both are hiding in their castles, it’s just that AMD is more vocal about this as they’re trying to pass as the good open source guys, which i’m sure those who run prevalently OpenGL or Linux with AMD hardware surely agree ( not ).

            When it comes to CPUs, well, if the problem you mentioned is so huge then AMD must be pretty stupid to design such a weird architecture so desperately in need for custom tailored code and hoping all will go for the best, either that or they couldn’t resist the chance to market their CPUs as “octacores” while in reality they’re technically not.

            The truth is that AMD is good at playing the underdog card, drawing a lot of simpathy, and their PR about Zen and the miracles of HBM for GPUs ( where GPU power still matters more as long as you have over 250gb/s of bandwidth ) is much more extreme than what i can see in Intel or Nvidia slides. I still remember all that crap about the efficiency of console’s APUs which, while true, are still in need of serious horsepower and some more years to actually be relevant.

            AMD since some time is focusing on gimmicky architectures rather than honest brute force and crying if stuff doesn’t go well, their stock is plummeting just like i fear their R&D budget is and i can’t really have simpathy with their PR strategies. I too hope for some good competition, but i also hope someone else steps up instead of them, or buy them.

            In summary: No, i don’t want Nvidia or Intel to reign, i just want something different than AMD to be their competitor.

          • Asurmen says:

            Using FreeSync as an argument that they’re as bad as each other is poor. Nvidia have no need to to licence it and AMD have no reason to give it away. Nvidia can just use the open standard but that would mean abandoning the truly propriety G-Sync. They’ve also improved Nvidia performance in TressFX for free off their own backs, and they’re also giving them access to HBM through licence rather than truly keeping it to themselves (which I’d like to point out has more benefits than just bandwidth and is a pretty weird move by Nvidia if HBM was only a PR ‘more is better’ move). The same does not occur in reverse.

          • TacticalNuclearPenguin says:

            There is nothing weird, HBM is absolutely better than what we have now, and i didn’t really claim the contrary.

            What i said is that it’s slightly premature to call it a game changer at this current time, at least not to the extent it’s being hyped to. It has other benefits as you said, for example power consumption, but not to such a huge level that will help AMD get much more headroom for it’s next card, which will absolutely have to be a monster when it comes to die size, and THAT is where most of the power requirements happen. I’d say it’s no wonder we’re still in the dark about the 390x, Nvidia had enough power headroom left to spare that it wasn’t hard to make a bigger chip, but AMD really need to revolutionize a lot, and HBM alone won’t help much.

            Then again, nobody is saying HBM is not something great. It is, but it will be more apparent when GPUs will have a lot of extra horse power themselves.

            Also, don’t talk like AMD are making Nvidia a favor with the HBM license, they’re getting money out of that. Nvidia would also get money out of AMD licensing physx but AMD doesn’t want. Nvidia is an evil empire just as much as AMD is an empire in decline, but i think they are also because of the bad choices i listed, not only because of the Illuminati or some other big cospiracy.

          • Asurmen says:

            HBM is a game changer. It’s a solution to a problem that will soon rear it’s head, namely power and size of continued use of GDDR5 as well as bandwidth for bigger games/res. My comment about it being weird was you seeming to be saying HBM is a PR point only bandwidth, but that would be weird for Nvidia to be using it if it was true.

            I also never said they’re doing them a favour, but that’s certainly more than Nvidia give in reverse.

          • TacticalNuclearPenguin says:

            Yeah, i said it was used a PR stunt, what i meant is that i fear that they’re trying to overstate it’s ( current ) importance way more than what’s realistic, in an attempt to distract people from the fact that they are going to have some huge difficulties in releasing a huge chip with 4k shaders.

            This is my assumption based on the fact that their current cards are already drawing so much power that it would be hard to remain on the same 28nm node and yet provide a 40-50% increase. Not because i necessarily care about power consumption, mind, but because it’s already so high that anything more power hungry than that would require some seriously specialized cooling solutions to be reliable, and i doubt HBM will cut anything more than 20-30w.

            Even if it does, it just means that the RAM will require less cooling, but the huge chip will still be a big problem to handle.

            And really, trust me if i say that i totally want them to succeed this time, i’m heavily interested in whatever wins and i might even get AMD if it proves considerable better, enough that i stop caring about Nvidia’s own features. But even if i end up with a 980ti, i still wouldn’t mind AMD forcing Nvidia to lower prices.

          • Asurmen says:

            But if it’s mainly a PR stunt, why will Nvidia be using it? The cards will be smaller with HBM by quite some way.

        • Frosty Grin says:

          What you’re missing is that AMD wasn’t trying to prevent game developers from adjusting TressFX for Nvidia’s cards and Nvidia from optimizing their drivers for TressFX. Big difference.

        • emotionengine says:

          What’s your take on the whole Nvidia Project Cars GameWorks debacle? link to m.reddit.com

          If the stuff that is claimed in that thread has any merit it sounds like Nvidia is going out of their way to make sure this GameWorks runs like crap on AMD GPUs, and there isn’t a thing AMD users (or AMD itself – for the time being, at least) can do about it. Or could this whole issue be avoided if AMD were to license Phsyx support for their hardware? I’m genuinely curious.

          • TacticalNuclearPenguin says:

            I’ll give you a better answer after ( and if ) i can find again the articles explaining how AMD never wanted to move in the hardware physics department, Physx or not, then afterward deciding to get themselves another proprietary engine that wouldn’t really help the market in any way, and it bombed anyway.

            I’ll post again if i find it, meanwhile i just want to add that i’m not fan of Nvidia practices, but i think Physx at least could have been a great thing if only it was standard, it would be widely adopted, better integrated and it would change the definition of “graphics” as not only eye candy, but an actual gameplay element. And yes, obviously some open standard would be better.

            But when it comes to proprietary tech, i really can’t blame Nvidia if they don’t want to licence Freesync after they spent so much money and time developing their own when there was no alternative on the market, and i also don’t think AMD should get a free pass just because they put “free” in the name combined with some serious guerrilla PR.

          • emotionengine says:

            Thanks for replying, but I didn’t notice that that thread has been marked as ‘misleading’ because it seems a whole lot of those claims were unsubstantiated conjecture/downright false.

            Here’s a different follow up of sorts to that thread with some (possibly) less biased commentary: link to reddit.com

            The usual he-said-she said AMD vs. Nvidia drama then? We’ll hopefully get to know sooner or later in this case.

          • TacticalNuclearPenguin says:

            Yeah, well, this whole thing is a huge mess as usual. I’m just not a big fan of AMD being always the poor underdog, surely they weren’t when they asked 1k dollars for their CPUs when they actually had something that trashed Intel. That was a LOT of years ago.

            The thing i see the most is that whoever gets on top is automatically evil, and that’s something i don’t like. They’re all big corporations trying to get the most money and nobody should ever forget that, and all of them absolutely will price something atrociously if it happens to be the uncontested winner, with the 390x the only possible exception as they need to regain a LOT of market share first, before becoming douchebags again.

            Oh and thanks for the link, don’t get the wrong idea but i’m interested in this aswell, mostly because i too have the sneaking suspicion that my 780ti is getting intentionally handicapped the more drivers i download.

      • Premium User Badge

        Aerothorn says:

        To be fair to Jem, it’s not like the proprietary legal requirements Nvidia imposes on devs who use Gameworks is present in this trailer. But yes, it’s a big issue, and I hope RPS will write on it soon!

        • Cinek says:

          IMHO things like that should be included in DirectX 12. But somehow Microsoft avoids topic and well… since DX9.0c pretty much the only new visual MS added was tessellation.

          • Nouser says:

            I suspect you don’t know what is exactly DirectX and how much it changed with the 10th iteration.

  12. Monggerel says:

    Well the thing about hair is
    you have it or you don’t

  13. Unclepauly says:

    It’s same as tressfx both can use the hairworks it just runs poopoo on amd vs running poopoo for nvidia in tombraider. Graphic card parity I say.

    • Asurmen says:

      Except Tomb Raider was Nvidia/Dev issue not sorting out the issue which is now fixed. Given Nvidia’s track record we’ll see whether the reverse happens.

  14. dangermouse76 says:

    Very impressed by Fenceworks, wonder when they will have the power to do shedworks or building works.

    • Jediben says:

      Looks like AMD users get to use…

      The water works.

      《Yeeaaaaahhh》

  15. Chiselphane says:

    At least it’s not PubeWorks, and Witcher being what it is, that isn’t too much of a stretch

  16. aircool says:

    Does it work for bush?

  17. Synesthesia says:

    Shiny! Hair is still one of the low points for game art, so I’m all for this to keep happening. CDPR has some seriously good 3d guys. I always loved their work with materials, geralt’s armor always looks properly leather/metal/whatever.

  18. blastaz says:

    I lasted all the way up till 7pm before preordering it.

    Now it downloads in an hour.

    My pc has no hope of running it.

    I have 50 quid in the bank till payday….

  19. EhexT says:

    So is there another video where they show something that isn’t done equally well (in the cloth case) or better (in the destruction case) by PhysX? Because what they show in that video, is quite frankly, crap. The destruction is hilariously bad – you can see the physics kick in (at a sub-PhysX fidelity) half a second AFTER the triggering event. It looks horrible. The Cloth is just straight up stuff-PhysX-has-been-doing-since-Arkham-Asylum-or-longer. The Hair is ok, but hardly revolutionary – Bloodborne’s been doing equally or better hair/fur/strips-of-cloth recently and there was some great fur the last year/s.

  20. iMad says:

    I wonder if it’ll work well with existing games like Skyrim.

  21. alms says:

    Can’t shake off the impression they stuck hair on every monster whether or not it made sense. You know, not a lot of reptiles have it.

  22. Zenicetus says:

    Unlocked now, and played a bit past the tutorial into the story.

    First impression — they should have spent more time making mouse & keyboard more responsive on the PC version, and less time on the hair modeling.

    I’m still toying with settings… the graphics ain’t the problem, I’m getting fast frame rates. And it probably isn’t bad on a game controller, but keyboard and mouse is a bit dicey so far. I want to hear what others are experiencing, but I guess this isn’t the thread for it.

    • Jediben says:

      I agree. Played half an hour last night and M+K feels very sluggish, almost like there is some smoothing going that you might expect with an analogue stick on that is leaving it very ‘thick’ when turning or running.

    • cpt_freakout says:

      I guess we’ll see more opinions in the next few days, but I also fired it up to have a quick ‘taste’ and I agree the M&K controls are a bit wonky, even in the menus. Surprisingly, my laptop (don’t groan, I play everything just fine!) runs the game OK, even though it’s a bit below minimum specs. Almost every setting’s on low, but the game still looks quite great to me. Maybe it’s because I’ve been playing too many indie games as of late, but I think CDR did get the optimization right this time around. Anyway, I’ll try the controller when I have more time, but yes, you’re not the only one who thinks M&K might need some tuning.

  23. Premium User Badge

    particlese says:

    While I’m not entirely sure I didn’t just watch a mashup of Skyrim and Metal Gear Solid 4 with a dash of The Last Guardian (thanks in part to Youtube compression + hotel Internets), I’ll just go with the title of the article for now and declare Geralt the owner of the best dang videogame skirts I have ever seen.

  24. Spacewalk says:

    My hair doesn’t cripple performance. Advantage=me.

  25. RegisteredUser says:

    So Lara loves AMD, but Geralt is more of an Nvidia Gestalt?

    Who would have thought there were such problematic issues with girl vs man hair rendering specialization in GPU hardware.

  26. Initialised says:

    Well done nvidia, another entry into the “long list of reasons not to buy nvidia cards”