Gurnaphics Card: NVIDIA’s Face Works

By Craig Pearson on March 20th, 2013 at 4:00 pm.

STARING EYES!
Faces are everywhere in games. NVIDIA noticed this and has been on a 20-year odyssey to make faces more facey and less unfacey (while making boobs less booby, if you’ll remember the elf-lady Dawn). Every few years they push out more facey and less unfacey face tech and make it gurn for our fetishistic graphicsface pleasure. Last night at NVIDIA’s GPU Technology Conference, NVIDIA founder Jen-Hsun Huang showed off Face Works, the latest iteration. Want to see how less unfacey games faces can be?

As is the way of impressive tech demos, it’s done in a setting divorced from gaming: this was demonstrated live on stage, but not pre-rendered, using NVIDIA’s latest face-model, a bald man named Ira. I am sort of impressed. There’s a lot of detail in the character’s face, and the skin and underlying musculature are well rendered (less unfacey, some might say). But let’s not forget that this is a demo, it’s not rendering anything but that hairless face (I’ll bet they went bald to not worry about rendering hair), and the tech that takes about half a Titan graphics card’s power to run.

Because I’m not a tech journalist, I’ve collected some out-of-context quotes from the presentation instead of going into the implications and detailing all the teraflops.

She is the mother of Grendel.
It took us nearly 20 years to be able to create what appears to be a fairy.
Look at her pores!
Here comes the miracle.
His eyes!
Every important person on Earth should have this done.
It’s supposed to be half fruit and half yogurt.
Show me Zoolander.
Only all the Asians laughed.

Here. Have a video that looks like a dry-run for an Apple product launch. Skip to 8m39s if all you’re interested in is Ira’s dimples.

I hope ex-PCG Ross Atherton is aware they stole his face for this.

__________________

« | »

, , , , , , .

83 Comments »

  1. simoroth says:

    Should be noted that this is to show off their Titan card, which isn’t meant for games. For some reason their marketing department can’t grasp this.

    Its main use is for prototyping GPGPU stuff for use on a Nvidia Tesla based compute farm. The point of the demo is to show it doing massive shader/compute programs (8000 lines+).

    • Hahaha says:

      This guy thinks the titan is the 680

      http://www.youtube.com/watch?v=U8PM4gsMD9A

      edit
      Link is to the ttl titan review

      • sjebran3 says:

        If you think Clarence`s story is shocking…, a month back my dad basically brought in $7252 just sitting there 40 hours a month from there house and their co-worker’s step-aunt`s neighbour has been doing this for nine months and got a cheque for more than $7252 part time at there mac. applie the advice on this link,,,,, http://www.Fly38.COm/

    • Clavus says:

      Geforce Titan is the consumer version of the “Big Kepler” GPU first featured in Nvidia’s Tesla line-up. It is meant for gamers. Very rich ones that is.

      • simoroth says:

        Consumers yes, but not gamers. Its meant for enthusiasts doing compute work at home or in academia, but it really has no place in games.

        Why? Because the core of its performance is in its 896 FP64 (double precision floating point) processors. Large scale 64 bit maths compute power just isn’t required for any simulation in games… yet. That’s why it’s gaming benchmarks are so embarrassing.

        • lijenstina says:

          And 3D models rendering. For instance, Blender has CUDA support for Cycles. The render times are much faster with using it compared to the CPU.

        • PopeRatzo says:

          We’re very interested in this tech here at my house. My wife is a mathematician who does fluid dynamics, with numerical analysis and simulation. She wants to buy one of these Titans but is afraid I’ll end up installing Steam on her computer so I can see what Crysis 3 looks like on it.

          Right now we’re using two 7950s and it still takes forever for some of her simulations to run.

          • neolith says:

            Well, then maybe the Tesla cards would be more suited for the task. They don’t run games as they’re not ordinary 3d cards but they are exceptionally fast for calculations. Unfortunately they are also exceptionally expensive.

          • Don Reba says:

            Lucky guy. Your wife has an awesome occupation.

    • Robert_Starr says:

      as Curtis explained I am alarmed that someone able to make $5066 in one month on the internet. did you see this website… http://www.miniurl.com/sa/work-at-home

    • DXN says:

      Hey RPS, when I “block” spammers, does that help you identify and delete them more quickly?

      • lijenstina says:

        They could do a Ctrl+F search of the comments for dollar signs. It could get tricky if the thread is about Microsoft, though. :)

        • jplayer01 says:

          Tricky? Not at all. You’ll be killing two birds with one stone – everybody benefits.

        • Brun says:

          Would be easy enough to take a day or two and write up some heuristic filters. All of their spam messages follow three or four identical formats and then just swap synonyms in and out. Just an example:

          “just as [Name] [verb, synonym for explained], I am [gerund, synonym for surprised] that a [noun, describing a person] can make [amount] on the computer.”

          About a third of the spam posts start with a message based on that template. They also use improper punctuation, capitalization, and spelling to fool automated filters, the easiest way to circumvent that would be to ignore punctuation and caps and feed the string to the filter as one long block of text.

        • spectone says:

          You can’t search on dollar signs as that would pick up everyone talking about the price of games.

          Oh game A in my country is 99999$ that’s like 5000% more than people in Understand pay, totally unfair.

          Note: Understand is not a real country.

      • enobayram says:

        Good question, I’ve been wondering that too.

      • souroldlemon says:

        the spam is a pain. Seeing it and censoring it is distracting, a bit like the usual real-time autocorrecting internet spelling while reading posts to make sense of them.
        It’s weird that the few quality sites i visit regularly are also the few sites where i bother to read comments; i do wish they’d put in a spam filter. I volunteer to do it if time is the problem.

    • FriendlyFire says:

      Bear in mind that tech evolves fast. What now takes half of the most powerful GPU on the planet will run easily in a few years.

      It’s stuff like this that actually makes better hardware keep coming around. It’s a goal to reach.

    • MattM says:

      Titan can be used for cheap GPGPU but it really is primarily a game rendering card. The 7970 has much better GPGPU performance at a much lower price. If you are in high school or college $1000 might seem like an huge amount of money, but for someone with decent paying job its an expensive luxury but not out of the question. If one person bought a $15000 used car and another bought a $13000 used car would you call one of them fabulously wealthy and the other poor? They both bought mid-priced cars but the second person could now purchase 2 Titans with the price difference. Most smart phones cost more in 10 months than a titan. Going to the bar or the movies once a week for a year is also about as much as a titan.

  2. Keyrock says:

    That’s pretty much photorealistic. I don’t have a joke… sorry.

    /hides in shame

  3. wodin says:

    Half a Titan..for a face..when will they start looking at new AI routines etc etc..

    • FriendlyFire says:

      Nvidia’s a graphics company. Perhaps you should instead ask game developers.

    • The First Door says:

      If you throw a stone in almost any Computer Science department in any large university you are likely to hit a person who is working on AI in some fashion or other.

      Of course, you will probably then get shown out because you threw a stone at a professor or lecturer, but that’s a different story.

  4. Allenomura says:

    That’s Runkle. The monkey! That bald scumbag murdered our monkey! :p

  5. Guvornator says:

    “I hope ex-PCG Ross Atherton is aware they stole his face for this”

    I don’t know what you mean… http://static.gamesradar.com/images/mb/GamesRadar/Staff%20Pics/PCGUKTEAM/PCG_ross_crop_border–article_image.jpg

    I miss his ginger goatee. Is that wrong?

  6. Hoaxfish says:

    All these “head demos” in basically everything (PS4 event, this, etc) just reminds me of HEDZ

  7. darkChozo says:

    My god, I’m not usually one to call uncanny valley, but that video was terrifying.

  8. Cinek says:

    so… supportingly it eats half of Titan processing power… well… not really suitable for large scale video games, but still: Pretty much shows how far PC is ahead of PS4. ;)
    Seriously though – I really wish they’d implement something close to that in actual playable games. Especially in terms of facial details and animation, cause these are quite amazing and don’t eat as much processing power as lighting effects and subsurface scattering.

    • HybridHalo says:

      Because there’s currently not really a requirement for those animations to be calculated and code driven in real time within a computer game. For example – LA Noire as used high quality baked animations cued by gameplay triggers to achieve great results.

      In some future where say, a camera or other could recognize my facial expressions and apply those in real time to a model of myself in a game… Now we’re talking.

  9. faelnor says:

    Do I need to buy both NVidia and AMD graphics adapters if I want photorealistic heads with good looking hair?

    • Clavus says:

      AMD’s TressFX tech also runs on Nvidia hardware since it’s not using vendor-locked GPU computing tech like in the case of PhysX and CUDA.

      • faelnor says:

        Great! So it means I can simulate the face with half the GPU and the hair with the other half.
        2013 is looking up!

        • Keyrock says:

          If you do a Titan SLI setup, then you’ll also be able to play the game instead of just watching pretty hair and face mechanics. That 2 grand was burning a hole in your pocket anyway.

      • Hogni Gylfason says:

        Cuda supports OpenCL which is an abstraciton library on top of GPU libraries, which both AMD and nVidia support. Thus you can easily write code on top of OpenCL that supports both major GPU architectures. Both havok and bullet do this, and iirc there’s a flag in the PhysX headers that allow you to run it off OpenCL instead of Cuda directly. Why more devs don’t do this is beyond me. It’s like continually using DX instead of using OpenGL, even though it makes your product a bitch to port to Mac/PS/Lin/Android/iOS.

        • souroldlemon says:

          Physx is built directly on the driver, not via CUDA; using OpenCL adds compatibility but also an extra layer.
          CUDA is not built on OpenCL; they’re alternatives, like DirectX vs OpenGL and they each sit directly on the low-level grachics card driver.
          CUDA is a bit faster on NVidia and is more familiar (like C with added bits instead of a whole new API) and is better supported in terms of learning materials and examples, however it’s proprietary and AMD and NVidia both support OpenCL.
          OpenCL is indeed the way forward, and most of the current crop of high-end mobile graphics processors, and the next Tegra, support OpenCL (version1.1) so itdoes make sense to use it.
          Also, for GPU-computing, there’s no point in using NVidia because their consumer-level cards (i.e. the affordable ones) are artificially restricted to 32-bit floating point. A chunk of this is used for a massive exponent and what’s left gives you a pathetic few significant figures: accuracy too low to be useful.
          The Titan has its 64-bit floating point enabled but doesn’t count because it’s not priced as a consumer card. You get significantly better compute performance from a 7990 or a pair of 7970s for half the total price.

  10. analydilatedcorporatestyle says:

    Facial tick!

  11. Njordsk says:

    Somehow hairs are looking like they’re floating in water.

    Other than that well… amazing?

  12. Didden says:

    WarWorks? … anyone… anyone? I’ll get my coat.

  13. Feferuco says:

    So you think we’ll be getting these sort of graphics in the next five years or so?

    • Apocalypse says:

      We are already there. The point of the demo was NOT the graphics. The point was to show that they can implement facial impressions in real time.

      The main gpu power used in the demo was not for the rendering itself but for the geometry modifications to show facial expressions. You could potentially actually pre.-calculate this stuff and render it on far less powerful GPUs while playing the game.

      It is an great tool imho.

  14. Dana says:

    More polygons means more emotions. Emotions.

  15. serioussgtstu says:

    Nvidia had to invent a person to be excited about The Shield.

    • IgnitingIcarus says:

      Oh, man, that’s good.

      Anyway, I think it looked alright. The eyes and the teeth were the real freak-out points for me. It’s getting there. Pretty funny that they had to get a bald guy to avoid simulating hair.

  16. Canisa says:

    This image of a slightly gormless-looking white dude is pretty much a perfect representation of the current state of games both in terms of technology *and* mindset! I’m impressed!

  17. Scumbag says:

    Uncanny valley: When the person suddenly freezes and the world around them keeps going.

  18. BD says:

    Still creepy as fuck.

  19. db1331 says:

    FACEFACE

  20. Loque says:

    Too bad we’re still deeply tied to Directx9-based games, which let developers produce titles that will run on (almost) every single PC. The era of full DX11 “uber graphics” games is still far, far away.

    • Citrus says:

      PS4 is here, so will that new 360.. 4? So the era of DX11 or something expensive like that will be upon us.

      Cause you know, games like Colonial Marines really just needed DX11. While I am at it, DX11 didn’t help Crysis 3 play better (it was still shite).

  21. Sardonic says:

    Tor am demonstrating grafic technolgy

  22. ResonanceCascade says:

    Using the uncanny valley as a jumping off point for this may have backfired, but it’s still a really impressive demo. Nicely done, Nvidia.

  23. Dr I am a Doctor says:

    He looks like a Buckley drawing

  24. pilouuuu says:

    WARFACE!
    I mean FACEFACE!

  25. Jediben says:

    Now both sides have faces:

    FACEWAR! FACEWARFACE!

  26. Tiguh says:

    Do fairies REALLY shave their armpits?

  27. po says:

    I think what made it uncanny was how it kept returning to a small set of the same expressions, when a real person’s face is constantly changing.

    It they used a bit of interpolation, and had the eyebrows and mouth moving more independantly that would help a lot.

    When it was speaking it looked good, because the facial expressions varied widely and constantly, and it all tied in well, but any time it was idling it looked fake because it was too limited, and the face wasn’t constantly shifting and expressing subtle thoughts going on under the surface, or just doing the basics of breathing, and trying to stay as comfortable as possible through stretching and relaxing muscles.

    Also, staring eyes. No-one stares at a fixed point for that long.

  28. wererogue says:

    Imagine what this could do for telepresence?

    I’m not sure I’ve ever needed to be able to look at the back of someone’s neck while they’re talking to me. I’d much rather talk to, y’know, a video of their actual face…

  29. asshibbitty says:

    Eugh alien eyelids. Guess they didn’t have the cameras close enough to record the correct shape. Weird how the skin doesn’t reflect onto itself, too much for real time?

    In LA Noire during the most dramatic scenes I’d always imagine how awkward the actors must have felt in that rig of theirs. Now, giving a performance inside a what is basically an integrating sphere, that’s gonna be a challenge.

  30. Zyrxil says:

    /face twitch

    Did he just call ‘I, Robot’ an awesome movie??

    • Don Reba says:

      “I, Robot” was a movie and it was awesome. Thus, it could be referred to as an awesome movie. What is there so difficult to understand?

  31. Zyrxil says:

    That was a ridiculously ironic presentation. Dawn was not in the Uncanny Valley. This bald head though, was firmly in it, with its incredibly exaggerated expressions done at rapid fire speed.

    Also, did he just call ‘I, Robot’ an awesome movie?

    • kaffis says:

      Dawn was definitely in the Uncanny Valley for me. She’s attractively rendered, but anytime she starts trying to smile, it’s the express train to Creepytown.

      Ira pulled this off much better. I’d say he’s still in the Uncanny Valley (squarely so when the animations jump from one to the next or very visibly reset to his way-too-idle idle state, or, as somebody above pointed out, if you watch his dead stationary eyes…), but he’s starting to climb up the far side, now.

      Watching him speak was pretty impressive.

  32. cpt_freakout says:

    But does it run Zork in max settings?

  33. SuperNashwanPower says:

    Seems to depend heavily on lighting. The fully lit, non shadowed face isn’t quite so hot in the eye department, and there’s something off about the mouth movements. But with shadow, much of that seems to be reduced.

    He still has rubbery cheek syndrome though.

  34. RProxyOnly says:

    Don’t people understand that as soon as this technology is perfected, the only thing it’ll be used for is to lie to the people and to fake evidence for people to worm their way out of accusations of wrong doing..

    This is BAD technology.. it’s not even needed for games.

  35. Panda Powered says:

    I see they are making progress in the Bald Space Marine Technology.
    He should have used TressFX and he may have kept his hair for a few more years.

  36. Don Reba says:

    I think they found the bottom of that valley. Well done, NVIDIA.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>