Can Tech Demos Cry? Square Enix’s DX12 Witch Chapter 0

Being a cold and distant sort, I don’t know what to make of crying. It’s the thing where your face gets wet without swimming, yeah? I’ll trust Square Enix when they say “the human emotion of crying” is “one of the most difficult representations for existing real-time CG technology” but not know what to make of their fancy new tech demo with high-def crying.

Squeenix last week showed off a new DirectX 12 tech demo with a crying lady running on a chunky PC packing four Nvidia GeForce Titan X GPUs. the demo named so-very-Squeenixly ‘WITCH CHAPTER 0 [cry]’ is some pretty fancy pixel-pushing, but even to my reptilian eyes I don’t think we’re out the weird creepy puppet people phase yet.

This fancy new demo was revealed during Microsoft’s Build event last week. Witch Chapter 0 was created mostly by the folks behind Square Enix’s 2012 DirectX 11 tech demo Agni’s Philosophy, which may be why it looks familiar, with a little help from Nvidia and Microsoft.

“Square Enix is working on this next-generation research project as a showcase for our technology and creativity which will lead the future of Square Enix,” Square Enix president Yosuke Matsuda said in a follow-up announcement. “The results will be used to further advance the quality and technical expertise of our games. We will continue to pursue this research on cutting-edge technologies including high-end real-time graphics using the innovative DirectX 12.”

They say this project’s research feeds back into the Luminous engine at large, including helping with Final Fantasy XV – a game I really want to see come to PC because the idea of a group of rowdy anime boys going on a road trip to save the world is delightful.

Unfortunately, a large cunk of the tech demo’s flash is lost to us right now because no high-quality version is online yet. Probably the best-quality version is on the recorded stream – skip to 1:58:40, but for your convenience here’s the appropriate section clipped out and re-encoded:

33 Comments

  1. Herbal Space Program says:

    I can’t wait for my ps5 to barely render 30 frames on my 4k television.

    • Premium User Badge

      Wisq says:

      4k? We’ll be lucky if games are even reliably supporting 1080p by then.

      • Asurmen says:

        They more or less are now. I’m waiting to see whether there will be a hardware refresh for both consoles that will make them 1080p guarantees rather than some compromises.

        • epeternally says:

          There’s already been a number of high profile sub-1080p Xbox One games so far this generation, and I suspect as developers try to push the hardware further and further we’ll end up seeing some on the PS4 side as well. It’s still far from something that players are able to take for granted in their games. Sure, you can argue that the Xbox One is tragically underpowered and only a fool would purchase one, but that doesn’t change that it is part of the current hardware landscape. We have a long way to go before 1080p rendering can be considered functionally universal, and I definitely expect that next gen 4k support will be – at best – as common as true 1080p was last gen, and with just as iffy framerates. Unless there’s a massive increase in the rate of graphics technology development between now and then.

  2. Pizzacheeks McFroogleburgher says:

    Meh. I find it hard to care about this.. Haven’t sqenix been harping on about graphical tish and pish they’re working on for the next deus ex game? Priorities all wrong…

    • Press X to Gary Busey says:

      They’ve been doing stuff like that since FF7 was supposed to be a Nintendo 64 title.
      Throw tons of man-hours on tech. Realise it isn’t feasible for real-time. Use it in a pre-rendered cutscene, if at all.

  3. Urthman says:

    But..but..DirectX 12 bridges the Uncanny Valley! It’s the biggest advance since the PS2 emotion engine brought emotion to video games.

  4. skalpadda says:

    8k textures, 50+ shaders on single objects and bajillions of polygons – I’d be curious to know how this stuff affects the work load of artists and animators, as well as overall budgets now that the new console generations are becoming the norm.

    It’s a matter of taste for sure, but I’d much prefer to see exciting things to happening in other areas, like AI and animation, rather than leveraging pure grunt to push numbers ever higher.

    • OscarWilde1854 says:

      Couldn’t agree more, we create this amazing technology and it seems the only thing they are interested in pushing is the video forward… how about we catch up the things I can do in a game vs how the game looks…

      Not that I have anything against a game looking incredible.. but in the 6 years since I built my current rig, I have upgraded my graphics card twice, and never had to upgrade my CPU (nor do I feel like I’m even close to needing to…) I wish they would push the “thought” and “processing” in games as much as they push the rendering.

      • mattevansc3 says:

        Its a short tech demo highlighting the difference between DX11 and DX12 on the same hardware and how much better the new API can perform.

    • Shuck says:

      The demo seems to have been running on a machine with four Titans as graphics cards, so this isn’t exactly going to show up in games any time soon – especially consoles – at least not in any meaningful way. (It does, however, neatly show how little of a graphical improvement is provided by even a quadrupling of graphical processing power at this point.) But this does ultimately represent (another) multiplying of the cost of making a AAA game, when it does finally hit games. For a while there, having high-poly models and high-res textures in games was making things a bit easier in some ways, as artists were already using tools to create high-poly, high-res models/textures and having to “de-res” them for game assets. Of course you needed smoother and more detailed animations (esp. facial animations), more shader work, more careful physics integration, etc. that increased costs. But it seems like this represents yet another increase in detail and work required beyond what’s currently being done.

      • DanMan says:

        But it’s not like they have to start from zero every time. Once you have a skin texture and the attendant shaders, you can re-use them. From then on out you only tweak what you already have.

        As the Sqenix guy said, this is going into their engine, not a specific game. Game devs will then just use this stuff like “this is a stone”, so the engine knows what shaders to apply and so forth.

        • skalpadda says:

          That’s only the case if you’re making a game where you can reuse a lot of assets. The more expensive each unique asset is to produce the greater the risk that you’ll run into a wall where the scope of your design collides with the amount of assets you can afford to create.

          That obviously happens frequently anyway, but I’m still curious what the impact of this sort of thing will be in the future. AAA game budgets already seem to be at the point where anything which doesn’t perform fantastically risks death for the studio. If this means that studios either have to push even higher budgets or limit their design scope I’m not sure that’s a good thing for gaming, even if it is shiner.

          • Shuck says:

            Yeah, and sometimes you are starting completely from scratch (it’s been true for a number of games I’ve worked on), and you never have a full library of the assets you need even in the best of circumstances.
            The number of AAA development studios has plummeted in recent years because fewer and fewer studios can a) raise the funds to pay for all the developers needed for AAA development, and b) failure not only becomes that much more disastrous given the costs, but the revenue required to be successful increases to the point where many fewer games can get the requisite number of sales.

    • SuicideKing says:

      More elaborate AI may become possible if DX12 manages to fulfill it’s promises of reduced CPU overhead in real-world games.

  5. BlazeHedgehog says:

    8K textures, huh? And so begins the countdown to the first 1 terabyte PC game, I guess.

  6. FreeTom says:

    Ah, so the main difference is that if you render it in DX12, the in-game characters collapse to their knees weeping. That really will change gaming as we know it. Can’t wait to see the next COD looks like.

    • Vacuity729 says:

      CoD? I’m more looking forward to the effect it’ll have on the upcoming Duke Nukem reboot (not that there really is one, but hey).

    • SuicideKing says:

      CRY OF DUTY 10 OUT NOW FOLKS

  7. MultiVaC says:

    I think Crytek missed a great opportunity to pioneer this technology in the next generation of their Cry Engine.

    • Darth Gangrel says:

      They were pioneers in a way with Far Cry, but what was rendered there was still a far cry from actual realistic crying. I’d like to see this tech in the next Devil May Cry game, which then might be called Devil Will Cry.

  8. Monggerel says:

    YOU DOUBLE DECKERED PINK FARTS

    I WANT UGLY GAMES! UGLY! UGLY!! UGLYYY!!!
    UGLY GAMES!!! GIVE ME THosE! SO MY COMPUTER CAN RUN THEM AND MY WALLET DOESNT EQUAL MY ENGAGEMENT! UGLY LIKE THE GOOEY CORE OF PUNK! UGLY!!!!!!

    • DanMan says:

      I think your caps-lock is broken.

      • Monggerel says:

        Thing is, I’m the kind of bumblefuck that actually holds down shift to type in allcaps.

  9. vecordae says:

    Why is Morrigan crying?

  10. Xerophyte says:

    Not to state the obvious, but engineers (and artists, although there I am less sure) do not burst fully formed from the luminiferous aether with an encyclopedic knowledge of the DX12 API. The only way the squeenix people can learn how to make a renderer using DX12 is to make a prototype and figure out what works as they go along.

    In this case that prototype could also serve double duty as a semi-neat promotional demo that was sufficient to get them on reputable video gaming blog Rock, Paper, Shotgun, but that was very likely not the primary purpose of the exercise. I also wouldn’t read it as commentary on the future of video gaming as we know it, or what sort of games squeenix intend to make in the future and so on. It’s a small vertical slice to let people learn a new rendering API, plus see what sort of art workflows will be relevant in a few years.

    • caff says:

      Yup and it’s probably targetted more at potential funders than it is at gamers. Sadly a room full of suits will probably nod and think that because the characters have got better hair and “cry” potential, that it’s an advance.

      However, I’m all for developers pushing the boundaries, because it only encourages further development in the hardware space too. After all we’re going to need meaty graphics cards to power our 4K screen and Oculus Vives.

      • Blad the impaler says:

        To be fair, better hair has been the focus of the Japanese gaming industry for nigh on 17 years now. They’re very good at it.

  11. Premium User Badge

    particlese says:

    Being a warm and easily amused sort, I know all too well what crying is: Suppose something amuses you quite a bit more than it should — for example, during biology class, a fish in the tank at the side of the room repeatedly swims away from the filter inlet and drifts back toward it, clearly having the most fun of all the fish in the tank. The awareness of that disproportionate amusement and that you probably shouldn’t burst out laughing during class causes further amusement. The awareness of that cause for amusement then leads to recursive amusement. The resulting internal explosion of hilarity causes runaway, stifled, convulsive laughter which includes the production of a salty liquid exuded from the tear ducts and the teacher asking if you’re all right. This is crying.

    I therefore have to call out Squeenix for failing to represent the entire picture here.