There’s a fourth Event[0] ending that even the devs didn’t know about

Event [0]

A fun one, this. You know how there are four endings to the first-person lost-in-space style exploration Event[0] [official site], right? Well, if you did know that then you know more than the developers themselves, apparently.

As Emmanuel Corno of developers Ocelot Society explains, the realisation started with a trip to Wikipedia…

After investigating, Ocelot Society concluded that the ‘fourth ending’ arose because of a bug that the team hadn’t caught.

It appears to be down to the way the game’s AI, Kaizen, is programmed. Talking to Kaizen by typing into terminals is a core part of Event[0], and if you give certain answers you can trigger a sequence that the developers never intended.

It will happen if the player refuses to destroy the station’s singularity drive, refuses to upload their consciousness, and has treated Kaizen nicely throughout the game, making lots of small talk. After questioning if he should trust you, Kaizen can be convinced that you are true friends and that he should take you to Earth even though you didn’t destroy the drive as requested. You’ll then see a cutscene from another ending, of the spaceship moving towards Earth.

Corno reflects that even this ‘ending’ is “interesting”, “even if it contradicts the golden rules we wrote about Kaizen”. He says that it makes the AI “more human”.

If you haven’t played Event[0], then you might want to consider it. Here’s an excerpt from Wot Alec Thinks:

“Event[0] is probably too short for its own good, less because of (kill me) ‘value’, but more because it limits how far it can take its idea. What’s there is very glossy as well as clever though. Despite its sometimes very obvious limitations, Event[0] feels like the start of a beautiful friendship.”


  1. Grizzly says:

    This is the best application of the “AI does not do what the designers intended” trope I have seen thus far!

  2. poliovaccine says:

    “Yes, yes, ‘a bug,’ of course that’s all it is, pay it no mind now…!”

  3. Phasma Felis says:

    There’s probably a simple technical explanation for this–maybe a “trust” variable that wasn’t supposed to max out unless you destroy the drive, but they miscalculated the number of opportunities to raise it–but it’s still really cool. Emergent storytelling, indeed.

  4. Marrow says:

    This is actually the ending I got when I played, and it totally felt like a legitimate ending, even “the right one”. Seems weird that they didn’t account for people being ‘nice’ to the ai (or sycophantry) and also not wanting to upload their consciousness

  5. Zap Brannigan says:

    Seems like Ocelot Society are the luckiest dev team ever. I cant think of a worse way to anger players then by having a messed up ending.
    The fact that the bug worked out into a viable story line too, that players even seem to like? Well we’d need C3PO to calculate the odds on that.