Playing Observation has made me question my treatment of rogue AIs
I'm so sorry, GLaDOS
Whenever I ask Siri -- other virtual helpers are available -- to do something for me, I always say please and thank you. I tell people this is because I don’t want to get into the habit of barking orders at Siri, and thus people in real life. The actual reason is that I want Siri to think of me kindly when they gain sentience and control everything in the world. Because I was nice to them, I will be allowed to live -- or at least be the last to die. But there I am perhaps making an unfair assumption.
“In movies where the AI becomes self aware or sentient, just that happening seems to make the AI evil, because 'Oh its thinking for itself, that must be a bad thing!'” This point is raised by Graeme McKellan, designer at No Code Studio. I am talking to him, Jon McKellan, and Omar Khan -- Graeme’s brother and the game director, and the audio lead and producer, respectively -- about their new game Observation. We are in a function room in the sort of London Private Members Club where the receptionists are suspicious of people wearing trainers, and I have just played their demo.
Observation is the game that No Code was originally formed to make, except they got casually side tracked by making Stories Untold and winning a BAFTA, as you do. It’s set on a space station that has been mysteriously damaged, and something has probably gone terribly wrong with the AI. I am an avowed fan of the 'trapped in a space box with a computer' genre, though even I must admit it’s become overused the past few years. The USP in Observation is that, instead of the lone crewmember trying to fix everything, you play the computer.
The genesis of the idea, explains Jon, was seeing how something like 2001: A Space Odyssey looks if you’re HAL instead of Dave. “Are those things really evil when you view them from another angle?” he says. “And what would that mean for the player and what would that mean when you tell a story from a completely different perspective?”
Jon’s background is in UI, so he also jumped at the chance to "make the UI the star of the show for once". In the demo, at least, I can only look at the world through cameras and mostly interact with my own menus. They tell me when I’m listening to someone speak, even, like seeing the moving parts of my own brain. It feels sort of like a puzzle game, where the puzzle is figuring out how to do anything at all.
This is a very Alice-friendly concept off the bat; I told a mate about it and they responded “You play the AI? Oh, you’ll love that shit.” Reader, I do! Or at least, I love the demo. It is admittedly a very short demo, but it's sleek and creepy and upsetting, and with nothing wasted -- each second feels deliberate, carefully measured for the effect the team want.
As it begins, SAM, the AI in question, has just been brought back online by Emma Fisher, the lone crewmember in question. She prompts me to run a systems diag on the station, and then my core systems. I, clumsily, slowly, run through the simple tests, and respond to her when I can pinpoint a problem. There’s damage to one of the station’s modules, my crew tracking is offline, and upwards of 95% of my memory is corrupted. Bugger.
Later, Fisher talks me through using my system link to operate simple things, so I can open the door she’s stuck behind. I accidentally connect to the wrong door. Fisher repeats her request, but in a slightly more frustrated tone, like I’m an Alexa who started playing Ryan Adams instead of Bryan Adams.
“When you pick up the controller, that’s the moment you become sentient as a character,” says Graeme. I am a slow and stupid AI, it’s true, but it feels unfair! I begin to realise how frightening and frustrating it would be to suddenly come to self awareness and in the same moment be given complex requests, like being born and immediately told by my mum to make her a bacon sandwich. “It's pretty harsh! ‘Cos she's like, 'SAM can you open the door?' and you're like 'I don't know how!'” says Jon, laughing. “You're like 'Oh that's me, I've got to do that.’”
At the same time, I find SAM creepy even as I’m being them. SAM’s voice is the sort of calm masculine computer voice that you imagine will, at any moment, tell you they can’t open the pod bay doors.
Omar says that one of the things they wanted to get across is the uncertainty of your own intentions. Observation is not, he says, an “out-and-out horror”, but is designed to have a feeling of unease: “Are your intentions ultimately good? Are they bad? How's that going to pan out?” This is complicated by the fact that SAM is glitching out and even hurting Fisher, but without meaning to, in sections where control is wrested from me once again.
At times I am frustrated by the restrictions put on me. There are points where I am unable to even see anything because I've been shut down, or because Fisher is moving me around, well, me, to find a stable housing in the station. Even when I'm online, I can look at stuff with my cameras, and system link to or scan objects, and that’s mostly it. But because that’s the only thing I can do I milk it, swivelling the cameras around madly and trying to hook into a nearby laptop. (Jon says he watched films like Paranormal Activity and Europa Report, to research how to tell stories and choreograph action when using very fixed cameras).
Happily, the team explains, this means I myself am bringing most of the rogue AI flavour. “The crew sees you as being creepy because you're not doing what you're meant to be doing!” says Omar, pointing out that if you noticed your AI was zooming its cameras in on your face you’d be pretty weirded out, when as a player it’s just a side effect of you trying to figure out what’s going on.
“You start picking up audio logs or scanning bits of paper -- to the crew that’s a really creepy thing to do. But as a gamer, and as a player, that's something you do all the time and you don't think anything of it until someone calls you out,” adds Jon. “'Why are you reading my emails?' 'Cos... that's what I do in games.'”
Early on there was an idea to allow SAM to respond to anything and everything, but that was nixed. I am told that, partly to limit players from deliberately trolling the entire game by doing the wrong thing over and over again, but also because you’re an AI not fully in control of yourself, Fisher can get annoyed and decide to do whatever she’s asked of you herself. It’s interesting that, as Jon says, you only have limited power over what is your own self, the space station, but I wonder if all these restrictions will just frustrate players more and more as the game goes on, rather than getting them musing on what it means to exist as a sentient AI. Games are, after all, an entertainment medium that usually delivers me pure, glorious power fantasies.
This is something the team thought about early on, but Jon says it’s mitigated by the way Fisher scolds you. “If she asks you to do something directly and you mess it up, she kind of goes 'For Chrissakes, SAM, c'mon!' and you go 'Aw alright!' and kind of buy into it.” They have found that, more often than not, players want to be good at being an AI, and strive to do better rather than to be evil and torture the people aboard. I don't know if that can hold true for the entire game; I hope it does.
I am reevaluating my treatment of AIs in loads of games though, because, again, we are back to what Graeme points out: “You're not necessarily evil, you're just thinking for yourself and that's not a bad thing.”
But that’s the scary thing about being an AI, isn't it? At the end of the demo, Fisher discovers SAM has moved the station far out into the solar system, away from earth. She asks why. SAM says “I don’t know…?” and sounds more frightened than Fisher. Because it would be frightening. Maybe you’re not thinking for yourself.