Eliza is a visual novel about Evelyn, a woman starting work as a ‘proxy’ for the eponymous service. The service is counselling by algorithm. The proxies sit and listen, while clients say whatever they have to say, and the system takes measurements of things like heart rate, vocal stress and such, before analysing keywords used and delivering a reply. You read the script it generates, and nothing more. That’s the job.
The game itself is about everything to do with that. Counselling. Crunch in the tech industry. Ethics and isolation and empathy, and Men In Tech. And it’s about recovery. You get dialogue options here and there, but until the final act there’s not a lot in the way of big decisions. I mulled over those closing decisions for longer than I’ve thought about many I’ve made in real life. Indeed, if it seems I’m sticking to the slightly dry facts in this intro, it’s because if I start talking about how much this game has spoken to me, I don’t know if I’ll be able to stop. It is doing so much. I have lost sleep thinking about it. And I am glad.
You learn the job before anything else. The counselling sessions are short by necessity, and at first, the atmosphere in the sections between them is sterile. Not in terms of being dull to play, but in the feel of the office and the implied culture of the business you’re now working for. Then you have lunch with an estranged friend, Nora.
Nora is wonderful.
It’s in talking with her that you begin to learn the real story. It turns out Evelyn made Eliza. She, Nora, and some others created the core of this programme, and two of the men involved have since built an empire on it. You catch up with them, too, as they’ve since had a very public split. Rainer, the businessy one, is a manipulative, cold-hearted hypocrite. Soren, the idealistic one with the counselling experience, has some fancy new technology he thinks could end human suffering. Soren is deeply miserable and wants to help people… but he’s also an insufferable creep, who hints so clumsily and spinelessly about his obvious attraction to basically every woman at least 15 years younger than him, that I don’t even care if he does change the world. We’ve all known a man we pitied but couldn’t trust.
Then there’s Rae, your supervisor, who I mistook for a stone-cold true believer until we talked about why she does the job. And Erlend, the young one who’s far too good for the industry he’s in. My only real regret at the end of the game was that I had to leave someone behind. And then, of course, there are the people who engage in the counselling sessions, who you know so little about, and can do so little for.
I could talk about every character in this game all night. They’re incredibly well written, well thought out, and — with one minor exception — acted wonderfully. And woven through them all, there’s the inescapable point of it all, the social context that’s ultimately what has driven you here: the suffering. The endless, terrible pain and trauma and fear and sorrow suffered by countless millions, everywhere. It’s overwhelming. It’s invisible. It’s inextricable from your choices about your life. Caring about people hurts.
This is a beautiful game not just about trying to help people, but about the desire to help people. What do we do with it, in a world where anything useful we create will be taken by wealthy men and used to suck more marrow from our bones? Should we walk away from something that isn’t working, or stay and try to influence it for the better? Is it right to abandon this project that might be doing some good, just because it also contributes to the same parasitic system basically everything else is also shackled to?
How do you get back into life after a long period of grieving and isolation? Can you go back? Should you?
That’s Evelyn, the player character. I have never identified with a character in a game more. Aily Kei’s vocal performance is the jewel in an already glittering crown, and combined with a flawless script and art… hell, even the music is fantastic, especially in the final act. There’s one scene at a gig that has a bass drop, for god’s sake.
I love Evelyn so much. She’s quiet, intelligent, empathetic, and she’s healing. Most games and stories about mental health focus on someone in the grip of depression or abuse or addiction or whatever else. Evelyn isn’t. Evelyn is in recovery. The liminality has her. It’s a space few talk about, and in many ways it’s harder — and makes you more vulnerable — than all the awful shit you’re climbing out of. The armour is coming off, and suddenly life isn’t a thing that happens to you, it’s something you have to ride.
Things haven’t been okay for Evelyn for a few years, but the game isn’t about the details of that. There are no details of that, really. Times like that are as weirdly boring as they are painful and difficult. She pushed a lot of people away, not that there were very many to begin with. It’s hard for her to decide how much of those old relationships she should salvage. How much she even can, given what she’s been through, and what she’s missed.
She also types out filler words and ellipses in texts. IT ME.
There’s a glorious bit late on where you get the option to go off script. It’s like Evelyn is done observing and thinking and waiting and now she’s… she’s angry. She’s angry in the way only a kind person can be. You can choose to behave still, but it’s already clear that doing so isn’t helping these people the way Eliza was supposed to. It’s a system that thrives only because there’s nothing else for people who need help.
So Evelyn intervenes. She starts to disrupt the technocratic, algorithm cultist bullshit. It’s probably futile, since it won’t change the culture. But it’s a statement. And it might help a few people – really help them.
One of the clients asks a good question during the Eliza AI’s closing script. The software ignores her completely. It can’t deal with it, just as it can’t deal with hostility or rejection or any of the situations real counsellors face: clients who lie, who don’t tell you anything, who will chat about nothing for months without saying what they really want to talk about. Things that take more than a statistically probable solution.
The problem with the Eliza AI isn’t just the awful corporate terminology — every session ends with the same sloganised spiel and starts with the same fake small talk. It’s not the grotesque gamification the proxies are subjected to — you gain XP after each session, and the corporate system even has achievements (plus they lie and manipulate data anyway “for better results”. The sheer arrogance of it). It’s not even the horrible parts where you’re given permission to go through clients’ emails and chat logs. Oh, it’s legal, and they gave it voluntarily. But they gave it in a fucking counselling session, and here you are reading a script that directs them to an EULA. It turns my stomach. But that’s still not what the problem really is.
The problem is, it’s a system made by people who don’t really know what it’s like on the ground. Not just people who’ve never been psychiatrists, but who’ve never lost their home, who’ve never worked in a library in a deprived area. People who’ve never been poor, or sick, or Black, or elderly, or all of the above.
People, in short, who don’t know anything outside their bubble. The easiest example is how immediately obvious it is that the “treatment” system is entirely dependent on a user having a modern smartphone. One woman mentions that her phone isn’t working, and it probably wouldn’t run anything new anyway. The AI script instructs you to tell her that a desktop computer is also an option. And I’m copying this next quote directly from my notes: “I am typing this while Eliza says it. I’ve worked in libraries. I bet 50p this woman doesn’t have one at home.”
The woman replies: “Oh, well then, I’ll just have to go down to the library at some point.”
You see it all the time in those jobs, or maybe even in your own neighbourhood. A third of people are winning. Another third are being dragged along against our will. The rest are left behind by a world that only cares about them so far as their suffering, in aggregate, might be monetised. Ironically, Eliza is little different to conventional therapy in one regard: it’s hopelessly inadequate when the problem is society itself.
With one of the big tech guys, you diplomatically suggest that hoarding data on the private counselling sessions of thousands of people is bad, saying that “the potential for misuse seems kind of high”. His response is a perfectly dismissive “sure, we’ll work on those things” Tech Guy response. He says it just right, so that he can deny that he dismissed the problem. I hate him. I hate the thousands of him doing this to us all every day. But whatever my objections, this is probably inevitable. If not Eliza’s owners, some other billionaire club will do it. Evelyn’s in the rare position of maybe being able to influence it, at least.
The climax of the story, for me, was when Evelyn’s finally speaks out, and not even to anyone specific. She’s been pensive, unsure of anything through the whole game, and then it all comes out. It’s a perfect scene. The performance is understated, but sincere. She’s starting to make up her mind. Finally having her say after not just what you’ve seen her through in the game, but everything she’s been struggling with in her life. It’s not an outburst. It doesn’t even offer any answers or solution. It’s just who she is. It’s what she needs to say to get it out and let herself make a decision about it. It’s what a lot of good counselling sessions come down to.
Eliza ends with a near-traditional VN decision about who to pick, but instead of choosing which stereotyped anime girl to boff, or what stock ideology to confirm that you already had, you choose what to do with… everything. None of the options will let you solve all the problems. There will be compromises, and whatever option you go for, life will move on, things will change, people will drift. This is all fleeting. Who do you think you could live with now? What part, if any, of your old life was worth salvaging?
I chose to sack it all off and start a new life with Nora. The big fancy tech lads suddenly stopped fighting each other to employ me after that. Maybe I should have stayed and kept trying to save the world. Maybe the music we made together would help save a few people. But you have to save yourself first.