By Kieron Gillen on September 6th, 2011 at 1:23 pm.
The new Deus Ex is about many things, but ranking high amongst them is DRM. I’m not even joking. (The following article contains spoilers to the very end of the game.)
Human Revolution is a meditation on technology’s ability to make us more than human, and the dangers (both perceived and actual) herein. This is a universal, perpetual theme that’s interwoven with civilization. We have been made more than human since our greatn ancestor chimp picked up a rock and bashed the living shit out of something that was trying to kill it.
I’m a cyborg. I am able to write this today because the technologies of antibiotics and surgery were able to remove a malfunctioning organ when I was 24. If this was 200 years ago, I would have died from the common appendicitis. Now, I sit with a hole in my belly, my physiology altered for the better as much as any of the cyborgs in Human Revolution. Equally, unless you’re crouched in the corner of my room, watching over my shoulder, you’re reading this via an internet technology developed to avoid the hammer-footsteps of nukes instantly removing the military’s ability to scream “Fucking hell! We’re fucking fucked!” at each other when the bombs started wiping them out. You could be in an office, or a beach, or a packed train or wherever, and my thoughts are being projected directly to you. And that in turn rests on the technology that made us more than human several thousand years ago, able to transmit thoughts across time and space. The written word allows Plato – 2400 years dead – to project his thoughts directly into the mind of anyone who bothers to read him.
And as technology changed what it meant to be human, people question its merits. Let’s take Plato’s thoughts – through the mouthpiece of Ammon in his dialogues – talking about the very technology that allows him to reach us through the years:
“If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.”
Technology gives us power. And some of us always fear that the technology that is providing this power will also reduce us, to the point of ending us.
Right now? We’re all superhumans. And as time goes by, at least in the industrialised world, we’re all increasingly superhuman. It’s almost impossible to over-emphasise how much technology has made us titans. It was said, the Colt is the great equaliser. For the last century, I would favour your gran with a gun over any martial artist in the world. Today, I’d favour anyone with a basic understanding of Google over pretty much any scholar in matters of general knowledge. Frankly, after a prompt from Dan Griliopoulos, the previous two paragraphs are pretty much evidence of that. I appear more learned than I actually am by the power of a Google search. And that power is only going to increase over time.
Communal sighing for the loss of human achievement is misplaced. This is the fundamental story of humanity. We’ve only ever been what our technology has made us. And if it made us, will it destroy us? That’s been the story since Prometheus, and if we observed humanity from a distance, I suspect it’s the story aliens would tell about us. “Humanity: an animal on a little muddy ball with a nasty addiction to the steroid of technology”.
We live in the shadow of Oppenheimer. Nukes have the ability to annihilate human society in an eye blink. But nukes are nukes, held by larger bodies. What happens when people can turn themselves into something with the ability to alter society as much as a nuke? What happens when the technology means there is absolutely no need to pay anything but lip service to the society in which you find yourself?
It’s the end of the world. Even for people who are in favour of the eternal acceleration, it’s the end. The geek rapture of the Singularity is nothing if not an ending.
Human Revolution is about reactionary forces trying to put that genie back in the bottle.
Human Revolution methodically takes us through a host of aspects of how an emerging technology is changing society. By doing so, they’re using a science-fiction filter to show how technology effects any society. Take, for example, the memorable plot in Shanghai where a trader from a poor background is in debt to a money-lender to pay for the cybernetics that allow her to play on an equal playing field with everyone else. People who come from backgrounds which can afford to pay for it. Even away from that particular incident, the wealthy draining money from those who buy the advantages they need to prosper is all too visible in the world’s initial set-ups. While medically necessary, the anti-rejection drugs you’re on for life if you become a cyborg are fundamentally an endless mortgage on your life. You do not pay, you die. But if you don’t get the cybernetics, you sink to the bottom of society. What choice do you have but to become as much meat as metal. Technology is the key to temporal power which is the key to economic power with pays for the technology. It’s a vicious circle. And when bought-for-advantages entirely outstrip anything a lucky genetic throw of the dice can give you, those towards the bottom of society are perpetually annihilated.
Of course, strip away the metaphor, and it’s already leaning like that. As a child, in a Midlands town, I was never aware of class on a day-to-day basis. It’s when I went to work at PC Gamer and realised that I was the only person on the magazine who went to a comprehensive rather than a private school that the penny dropped. In the decade and change since then, it’s got worse for young writers. Not willing or able to work as an unpaid intern? You may just be screwed.
Human Revolution’s point: the more technology advances, the more advantages it buys, and the more those unable to purchase it suffer.
There’s a flip to it though. While the bleeding edge of tech – what lets you operate as the best in the world – is beyond most, whatever’s lagging behind becomes more available. And older tech is still, in its own way, power beyond what you would have “naturally”. And to return to my nuke metaphor, when the gap in earnings and societal respect becomes a chasm, having a large underclass with access to even the original generation of nukes is a recipe for the aforementioned end of the world.
In Human Revolution, Hugh Darrow – Augmentation’s Oppenheimer – sees the world he’s created and can’t help but think it’s doomed. He does not trust people with the power he’s given them. Moreover, he doesn’t trust those who have power over others to act correctly. He knows what the Illuminati were planning – the ability to prevent anyone in the world mis-using the nukes beneath the skin. While we can easily say his bitterness is because he’s physically incapable of accepting augmentations himself – therefore, is always going to be left behind by progress – he does have a point. This is what technology allows people to do, both the masters and the serfs. You are deluded if you believe by putting technology into your body you make it yours. It is still theirs – and putting it into your body, makes you theirs.
The Illuminati’s original position doesn’t really care about that. It thinks the masters are best for the people. If people are free to just do whatever they want, they’re going to destroy the society. If everyone has a nuke, even if the vast majority can use it responsibly, it only takes a tiny portion to decide to mis-use it to bring ruination. So, by fair means or foul, there must be a way of enforcing discipline. By killswitching this world-ending they maintain control. They have added entirely unnecessary functions to a piece of technology because they distrust human nature to use it responsibly and maintain a societal order.
At which point you see the DRM metaphor. The Illuminati’s plan is to put DRM into every piece of cybernetics to ensure that it’s not misused – or, if it is misused, it can be prevented from causing wide-spread harm. Darrow’s murderous critique isn’t just that augmentations are dangerous – but that augmentations will leave you open to something like this. His problem is both what the augmentations let you do (“I can tear that dude’s head clean off if I feel like it”) and what they make you do (“They can make you feel like tearing that dude’s head clean off if they feel like it”). Some technology is just too dangerous for anyone to allow it to exist, because the safety-locks you “have” to add to it are just as rife for abuse as the technology it exists to control.
David Sarif’s one of the most interesting figures in the game. Making a corporate head sinister has been a cliché (if an understandable one) for decades now, but Sarif has a Steve-Jobsian charisma to him. Of all the leaders, I like him most. Of all the leaders, my best instincts wish he was right. He believes we can trust people to push this tech as far as we can, because it’ll turn all right in the end. Those who are worrying about it are just old people. The old always say the young are changing us for the worse. The young always say the old never understand. However, I find myself thinking, just because this will always happen – and has always happened – as long as there are young and old, does not mean that at different times one side is always righter or wronger than the other.
As much as I like Sarif, I don’t trust his successor or his peers to act like him. And I don’t trust Sarif in five or ten years time, because time changes people. By then, he could be just like one of the Illuminati, looking for ways to maintain control. Or, to look at the world we live in, just because Google have been relatively lovely up to now, we shouldn’t necessarily assume that they’re not going to decide to be Doctor Doom in two weeks time and simultaneously blackmail every Google-account-owning person in the world.
How time changes people preyed on my mind as I finished Human Revolution. A decade ago, with the first game, I went with Tracer Tong and burned down civilization for hope of a more equitable, better world tomorrow. This time around, I sided with the people who wanted to put a DRM-chip inside every consumers’ head.
Yeah, I went with the Illuminati. How the young radical has changed, eh?
After seeing the world of Human Revolution, I simply saw a disaster. I was too in love with the concept of progress to go for simply nailing the door shut to the future that Darrow offered. Equally, Sarif’s tech-utopia was mediated by corporate bodies. I trusted Sarif. I don’t trust the immortal, perfect soulless machine of a corporation. And to leave it up to humans? My second choice. My idealistic one. This is too big for any one man, especially me. However, in a world where bodies with no responsibility to the public wield so much money and power, I couldn’t see how what the world would decide would be anything other than what the corporations told them to think. Or, at least, until it was too late.
To cite Plato’s famous peer Uncle Ben, with great power comes great responsibility. The anarchist in me would like to think that in times to come people would learn to live without the controls. But whoever I am today thinks that without some enforced restraint, the dark future of Human Revolution simply becomes no future. I simply don’t believe that enough people in Human Revolution’s world would do the right thing. Not Yet. Maybe ever.
So I guess that’s my confession. Human Revolution convinced me that in certain situations, to prevent a certain abuse of technology, I’m totally fine with the sort of draconian DRM that even certain major PC publishers would think a trifle harsh.
I’m one of them now. Shit.