Image by Ken Fager (used under CC license)
One of the fascinating things about the Oculus Rift headset is the way it forces players and developers to consider the body in relation to a game. A lot of the talk is about how to represent your avatar’s body. There are questions like whether it’s a problem when you look down and see “your” body but in the wrong clothes, and fascinating art projects designed around you being in someone else’s skin. But another avenue of exploration when it comes to incorporating bodies in gaming is biofeedback.
Brainwaves, breathing patterns, heart rate, blood pressure and so on can all be used to affect and to personalise the experience of gaming. Developer Robin Arnott has been using biofeedback in his work since Deep Sea, a game which monitored player breathing to control the scuba regulator sounds they would hear in-game. His current work is SoundSelf. It’s a meditative experience which is controlled through microphone input. Ostensibly that’s a voluntary process but it incorporates elements of biofeedback because breathing rate is tied to the user’s vocal input.
“I first seriously learned about biofeedback at a GDC talk from Valve’s in-house brain-scientist,” says Arnott. “They were giving heartbeat data from an EKG [a measurement of electric potential on the skin that correlates to heart beat] to an AI Director to dynamically tailor the game’s difficulty. This was years ago, and in my mind it has since seemed inevitable that gaming controllers would eventually include EKG and galvanic skin resistance detectors. They’re cheap, and they provide a treasure-trove of personalised data to feed into any game.”
If you’ve been keeping an eye on such developments you might know Erin Reynolds’ game Nevermind. It uses similar ideas in that if the game detects increased heart rate via a heart monitor strap the difficulty will increase. The idea with Reynolds’ work is that to progress efficiently you’ll need to find ways to remain calm while playing.
With Arnott’s SoundSelf project the input is a microphone which picks up noises and tones made by the player. “I don’t consider those tones themselves to be biofeedback since they’re voluntary,” says Arnott. “The eureka moment came to me when I realised that, by encouraging players to tone through their entire exhalation, I already had implicit access to the player’s inhalation as well! This means I know approximately the rate and rhythm of the player’s breath, which means I know approximately how relaxed and entranced they are.
“In a way, SoundSelf now has a basic artificial-emotional-intelligence. If you’re playful and energised, it can meet you there. If you’re entranced and ‘zenned out’, it can meet you there too.” (I should probably admit that I played SoundSelf at GameCity last year in the middle of a crowded room and became so entranced and zenned out I fell asleep. At least it’s a game where that could be considered a compliment.)
“What’s important for me is that you’re not deliberately controlling the experience with your biorhythms (as you would have in Deep Sea). Instead, your biorhythms betray something about your subjective experience that’s useful for building a more intimate relationship between you and the game. This can be valuable to a traditional gaming experience like Left 4 Dead, but I think it opens up huge new vistas of design when designers instead focus player attention inwards, on the mind itself.”
Rift is important in these developments because of how it manipulates your senses. Arnott talks in terms of bandwidth – Rift providing high-bandwidth output and biofeedback being capable of providing a type of high-bandwidth input.
“Just like your internet speed, all modes of communication have bandwidth. A high-bandwidth communication uses very little compression, it’s close to a direct signal. A low-bandwidth communication requires a lot of compression. For example, a traditional controller compresses intention into a symbolic mapping of buttons, and then decompresses it into on-screen action.
“The Rift is like a high-bandwidth one-way cable plugging directly into your perception (I’m exaggerating, but only a little). But a game is a two-way street. It’s output but it’s also input. It’s the dance between the input and the output that makes games special, I think. So if you have a high-bandwidth output like the Rift, but a low bandwidth input like a controller, your game is still a low-bandwidth loop. It’s not an easy problem. Biofeedback technology exists now though, and it’s pretty high-bandwidth.”
Arnott admits that one of biofeedback’s limitations as a control system is that it’s hugely inefficient as a vehicle for player intention. That makes sense because to a large degree the processes involved are involuntary. They can be modulated in some ways (stress management techniques affect breathing and heart rate, for example) but that’s very different from being able to trigger definite action in the way that you can by pressing spacebar to jump.
For that reason, when you’re focused on a set of external challenges or objectives you’d want the control system which lets you translate intention into action reliably and so biofeedback’s role tends to be more along the lines of generating difficulty or creating atmosphere.
Arnott isn’t dismissive of those applications but sees biofeedback’s real strength as being in less goal-oriented environments. “If you look at the most popular experiences on Oculus Share, very few of them are goal-oriented. This is partly because roller-coasters and tech demos are still novel. It’s also partly because a ton of the assumptions about game design that have evolved over the last forty-plus years don’t fly so well in VR […] Game designs can be more mellow, more subtle, more quiet, and still be riveting and beautiful. It’s in that design space – embracing the calm – that I think biofeedback will make a really huge impact.”
At present the majority of games or game-adjacent projects I’ve seen using biofeedback relate to self-help – meditative projects like SoundSelf and apps designed to help sufferers deal with panic attacks. I ask whether that’s a product of how biofeedback is largely limited to medical scenarios at the moment or whether it’s more of a tech availability thing.
“For electrocardiogram (heartbeat), galvanic skin response (changes in skin’s electrical conductivity in response to emotional reactions), and even electroencephalograms (brainwaves) to have an impact on consumer arts like video games, they’re going to need to be built into consumer controllers. I suspect that’s right around the corner but until that happens these technologies will unfortunately be limited to industries where you have a few consumers/hackers willing to spend a lot of money – like art installations, science, and therapy.”
This year’s Indiecade Night Games party had a strong biofeedback element via a selection of ‘transformative technologies’ formerly presented at Burning Man and curated by inventor Mikey Siegel. They’re not billed as games, but I’m noting them here because as well as being facinating self-contained experiences they demonstrate how you might use the technology to create different types of relationships between players or to influence and personalise.
Arnott describes Reincarnation Lounge Chairs by Alan Macy. It’s a two chair setup where metallic hand pads read your heart rate. The heartbeat gets output through a subwoofer in the chair and a blinking red light. You can either experience your own heartbeat or switch to the other person’s. HeartSync by Mikey Siegel is similar in that it allows you to experience the other person’s pulse but adds a visual glow around the other person which matches their breathing (as detected via a belly strap). Another – Indra’s Net by Peter Tjeerdsma – uses an EEG to relate the colours of LEDs to brainwaves and thus, mental states.
But there is a concern when it comes to these personal experiences. When using data produced by your own body, there will also be issues of storage, of protection, of privacy. It’s not directly identifiable information which might lead to credit card fraud or identity theft. Instead it’s an account of your emotional and biological reality which could be used in ways you may be uncomfortable with. It also links to your health so there will be questions of whether developers would have a moral obligation to warn players of any abnormal readings.
“What data protection and privacy issues do you think biofeedback creates?” I ask Arnott towards the end of the interview.
“Good question. I don’t know the answer to that,” says Arnott. “There’s a certain amount of involuntary information you’re giving away to the system and potentially the people behind the system. It’s not mind-reading, but there’s definitely the possibility for game designers and advertisers to get a sense of how aroused or excited certain content makes you feel, for example.
“I think it’s important for us all to make deliberate decisions about what data we give up. For me, I don’t feel threatened by giving up that kind of data, but that’s just me.”
This post was originally published as part of the RPS Supporter program.