Quote Originally Posted by lithander View Post
If anyone want's to post this wall-of-text on the forum to get some discussion started I'd be eternaly greatful! :)


In interviews on RPS Bethesda's Todd Howard says: "I think people discount graphics. [...] I do feel that graphics and your ability to present something that feels new, real, and believable puts people in that environment where they can really enjoy what they’re playing."

Obsidian's CEO Feargus Urquhart says: "A lot of the other systems in role-playing games, they all work awesome and people love them. They still need to evolve and move forward a little bit, but what should combat be in that next big role-playing game? That’s one of the things we’re trying to zero in on."

We just need better graphics and a little work on some of a genre's sub-systems, and, voila, here's the pinnacle of what video games can aspire too? Really?

In my eyes their stance lacks all ambition! I can only hope that what the game's industry is "zeroing in" on is at best a local maxima.

I'm not a CEO of anything but here's what I have to say regarding the current state and future of video games.

I wanted to create games for the better part of my life. I enjoyed playing video games but what fascinated me was their yet unfulfilled potential. Hardware was growing more powerful at a staggering pace and games where the only kind of consumer oriented software that would motivate people to upgrade their systems every two years. It was this synergy between hardware and video games that allowed both industries to prosper. We were witnessing the birth of a new medium and I wanted to have a part in shaping it.

Moore's law, the prediction made in 1965 that the complexity of integrated circuits would double every two years, remained surprisingly accurate for over half a century. The improved capabilities of electronic devices changed our lives profoundly. When surfing the net with my smartphone, reading a book on my kindle, depending on amazon, google, wikipedia or the navigation assistant to solve most of my day-to-day problems I realize how technology has advanced way beyond what I could have imagined 15 years ago.

Video Games on the other hand turned into a disappointment. Not because my career plans failed - I have studied multimedia engineering and make my living in the games industry. But the product is not what 15-year-old me would have expected videogames to be like in 2013. Not remotely close. That's a thought provoking revelation for someone that devotes most part of his waking hours to games.

I'm not denying that video games have come a long way. But what improved the most are surface values. And along with the audio-visual quality our expectations rose, too. Just try to play a "timeless" classic again and you'll have a hard time to immerse like you used to. We got addicted to a fidelity level that comes with a high price tag attached. Losing the ability to enjoy our favorite games like we used to is the least of it.

Early video games were defined by their hardware platform. By lack of better options the state-of-the-art is always good enough. But it's easy to imagine how a game would be like with more colors, higher resolution, better sound samples, smaller polygons and better textures! Improvements like that are easy to imagine and easy to sell. And last but not least they are optional, too. Only a few developers dare to alienate the majority of potential buyers by requiring the top-end rig to play their game. The sensible approach is to target some baseline spec and add optional eye-candy to keep the system busy. Hardware vendors adapt, designing gaming oriented hardware aimed to maximize audio-visual payoff. Rendering capabilities become the benchmark for gaming hardware and consequently their owners prefer to buy the games that make best use of it; both games and hardware evolve with a strong development and marketing focus on visual quality.

Iterative improvement of what's come before is the modus operandi of any industry. Hit close enough to what player's know and like, just make yours a little better. Iterating on gameplay mechanics is risky but who'd argue with higher fidelity? Unique selling points beyond presentation are entirely optional.

It's a perfectly reasonable approach for the developer to take. For the industry as a whole it leads into a dead end.

The addiction to high fidelity becomes a limiting factor of it's own. All that high quality content is very expensive to make. Development of a AAA title keeps large teams of specialists busy for years. Millions of investments are at stake in a hit-driven business. Smart money looks for franchise potential. An environment like that doesn't promote taking chances.
Worse, for the sake of quality you rely on content that is very inflexible. Static level geometry, baked lighting, hours of canned animations, hand-animated or performed by human actors, thousands of lines of text has to be voice-acted, too. Don't forget the lip syncing! In a fierce competition you can't afford the player to miss out on millions worth of content just 'cause you want to provide some room for meaningful decisions. So the real challenge is to fake the player into believing he's in control, when in truth every turn of events has been carefully planned and scripted to maximise asset-use. Interaction is predetermined or insignificant.

Of course, there's always a balance to maintain between player and authorial control. If movie-like aesthetics is your goal the current approach makes sense but let's not forget that photo realism isn't required to create immersion. Our mind is capable of forming a mental concept of things not actually present. Media is engaging the recipient on a creative and emotional level by guiding his imagination. On the surface books offer only language encoded in little symbols but the story that unfolds is not constrained by that. In comic books the action takes place between the frames. Our mind provides closure to missing elements. We build an internal model of the fictional world based on the input we receive regardless of how abstract it is as long as we can interpret and integrate it effortlessly. We make predictions and the quality of our experience depends on the accuracy of these predictions as the story unfolds. All parts of a piece of fiction have to fall into place or immersion is broken. Of course there's room for surprises but they have to make sense in hindsight. What happens has to accord to the laws of the imaginary, second world. The moment disbelief arises, the spell is broken, and you're back in the primary world.

In modern video games, the contrast of pseudo-realism and the emptyness behind it's surface makes suspension of disbelief hard to maintain. Welcome to Uncanny Valley!

At the time I made my career choice I expected games to a evolve towards emergent gameplay. To allow that games need to provide systems for players to engage with that are inherently interesting to interact with. They shouldn't have to rely on extrinsic reward mechanisms (like achievements or, yes, the storyline) to hook the player.

Given that games are unique in how they allow us to interact with them instead of presenting their content in a fixed sequence we can only witness unfolding, the potential of video games still seems amazing. But what matters beyond surface values is that the World we're visiting is forming a coherent, predictable system that always plays by the rules. Tetris meats those requirements but humans like to relate to and interact with equals. I hoped that we would find a way for autonomous agents to be more then shooting targets. Given a simulation that can integrate our actions accordingly there would be nothing to stop us from employing our own creativity to solve challenges or develop real feelings towards virtual characters. This is the key to genuine player agency - everything we deal with today is fake and make-believe, a waste of resources on facades.

Video games were already pretty fascinating and immersive in 1990. I just assumed that they would become increasingly more fascinating and immersive at the same rate as hardware grew more powerful. This is where I erred. It's the same fallacy the led AI researchers in the 60s to boldly claim that AI would surpass human intelligence within a generation: The thought that all the constraints are technology-based.

We still have no good understanding of what intelligence and consciousness are let alone how to simulate it. But in 1997 the chess computer Deep Blue won a match against the world champion Garry Kasparov. Deep Blue wasn't intelligent. He won thanks to sheer number crunching power, a huge databases of opening moves, static analysis of thousands of recorded games and a lot of fine tuning. Eventually the hardware power at their disposal sufficed to compensate for the lack of an truly intelligent approach.

Parallels with the game industry are glaringly obvious: There's "Game Theory" but it doesn't help us to understand the human heart and mind. It does explains a lot about rational decision making so it has it's merritt if you want to understand the current state of global economics which is in large part driven by algorithms and financial models. But humans - they are a lot less rational than economics used to believe. So, as far as video games are concerned we rely on limitted interactivity, canned content, storytelling tricks borrowed from other media, of-the-shelf engines and established gameplay-mechanics to produce endless variations of the same proven themes and genres.

The big players in the industry leave it to the Indies to think out of the box. They've mostly given up on finding new solutions to unsolved challenges. But when the big budgets are spent on producing generic clones, can you expect the Indies with their limited resources to compensate for that? They prosper in niches but pushing the limits on topics like AI is well beyond their scope.

It seems like the game industry has lost it's ambition and ability to evolve the medium.