In the furnished upper floor of the Epic booth at this year’s GDC I sat down with company founder, Tim Sweeney. We were there – indoors and down a flight of stairs – to discuss the great outdoors. I wanted to know more about the challenges of producing outdoor environments using a game engine, how Epic themselves approach the challenge, and what the big areas of research will be next. But to look at something concrete we started with the Unreal Engine demo whose cinematic follows a boy as he chases his kite through an area modelled on Scottish terrain but populated with flora scanned from New Zealand…
Tim Sweeney: You’ve seen the Kite demo that we debuted at last year’s GDC? That’s a huge outdoor space. We actually downloaded one of the digital elevation maps of Scotland which is data gathered from planes flying around with lidar scanners and figuring out a precise digital representation of the terrain elevation there. So we recreated an area of 48 square kilometers of Scotland faithfully. But that was just the heights of the different locations. We had to procedurally fill in the details of what kind of vegetation belongs there.
There are two steps to it. First of all, scanning a whole lot of vegetation in so we have models and matches for the plants and rocks and everything. Then determining where to place it.
Pip: Scanning like photogrammetry?
Tim Sweeney: Yes. Photogrammetry. Basically, with a standard digital camera an artist would take a hundred photos of a rock from all the different angles. Just a regular camera to do that. Then we used special pre-packaged software to generate 3D models from that which analysed all the different photographs and figured out how they fit together and generates a 3D model which is really close to what you can ship in a game. That gets up about 95% of the way to done. Then an artist goes in and cleans up any little errors in the data and polishes it up to perfection.
It’s a great reduction in the labour required and also a great increase in realism because it would take an enormous effort for an artist to recreate all the details of a rock or understand the geometry of the physics that causes it to take whatever shape it takes, whereas scanning a real one gives you the authentic thing right away.
So we scan in hundreds of objects that way. The funny thing is that we were building the demo in the middle of winter and in Scotland the winter isn’t very pretty so the team actually flew over to New Zealand and worked with the WETA digital team that allow folks on site. They were working on a different demo with us at the time but they helped out and we went around New Zealand scanning New Zealand rocks and vegetation, plants, all kinds of different textures to use in it. Then to combine that with the digital elevation map we had to create a very rough simulation of the ecology in an area. You know – Northern-facing slopes receive less sunlight and have more water on the ground so you get lusher patches of vegetation but tougher growing conditions so you get a lot more grasses and things like that.
Based on the slope aspect ratio and water flow related to the surface you created all kinds of of different procedural variations and [fill in] 48 square kilometers in Scotland with procedural vegetation in a way that looked pretty realistic – the gullies in the terrain naturally received lush vegetation because there would be springs and water flowing down the rocks there whereas the ridge tops would receive much sparser vegetation and have scrawny trees here and there.
So we did all that and the team procedurally populated the area with animals like deer and birds and everything according to their preferences for grazing. You only find deer where there’s grass to graze on and so on. The simulation was randomised but it produced a very compelling environment and served as the basis for the demo. Then the animators went in and had a kid flying his kite in the scene and run and chase after his kite and created a story around it but without the procedural environment simulation and all the photogrammetric model scanning it wouldn’t have been possible to create such a good demo which has used all our team.
Pip: Is it the photogrammetry that’s key to creating these spaces or is it more about coherence or having the right external experts to tell you about things like erosion patterns or ecosystems and things…?
Tim Sweeney: I think it’s all of those things together. The really hard part of computer graphics is people have an enormous amount of experience of how real world scenes look and so they can easily spot the weakest part of any scene so you have to have all parts be equally strong otherwise the weaknesses stand out. Photogrammetry was the key to having objects look realistic up close. Wherever you are in this massive terrain there are a lot of objects up close and so making them look perfect was a key there.
Being able to scan the objects photogrammetrically but then also apply realistic lighting and shading to them, specular reflections off of wet surfaces and really shiny highlights on them whereas rough dry surfaces have a completely different appearance so the lighting and shading model in the unreal engine was really critical for that.
Then simulating and displaying custom vegetation and trees required a lot of custom work. But then to make the faraway parts of the scene look right, that was where the proceduralism was really critical. If you wanted to build all that by hand you might have needed a hundred people working for a year but really it was about ten people working for two months who did it.
That was a multiplying effect on people’s effort and I feel like we really only scratched the surface there. If you took this to the next level you could perhaps build an entire planet with different environments, different ecosystems, and completely different flora and fauna in different areas.
Pip: When it’s something more stylised, less a recreation of a real or real-inspired place I was wondering which things you look to or which are keystones for an area to feel “right” and “outdoorsy” instead of an enclosed space?
Tim Sweeney: That’s a great point. There’s a lot of subtlety to it. I think the lighting and the shading – just the ambient colour that you see in the environment has a big impact on that. In an outdoor scene the way the sun moves through the atmosphere has a really profound effect on things.
You have to realise the colour of the sun is what we perceive as perfect white but when it goes through the atmosphere the direct sunlight from the sun [can look] red/orange because all of the blue light has started bouncing round the atmosphere.Blue light is diffused by oxygen and ozone in the upper atmosphere and so the sky is blue because of that. You get this interplay between the direct lighting and the indirect lighting from the atmospheric scattering.
But then depending on the clouds and how overcast it is you get a whole lot of effects. They’re really quite subtle. Light bounces from the ground to the clouds and back and so the clouds actually take on some elements of ground colour. The effects of all of that are so subtle that you don’t really notice it when you’re outside but you do notice that it’s wrong when a videogame doesn’t do it.
Pip: I was talking to [Luis Antonio] who worked on The Witness and he was saying they had architects telling him how fortified buildings would crumble so that didn’t look weird or wrong.
Tim Sweeney: It’s amazing. If you look at the most extreme, stylised artwork like a cartoon you might read in the comics, you often see a brick house represented as a blank house with like three bricks drawn in. I remember being a little kid and looking at that like, “Why are there only three bricks there?” After a little while your brain just fills in the details – “I get it, it’s a brick house”. Your brain plays a lot of tricks on you that way and it’s cool to see games picking up on those forms of stylisation. Like, extreme minimalism. One of the outdoor games here is largely a flat shaded world but there’s a few blades of grass here and there so you’re like “Okay, I’m in a meadow!” [laughs]
Pip: In your experience creating games what have been the biggest technical challenges you’ve encountered. Is it the lighting or is there any other aspect you’ve had to really work at to get it to feel right?
Tim Sweeney: Well, you know, I had the real fortune of being in real time PC graphics since the very beginning. I was starting this in 1995 shortly after Doom came out on the first unreal engine. I remember building all these tools and algorithms for drawing texture map surfaces and basic lighting and things like that.
I’d always build these little demonstration levels where I’d place two walls together and the amazing thing is you get it off to an artist and after a few weeks with the tools the artist will build something that’s orders of magnitude better. It’s like “Oh my god, I had no idea the technology was capable of that”.
I think it highlights the importance of having people with really deep technical and artistic expertise at every level of a project. The deep expertise is required because all aspects of the scene have to look right otherwise there are going to be errors.
Characters are a huge challenge there. It’s extremely difficult to simulate the way lighting behaves on a character. A white painted wall is really easy to draw. We know exactly the physics of that but people and skin is really complex.
If you look at the difference between a mannequin and a person the most apparent difference is the light actually penetrates many millimetres into your skin and is diffused around. A lot of what you see as the colour of skin is actually the colour of little microscopic blood vessels and things bouncing around. That’s why you see so much subtlety to the face.
It’s even more subtle when you look at eyes and the way people’s mouths move. You add motion to it, now you need proper facial animation – that’s all extremely challenging. And the way light bounces around environments is important too.
The very first Unreal Engine just figured out which surfaces were hit directly by light and those areas were bright and the other areas were black. That looked terribly wrong because in reality light bounces around many times as it traverses a scene and eventually hits your eye. Usually by the time it gets to your eye it’s gone through several bounces off walls and everything so colours are bouncing around the scene in subtle ways and no area is completely dark.
Just the calculations required to do that accurately are incredibly intense. We can only do them offline in their full detail and then we have real-time simulations that capture some of them up close but all those details when you add them up and do each one with a sufficient level of accuracy will create a realistic scene but it has taken dozens of people who are absolute experts in their field to achieve that.
Pip: How do you look at the suite of tools and software you have for a game engine and work out if anything is missing or you want to upgrade or tweak things? I’m guessing they interact in so many ways…
Tim Sweeney: Our building of the engine is guided in large scale by your experience building games. Unreal Tournament, Paragon, all of these projects are both creating a product for consumers but they’re also really shaping the technology, pushing us to implement better characters and environments and everything else.
It’s where the rubber meets the road, right? This technology is used by a lot of the best companies in the industry but if we weren’t creating our own games we’d have a very hard time understanding what they all really need. There could be a cacophony of voices asking for different things. How do you prioritise them unless you really know demonstrably, first-hand what’s needed there?
We also have a more bleeding edge and uncertain effort through creation of tech demos where we’re trying to figure out what’s possible as we go. When we build one of these demos we have a general idea of what we want to create but we don’t really know how it’s going to work until it’s pretty close to done.
The Kite demo was like that. It started out as one thing and ended up with another. We didn’t even plan to have a character until near the end and we had no idea it would turn into a Pixar-style emotional experience.
Pip: I’m guessing that’s why you were saying it’s important that these things aren’t just restricted to one media type, it’s why you bring in other industries and fields because they feed in and bring new things?
Tim Sweeney: That’s been the most exciting part of working with partners outside the games industry. Working with McClaren, they absolutely needed a perfectly accurate representation of carbon fibre in real time and also their metallic anisotropic paint that makes their cars look so distinctive. An approximation isn’t good enough.
We studied these problems in a lot of detail and it’s fairly complex what they needed. So we worked with them and hand in hand we recreated all of these materials digitally as a new material library and downstairs [on the showfloor] you see a McLaren car and a McLaren 3D model in Unreal. I took a picture of each one and emailed them to my dad. I was like “guess which is which!” [laughs]
It’s really that accurate. But it’s drilling into very specific problems with customers like that that drives engine features, and once we create that feature it’ll be used in a hundred different ways by different people. But if we hadn’t done that first step of work I don’t think we would have ever gotten there.
Same thing working with architects who are creating what needs to be a physically accurate representation of an apartment or office building before they build it. They have up to a billion dollars of physical construction contingent on that 3D model being accurate.
It really blows our mind to see some of these things being done. And it certainly makes the engine better and these benefits come back to the game industry and result in better games too.
Pip: Looking to the future is there any problem or unresolved desire to represent something more accurately than currently exists that you’re personally invested in?
Tim Sweeney: There are two major areas where I think big research is needed that we are prioritising. One is realistic humans. We showed the absolute best of our state of the art work and our partners’ state of the art work here at GDC with real time facial motion capture on a photorealistic character.
Pip: The Ninja Theory one?
Tim Sweeney: Yeah, yeah, yeah! But eyes and skin and hair and all these problems are very hard to solve. But that’s really just scraping the surface. That’s a tool that’s ready for leading edge game developers. But the next step is these VR and augmented reality headsets having inward facing cameras that can do that in real time in games. You want to be in a shared 3D virtual world with other people and communicate with them on an emotional level and that’s going to require a lot more work to bring it to the masses.
The other area that’s of great importance is a more efficient pipeline for creating large worlds. Basically world-building. You want something that has the ease of use and scalability of Minecraft so anyone can pick it up and start building something but the graphics and visual quality and fidelity of the Unreal Engine.
That means bridging some huge gaps. Right now professional game developers with large teams can actually create those expansive realistic environments but there’s a huge cost to it. We want to bring it to everybody. That’s going to require a lot of a new tools, technology for modeling, for procedural creation of worlds.
Like, well if you could just sculpt out some mountains by dragging geometry in the world like a sculptor might do and then say “Populate this”. You now have a forest with trees and rivers everywhere. But like, “No, I wanted more of a desert!” Okay, great – more sparse vegetation, we’ll have a lot of rocks on the ground, rattlesnakes, trees by the streams and that’s it.
You really want that level of procedural control. We’ve learned how to build a lot of the building blocks for that but nobody in the whole industry has put it together into a big solution like that. So that’s going to be an ongoing research challenge for the next several years.
Pip: Thank you for your time!
For more posts about The Great Outdoors just check out The Great Outdoors tag page