Epic Show “Mind-Blowing” Unreal Tech Demo

By Quintin Smith on March 3rd, 2011 at 1:36 pm.

I ahve troble typign when the pretty mna loks at me with both his eys like that

Epic has revealed the next generation of its Unreal technology in a behind-closed-doors showing at GDC, VG247 report. What was being demoed was not, in fact, Unreal Engine 4, but rather “Unreal 3.975,” in the jocular words of Epic vice president Mark Rein. Words and pictures follow, but not footage. I imagine that’ll be doing the rounds next week, or thereabouts.

VG24/7’s giddy summary of the showing, and the content of the demo can be summarised thusly:

(1) DirectX 11 features added to Unreal Engine 3

(2) Several new features, including sub-surface scattering, shadowed point light reflections and bokeh depth of field, which have already been made available to Unreal engine licensees. Here’s the extent of VG24/7 gallery.



But until the footage itself is made available, the most interesting thing about all of this is Mark Rein’s comments as to what this technology represents. Since it’s beyond the scope of current videogame consoles, Rein said that “the whole idea behind this is to tell the hardware manufacturers that this is what you should be doing down the road,” adding that “If the next game consoles can’t do this, well, Apple increased [the graphics processing power of] their iPad by nine times today.”

One of the “awesome” things about the engine, Rein says, is that it “scales all the way from an iPhone 3GS up to next-generation hardware… you could theoretically make a game that’d run on every single one of these devices. Mobile phones to tablets to set top boxes.”

Quite how this tech would be scalable right down to the iPhone’s processing power but wouldn’t run on next-gen consoles is a little beyond me, and Alec’s out to lunch. I’ll pick his augmented brain about it when he gets back.

The purportedly “mind-blowing” tech demo that “looked like CG” was running on three Nvidia GTX 580s, which is, of course, a set-up designed to maximise the potential of the engine. I don’t expect Epic want everybody to buy another two video cards. OR WILL THEY? No, probably not.

EDIT: I’ve had a word with Alec! Rein is talking about whether the next generation of consoles can run this tech with all the visual bells and whistles. While the engine can reduce resolution, detail, lighting, shader effects and so forth to hypothetically work on an iPhone, Rein is suggesting to videogame hardware maufacturers that this is what they “should” be aiming to be able to do.

, , .

114 Comments »

Sponsored links by Taboola
  1. mrjackspade says:

    I want to see a Blade Runner sequel in this engine.

  2. Joe Maley says:

    As good as it looks, it will still play like a console port.

    • pakoito says:

      OH SNAP!

    • Calneon says:

      ‘it’ is an engine, not a game.

    • Joe Maley says:

      the current unreal engine does not support:
      -client-side prediction in networking
      -native anti-aliasing support in dx9
      -a way for clients to change field of view

      etcetera. it was built for consoles, and this new engine will be no different.
      especially because it uses deferred shading, which doesn’t support transparency in the algorithm. which means any anti-aliasing we get is edge-detection blur instead of transparency sampling. they usually just throw tons of post-processing on the scenes afterwards. that’s why all my console ports look like a blurry mess.

    • Calneon says:

      Still no AA support? Yeah, that’s ridiculous.

    • Premium User Badge

      Diziet Sma says:

      But none of that is related to ‘play’. It will still ‘look’ like a console port yes.

    • Joe Maley says:

      to be fair, these screens are dx11. I don’t know if its true msaa, but dx10+ has support for anti-aliasing with backward rendering.

      i don’t know why they’re developing all this new tech when i haven’t even heard of a next-gen console.

      when I said ‘played’ I guess I meant the looks and networking, which tend to break games for me.

      i.e.,
      -borderlands (pc) doesn’t support aa or vsync, and the co-op is unbearably laggy, and the only way to change fov is through a keybinding.

      -bulletstorm (pc) requires unencrypting config files to edit the fov. runs in a letterbox resolution, has trouble running resolutions that aren’t divisable by 8, every other game I join in multiplayer has ‘bad connection’ (despite that i’m running on ~30+ mpbs college internet). and mouse smoothing cannot be turned off with going through the same ini files.

      -moh single player (pc), the unchangeable fov with blurred edge aa

      -gears of war 2 (360), probably my favorite franchise ever. i loved gow1 on the console when I was young and nieve, and figured it was just bad aiming when my clear headshots would miss. turns out the networking is just awful. and the sequel was even worse, near every game I played was unplayable with pings that felt like 400-800ms.

      so i went and monitored gow1 (pc), and it turns out they have ridiculously extraneous package sending. no client side prediction, so every frame makes a call to update player position and such and passes it through the server yadda yadda, lots of lag.

      bottom line, unreal engines are built for consoles in mind and would require lots of work on the game developer’s end correct it for pc players – which never happens.

    • Urthman says:

      On the other hand, Batman: Arkham Asylum.

    • Pointless Puppies says:

      But none of that is related to ‘play’. It will still ‘look’ like a console port yes.

      Are you kidding? I can sniff console ports a mile away on my PC just by playing them. Clunky UI, clunky controls (that often leave aim assist on by default. WTF?), and just a general consoe-centric design. It’s extremely obvious when I’m playing a console port, I don’t know why you think it’s just aesthetics.

  3. Premium User Badge

    Lars Westergren says:

    >If the next game consoles can’t do this, well, Apple increased [the graphics processing power of] their iPad by nine times today

    Sounds like someone is itching to get the graphics wars started again….

    For those who are grumping about the current gen consoles holding back their PCs from unleashing their full awesome power, I guess it’s nice that a major developer like Epic are announcing that they aren’t going to be held back much longer by the hardware limitations of current consoles. Personally I prefer a focus on fun in games, and plot. But then again, I cheered for the Wii to win the console wars before they were all released, and look how that turned out!

    • Urael says:

      Yeah, but it’s also a shame they can’t. y’know, focus on PC’s that can already accommodate these improvements, rather than egging on the console industry to reiterate their hardware.

      But then this is EPIC we’re talking about, whose name by now must stand for their oft-repeated moan: Everyone Pirates Intellectual Copyrights!

    • BAshment says:

      How did it turn out?

    • Premium User Badge

      Lars Westergren says:

      @Urael

      In a way they do focus on the PC since it will be the only platform out that can use the engine to its fullest. But if you want them to make PC only engine, I think Epic would consider that an economically risky proposition. As for them mainly talking about the consoles in the announcement or marketing… meh. It is how it is.

      If developers take the time to create detailed assets that makes the PC version look significantly better (as is already happening, see Dragon Age 1, Arkham Asylum or Bulletstorm) the pressure will mount on Sony and Microsoft to release a new generation of console hardware.

    • Premium User Badge

      Lars Westergren says:

      @BAshment

      They did win the pure numbers game, but then I realized I didn’t want to buy one since the games (from what I’ve heard) are 95% cynical shovelware, 4.95% gimmick-ware, and the remaining 0.05% Nintendo titles which, while fun to play, are the same IP they’ve been recycling since the 80s.

    • Robbert says:

      Apple didn’t start any graphics war. NVidia did when they announced the Tegra 2 and Samsung when they released the Galaxy S with the fastest graphics chip of its time and the still second fastest graphics chip in any mobile device with the exception of Tegra 2 devices. Apple is just following the trend.

    • Premium User Badge

      Lars Westergren says:

      @Robbert

      I wasn’t suggesting Apple was trying to start a graphics race, I meant Epic was doing it by trying to play on out console makers fear of Apple’s continued growth and dominance of certain markets. (“Do you know what I heard that guy say about you? Are you going to let him get away with that?”) But it’s a silly comparison, it’s easy to have order of magnitudes greater growth of GDP or graphics performance or whatever when you are a fraction of those you are being compared to.

    • DOLBYdigital says:

      I know this is a PC gaming site so I don’t expect many to know much about the consoles. However please do not spread the pointless lie that the Wii has no good games. Quite tired of hearing that from people who try 3 games on it and jump to that conclusion. Here is a quick list of phenomenal 3rd party games off the top of my head, there are more and plenty of great first party as well.
      – Muramasa
      – Monster Hunter
      – Red Steel 2
      – Dead Space
      – Geometry Wars: Galaxies
      – Scarface
      – Silent Hill
      – Zack and Wiki
      – Okami (best version)
      – RE4 (best version)
      – Plenty of wiiware (don’t feel like listing)

  4. Jacques says:

    Bokeh depth of field makes me happy, but only if they give devs the option to select what shape the bokeh has based on how many aperture blades are in the camera.

    • Jonathan says:

      I’ll dust off my old anti-lens flare soapbox for this feature, I expect. If my character is looking at the game world through a camera or similar device then sure, go ahead and show some lens flare bokeh. Otherwise leave this nonsense out; eyes do not cause lens flare bokeh!

    • stele says:

      THANK YOU Jonathan for pointing this out. One of my biggest peeves in game graphics is unnecessary flare/lens effects when my character is not looking through any lens.

      Also, yeah the graphics look nice but if the game ends up playing like another Gears of War or Bullestorm or CoD then WHO CARES? Give me some innovative gameplay please.

    • Jacques says:

      I never said it had to be used for in-character cameras, but it would make cutscenes look rather delicious.

    • simonh says:

      Well actually eyes cause just as much “bokeh” as anything else with an aperture.

      I’ve never been bothered by the “photorealistic as in shot with a camera” approach. The display technology necessary to approach human perceptual realism isn’t avaliable yet, but true camera realism IS possible, so I think it’s a valid and more modest goal to aim for.

      This is what you’d need to approach human perceptual realism with a screen:
      *A screen bright enough that if you look at the sun in the game you’d actually be blinded
      *Good 3D vision
      *Head tracking to adjust for parallax
      *Eye-tracking to adjust focus

    • El Stevo says:

      Without some kind of eye tracking to work out what your eyes are actually focusing on depth of field will never work. I’m not always looking at the centre of the screen.

      Edit: What simonh said. 20 minutes ago. Oh the perils of leaving a tab unread in the background.

  5. karry says:

    “as to what this technology represents.”

    I’ll tell you what this technology represents.
    – new generation of hardware
    – which will consequently pull down the accents on optimizing your damn code
    – games mod-ability will go down some more
    – the budget percentage for graphics will increase again
    – therefore the corporations grip on the game market will become even stronger
    – therefore nothing good will come of it, ever.

    • Quintin Smith says:

      Your passion is admirable! But perhaps a little ill-timed, what with indie games becoming a more and more wide-spread and noteworthy part of the games market.

    • sassy says:

      I think you are being a bit cynical there, what I would say it represents is:

      we will still be getting bad console ports but now they will claim to be so much better, only they will have rushed the technologies into the game and ultimately make it more buggy.

    • StranaMente says:

      This. A thousand times.
      It’s true that indie games are making much against this trend, but still big budget games are going all Michael Bay on us.
      Graphics are “pwetty” but let’s focus on plot and characters a bit more!

    • CMaster says:

      This is one of the reasons that CryEngine2 is more interesting in a lot of ways. They’ve put a lot of effort into trying to make asset production easier, which strikes me as much more important than improving fidelity these days.

    • Potentaint says:

      What Quintin said. All this means is that indie developers will be able to make use of “Last” generation’s engines for cheap as they are forced out of the premiere developers’ tool boxes. My guess is the major game engine players will loosen the grip on game engine code to allow more customization of the engines themselves. New technology is good and PC gaming is as strong as ever right now; try not to be too much of a cynic. I think developers as well as publishers are really learning from their past mistakes that involved the gameplay crisis that got a foothold when this generation’s consoles launched. We have seen a huge resurgence of very playable games lately. I’m very excited for a new graphical era.

    • Consumatopia says:

      Yeah, I’m not sure how much it matters whether games are mod-able when you can just download Unity or UDK and go to town making your own game.

      And the kinds of hardware people get to do fancy graphics/physics for games like these could also be used to do AI or vision in other games or applications.

    • crusty1 says:

      Well sorry to p1ss on your view but a chunk of this unreal tech improvement is due to enlightment integration (google Geomerics if you not heard of it).
      This middleware can do stuff on CURRENT hardware so all this rot about ‘pushing the hardware to next gen’ is poppycock.
      BTW DICE’s frostbite 2 engine also has enlightenment integration.
      Whats it all mean? Better lighting effects for all (except those on Netbooks :P )

    • karry says:

      “games are mod-able when you can just download Unity or UDK and go to town making your own game.”
      Do you REALLY understand the difference between making a mod for a high-profile game, and making your own game ? Difference both effort-wise and effect-wise ? I dont think you do.

    • golden_worm says:

      The more you tighten your grip, Corporations, the more Indies will slip through your fingers.

  6. KauhuK says:

    So instead of making good games (and good looking games) for PC they want new generation of consoles. Figures.

  7. ross_angus says:

    Staring eyes tag, please.

  8. Turin Turambar says:

    Looks very good, but not next-gen. It actually looks like very close to what would look a game done with Unreal Engine 3.5 designed a 2011 computer. In fact, still have the kind of dirty dark color in the lightning that plagues lots of UE games.

  9. Nemon says:

    Is he even going to finish that cigarette? Or is it just for posing?

    • Quintin Smith says:

      Haha. I just checked the screenshot to see what you were talking about, saw the cigarette and physically winced at the wastefulness.

      I need more money.

    • subedii says:

      It’s symbolic. The cigarette represents current generation graphics, which he’s flicking away well before he’s finished with it.

      Maybe he’ll pull out a cigar next, representing UE4. And it’ll have LED’s built into it or something.

    • Jonathan says:

      Realistic object burning isn’t due til UE5.

    • Kdansky says:

      You could also stop smoking. That way, you

      A: Have more money.
      B: Don’t want cigarettes, which means you effectively have twice as much more money!
      C: Don’t have to listen to people telling you that you should stop smoking.

    • sana says:

      Check the pictures more carefully and you will find that the cigarette DOES burn up between shots!

  10. ghost4 says:

    Too bad nobody will use this to make any good games.

  11. DeepSleeper says:

    Bald space marines.
    Bald space marines stretching out as far as the eye can see, an unbroken shiny-headed line.
    A solid string of viscous tears on their cheeks, unable to drip or fall.
    Their chrome armor only reflects each other.

  12. Edawan says:

    It sure looks pretty, but what we most need is improvements in animation, and particularly facial animation.
    Crysis was quite impressive in this area, but so many game character still look like stiff puppets…

    Also, SSS was already used on some characters in Mass Effect 2.

    • subedii says:

      I agree, before pushing for new hardware and raw technical POWER, things like good art design and art design will have a far greater impact.

      Good or bad, the bigger question is whether studios can actually AFFORD a new hardware generation and attempting to push production values to those kinds of levels.

      This hardware generation’s been drawn out with visuals looking “good enough”, and no pressing need to push for extra heavy hardware or visuals. And the reason for that is simply because production values are already as high as they feasibly can be right now. 99% of games won’t really achieve any sort of improved effect from adapting higher production values than they’re currently using now, and pushing for higher production values is simply going to cost a heck of a lot more.

      Epic can put out shiny looking CG tech demos illustrating a Blade Runner esque scene, but can even the major players actually afford to create visuals of that nature for a full length, fully interactive game right now?

      I’ve said this before, but devs pushing hardware specs for their own sake was one of the worst aspects that characterised the previous generation of PC Gaming, and they learned the hard way that this doesn’t automatically translate into the increased sales necessary to fund the production values they’re going for. They simply EXPECTED that everyone would keep up with the hardware curve, and when it ended up that they had effectively priced themselves out of the reach of most of the market that way, they cried foul.

      Meanwhile across the rest of the industry, you’re constantly hearing other major developers and publishers like THQ talking about how a new hardware generation now would be a massive mistake because they simply can’t afford to push visuals any further. Outside of advertising, art assets are still probably the largest aspect of most AAA projects, and pushing them means hiring more people to do more work (and there’s a logistics issue there to begin with when increasing team sizes of hundreds are already seeing diminishing returns) for a game that isn’t going to necessarily reach a larger audience as a result of them.

      This isn’t just about pushing the best visuals, it’s about the practicality of implementing them. If it wasn’t we’d already be well into a new console generation by now, and MS and Sony wouldn’t be trying to extend things with motion controllers in the hopes of emulating the Wii’s success.

      I realise Mark Reigns talking about the capabilities of the “next” generation of consoles, but it would be short-sighted to conflate the idea that “we could potentially make these visuals” as actually being able to.

      Is “The Last Guardian” really something that would gain more impact from having newer hardware attributed to it with “next gen” visuals? Or is it more down to how they’re making use of it to make incredibly emotive scenes and characters?

    • ghost4 says:

      subedii, I’ve recently been playing through Thief: The Dark Project, and it’s amazing how atmospheric and immersive the game is despite graphics that were a bit outdated even in 1998. Until there’s a way to dramatically reduce the amount of work required to produce cutting-edge art assets, developers should consider scaling down.

    • subedii says:

      Yeah. I mean not that I’m a luddite or anything, I loved the visuals of Crysis as much as anyone else. But recently I’ve been playing an indie point-and-click adventure game called Gemini Rue, which has visuals that would probably be considered on par with the original Monkey Island if it weren’t for the fact that it has more than 16 colours.

      The thing is, it works, as an adventure game it draws me in far more successfully with its characters and its setting than most modern games ever manage. A lot of that’s simply down to the power of abstraction, you’re viewing these blocky characters but you can far more readily accept something like that than you can an almost right looking fully 3D character model, with sort of working facial animation. It manages to have more atmosphere than most games really manage to give me with all the HDR lighting in the world.

      Oh and tips for anyone playing it: Turn off the speech, just stick with text. It just works better that way.

    • D3xter says:

      Improved Facial Animations are already in the works e.g. see L.A. Noire: http://www.youtube.com/watch?v=xvkjcDq5zqM&hd=1

      As for “Art Design”, why not do both? More often than not certain assets are being made in a much better quality than what you see in games at points from High Poly Models to High Res Textures that are just being downgraded for current hardware to run… A lot of the textures are even just pictures of things made in the real world… of postcards, papers, wood or whatever and they end up blurry and unrecognizable even in the PC version of certain games… why?
      Also DX11 features like Tesselation have the potential of increasing how “good” a virtual world looks a tenfold by adding depth and plasticity compared to the good old “flat” surfaces we got used to depending on how it is used.

  13. squirrel says:

    Thus this implies that this upgraded Unreal Engine may be a key to lower production cost for 3D animation production? Which means, more good movie in the future?

    BTW, since this is still UE3, will this be included in the UDK?

  14. V. Profane says:

    Does it do anti-aliasing now?

    • Premium User Badge

      shoptroll says:

      I think a couple other blogs were saying they announced MSAA support via DX11. Which is weird because there’s already a handful of UE3 games that do native AA despite the deferred rendering.

      However, if Nvidia is helping them out with integrating DirectX11 into the engine I’m expecting them to at least shaft the ATi users again. Prime example being anti-aliasing support in Arkham Asylum which was disabled for ATi cards.

    • Joe Maley says:

      http://en.wikipedia.org/wiki/Deferred_shading
      the d3d9 library does not natively support transparency in the deferred shading algorithm, therefore the standard msaa algorithm does not work either.

      however the d3d10+ libraries do support transparency, so msaa works on them.

      also, like it says in the wiki, the only anti-aliasing available for ut3 engine games I’ve seen (bulletstorm), only sample the first pass of basic geometry. so only the edges get blurred. it will not blend with stuff like internal geometry (the modeled details on a model), and lighting.

  15. Premium User Badge

    shoptroll says:

    Interesting. Sounds like they’re not pleased with the current console cycle drawing out due to rising development costs and the current global economic situation. Weren’t they originally targeting Unreal Engine 4 for the next generation of consoles with an estimated release date around 2012 or 2013?

    EDIT: Yes. Yes they did: http://arstechnica.com/gaming/news/2008/07/epic-games-unreal-engine-4-ready-in-2012.ars

  16. itsallcrap says:

    These are really very pretty graphics.

    What will it look like running on a computer worth £500?

  17. kyrieee says:

    The new CryEngine 3 demo looks better
    http://www.gametrailers.com/video/gdc-11-cryengine-3/711208

  18. Javier-de-Ass says:

    looks like a meatcube man

    MEATCUBE MAN

  19. Pijama says:

    The best part, IMHO, is the scaling – if it truly works, of course. :)

  20. Ridnarhtim says:

    Why not Unreal 3.P1C?

  21. bartleby says:

    I thought “bokeh” was some sarcastic Anglicism until I googled it.

    • Premium User Badge

      Harlander says:

      As in “I can’t believe they’re trying to foist this bokeh depth-of-field off on us”?

      Huh. I rather like it.

  22. Postal76 says:

    I can’t even run UE3 all that well with my 5870 — I can only imagine the hardware required for the next iteration. I had to turn off AA and turn some settings down in Bulletstorm to get it to run well.

  23. crusty1 says:

    Unreal 3.999 (what ever) & Frostbite2 are both coming with geomerics middleware which doesn’t need next gen. So don’t fret about it.

    Go see http://www.geomerics.com/enlighten/
    Its not new but its in BF3 & its coming to others soon to.

  24. Magrippinho says:

    @Quinns

    “Quite how this tech would be scalable right down to the iPhone’s processing power but wouldn’t run on next-gen consoles is a little beyond me”

    I think Rein means that:

    a) The engine can only run at its best, with all its features turned-on, on a top-of-the-line PC.

    b) It can, however, run on a 3GS iPhone, if you turn the resolution & features way down, without making it look ungodly awful.

    c) Console makers don’t care now because they think top-of-the-line PCs cost bazillions of dollars and nobody has one.

    d) Console makers should care soon, because the iPad [2 + x] will be able to run their latest Unreal engine at its best and, unlike top-of-the-line PCs, everyone has at least one iPad.

    He’s excited about making games look good, so he’s making his case on pushing graphics hardware!

    I’m on the camp that prefers better gameplay to better graphics, but I don’t think they’re mutually exclusive. You have the guys like Epic that work on neater graphics, while other developers work on neater gameplay concepts. Every developer borrows the best from one another whether it’s an efficient shader or a “karma-meter”, they all move gaming forward and we live happily ever after.

    • Sourav Chakraborty says:

      everyone has at least one iPad

      ?!

      Are you steve jobs’ ultimate sweet dream?

    • mrjackspade says:

      Wet dream*

    • crusty1 says:

      ITs not hardware that is being pushed its new codepaths (ie software). Current middle top GFX cards will do fine.

    • Magrippinho says:

      Amusingly, I don’t have an iPad (or think that everyone has at least one), but I do have a restraining order against Steve because he kept pestering me about his wet dreams. Things got pretty weird.

  25. Megadeth89 says:

    RED ORCHESTRA 3

  26. Premium User Badge

    stahlwerk says:

    Yay for vertical engine tech! Boo for having to design and debug control schemes and assets for 15 years worth of technology. Just because you could target everything and the kitchen sink doesn’t make it feasible for every licensee, provided they don’t want to produce “bad console ports” (in both directions).

    </grump>

  27. fionny says:

    Unreal engine, crashing my computers for no apparent reason since its inception.

  28. SuperNashwan says:

    Maybe this time we won’t have a generation of npcs that appear to be horrific waxwork facsimiles magicked into life.

  29. Pointless Puppies says:

    Lovely. Now at least when I play horrible console ports on my PC’s they’ll have shadowed point light reflections! Every PC gamers’ dream! ¬_¬

  30. sneetch says:

    Impressive but any chance of seeing what it’ll look like when everything’s not all wet? Whenever I think UT3 games I think “that looks a bit slimy” (obviously not all of them, but the makers of a lot of Unreal games seem to overdo the vaseline filter).

    I’m also impressed that he not only managed to light that cigarette but also that it stayed lit in that kind of rain.

  31. omicron1 says:

    I’d like to see a next-generation of consoles be truly “final.” Make them good enough to render this scene interactively in real-time and then some – use 2014 PC hardware if you have to – and let us put to rest the console war once and for all.

    Then we can worry about implementing procedural generation in EVERYTHING!

    Especially, get some common algorithms for various materials and model types – skin textures, rock textures, weapon textures, cement textures; trees, procedural animation, terrain, human facial features… there’s so much possible here, if you just stop focusing on new filters and start focusing on content creation.

    /rant

  32. DestructibleEnvironments says:

    Oh look, Sam Fisher de-aged another ten years. I wanna know what drugs THAT guy is on.

  33. vodka and cookies says:

    As a multi-platform gamer this is bad news for the games industry, Epic’s business model is an arm’s race which they freely admitted to a 2 or 3 years ago.

    Pushing up development costs is not good for the whole games industry, even now many companies are still hurting at the costs of current game development. I do hope there are no more game consoles for a few more years because
    1) It keeps graphics whore companies like Epic or Crytek in check (who hurt the PC not help it)
    2) It keeps costs down allowing more games and less risk in case of failure (which is pretty high as it is)
    3) There are still plenty of people out there will absolutely refuse to buy a game with less then stellar graphics which is deeply disappointing and Epic trying to raise the bar higher will only make it more difficult for everyone else.

    • hamster says:

      The graphics war that’s been going on for a decade now is, i suspect, a result of too little industry specific expertise from management. The same “problem” (if you can call it ) is present in Hollywood and and any multimedia industry. The core idea is to hedge. Graphics, famous actors, famous artists/bands, famous brand names, aggressive advertising: all of these elements invariably attract a definite return which is mathematically predictable. It is a triumph of inductive over deductive reasoning. And it works. Afterall, who here has watched a movie only because Angelina Jolie is in it? Or Jet Li? (only to end up getting burned).

      A generally more traditional approach is to supervise and ensure the quality of the product by analyzing & evaluating it on its artistic merits. However, the highly subjective nature of multimedia experience and, more importantly, the seemingly RANDOM quality of output by essentially the same development team has made this approach a bit of a gamble – notice how the Matrix sequels suck badly even tho they’re directed by the same guys and generally movies by 1 director has as much of a chance of enjoying widespread success as winning Razzies. And of course, dev costs are so high that corporations become “very” accountable to investors.

      That being said I don’t think the industry will ever reach the point where it just becomes unsustainable. I suspect that initially we’ll just end up with a few oligopoly brands up top and then we reach the point where games with decent graphics aren’t too jarringly different from games with AAA graphics. Then it really becomes a free for all. And in the periphery of this maelstorm, RPS’ raison de’tre (indies) will continue doing what they do and we’ll still be very cool.

  34. Davie says:

    The scalability is nice. If there’s one thing Epic does well, it’s optimization. The state of my PC’s hardware is tenuous at best, and it’s always nice to be able to turn things down for when my graphics card is making excuses.

    Actually–bit of a tangent–Red Alert 3 really impressed me when it came to scalability. The “low” settings made it look like the same game from 2003, rather than a half-finished alpha version of a current-gen game, which is what a lot of developers seem to be fine with. Said developers keep the specular and anisotropic filtering on permanently and just turn the model and texture detail down all the way, so you have to play with blurry, polygonal messes that are lit really well.

    I guess it’s a minor issue that only presents itself in PC games, but I’m always impressed when the devs take the time to make the game look good on crap machines.

  35. Mikko-Pentti Einari Eronen says:

    While I’m fascinated by the developement of these game engines, I still can’t and will not grasp on the fact that we are still generally hanging on the Directx. Funnily enough Epic states that hardware companies should push forward the boundaries, but they themselves are stuck with Directx.

    Biggest problem to me is that we see these tech demos all the time which are fancy sure, but they are only demos. We rarely see anything like this ending up be a full game product because increasing amount of developers are forced to spew out games in to the markets too early and too fast to have time to polish any game to really look like this.

    I raise my hat to the people like Frictional Games and other teams who have the guts to see efford to develope own technology to run their games. Sure they don’t push any general boundaries, their technologies are not looking anything like this. But if you dig under the hoods and compare, you’ll see that the line is getting thinner and thinner between major developers and smaller companies.

    On the other hand it’s good that we have this “competition” to keep various developers motivated to do their own stuff, and most importantly to push their own boundaries forward.

    Unreal engine is good, very good. Infact I love it, I don’t deny that. However I seriously have problems when it comes to the fact that majority is still “loving” the Directx without knowing that it’s not that special at all and we would probably be better off without it since more and more people are not using Windows anyway. Not saying WIndows would be bad again, but gaming industry is seriously blinded to the Linux community for example and oh what market they would get from there with good games. Few smart developers already have done so.

  36. foda500 says:

    How about you go back to makng some fucking PC games now Epic?

  37. Angryinternetman says:

    I swear this engine or Crytek engine cannot handle Cannon Fodder. Thats why the game hasnt been remade.

  38. bill says:

    this from the company that couldn’t get unreal 3 to work on the wii?

  39. RegisteredUser says:

    Don’t care.
    Epic has nothing short of ruined both games themselves and PC gaming. Slavish console obedience, crappy ports and utterly stupid design choices imposed upon their game developers(hurrr 2 guns should be enough for everyone durr).

    I hope they go away and never come back.
    Oh, and the game Unreal I?
    It sucked.

  40. AIEmpire says:

  41. Lukasz says:

    Motion Blur… God. How I hate it.
    It is artifact of inferior production methods of movies. Why the hell are they trying to implement it in games?

    • Dominic White says:

      Because it’s how the human eye works? I get a fair amount of visible blur just waving my hand in front of my face. Overdone, it’s a horrible-looking effect, but with just the right amount applied, it makes things look far more natural.

    • Lukasz says:

      It is not how human eyes work
      If you move your hand faster than you can actually look at it it will be blurry.
      So will animated hand moving fast enough.

      Any application of blur in game looks artificial to me because I look at something and it is blurry while it should not be at all as I am actually looking at it.

    • Dominic White says:

      If you look at your hand and track it as it moves quickly, then yes, your hand will not appear to blur. But the rest of the world will. It’s why motion blur, depth of field and object tracking are all part of the same set of problems regarding graphical fidelity.

    • Vague-rant says:

      Fun Fact; Moving your hand rapidly whilst trying to look at it will result in blurring. But moving your head at a comparative rate while looking at your hand results in no blurring. The eyes just aren’t very good at proper tracking, but the vestibular system is.

    • Lukasz says:

      @Dominic
      You are referring to peripheral vision.

      Same problem is with depth of field.

      When I look at batman then a mook far away is blurry cause of depth of field. but when I look at him he is not like he wouldn’t be in real life. If something moves fast it will blur by itself just like vague said. It does not need help from some special effects.

      Depth of field looks great on screenshots. Tried to use it in Empire. First reaction was that indeed it looks great. Took 3 minutes to realize that it is just stupid because when I look at something it should be clearly visible, i should not think that i have to move my mouse up by 1.4cm to make that soldier come into focus.

      Can you give me any titles where you believe both Depth of Field and motion blur work perfectly?

  42. kennycrown says:

    thanks for you post