Here’s How Mass Effect Andromeda Will Look In 4k

Sony just held a somewhat dry press conference about the release of their new PlayStation 4 models (a skinny one called the PS4 slim and a fat one called the PS4 Pro, if you’re interested) but the new Mass Effect Andromeda [official site] was also among the games they showed off. The footage is marked as a ‘tech demo’ to boast about the benefits of 4k, promising “crisper visuals, high dynamic range lighting… and some of the most lifelike characters we’ve ever created.” It’s also incredibly boring.

So, uh, yep. That’s definitely some footage of a dude walking around a dark place and pressing buttons. But again, it focuses on how the game will look, not play, for players on the new PS4 Pro, but it’s not hard to imagine that we’ll get the same quality, if not better, on PC with 4k monitors and all the best innards.

We don’t know much about Mass Effect Andromeda but we do know that your decisions from the previous games won’t have as strong an influence as they have previously. The final ending for example, won’t be tied into the new story. Meanwhile, you’ll be put into the spaceboots of a new main character, Ryder.

As for the 4k stuff, do you care? I often feel people get caught up in the push for “crisper” and “clearer” images when what we have is already fairly astounding. The drive for some people to have every game be 60 frames-per-second flummoxes me. Of course, it wouldn’t be the tech industry without some good old-fashioned planned obsolescence.


Top comments

  1. Turkey says:

    It's funny how all the wonder just went straight out the window once he vaulted that waist-high cover.
  1. newguy2012 says:


    It is still dead to me.

    Marauder Shields did not want this for us.

    Just no.

    • Sardonic says:

      He died trying to save us from the ending, a true American hero.

    • Pliqu3011 says:

      Haha, I had almost forgotten about the whole Marauder Shields thing.
      Sometimes the internet can be a nice place.

    • shde2e says:

      That top image immediately kicked off flashbacks to TIM and his eye-destroying office/show-off room/thingiemajing.

      And frankly, how it looks is just about the LAST thing anyone is worried about with this game.


      • shde2e says:

        Whoops, comment mistake.

        Also, Let’s hope they fired everyone involved in writing Cerberus or 70% of ME2+3.

  2. X_kot says:

    Whoa whoa whoa…the main character is named Ryder?

  3. Flea says:

    I can see that even when they take the Frostbite engine to the next level, they still can’t render hair or faces to be any more realistic. Just compare the face and hair of the character from this video to Geralt’s in Witcher 3.

    • Asurmen says:

      Seems fine to me.

    • Zenicetus says:

      I noticed that too. The main character had that wooden puppet look. The Asari was better.

      I thought Witcher 3 would set more of a standard for believable faces in AAA games, but recent games have ranged from fairly good face modeling in Rise of the Tomb Raider, to really bad, stiff faces in DX Mankind Divided. This is Alpha footage and we didn’t see much of it, so maybe the final game will be better.

      • DwarfJuggler says:

        I suspect it’s an issue of the uncanny valley becoming to readily ran into. The amount of effort to make every character believably real compared to the main characters is probably quite the task.

      • ilitarist says:

        I thought LA Noire (2011) would set a new standard for facial animation. But everyone ignores this technology and doesn’t even try to come close to it. Even famous Witcher 3 is not that good compared to LA Noire.

        • laiwm says:

          L.A. Noire was hecka labour intensive, there’s no way they could have done every scene in a game like Mass Effect using that technique. Plus it still had boatloads of uncanny valley because of how disconnected people’s facial movements were from their bodies.

      • Premium User Badge

        phuzz says:

        I’m not sure how much of a difference it makes, but I assume that the player character will be customisable again, whereas the NPCs have fixed faces.
        I imagine it’s harder to make a realistic looking face that can also be customised by the player.

  4. Vandelay says:

    I feel that this sudden push from consoles for 4K is far, far too soon. Looking at the PC tech market, high end graphics cards are still struggling to get comfortable 4K gameplay, so how do consoles imagine they will manage it, even when targeting lower framerates? Having a PS4 and seeing that system not even manage 1080p for most of the games makes me think they are trying to run before they are even able to crawl (but I would also disagree with Brendan and say they should be aiming 60fps instead of higher resolutions.)

    As for the actual footage, it was dry and hard to really tell quality with Youtube compression going on. It looked pretty good on my 1440p display, but it was also a fairly uninteresting environment. And there were some inevitable framerate issues on display even in the promotional material.

    • Michael says:

      If I had to guess, this PS4 pro will be upscaling content to 4k, not running it natively. that might allow it to get the 60fps that it needs.

      • laiwm says:

        I wonder if the kludge will be upscaling the picture but rendering certain elements in 4k for better perception of crispness? I don’t know much about render pipelines and whatnot, but I wonder if it’s possible to render the UI and those floating point lights in 4k but have the rest of the scene be upscaled from 1080p.

        Or maybe they’ll just be upscaling from 1440p – the visual improvement will still be big enough that people won’t complain.

      • Enso says:

        They did say developers would be releasing patches for their existing games to add 4k support

    • TillEulenspiegel says:

      Also, nobody owns a 4K TV.

  5. aepervius says:

    “The drive for some people to have every game be 60 frames-per-second flummoxes me.”

    while it is true it does not make sense for every game (checker with 120 fps anyone ?) when it comes down to action game (fighters, fps, action games in general) it makes sense and it make a few difference : 1) the animation are less janky, and some of us can spot the difference 2) your input minimum reaction time changes. At 30 fps it will be 1/30 of a seconds. At 120 it will be at minimum less than 1/120 seconds. Sure self evident but it can make a huge difference between perceived “lag” behind input to screen reaction.

    Mass Effect being an FPS and having a lot of animation on screen on characters ducking running and shooting, I think both points affects its.

    • PenguinJim says:

      As one of those people who isn’t overly-affected by 30FPS (I played through Dark Souls that way, rather than risking ladder-falls, and had a jolly old time), I am absolutely astounded by Brendan’s comments here.

      60FPS is noticeably better. Games do play better at 60FPS. 60FPS really should have been the minimum target for the last ten years, and today we should be discussing why so few games hit 120FPS.

      As for Brendan’s suggestion that things look good enough today, why should they bother improving… I can’t even…

      The push for improvements in the acceleration of 3D rasterization has had numerous beneficial sidelines, including medical technology. Simply suggesting that games look good enough today, no need to improve further – well, I can imagine there were people saying similarly short-sighted things at the SNES launch.

      And then linking to “planned obsolescence” as if it’s something relatively unknown – doesn’t just about everybody know what it is nowadays?

      Such a poorly-written article with such nonsensical ideas. But on the other hand, at least it wasn’t another No Man’s Sky article.

      • Awesomeclaw says:

        Dark souls is a great example because going from ds1 at 30 to the newer games at 60 is a really good showpiece for how much smoother 60fps looks.

      • Jabb Stardust says:

        I think Brendan’s comment was spot on, and a lot of people are knee-jerking at their hasty interpretations.

        Note that he said “crisper & clearer graphics”, so meant the higher resolution s (and probably 3D model complexity and fidelity), which have been used as quantitative numbers for visual impressiveness – and misused by marketing departments to drive design. If 4K at 60fps becomes a priority, it can affect and restrict visual design. I think Brendan simply wants to see the emphasis veer towards the latter. But it’s tough. It’s like suggesting the car enthusiasts to concentrate on the emotional, subjective stuff rather than horsepower charts and numbers.

        Like Brendan said, not all games need 60 fps. Not all games need 4K. Anyone saying that enjoyment comes from those things are probably looking at VERY specific and small subsets of games and gaming habits.

        PS: First time poster, on a mobile, so I hope I’m replying in the correct spot.

  6. Turkey says:

    It’s funny how all the wonder just went straight out the window once he vaulted that waist-high cover.

  7. stringerdell says:

    Once you’ve gotten used to a minimum of 60fps, busting out an old console and dealing with 30fps feels like a slideshow in comparison. There is a huge difference between the two.

    Also considering the priciest graphics cards available today still can’t manage a stable 60fps at 4k resolution, the new ps4 will definitely be upscaled 1080 or at most 1440p

  8. Couchfighter says:

    I feel the focus should be to first have all the new console games running at 60 frames per second, instead of the more ‘cinematic 30 fps’.

    After that is taken care of, sure. 4K would be quite welcome, as it would hopefully lead to games relying with higher quality textures instead of hiding really low resolution ones behind heavy motion blur and other processing effects.

  9. milligna says:

    Cut to Chris Roberts’ face.

  10. woodsey says:

    “The drive for some people to have every game be 60 frames-per-second flummoxes me.”

    Kinda feel like you’ve never played a game at 60fps or you’ve never played a game under 60fps.

    • ComradeSnarky says:

      Yeah, coming from a PC-focused site, this and the “crisper” and “clearer” comment regarding 4K are bewildering.

    • Creeping Death says:

      I’ve played plenty of games at 60 fps and while there is a clear difference between it and 30fps I certainly dont MIND having to play a game at 30.

      What’s really important is that the framerate is stable.

      Having said that…. The consoles should really be trying to provide a solid 60 at 1080p long before they even consider 4k.

  11. Anti-Skub says:

    “The drive for some people to have every game be 60 frames-per-second flummoxes me”

    Uh…why? 60FPS has nothing to do with pushing tech for nicer visuals. It’s a straight up user interface and gameplay issue. 60 FPS is about the point where lower performance introduces noticeable input lag. Unless you’re playing a turn based game, the 60 FPS push is about responsive gameplay, not pushing graphics.

    • dashausdiefrau says:

      except that for many RTSs logic is totally independent of graphics updates, and it can be under 30 tick to avoid networking issues between clients

  12. Kaldaien says:

    That picture at the top of the page isn’t going to sell a soul on the argument that the game looks good at 4K. As someone who plays games at 4K regularly, Tales of Zestiria looks better at 4K than that does and it’s not a looker by any stretch of the imagintation.

  13. Severedmeds says:

    “The drive for some people to have every game be 60 frames-per-second flummoxes me.”

    I thought this website was for PC games?

  14. Yontevnknow says:

    The drive for some developers to have every game be 28 frames-per-second flummoxes me.

  15. Premium User Badge

    particlese says:

    Having recently gotten a 2560×1440 144Hz monitor, I’d say that, for me, the frame rate difference has made a much bigger impression than the resolution difference (coming from 1920×1080 60Hz). I haven’t noticed the resolution at all aside from some numbers in options menus. Before that, a video card upgrade brought most games up to 60Hz if they weren’t already, and that was an even bigger leap.

    I’m personally looking forward to the more colorful aspects of the SuperUHD++plusomgwtfbbq Premium “standard” — higher dynamic range, a wider gamut, better color precision, … though I’ll still be waiting for color accuracy and latency reviews, and I expect I’ll be sticking with my current monitor for a while to get through the early adopter pricing.

    • Premium User Badge

      phuzz says:

      How do you judge colour accuracy in a space game? Is that Asari’s skin the right shade of blue?

      • Premium User Badge

        particlese says:

        Heh! Yeah, that’s a valid shot at color snobbery, and I mostly agree with it, but I’ll address it anyway.

        Since games can be pretty varied regarding artistic intent (which is great), my main judgements would mostly be obvious stuff: “Does everything have a blue (or whatever) tint to it? Does it have ridiculously saturated colors all the time? Are the edges/corners of the screen oddly bright when looking at empty space or a mostly-black UI? Is my spaceship gray at the top of the screen and magenta at the bottom, within the same scene/lighting? Is there clearly unintended banding, unintended dithering of solid colors, or even quickly flickering colors? And if it’s obvious enough that I’m actually thinking about it, is all that stuff fine on this other ‘good’ monitor?” (The banding and whatnot is precision rather than accuracy or whatever, but still.) My laptop screen and a lot of TVs I’ve seen in stores are really, really blue, and sometimes they’re set to some super-saturated preset to try to catch eyeballs, and I don’t like it, regardless of game type. On the other hand, my old desktop monitor has that top/bottom color shift and some of the precision issues, but I don’t notice them while playing most games, most of the time. All that stuff’s usually just the cheaper screens and the mid-range ones with tons of features I don’t care about, anyway, but since I don’t want to buy something super-expensive just to avoid it, reviews which carefully evaluate color are helpful when browsing/shopping online. It also helps with whittling down the overwhelming number of TV options, though I’m putting that off for a while, now that I have a new monitor.

        All that said, I’d be using the monitor for a lot more than just playing a space game, which is the context I had in mind when I made my initial comment, but that’s a different can of worms. :)

  16. Paul says:

    I found the demo super boring and unimpressive as well.
    Hopefully it will be a great game, I like the original trilogy.
    This though:
    “The drive for some people to have every game be 60 frames-per-second flummoxes me.”

    What nonsense is this. If you have 60hz monitor or TV, having 60fps game is infinitely more enjoyable than 30, since every frame is synced, there is zero judder, and almost no input lag. Games in which camera moves and pans are just vastly more enjoyable at 60 compared to 30.

    • dashausdiefrau says:

      at 30fps every frame is synced, so your argument is invalid.

  17. Mungrul says:

    Gah, don’t care about 4K. Instead, why the hell don’t more games that take place in space implement low or no gravity situations?

    Look, in TV, I can understand that doing low or no G is expensive, but games should let me have fun with one of the fundamental joys of being in space.

    • montorsi says:

      Probably because you aren’t in space, the character is, and slowly floating everywhere is novel for about two minutes before you just want to get the fuck on with what you’re doing.

      • ButteringSundays says:

        Only if done poorly…

        As long as you had some form of propellant it’d be similar to swimming (mechanically, in-game) – so something like subnautica but in a space station? Count me in!

  18. Dave says:

    And that quickly, it was decided. Definitely playing femshep (femryder… hmmm) for this trilogy, that blokes voice was shockingly off-putting.

  19. Nibblet says:

    Really don’t get the point of this trailer.
    Are we supposed to be impressed that poverty boxes are almost where PCs were 8 years ago?

    • montorsi says:

      You don’t understand why Sony is advertising 4k resolutions on their platform? Are you this dense naturally or do you have to work at it?

  20. zarnywoop says:

    4K and no shadows and minimal particle effects.

  21. Laurentius says:

    It didn’t take them long to present us with nice crotch/butt shot of that Asari. Oh, and pepole say CDPR uses titilation but Bioware…

  22. klownk says:

    “The drive for some people to have every game be 60 frames-per-second flummoxes me.”

    Because you brain is too slow to understand.

  23. Fredward says:

    I had no idea people got so upset if you mention you don’t value 60 FPS and graphics quuuuiiiiite that much.

  24. khamul says:

    Bit late in the day to join the discussion – but maybe someone will see this and find it useful.

    I know a little bit about 4K, 10-bit colour, HDR and so on. Frankly, unless you have a *really* big TV, or are sitting *really* close to it, the difference between 1080P and 2016P (4K) is not likely to be visible… but as PC gamers, we sit close to the screen – so maybe.

    High Frame Rate is probably going to be a big thing for TV, especially sports (if you really care about quality). But as PC gamers it’s nothing new to us. Frame rate is much harder with broadcast video than it is when you’re rendering the scene dynamically, locally.

    10-bit colour is … interesting. I’m still not sure if it will make a difference. I’ve looked at the human vision aspects of it, but I think the display is a bigger factor. Frankly, most displays make so much of a mess of the colour information they already have, that giving them more precision is unlikely to have a big impact. But colour precision and thermal efficiency in displays has just gone through a major evolution – so this could change.

    High Dynamic Range, however, is the *big* one. Our eyes have a vast dynamic range – we can see detail clearly on a bright day, or a candle a mile away on a dark night. That’s because the eye adapts to conditions – but at any one moment, in a sweep of the eye across a scene, we can cope with a range of about 1000nits of brightness (‘nit’, unfortunately, is the measure for brightness). TVs have a brightness range of 100 nits.

    What this means is that when you try to show a dark area on a TV, either the dark areas are washed-out and grey, or the whole thing is too dark to see. It’s nothing like the experience of standing in a cave. High Dynamic Range makes things *much* closer to how the dark looks in real life.

    Displays with a brightness range of 1000 to 1500nits are hitting the market now. On one of those (or an OLED, with true black levels, in a dark room), the HDR-encoded version of the video above would look *amazing*. This really is a see it/socks blown off technology. But you can’t *show* it to anyone without an HDR TV!

    Unfortunately, from a standards point of view, it’s also a chuffing nightmare. There’s no agreement on how the dynamic range information will be carried or displayed, there’s no public brand to let people know the TV they’re looking at has a decent implementation of HDR… My money would say that the new Playstation boxes are Dolby Vision, and this will work with DOlby Vision displays – which means good luck getting your hands on any of this, PC gamers!

    • Premium User Badge

      particlese says:

      Oh, nice! I was reading about some of this stuff semi-recently, and the only nit number I remembered was some Philips displays with 700 nits. Good to hear there are some out or on their way which can go brighter (assuming they can remain dark where appropriate). I read something about a “UHD Alliance Premium” standard or something like that in the context of HDR, but then I got the feeling it’s all still up in the air, so I stopped reading and decided to wait until the industry gets its act together — hopefully before the new terminology is flogged into meaninglessness. I got the impression the “premium” thingy isn’t littered with optional features like the various HDMI versions, but that might be a wrong impression. Sounds like you’re more familiar with these things than I.

      Regarding color precision: I’d think it should become much more of a selling point once HDR is in the mix since banding would otherwise be even more obvious than it is now. Grainy tricks can still be played to get around imprecision, but I imagine (since I’ve never seen a 10-bit display with a full 10-bit path) it would be nice from time to time to see really clean gradients, and I’m aware that it helps even now with content creation and certain markets like medical imaging evaluation. (I’m not quite at that level of pickiness, but general accuracy issues have been known to bug the heck out of me.)

      • khamul says:

        I think actually you’re a little more up to date than I am! I’ve seen a 1000-nit TV at a trade show, but I’m not sure how close that is to a consumer model. I think that you’re right, I was over-optimistic, and 700 is still the range for what’s plausible in a home at the moment.

        That’s still a lot better than 100 nits, though.

        From the forum events I’ve been to, it’s clear 10-bit and HDR are fundamentally tied into each other. People – proper science people, from BBC research and so on – tend to talk about ‘colour volume’. I’m not sure – you may know more than me – how much of an impact HDR will have on banding. I suspect it very much depends on the gamma curve in play, and that’s what everyone was fighting over.

        I do strongly suspect that 50% of all the banding you’ve ever seen has been down to over-aggressive compression of the video – as moving like colours onto the same value is probably one of the cheapest compression tricks you can play – and 49% due to TVs that throw away the 8th bit (or more) early on in the DAC.

        I think you’re right that the creation desks are all running 10-bit now. Which is a headache, if you think about it: how do you make sure the Orange orange stays Orange orange when you downmix to 8 bit. Especially if it’s a HDR colour volume to 8 flat colour space downmix – nonlinear! Fun times ahead! People really care about this stuff.

        Haven’t been tracking the UHD Alliance stuff in the last few months (no longer in that job), so not come across “UHD Alliance Premium”… sounds like a fantastic way to piss off a lot of people with UHD TV sets. There was talk of a UHD Wave 1, UHD Wave 2 some time back at the last IBC, so it may relate to that.

        Yeah, you probably want to give it another year (at least) before you buy, I’d say.

  25. Von Uber says:

    So that’s the male Not!Shepard then. Ho hum.

  26. teamcharlie says:

    I think you guys forgot the accompanying press release for the trailer:

    “Are you ready for our doofy hero Aiden ‘Doucheface’ Ryder and his plucky dumbshit friend, Manic Pixie Dream Asari, to wander around the galaxy and touch things, but worried that there might be a story or choices to make? Fear not! We know you can’t read, so the goal on every planet is to now just get to the giant glowing green ball via third person platforming with a jetpack. Green means good! (Note: there may eventually be some combat, we haven’t decided yet. Pre-alpha, baby!)

    And don’t worry! For our first round of DLC, we’re having even MORE of the writers from the first three Mass Effect games killed to make room for our newly expanded FIVE K team (bigger numbers are better)! Pre-order now and we’ll unlock the Asari girl’s ‘Yatta!’ pose idle animation during cutscenes (which is the only time you’ll see other characters onscreen).”

  27. KenTWOu says:

    I’m not a 60 fps apologist, although some of the genres do need these 60 frames. But you should look at it this way, if all console developers will start to use 60 fps as a target, after their games will be ported on PC, these 60 fps will be far more achievable even on mediocre PCs, although games will be less pretty.