The Framerate Debate: 30 “Feels More Cinematic” Than 60

It is pretty.

“The human eye can’t see more than 24 frames per second,” Those Internet People say. “Tests found fighter pilots watching a 250fps video of playful kittens will grow furious if you slip in one single frame of Plumbers Don’t Wear Ties,” Others insist. “If a game ran at 500fps it would seem so real that if you died in the game you would die in real life,” I’m also told. I thought I’d heard it all in The Great Framerate Debate that rages eternally across the gameosphere. Dear, sweet, naive Alice.

Some Ubisoft chaps have declared that 30fps “feels more cinematic” than 60fps. Gosh.

To be quite clear, this is in reference to the Xbox One and PS4 versions of Assassin’s Creed Unity running at 30fps. Ubi haven’t said yet whether the PC version will suffer this fate or not but, given how many PC ports of multiplatform games do run at 30fps, it’s worth highlighting this absurd attitude. I’ll accept ports running at 30fps with a grumble, but let’s not pretend that particular framerate is in itself a sound artistic decision here.

“At Ubisoft for a long time we wanted to push 60fps. I don’t think it was a good idea because you don’t gain that much from 60 fps and it doesn’t look like the real thing,” world level design director Nicolas Guérin told TechRadar. “It’s a bit like The Hobbit movie, it looked really weird.” Pushing twice as many pixels is “not really that great in terms of rendering quality of the picture and the image,” he said.

Creative director Alex Amancio added, “30 was our goal, it feels more cinematic. 60 is really good for a shooter, action-adventure not so much. It actually feels better for people when it’s at that 30fps.”

Oh, what a load of old rot! It’s fine that they want to make a game look so pretty it wouldn’t run at 60fps on consoles — “If the game looks gorgeous, who cares about the number?” asks Amancio — but let’s not pretend we’re better off with low fps. To drag up these hoary old arguments again, recorded film is not the same as rendered game. Film looks dandy at 24fps because of its natural motion blur, while a low-fps game is far jerkier. Jerkiness changes how a game feels, of course, but in a very different way.

Why do these PC ports run at 30fps anyway? A shiny gaming PC could do so much more. Often, it’s because they wouldn’t work right otherwise. Some games build systems like AI and physics around updating 30 times per second, and wig out if this is changed. Hacking Dark Souls to raise the cap to 60fps, for example, will see you rolling shorter distances and possibly falling through the world on ladders. Not every game has such problems, mind. To state the obvious, 30fps limits exist because changing them would be more work than devs and pubs want, or can afford, to commit to in a port.

In related news, Bethesda said yesterday that The Evil Within will be locked to 30fps and a letterboxed 2.35:1 aspect ratio because devs Tango Gameworks have “worked the last four years perfecting the game experience with these settings in mind.” They will share debug commands to change this if you fancy, though. Spooky horror, now there’s a genre where low fps might be desirable to make things jarring and unpleasant.

Games running at 30fps, while less than ideal, are at least understandable from an economic perspective. Weird comparisons to film are less comprehensible.

237 Comments

  1. LegendaryTeeth says:

    Because if there’s one thing movies are known for, it’s no problems whatsoever with fast pans and quick motion..

    • Morte66 says:

      :)

      Yup.

      Movie people learn in filmschool to limit the rate of somebody walking across the camera to so many degrees of arc per second, and so on. 24fps only gets away with being so slow because they deliberately restrict the content.

      • Premium User Badge

        timzania says:

        Which is one reason this matters more on PC, where mouse control is common, than it does on consoles using analog sticks.

        In natural looking around (with an eyeball) you do not pan slowly, ever. The eye jerks around rapidly from one point of interest to another. With a mouse, it feels natural to look around the same way when you’re getting your bearings. It’s disorienting if those abrupt pans are choppy. With a controller you can only look around slowly and the choppiness is less of an issue.

        Assassin’s Creed games are the worst about this, because the game encourages you to go to a high spot and look around, and this is when they tend to have horrible FPS issues even on the latest hardware.

        • Gap Gen says:

          Yeah, this is one issue with FPS controls, and perhaps one reason why the Oculus Rift is an interesting proposition – no-one looks directly ahead while turning their head to look at something. Various birds bob their heads while walking because they can’t smoothly integrate motion in their brain, so they have to keep their head still to interpret what they’re seeing. I mean, it depends what you’re looking for, and if they have a way of directing the action so the framerate looks OK, then fine. Another issue is perhaps that making quick decisions is harder with lower framerates, because your decision loop gets messed up.

        • mashkeyboardgetusername says:

          The thing is with the eye is your vision cuts out when the eyeball moves, so you don’t get those weird effects when it skitters about. To show this, stand in front of a mirror and look at one eye then the other. Your eyes are moving but you won’t be able to see them move. (May require a second person to confirm your eyes are moving.)

          Sorry, just something I find interesting, not really applicable to framerates specifically.

          • kament says:

            Well, when someone’s comparing mouselook to erm eyelook, why not, it’s a relevant answer. The real problem is that eye and mouse aren’t synched, then. Which, of course, means the higher fps the better. ))

          • kament says:

            Wow, that’s a good one and somewhat on topic:

            “Vision’s mostly a lie anyway,” […] “We don’t really see anything except a few hi-res degrees where the eye focuses. Everything else is just peripheral blur, just— light and motion. Motion draws the focus. And your eyes jiggle all the time, did you know that […]? Saccades, they’re called. Blurs the image, the movement’s way too fast for the brain to integrate so your eye just—shuts down between pauses. It only grabs these isolated freeze-frames, but your brain edits out the blanks and stitches an — an illusion of continuity into your head.”

          • Ravenine says:

            Your eyes jiggle all the time because otherwise you’d go blind until you looked at something else. I’m not even making this up, to boot. If you were able to look at one single unchanging point, the “sameness” of the sensory perception would make your brain shut down your eyes, as they’re not “required”. Hence the involuntary eye movement, also called a saccade. Finally, you’re technically blind during a saccade, because your eye (as you said) shuts down actual perception and fabricates images to fill in the blanks thus created.

    • Cinek says:

      30 FPS vs 60 FPS: http://www.30vs60fps.com
      That shuts any discussion for me.

      • Dominare says:

        Oh good, now I can head off the entire debate with a simple link whenever it crops up at a dinner party.

      • kraken says:

        This one is pretty good too:
        link to boallen.com

        • JohnnyPanzer says:

          Thank you both for the links!

          Now I know for sure, there’s no difference that I can tell. Not even between 15fps and 60fps. I’m not saying it seals the argument, just that it seals it for me. The framerate still makes some difference for me, but that’s just because it’s a constantly updated average which means a framerate of 15fps could still mean I get minute drops that are noticable as freezes, even by me.

          But in my experience a fps of 30 is enough to be on the safe side and I rarely detect any glitches at that framerate..

          • Cinek says:

            Wow… impressive…. you can’t see a difference even between 15 and 60 FPS?

            My jaw dropped.

          • Martel says:

            Interesting. I went into those links thinking I wouldn’t notice a big difference and for it was pretty drastic. I’d for sure get a headache with the 15 fps one, it was causing my eyes strain just watching it with all that choppiness.

          • HadToLogin says:

            Yeah, you really can’t see difference between 15 and 60?

            I can’t really see a difference between 30 and 60 in boxes comparison, but easily can see it in rotate-around CS link.

          • PsychoWedge says:

            Up to this moment I didn’t even think it was physically possible not to notice the difference between 15 and 60 fps. Proven wrong, I guess. xD

          • JamesTheNumberless says:

            Obvious FPS troll is o / b / v / i / o / u / s.

          • remon says:

            Ooh, you should check it, sounds like a serious medical condition.

          • Shuck says:

            I wonder if it’s a browser issue, as I can’t tell a difference, either. (Except that the in first link, where the 30fps example seems to rotate at a faster rate.) I can certainly tell if a game drops from 30fps to 15, though. If nothing else, I start feeling sick.

          • TechnicalBen says:

            In all honesty, some people do see differently. Some don’t see 3d (eye problems or other related issues) for example.

            However, we know some people are colour-blind. While those who are will say “the TV looks no better in colour”, we know not to argue with each other.

            Likewise, higher framerate is better currently, until we reach ridiculous levels (we’ve not even hit the colour limits yet, so a long way to go with framerates). But not everyone will “see” the improvements.

          • JohnnyPanzer says:

            I’m not trolling, and I strongly doubt it’s a serious medical condition. If i really try, I guess I’m able to spot a tiny difference between 15fps and 60fps, but absolutely not between 30 and 60. I guess I just don’t see stuff like that, just like how some people have bad periphiral vision.

            But that doesn’t mean I’m AGAINST higher framerates, and I agree that the cinematic excuse is bullshit. It just means that now I know for sure (and have always felt) that I have no need for anything higher than 20-30, which is a good thing since it often means I can crank up the graphics even when it means a large drop in fps…

          • Flopper says:

            I think you may have a vision problem if you can’t see a difference between 15 and 60… Your eyeballs are slow to update. Maybe you need to replace the GPU in your brain. You can’t keep that Voodoo 2 forever ya know.

          • Superabound says:

            You cant tell the difference because youre looking at a pre-recorded video, not actually playing the game. Look up “mouse lag”. And this is exactly why direct comparisons from video games to film are so obviously stupid: Video games are not pre-recorded movies, they are interactive simulations. In non visually stylized games, low quality visuals do not make them more “cinematic” (that is done with pacing, art design, and storytelling), they simply take away from the immersion and verisimilitude.

          • drewski says:

            My head actually hurts from the chopping on the video loading at the start and the 15 fps bouncing square. Once it gets to 30fps its was fine, but I could definitely notice the difference to 60fps.

            I can play games at 15-20fps, but I definitely notice up to 60fps.

          • scatterbrainless says:

            I think it’s a misconception that when comparing framerates people concentrate on how it looks or appears. The real difference when it comes to framerate is actually how responsive it feels to kinesthetic inputs, that is, the lag between your mouse or button presses and what presents on screen. Even if visually there may appear to be very little difference between two videos running at different fps, once you are put in control of the game you will certainly feel a distinct difference. This is why it is less of an issue on control pads, because the inputs are slower than keyboard and mouse, so the lag time between input and on-screen response is less noticeable.

          • thebigJ_A says:

            The difference is very vety obvious between 15 and 60 in the boxes example, and highly noticeable in both examples between 30 and 60.

            If something’s not wrong with your browser or display (the more likely culprits), then there *is* something wrong with your vision. Perhaps it’s not serious, but it’s significant.

          • thebigJ_A says:

            There’s a massive difference in all those examples for people with normal vision. If you can’t see it, and the problem isn’t with your browser or dispaly, there *is* an issue you should get checked out.

          • Mabswer says:

            Also if you are constantly having “headache” from low framerate (lol) you too have an illness, its called brain tumor, might be too late to have that checked out by now…
            just thought i’d point that out.

          • socrate says:

            if you dont see the difference you have been used to way to much 30fps and choppy gameplay in your life since console have a hard time keeping up 30fps alots of the time in the first place,if you don’t see how fluid 60fps is you have a mental illness.

            And btw having headache because of low fps is totally normal since the image is choppy and repetitive and your brain is trying to analyse it fully(since you’re probably used to higher fps) and of course failing thus generation a stress in your eyes thus not your brain but still you feel like its your brain…totally normal…if you don’t see a difference you are either getting old or are old or you have horrible eyesight

        • DrMcCoy says:

          Well, even that is not clear cut. It really depends on your monitor too. How its refresh rate and the pixel response time interact with the FPS changes the subjective impression.

          • Shadow says:

            I’m not even sure they make monitors with refresh rates lower than 60 Hz anymore.

          • DrMcCoy says:

            Sure, but, for example, if you run at 75Hz, and the game outputs at 60Hz, that doesn’t cleanly fit, now does it?

          • Superabound says:

            Yeah but 30fps isnt a multiple of 75 either, so not only will it “not cleanly fit”, it will look like total ass while not doing it as well.

        • DanMan says:

          While we’re at it, check this out: link to svp-team.com

          Apart from that, Alice has already summed it up pretty good.

          • mattlambertson says:

            I no longer watch any movie/video content without SVP running. I’m a 60FPS addict. And when occasionally the plugin doesn’t start up, I notice it within 3 seconds. Low frame rates grate my brain to shreds :)

      • Zekiel says:

        Now I have a headache.

      • Stevostin says:

        To be completely fair this should be done on a game that allows for motion blur. Which is a thing that a happens in your own sight btw.

        That being said the whole debate is poppycock. Photon doesn’t reach our eyes 30 or 60 times per second but continuously. So whatever the brain does with it, it does on infinite fps stream (well, close to). I agree that 60fps movie demo in stores looks aweful but that has more to do with aweful contrast fx than frame rate.

        • Cinek says:

          False information. No game is or will show motion blur in a way identical to the movies, and neither movies nor games show motion blur in the way human eye registers movement of a real objects. See reply from FriendlyFire down, down below for a good start.

        • socrate says:

          that is just dumb thing to say and is false information.(aka dont make people as stupid as you are please)

          motion blur is a false thing that doesn’t appear in real life in any close way to what game or movie show eye adapt surprisingly well to fast movement if you see like that then you should go see a doctor cause you have a BIG health problem.

          Its the same thing with bloom and other such effect they just don’t exist in life and are just a stupid photographic “effect” which is usually taking bad picture and claiming its artistic in some dumb way.

          its also known that these effect actually put stress on the eyesight which if you didn’t know is really bad

      • Sothis says:

        Yeah massive difference, it’s most telling at the peripheral of your vision. Nice link.

      • trn says:

        Thank you, that’s a fantastic link. Clearly there are some people who cannot tell the difference, but that is like night and day to me. I get incredible motion sickness on games (especially FPS games) with low frame rates and on watching those two examples I don’t feel any nausea at 60 fps. I had to stop playing Assassins Creed Black Flag (locked at 30fps on my PC) because of motion sickness.

        To me this is like designers refusing to implement colour-blind support in a game or left handed bindings – its not a matter of ‘PC master race blah blah blah’, but compatibility with as wide an audience as possible.

        Which I would have thought would be true parity.

      • lowscore says:

        link to testufo.com is one of the best 30 vs 60 fps comparisons I have seen so far.

        How anyone can defend 30 fps, is beyond me.

    • RedViv says:

      And since it is more cinematic to have those in, we also get lovely motion blur and distortions! Hooray!

      • stele says:

        And don’t forget those cool lens flares!

      • jrodman says:

        I find defects in the film surface to be more cinematic.

        • Adrastos42 says:

          To give a truly cinematic experience, in Assassin’s Creed: Unity the bottom fifth of the screen will be given up to silhouettes of rows of seats filled with people, who will talk loudly and crunch popcorn during your movie gaming experience. Occasionally, someone will get up to go to the loo.

          In addition, the first 15-45 minutes of the game will be one unskippable cutscene of various adverts.

    • MessiahAndrw says:

      Back in the early 2000s when I had 3D shutter glasses (the precursor to nVidia 3D vision) on a CRT monitor, you’d have to have your monitor running at least 100Hz to get rid of any noticable ‘flicker’ of the glasses. That’s 50Hz per eye.

      I now own an Oculus Rift. 75 fps on the Rift is fluid and fells like you’re actually there when you look around.

      The thing with VR is if you keep your head perfectly still, you can watch low-frame-rate stuff. But, if you move your head and the movement isn’t fluid (either you don’t move at all, or there’s that cinematic ‘studder’ as you move) you get this queezy feeling in your stomach and a headache. Everyone that has used the Oculus Rift has noticed this (my wife, my in-laws, me). 60 fps produces noticable studder, and 30 fps would be unbareable unless you kept your head perfectly still!

      I know that 24p studder is ‘cinematic’ – especially for those that grew up comparing movies and television – but I grew up playing more 60 fps+ video games than watching movies, so I prefer higher-frame rate content.

    • h_ashman says:

      It also doesn’t help that the argument about the Hobbit films ‘looking weird’ is a bit of a strange one. I found that the outdoors shots and panning shots looked amazing and it felt a lot more realistic. It only started looking off in the scenes obviously shot on a set, or with any extensive use of props, because the props looked more like props and less like actual swords / axes whatever.

      I’m pretty certain I saw the rubber weapons bending mid fight when I watched the high-frame rate version in the cinema, something the normal frame rate smooths out with the missing frames and motion blur.

      So to my eye that’s surely a sign that filming like that is actually better? As it makes things real things appear more real and makes it more obvious that fake things are fake. Of course it requires better quality props to remove the downsides of it, but the same could be said about shooting at >4K resolution.

      • pepperfez says:

        I’d say the purpose of cinema is almost the opposite of “make real things look real and fake things look fake.” Because there’s always going to be some artifice when you’re telling a story, I don’t see the advantage of making physical artifice impossible.
        Of course, sometimes the hyper-real is exactly what you want – I can’t wait to watch Planet Earth in 120 Hz 4k.

    • Smashbox says:

      Also, if there’s anything games are known for, it’s being movies.

      What an asinine crock of ol’ shit.

  2. lowprices says:

    If I remember correctly then the human eye perceives 24 frames per second as movement, so any more than that would be wasteful, really. Save those extra frames for other games, make sure you don’t run out.

    EDIT:Goodness. Apparently, pondering that Ubisoft might run out of frames if the frame-rate of a game is too high wasn’t enough to make it clear this post isn’t serious. In the interest of seriousness then, let me just say that I know higher is generally better, though personally I don’t care whether a framerate is 30 or 60 so long as it’s consistant.

    • Garek says:

      Not sure if sarcasm, but I want to reply anyway:
      Well, it’s not like the perception just switches from not-movement to movement at 24 fps. In my experience, it’s a really gradual transition from not-movement at abyssal framerates, to stuttering movement at just below 20 fps, and then on to actual smooth movement somewhere between 24 and 60 fps, depending on content, display device (games on monitor vs. movies on tv) and also very much your definition of smooth.
      Also, what st33dd said below, nice explanation of some of the technical issues with low framerate in games.

    • Jeroen D Stout says:

      The trouble is your eyes aren’t stationary. If you track something moving on a screen in film, a low frame-rate will mean the object is blurred every single frame and you cannot see it correctly. Years of 24fps film have made is blind to this, which is why people respond so strangely to 48fps when the movement seems ‘too fast’ by moving while remaining sharp(er).

    • CommissarConn says:

      “To drag up these hoary old arguments again, recorded film is not the same as rendered game. Film looks dandy at 24fps because of its natural motion blur, while a low-fps game is far jerkier. Jerkiness changes how a game feels, of course, but in a very different way.”

      Read article before commenting plz.

    • Horg says:

      Hey, running out of frames is no laughing matter. My gran once underestimated how many frames of movement she needed to get to the shops and got caught short. She tried to make her last few frames stretch as far as possible and ended up clipping right into a wall. Took the fire brigade took 2 hours to debug her out. Consistent 60 fps movement is luxury for those who can afford it.

      • HothMonster says:

        Blood graphics mined pixel by pixel out of the earth and people want to use them all up in action adventure games! Think of the children!

      • drewski says:

        When I run games on old machines I keep getting out of frames errors.

        It’s so annoying having to restart every 20 minutes because the frame buffer filled up. You’d think they’d learn they can empty it once the frame has been used, but no.

        • jrodman says:

          When I was a lad, it was all we could do for enough rastertime! Now the kids these days are squandering whole frames!

    • HadToLogin says:

      Like good old rule says: if you don’t put ;) at the end or some other SARCASM tag, someone will read it as serious post.

  3. st33dd says:

    If you wanted something to look as smooth as possible, you’d run it twice the speed of the device picking it up. So saying a human runs at 24fps – you’re going to want to run over twice that speed so that you never have synchronization issues with your eyes. And there’s the allowance of pressing buttons like you’re a fucking human-humming-bird. In order to register a second key press the machine needs to see you not press a button, that’s one frame down, one frame up. So you can only hammer buttons at a maximum of half the frame rate (I know, I spent this week programming around this issue).

    So yeah, gloss over it with as much bollocks as you like. A faster frame rate IS better. An apology for having to run it slower to let the code breathe is fine, but saying it’s better slower is horseshit.

    • e-dog says:

      Hi Aaron! It gets even worse when you have multi-threading, physics etc. which adds another frame or two of lag. And those AAA games usually have all that.

    • Asdfreak says:

      You COULD of course put the event handler or whatever takes the keyboard input in a seperate process that works at maximum posssible speed, or twice the framerate or whatever and fill the keypress events into a que. Thats how I handled it. Thats how a lot of event based framworks basically work. You can have 0.5 frames per second and still register 5 keypresses per second if you want.

    • Aninhumer says:

      What? Even if you set aside the possibility of using an input queue (which could plausibly impact performance), there’s absolutely no reason your polling frequency has to be exactly the same as your framerate. You can easily poll more than once per frame.

  4. ghor says:

    Gosh, that word. I don’t like it.

    Seen any good ludomatic films lately?

    • Ex Lion Tamer says:

      Ha. I get why Ubisoft types use “cinematic,” but it sure is useful shorthand for: “Stay away, gamers who like gamey games!” (A category I typically fall in as well.)

    • Niko says:

      Edge of Tomorrow is about save-scumming.

  5. Guvornator says:

    “30fps “feels more cinematic” than 60fps”

    No it doesn’t, it feels more old school NTSC. 24 frames per second would be more cinematic, as that’s the speed of cinematic film.

    • Orillion says:

      Also, it’d feel a lot more cinematic if you only killed like a dozen people throughout the game, tops. After all, I’ve never seen a movie where the good guy murders hundreds of people*

      *I’ve not seen Rambo 4, no.

    • pepperfez says:

      Well, 30 is closer to 24 than 60 is…

    • Shadow says:

      “30 fps (is a lot easier to cater to without having performance issues but we’ll tell you it) feels more cinematic than 60 fps (because some of you chumps will actually believe it and we’ll laugh our asses off as we swim in your money).”

  6. iainl says:

    If anyone saw the 48fps presentation of The Hobbit, they’ll be aware that it looked a lot less cinematic than the 24fps one. So it’s a valid argument. But it did look a lot more “immersive” or “realistic”. And generally I think that’s where most people are coming from with their games.

    This is all because we’ve been really poorly raised as viewers – 24fps means Film, which in turn means massive budgets and/or artistic validity. 48fps is as close to 50fps as makes no odds, and 50fps means Video, which is Cheap Television. Sometimes it means The News, which is a great look for pseudo-documentary stuff, but in terms of drama it translates as Soap Opera Crap.

    So while on a purely technical argument more frames are better, it doesn’t always work like that – the aforementioned 48fps version of The Hobbit looks so much like the behind-the-scenes stuff on the Lord Of The Rings discs that I kept waiting for the camera to swing a bit left to show Peter Jackson ordering everyone around.

    • Ex Lion Tamer says:

      It is an interesting technical/personal question. I’d love to do a bit of testing on myself.

      As much as I recall always being annoyed by 30-locked games in the past, I’m a massive proponent of 24 in film. You may be right that it’s largely a function of conditioning – though I’d contend filmmakers also haven’t figured out how to compensate for the effects of 48. I could see someone figuring it out at some point, much as Godard apparently did recently with 3D. (More than a gimmick, but an essential part of the experience, in other words.)

    • typographie says:

      I’d personally take immersive and realistic over whatever “cinematic” means any day. The Hobbit looked unusual because it looked better than other movies.

      At the end of the day, though, consoles really have never had a wide selection of 60 FPS games and I don’t frankly care if they do anytime soon. In general PC ports have gotten better to the point where one locked to 30 FPS is enough of a big deal to make the news. As long as that stays rare, I don’t care what nonsense console developers offer up as an excuse for their hardware being underpowered and their priorities reversed.

      • joa says:

        I think the more ‘real’ 60fps look might work aesthetically for some films – but it was a strange choice for The Hobbit, a fantasy film that requires a fair amount of suspension of disbelief.

      • kalirion says:

        The problem with the “Real” in this case is that it can make it look more fake. I know, weird concept.

        But a couple episodes form my personal experience:

        1. Saw an episode of Game of Thrones on a 60fps TV with motion smoothing, and it looked like I was watching a play, or a discovery channel reenactment. I found low resolution clips from youtube more immersive than that.

        2. Saw The Avengers playing on a 120Hz TV at a Best Buy. Captain America looked like some guy cosplaying in a cheap rubber suit.

        • MaXimillion says:

          Game of Thrones is not available in 60 fps though. If you’ve seen it in 60 fps, it’s been created through interpolation, which is terrible.

          • Aninhumer says:

            I’ve had pretty positive experiences with interpolating TVs. I think the smoothness more than makes up for the occasional visual artifacts.

    • Guvornator says:

      As far as I’m aware, 50fps is NOT video, at least in terms of normal terrestrial broadcast. There’s a confusion on this based around interlacing – interlaced PAL signal sends 2 fields (half frames or every other line) to give better movement without exceeding bandwidth restrictions. However, 2 fields still make only one frame. So an interlaced PAL signal requires the screen to update at 50hz, because of the 2 interlaced FIELDS, but will still only be playing at 25 FRAMES per second*.

      Film and computers, however, largely output progressive video, which is one complete frame with none of this interlacing business. Which means that Ubisoft’s better looking Assassin’s Creed jobbie will actually feature LESS smooth motion than a crappy old analogue NTSC signal, which will be sending a half frame 60 times a second, to Ubisofts’s 1 whole frame 30 times a second.

      *capitalisation due to hopefully making this easier to understand, not snark.

      • iainl says:

        PAL video is 50 interlaced fields per second, yes – if you shot on film for PAL broadcast you either shot 25 frames and broadcast both fields of a whole frame, or sometimes just shot 24 frames and speeded it up by 4% (depending on who you consider your primary audience).

        Which then led to weirdness like the way the DVD releases of most BBC stuff is presented at the PAL speed it was meant to be, but the Blu-ray is 24fps progressive and slightly slow so Worldwide don’t need to make separate masters for different regions.

        • Synesthesia says:

          Finally, educated replies!

          Formats are weird, huh? They are echoes from the dark electronic times, when we were bound to the limitations of a new support. God damn you betacam!

          But yes. People should stop saying humans see at 24 fps. It makes me want to punch kitties.

          • pepperfez says:

            Of course you’d never be able to get the drop on a kitten to punch it, as their 60 fps feline vision would give them a head start.

            That’s how this stuff works, right?

          • jrodman says:

            How fast do YOU think our scanner beam operates?

      • blind_boy_grunt says:

        that’s interesting, what do they do with the sound when they speed up the pictures?
        edit: i mean speed it up would be the obvious answer, but that seems just wrong.

        • Guvornator says:

          In old style standards conversion, pretty much that, although this could be sidetracked by just filming from a monitor using a camera of the desired standard. New style conversion tends to be digital, so essentially everything is re encoded. Going from Film to PAL doesn’t make much difference to the sound however, so things like Cinema Tools just change timecode metadata without requiring a re-encode. So it’ll be 4%, or 0.679 semitones higher.

          Going from PAL (25fps) to NTSC (29.97) is, however, insanely complicated. Essentially, it requires slowing down PAL to 23.98fps, taking the fields from 4 frames of the 23.97 them smearing them over 5 frames of NTSC. This process looks bum, frankly, an tends to lead to judder as fields are repeated. Wikipedia has a fuller explanation here link to en.wikipedia.org if you have some time on your hand and a REALLY tight grasp of maths…

          • blind_boy_grunt says:

            “Wikipedia has a fuller explanation here link to en.wikipedia.org if you have some time on your hand and a REALLY tight grasp of maths…”
            yeah… that would be a no.
            But thanks for the understandable explanation.

    • blind_boy_grunt says:

      to me 24 fps (or whatever the normal framerate is) looks better, the faster framerates seem somehow cheaper, i can’t really describe it. Maybe that is conditioning (as others already mentioned). That said, games aren’t movies and saying 30 fps for games are somehow better reminds me of that interview for dragon age 2 where they tried to sell the barren landscapes as features, no we aren’t just pushing out the next game as fast as possible, it’s a design decision, minimalism’n’stuff.

      edit: that you say 48 fps is more realistic is kind of interesting, maybe for me slower fps makes my brain accept the fake-reality more easily.

      • quintesse says:

        Yes that’s conditioning, you’re associating 24fps with movies you’ve seen while anything higher is often video, things made especially for TV, music videos etc. In general things with only a fraction of the budget and using lesser talent. A lifetime of that and you’re like one of Pavlov ‘s dogs.

    • Gap Gen says:

      Yeah, I wonder whether to some extent people have trained themselves to see things in 24fps, so when it’s in 48fps it feels odd, even if it’s better. Would be interested to see what a future generation that sees everything in 60fps+ thinks of old 24fps films, and whether they find them unwatchable.

    • Geebs says:

      48 fps was by far the least of the Hobbit’s visual problems. It looked fake because of excessive CGI, not the frame rate.

      More fps is always better. Take the world of somebody who’s just done a couple of playthroughs of the Mac version of REVENGEANCE at 15-40 fps. Zandatsu? not so much.

      • Baines says:

        Not just excessive CG, but poor mixing of CG with real people and objects.

        Such as making the mistake of rendering CG monsters at a different depth of field than the actors were shot at. It looked particularly bad in less action-filled scenes where characters were moving around, with CG characters moving in and out of focus at a completely different distance than the human actors.

        Or not bothering to show CG creatures leaving any mark on real scenery.

    • Muzman says:

      The cinema experience is where the argument comes from , but as people point out games aren’t movies. I don’t really think being inflexible viewers has that much to do with it. That interpolative effect of 24fps and sometimes even slower has a certain something that works nicely for watching something on a projected screen particularly. Exactly why higher frame rates start to look a little chintzy is a bit mysterious.

      Some of it is a matter of how video is captured altering movement (that was certainly the case with interlaced video). But you also get this effect of things seeming sped up slightly, even when they are not. My bet for why this is is noise patterns. The movement in HFR 48, 50,60 etc is the same but the noise is buzzing at a really high speed compared to what we are used to. I guess even then the argument can be made it’s the comparison with the old style that’s the problem. Which is valid, but I’m still not sure there’s something more fundamental going on. Noise patterns in the eye are as variable as the firing of rods and cones. I suspect to see the pattern change so consistently from frame to frame at high speed can’t help but make things buzz.
      This isn’t to say people can’t get used to this stuff, but that there will always be a difference so long as capture methods are frame based (for which we need something like Doug Trumble’s notion of motion variable image capture to get away from).

      As to what produces that odd feeling of artifice in some higher frame rate stuff, I’m really not sure. It’s weird how suddenly it seems obvious that everything is fake, it’s just people standing in sets and so on. I have some idea that we like some motion blur, so I’ve often wondered if HFR presentations would just put some back in in post the impression might be reduced. But I really don’t know.

      Anyway, why this 30fps argument is bogus is games aren’t movies. And so my thing about that is it’s because games don’t have noise patterns in the image (to the same degree, and unless they are put there on purpose). I can remember watching old Quake 3 games at LANs running at 120 on CRTs and my own running much slower. There wasn’t that notable change in the feeling of watching it. It was in how it felt to play.

      • iainl says:

        Games aren’t movies, no. But we’re all used to the idea that some (generally narrative-driven) games want to look like movies. And so they use a whole bunch of techniques that aren’t relevant to the experience your eye would perceive were you actually there.

        This isn’t just film-apeing low frame rates – you don’t get massive lens flares if light doesn’t go through a lens before it reaches your eye, yet some people still think it’s the 1990s and they’re cool things to add. Frankly, I’ve got a lot of time for those people, if I’m honest, because I’m a big fan of Die Hard. I see that The Evil Within is locked to a 2.35:1 aspect ratio, even though it’s pretty unlikely your monitor is that shape, etc.

    • ffordesoon says:

      Precisely.

      The comparison to The Hobbit doesn’t work for video games because gamers have been conditioned to respond to a lower fps count the way they respond to a higher fps count in films. That is to say, a lower fps count is always considered worse than a higher one.

    • Smoky_the_Bear says:

      The only reason people didn’t like 48fps The Hobbit was because it looked different than what they are used to, having watched everything previous in 24. This does NOT apply to video games in any way as 60fps games have been around forever, therefore comparing it to film and saying it is “more cinematic” is complete bullshit, nothing more.
      I’m just astounded that people actually buy this crap, there are lots of stupid people out there it seems.

      “At Ubisoft for a long time we wanted to push 60fps. I don’t think it was a good idea because you don’t gain that much from 60 fps and it doesn’t look like the real thing,”

      What does this even mean? “It doesn’t look like the real thing?” What real thing? Real life? Because I’m pretty sure I don’t see real life in 24fps with film imperfections flickering everywhere. That would take a hell of a lot of drugs.

      As far as things looking “real”, I’ve watched UFC and other sports at 60fps. It is objectively more realistic, i.e. more like actually being there than standard frame rates. Trying to watch it in 24fps would be unwatchable, so their “more real” BS is nothing but crap and they need to go fuck themselves.

  7. setnom says:

    Yes, frames per second are very important, gaming-wise. You would want as much fps as possible, 60 should be considered standard. These two videos explain it much better than I ever could:

    link to gamespot.com (sorry for posting links from the “competiton”)

  8. Premium User Badge

    yhancik says:

    I thought this “human eye” thing had been debunked several times already? Remember when Carmack was ranting about it two years ago? link to dsogaming.com

    And yeah, it looks “more cinematic”, so what? Are we making 1 frame per hour films so they looks more pictorial?

    • Cinek says:

      Never, ever use Carmack to support your argument.

      • theSeekerr says:

        Because it’s a huge mistake to cite a genius? He’s not infallible, but he’s an incredibly smart guy.

        • Cinek says:

          To start with: it’s a huge mistake to think he is a genius.
          Though I guess everything is relative…

          • MadTinkerer says:

            Artistic genius, oh no. No, no, no, no, NO.

            However, he is a programming genius. If you look at his source code, it’s pretty inarguable, but you don’t have to read C to understand his achievements. Carmack is responsible for showing that you don’t need dedicated hardware to do fast 2D scrolling. Yes, I said 2D, keep up. His fake 3D and then true 3D engines which came out within five years of his 2D engine then completely changed the direction of the entire game industry. His influence on the industry does not make him a genius, but the fact that he accomplished first what pretty much everyone else in his field wanted to accomplish, does.

            And then we hit the graphical plateau where fast graphics finally started having diminishing returns and Minecraft came out and was the most successful game of all time instead of Rage. But that doesn’t make him any less of a technical wizard.

            tl;dr: John Carmack doesn’t know everything, and frankly he’s a bit of an egomaniac. But he still knows more than you or me when it comes to computer graphics.

          • LionsPhil says:

            and frankly he’s a bit of an egomaniac

            [citation needed]

          • MadTinkerer says:

            Citation needed? How about this for a source: Doom 3 got made instead of a new franchise and it was entirely due to Carmack’s cult of personality. The problem with that is due to the fact that he wasn’t actually in a position of authority to make the decision and got away with it due to abusing his legitimate position as the number one graphics engine guy in the business. He threatened id’s best asset by threatening to deprive them of him. Few other graphics engine guys would even think to demand creative control over a project at their studio, but Carmack did so and then bragged about it in public.

            The problem is not him acknowledging the legitimate fact of his elite skills, the problem is him leveraging that fact to undermine the authority of his bosses and publicly embarrass them.

            Now, maybe saying he is currently an egomaniac is unfair, because at least as of Rage’s failure as a licensed engine and mod platform, John Carmack has displayed the ability to actually admit mistakes. But he was acting like an egomaniac for several generations of graphics engines.

          • socrate says:

            pretty sure carmack didn’t accomplish true 3d first is game was just more popular,also he as been hyped so much by the dumb game media who venerate him when he as actually had tons of failure,some that were actually major like rage and doom 3 which he still apologies from,

            The fact are that he was never alone in the development of these game and the success was like always given to one person only instead of alots of people who were as smart or better then him…but publisher who kept hiring him made him into this thing to venerate to sell their product…check is recent game he worked on….all horrible cookie cutter FPS(first person shooter) that didn’t have that much success even with is “genius” backing from publisher and other dumb retard that venerate him…its not like this thing wouldnt have happened 3d was already going to happen…he didint MAKE it happen since tons of people were doing it in their own way and he never changed gaming history its a dumb myth that was spread to make him more important then he really is.

            Its the same thing as Peter Molyneux who everyone though was the genius behind bullfrog when in fact it was someone else or maybe just the entire team at the time working together…who knows the only thing we know is he was not in fact the genius behind these game.

            Sadly most thing that change the world are often not big success and then are taken by other people that copy and refines it and then cash in on the idea of other people.

            The only power Carmack as is how the dumb gamer today that know nothing of is past game and of is career give him that kind of retarded fanatical power that he shouldn’t have in the first place…just look at is programing skill of rage and doom 3….its archaic and bad…he hasnt shown improvement or diversity in any of the game he worked on EVER!

            so claiming he is a genius is beyond stupid

      • Windows98 says:

        Ok, I’ll run all my arguments by you first instead, cheers.

      • Premium User Badge

        yhancik says:

        Carmack was merely an example. But alright: why?

        • Premium User Badge

          yhancik says:

          “As an online discussion on real-time computer graphics grows longer, the probability of a mention of John Carmack approaches 1”

      • DanMan says:

        I’m afraid we can not be friends. :_(

    • suibhne says:

      “And yeah, it looks “more cinematic”, so what? Are we making 1 frame per hour films so they looks more pictorial?”

      By my scoring, you just won the entire discussion. Seriously, this’ll be my new go-to rejoinder on the subject.

    • GameCat says:

      Pfff. Pictures are for kids, grown ups don’t use them, we’re READING.

      Assassin’s Creed: Unity 2 will bring interactive fiction genre to AAA segment of gaming.

      • Droniac says:

        Clearly games with 1 Frame Per Action are the future!

      • drewski says:

        1 frame per however long it takes me to parse the pixels on it. Anything more than that is faster than the brain can read a page of text and therefore unnecessary.

    • Smashbox says:

      This. What does cinema have to do with games again?

  9. Premium User Badge

    Oakreef says:

    It undeniably does make a difference but I still don’t find it a massive deal to go back to 30fps games.

    • Smoky_the_Bear says:

      Fair enough if they would come out and say “The game is 30fps because we feel that is good and the game is an impressive leap forward graphically”. They don’t, they come out with this pure nonsense marketing speak that the games industry just fucking invented about 30fps feeling “more cinematic”, if this was actually true and not just a smokescreen, wouldn’t the games industry have figured this out over the last 20 years?

      The only reason they are saying this is because the “next-gen” consoles are so underpowered that they cannot get decent frame rates out of them. Because it is “next-gen” they are forced to push graphics forward so that their new games look flashy because otherwise all this hardware would be capable of is running PS3/XBox 360 games at 60fps instead of 30.

      I couldn’t give a shit about any of this, if console peasants are happy to lap up the bullshit the industry throws at them because they don’t want to educate themselves, screw them. However, the problem arises, and we are seeing enough of it (Watchdogs not enabling high end graphics options, Evil Within running at 30 fps in a letterbox), when these sorts of attitudes start leaking over to PC, which seems to be happening, especially with Ubisoft, because they don’t want their console versions to look like a bag of crap in comparison, so they deliberately restrict the PC version that does not need to be restricted.

      If you truly prefer 30fps gaming because you think it feels more cinematic, you can do that with any decent PC game by going into the options and switching to 30Hz, job done. When games start removing the option for 60Hz though, the reality is, it’s either a lazy, shit port (Need for Speed: Shift, which, upon forcing the game into 60fps ran at double speed because the coding was so piss poor that the timing was directly linked to the engine), or a concious decision from the developers to restrict the PC version of the game (The Evil Within, with it’s “your on your own” i.e. we don’t advise or support anything over 30fps). Either way it sucks monkey nuts.

  10. Zaxwerks says:

    I’ve just been playing Assassin Creed IV which due to the abysmal Async solution Ubisoft uses fluctuates between a locked 60fps and a locked 30fps and I can say hands down that 30fps is NOT suitable for an Assassin Creed game. Panning around to try and get your bearings is just a sickening smear of blur which is totally disorientating… and the same goes for fights with multiple enemies.

    Alex Amancio stop talking BS just to cover the fact you can’t get the Xbox One version of ACU to run at 60fps because it’s a cruddy underpowered device and your game engine isn’t efficient enough.

    Just shared this article with a friend who responded back “Alex Amancio is a knob”, and whilst I don’t know him personally I tend to agree with that sentiment.

  11. Oozo says:

    Well…
    On the question of forcing The Evil Within to be letterboxed: the annoucement made it sound like they would use the resulting black space… for reasons. Maybe fourth wall-breaking reasons? Maybe it’s PR speech for: “We didn’t know any better”, though. We will see.

    You wee, in principle, I’m a staunch defender of the aspect ratio being up to the creator, not the audience (there is a reason those wonderful Master of Cinema-Blurays do beg you to use the correct format, which any director worth a damn took into consideration when composing her shots).

    I’m not sure if that does count for the FPS, though — I’m not sure that it’s impossible to make a concious aesthetic choise to limit it, but you’d have to show me why it actually is better for the work. Throwing in a word like “cinematic” sounds much less like an argument than an excuse.

  12. Orija says:

    Another piece of console buggery that is being fed to PC users just because. And what makes it harder to get the point across, is the trove of console users who actually advocate 30fps being the standard for games, yuck.

    • pepperfez says:

      Widescreen monitors are great! They give you all that extra horizontal space for free!

  13. stephen-hill says:

    Movies shot at 24fps have motion blur, not because of their frame rate, but because of the camera shutter speed. The standard shutter speed for 24fps is 1/48th of a second. This long opening of the shutter records blurred motion onto the film (or sensor).

    A shorter shutter speed, for example 1/96th of a second, will result in less motion blur and a less cinematic feel. This is why The Hobbit, which was shot at 48fps and a shutter speed of 1/96, looks less cinematic.

    So for video games, the only way to make a 30fps game look cinematic, is to make the game engine add motion blur while rendering the image (essentially, simulating a 1/48 shutter speed of the in game camera).

    Your eyes are actually very good at reducing motion blur as this wikipedia article can explain. link to en.wikipedia.org

    • HidingCat says:

      This needs to be a featured post. This is the reason why 24FPS movies look “cinematic” but 24 FPS games look terrible. Without the motion blur to “fill-in-the-blanks” 24 FPS games just look so stuttery.

  14. shinygerbil says:

    The human eye can definitely tell the difference between 24 and 60; saying that any more than 24fps is unnecessary is just plain incorrect. I believe (don’t quote me) that the upper limit of perception is nearer 100fps, beyond which humans won’t notice a difference. That could be bullshit from the “i want to justify spending so much money on my 120Hz monitor” camp though – I’m pretty sure there was an RPS article on that debate too.

    Interestingly, early cinema projectors which ran at 24fps ran into problems as the gaps between the frames was noticeable – between each frame the shutter has to close, thus blocking off the light, while the projector is advancing to the next frame. This led to a very flickery, dim picture. The solution was actually to close the shutter twice on each frame, so each frame is technically flashed on screen twice. Admittedly, this was more to solve problems with a lack of light than a lack of “persistence of vision” but still. The whole idea of “24fps is enough” is old-fashioned, and the only reason that people feel 30fps to be more “””””cinematic””””” than 60fps is because films are shot at a similar rate.

    If films had traditionally been shot at 48fps and TV shows broadcast at 24, people would most probably associate 60fps with “””””cinematic””””” rather than 30.

  15. Tinotoin says:

    A very telling excerpt from the full article was this, I thought:

    “So I think collectively in the video game industry we’re dropping that [60fps] standard because it’s hard to achieve, it’s twice as hard as 30fps…”

    So it’s hard, so we won’t bloody bother. Tsk Tsk.

    • pepperfez says:

      I’d even be OK with that explanation. It’s hard, you have other things to focus on as designers, fine. But the sales effort trying to make me think it’s my choice to stick with 30, that’s the part that puts me on edge.

  16. Hyoscine says:

    This is total crap.

    Film *does* look more cinematic at lower framerates, and if someone were recording their Lets Plays on film stock, I’d be happy to watch them at 30fps. We’re not talking about film here though. When your motion blur, bloom, and depth of field are effects driven, frame rate is just not a factor any more. Halving the frame rate of something rendered will in no way make the output look more like film, and the people saying as much are very much aware of this.

    I haven’t been this annoyed by an outright marketing lie since the Alan Wake sofa thing of 2010.

  17. ColonelClaw says:

    I work in film-making. Here’s why 24fps can look smooth at the cinema, and 30fps looks like shit on a game:

    Real-world cameras make films from taking a series of still images very close together, known as frames. Each frame has an amount of time that the film/sensor was exposed to light, known as the frame exposure. A typical exposure might be 1/125th of s second. During this time everything can move, the camera, the subject, the surroundings, whatever. That movement creates what we know as motion-blur.
    You knew all that already.
    What you probably didn’t know is that accurate in-camera motion-blur is extremely time-consuming to replicate with a digital camera, i.e. in a game, or any kind of CGI rendering. What you see in modern games as motion-blur is nothing like what the real thing looks like, and consequently looks bad. So far the only way to reproduce realistic motion-blur witrh CGI is to use specialist rendering software, the one I use day to day is called V-Ray. It takes bloody ages to produce a single frame of animation, but when you slap it all together at 25fps (the PAL standard we produce animations at) it looks completely smooth, as the motion-blur is simulated using a physically-correct model.
    No game engine has even come close to replicating accurate motion-blur as the hardware is nowhere near powerful enough to do it in realtime. The only way to currently get smooth motion in games is to use a far higher frame rate i.e. 60fps or more.

    Bottom line: The Ubisoft guy is talking out of his arse, as is everyone else who claims 30fps is ‘cinematic’

    • Niko says:

      Thanks! I, personally, didn’t know that, although in retrospect it sounds kinda obvious.

      • Rizlar says:

        That was exactly my response too! Excellent post, ta muchly.

    • FriendlyFire says:

      Yeah, motion blur in games is a gross approximation, but that’s not the whole story. There are many things that make it impossible for games to properly replicate motion blur as seen in movies, one of which in particular is to me the main reason 30 fps is unacceptable.

      First and foremost, motion blur is essentially an integration over time. You take all the light that comes in during the given exposure time and here you go, motion blur! The problem is that games work using instantaneous times, but you need to look ahead to be able to “expose”. In movie postprocessing, that’s fine; you have the entire animation sequence, so the renderer can look up the position of every single object at any given time in the animation. In a game, especially with a low-ish frame rate, you can run into situations where the motion you predict (ie. by looking at the input right now and the animations in progress) can differ from the motion that actually happens the next frame, which causes a jarring step where the motion blur is incorrect.

      Moreover, an awful lot of games expect instant response times. I remember a talk (though I forget which game, it may have been the Assassin’s Creed games actually) where the devs said it was extremely tricky to keep physics and motion blur working when your character had to make very large instantaneous motions so that the game would feel responsive, say during a block move or a dodge move. The motion is completely impossible physically, so it throws off the physics simulation, and it’s of an extremely large magnitude comparatively, which would cause heavy over-blurring when motion blur is computed.

      Finally, there’s one last detail that’s not mentioned enough: frame rate variability. Every movie has a fixed, guaranteed frame rate. I have never seen a game which doesn’t fluctuate somewhat. Therein lies two issues: firstly, motion blur is frame rate dependent, so a lower or higher frame rate will skew the appearance of motion blur towards overblurring or underblurring. This goes back to the assumption that frames have a certain exposure time, thus fewer frames have a longer exposure time to compensate and vice-versa. Secondly, motion blur aside, you still have a number problem: if your game runs at 60 fps and you dip 5 fps, you’re at 55. Few people will notice that. If your game runs at 30 fps and you dip 5 fps, you’re down to 25. Everyone will notice. You have no margin of error. That to me is the largest issue with low frame rates.

    • albertino says:

      Interesting – never thought of it quite like that. I remember first seeing Crysis running on my machine (at the time) and thinking how smooth it felt (relatively speaking), considering it was only running at 20FPS. So I’d argue that the motion blur in games does go to some length to recreate the motion blur from film. Crysis also had ‘object orientated motion blur’ so it wasn’t just frame by frame. Not saying it was perfect, just that it definitely helped with the low frame rates to a degree that outweighed it not being there at all. I remember showing my friend who couldn’t believe it was only running at ~20FPS

      Also I was thinking about Alien Isolation when reading this. Now here is a game that is definitely trying to capture the feel from the original film. This game runs at 30FPS on consoles (with motion blur) and is probably therefore a much better contender for justifying 30FPS (on consoles at least) – better than Ass Creed at least.

      Edit: Also I enjoyed Crysis at this kind of framerate as well as Dark Souls at 30FPS (one of my all time fav games) and a host of other games on PC and consoles. Games can be thoroughly be enjoyed at lower frame rates – as long as it’s consistent (as mentioned in the post above). And my rig can run other games at 60FPS no probs.

      Personally I’d take a consistent 30FPS game over another game continually fluctuating between 30 and 60FPS any day.

    • jonahcutter says:

      Extremely well put.

      And exactly why motion blur is generally the first thing I turn off in games. It always looks fake and distracting to me, and does nothing to successfully mimic standard film. It looks like exactly what it is, an effect.

      • Smoky_the_Bear says:

        Also I don’t even see why they should attempt to mimic film. I want them to mimic reality, not film. I can turn my head around in real life without everything blurring into an indistinguishable mess (until I’ve had too many beers at least).

        Which phrase out of the following two have you heard from gamers?
        “It looks really realistic”
        or
        “It looks really cinematic”
        I have heard the former dozens of times, the latter, NEVER.

        Not fucking once until this last year was the idea of video games being cinematic even a thing. It’s only come about because of marketing people inventing this idea that video games need to be cinematic in order to justify their crappy frame rates on this crappy new hardware.

    • Urthman says:

      Another way of saying this is that film is mostly analogue and games are entirely digital.

      If a game is running at 24 frames per second, it shows you 24 instants, completely leaving out everything that happened between those instants.

      A film running at 24 frames per second shows you 24 chunks of a second, showing you everything that happened during the time the shutter was open in each chunk. The only thing that is left out is what happens between chunks when the shutter is closed.

    • gamma says:

      One frame of a film camera (and video for that matter) captures a moment – as in light variation for a certain duration of time (ie. shutter speed). CG in games usually calculate an instant of simulated light and project it for a duration of time (the corresponding duration of a frame for a given frame rate).

      Since an instant is an interval of time of zero length, an accurate reproduction of a frame in film* light capture would require infinite number of CG frames. (not practical)
      * refering to the silver process not the fact that there are a sequence of images of movement.

      We are used to think of a photo as an instant (in opposition to motion pictures), but it is always, always a moment, very short or lenghtier it is never zero length, hence the confusion.

      This is why we still find benefit from a higher frame rate above the convention of 24fps as the limit of movement perception. Now to say that 30fps feels “more cinematic”… geez, get a grip.

    • drewski says:

      10/10 would read again

    • Smoky_the_Bear says:

      This needs printing on a 50 foot tall piece of laminated card and super gluing to the front of the Ubisoft headquarters. Also any other games company advocating this “more cinematic” nonsense.

  18. Clavus says:

    Can’t wait for VR to come around and say “well those visuals are nice and all, but your game needs to be running at at least 90 fps or it’ll be unplayable and make the user vomit.”

  19. Lobotomist says:

    Genius marketing. LOL

    • Krull says:

      Ubisoft style, baby.

      • Lobotomist says:

        Here are other improvements we made to our game for your convenience:

        1. Bad AI – Because if AI is too good, players will not have that “its a game” feeling

        2. Low quality textures – Because beautiful textures will cause player to stare at screen, which is bad for eyesight.

        3. Draconic DRM – Spending hours installing the game improves players inteligence

        • pepperfez says:

          draconic DRM (1 computer, DC variable)
          This special quality makes a game’s very presence unsettling to players. Activating this ability is a free action that is usually part of a release or DLC package. Players within range who witness the action may become frightened or shaken. The range is usually 1 computer, and the duration is usually 5d6 months. This ability affects only players with fewer Torrent Links than the game’s publisher has detected. A player can resist the effects with a successful Crack save (DC 10 + 1/2 the frightful game’s genre HD + the frightful game’s DRM modifier; the exact DC is given in the game’s descriptive text). A player that succeeds on the saving throw is immune to that same game’s frightful DRM permanently. On a failed save, the player is shaken, or panicked if it has 4 games or fewer. Draconic DRM is a mind-affecting fear effect.

        • Baines says:

          In the past, there have been developers that have effectively praised bad AI as a desirable feature.

  20. derbefrier says:

    60 fps is better than 30 though 30 fps is perfectly playable and anyone who says otherewise is a choad.

    • nrvsNRG says:

      perfectly playable to choads like you.

    • MobileAssaultDuck says:

      An MP3 with 64 kbps quality is listenable.

      A movie at 360p is watchable.

      Just because a game is playable at 30 doesn’t mean it’s an enjoyable experience.

    • FriendlyFire says:

      You’re really not making yourself look smart there mate.

  21. Darth Gangrel says:

    30 fps is more cinematic? I’d argue that [buzzword] is more [buzzword] than [buzzword] and anyone saying otherwise can just buzz off.

    • Niko says:

      You only say that because you don’t care about journalistic [buzzword], you [buzzword]!

      • shinygerbil says:

        #[buzzword]gate

        • jonahcutter says:

          Feh… You’ve all [buzzword]ized the [buzzword] of the [buzzword][buzzword][buzzword]. Stop [buzzword]-[buzzword]ing!

  22. Niko says:

    Well, my Very Important Opinion is that I don’t care about FPS rate if it doesn’t jump too much. Love Dark Souls 1 w/o FPS unlock, love Dark Souls 2 as well.

  23. PsychoWedge says:

    Uh, I played AC3 on my PS3 with 30fps long after I played it on my PC with 60 fps and if the PS3… experience… was not enjoyable. 30 fps are just jerky and skipy and… not smooth, is probably the best word I can use. You not only see but so much more than that feel that it is not smooth. You feel it in every step, every jump, every fight. You feel it when you stand atop a building and watch over the land. It’s really not about being cinematic but about moving about in the world

    But of course we all get the marketing bullshit. How could you EVER admit to the customers that their new 500 dollar consoles are not good enough to run your game with 60fps or that your engine is too shitty to run the game with 60 fps on a new 500 dollar console. Better pull some nonsensical gobbledygook about feeling cinematic out of your shit encrusted asses and tell everybody how awesome it is to have a game run in 30 fps and how bad it would be to have it run in 60fps.

    Sometimes I wonder how these people can sleep at night. And sometimes I wonder why I lack the ability to spew bullshit into people’s faces and how much easier my life would be if I could. And sometimes I really wish we could have a Bullshit-Man. link to youtu.be

    • Smoky_the_Bear says:

      30 is definitely playable…….UNTIL you’ve played 60.
      Some things cannot be unseen, once you’ve experienced superior frame rates it’s very difficult to go back.

      Think of it another way. Most people at some point or another have had “old PC syndrome”, where you have to run games on lower graphics and lower framerates for a while. When you upgrade, the first thing you do is go back to some of those games to crank the graphics up and play them at 60fps. It makes you go “Wow! this is so much better”, I’ve done this with so many games, several times in the past and never once did I ever think “hmmm, I think this felt more cinematic on my old PC”, what an absolute load of bollocks.

  24. Premium User Badge

    yhancik says:

    Also

    Some games build systems like AI and physics around updating 30 times per second,

    What? That sounds like pretty terrible practice D: Who does that?

    • suibhne says:

      I’m not sure about the claim of AI/physics updates, but take a look at tickrate in online games. Any competitive FPS gamer has been dealing with this for 10+ years – the potential mismatch between client-side framerate and the client’s updates to the server. Some game engines have proved more responsive when these align or relate to one another in specific ratios.

      What this really tells me is that the whole argument re. “cinematic 30fps” is even stupider than it seems on the surface. We already face a host of arbitrary technical limitations in our medium – why the heck would we choose to emulate the arbitrary technical constraints of a totally different medium, and thereby further limit our ability to deal with the actual technical constraints baked into the platforms in front of us?

      This whole conversation really boils down to gamemakers saying, “We can’t deliver the visual quality we want without sacrificing framerate”. Which would be a fine point, if they made it honestly. But that brings me to another ridiculous issue with their intellectually threadbare argument: their claim that lower framerates are defensible because they’re “more cinematic” is equally applicable to lower visual fidelity. In fact, I’d argue that it’s actually more compelling in that case, because players are much less likely to experience a stationary camera (and therefore much less likely to evaluate the visual fidelity of a scene) in a videogame than in a film.

    • iainl says:

      In essence? Lots of people who are designing for fixed hardware, i.e. consoles. Because running your AI/physics clock independently of your rendering takes additional resources, why not take those cycles to do something else?

      Mind you, not just them – to violate the Carmack Law mentioned above, Quake 3’s physics were tied to framerate and so everyone tried to hit the 125fps cap where you got the best rocket jumps for your bang. So it’s easily done.

    • FriendlyFire says:

      Everyone?

      Physics and AI *never* need to be updated more than 30 times per second. A common value is 20 times per second, which is an integer divider of 60 frames per second. You won’t notice if the AI is a few milliseconds slower to react or if the physics engine takes a few milliseconds to update, but you definitely will appreciate the substantial performance gain of not running those expensive computations every single frame.

      Moreover, physics in particular are generally handled with a fixed time step, so you need to guarantee that they’ll be run a fixed number of times per second. If you can’t, you might get unexpected behavior. What console devs generally do is lock the framerate to a desired value, like 30 fps, and run the physics and AI on an integer divider of that, such as 15 or 30 times per second. What PC devs usually do instead is a more complex loop like is illustrated in the excellent book Game Programming Patterns: link to gameprogrammingpatterns.com
      The game processes input, then it updates the game (ie. runs the physics and AI and such) as many times as required to “catch up” any possible lag incurred by the last rendering step, then renders and repeats.

      Obviously, if the game was developed for consoles with a fixed framerate, porting it to another form of looping is extremely difficult and is thus never done. Since doing it the right way is more complex, devs avoid it. Hence, we get this.

      • Premium User Badge

        yhancik says:

        My bad, I misread the original line as updating physics / AI at the same rate than framerate (which is potentially quite variable). But yes, you are correct.

        • DanMan says:

          Oh, you did read that right. There are plenty of games (mostly built for consoles and in Japan) which run almost everything synchronously. Beat’em’Ups being a prime example. A lot of them slow down _everything_, if your PC can’t achieve the target framerate (say, below 20fps), resulting in the whole game running in slow motion instead of choppy.

          It’s ridiculous.

      • gamma says:

        Physics and AI *never* need to be updated more than 30 times per second.

        It actually does need to be updated at a higher frame rate if we are to exacly address the issue at hand. The fact that there is not enough power to put into it is another issue and does not make it dismissable.

        • blind_boy_grunt says:

          wouldn’t it be physics must be at least updated as often as the fps and ai as often as neccessary (which often is not that often…maybe, i know nothing).
          If you have physics objects but don’t update them the drawn picture would be the same. You could just take the position+(velocity*time since last physics) for drawing, but that would mean on collision you have some objects jumping around.

          • gamma says:

            Well, we could restrict the desire to update more often only those objects that are visible or under calculations which will result in visibility differences from one frame to the next (assuming we settle for a 30fps).

            But I meant only in regards to the “Feels more cinematic” expression which is quite misleading to defend per se a 30fps rate in a game. It turns out the expression is being used only in ignorance of what underlies the issue, at worse serving as an excuse for lack of processing power.

            In practice we can get away with a less than accurate simulation of motion blur, by itself adding a processing cost. As with all CG many shortcuts are taken.

  25. Tei says:

    My pet theory is that they are using a engine optimized for the limitations of the PS3 and XBox 360. These are machines that are very different hardware than the next-gen, so the engine is unoptimized.

    They may have to optimize it for the new architecture, or build a new one. But both these things take a lot of time, and a lot of money.

    I think they are now lying trough their teeth, and Its a sad state of things. I think RPS should not be reporting this, because this is embarrassing and ridiculous for Ubisoft, and is best left ignored. I think a better reaction is to laught at them and then look away. The problem will solve itself in 2 or 3 years, when they manage to optimize the engines enough, so they can give either decent framerate or decent resolution, maybe both in some cases.

    • iainl says:

      Not quite – Unity isn’t coming to the older machines, it’s PS4, Bone and Windows. But no matter how powerful you make a console, people are going to have only a set quantity of GPU power. They can either draw 60 frames per second of a certain complexity, or 30 frames with twice as much power available per frame. Games that aren’t desperately frame dependent will usually choose the latter, so it looks really pretty on Youtube and in screenshots.

      And that holds no matter how much power you give developers – double it so they could do 60fps, and instead they’ll just make it even prettier still. It’s only on PC, where nobody (except, seemingly, The Evil Within) that wants to put a GTX 980 as the minimum spec, that you just get more frames by throwing more hardware at the problem.

      • Tei says:

        I was not talking the game, but the engine. Their engine is probably not up-to the task …. yet.

  26. Arithon says:

    30fps may be the best ever for film, but UBISOFT you don’t make FILMS you make GAMES!

    Even if you pad them out with hours of cut-scenes.

    Stop excusing bad technical decisions (like coding only for consoles) and pretending the half-baked result was intentional. 60fps is twice as good as 30fps and half as good as 120fps, because we’re playing interactive games not watching non-interactive cinema!

    • snowgim says:

      Yeah, I knew when I clicked this article that someone would bring up The Hobbit. That’s nothing to do with games.
      The biggest thing for me with games is the responsiveness. At 60fps you are seeing response to your input TWICE as fast as 30fps. I find it really hard to aim, or even move with precision in most games if it drops too far below 60.

      Ubisoft transaltion: “We wanted to get 60fps, but that was too much like hard work so we gave up and are trying desparately to convince you you don’t need it.”

      Hey, it worked for 720p vs 1080p… oh wait…

      • Zarf says:

        Yes, exactly! 30fps is more cinematic,and 24fps is even more cinematicer, but we aren’t watching movies at the cinema. We are playing interactive games, where any input lag or control sluggishness will rip apart our immersion. You don’t have that problem with movies because you don’t control anything in movies.

      • HadToLogin says:

        It works for 720 vs 1080 – ask any console player, they don’t consider 1080 to be a requirement. Nor they find use of 60 fps.

        And they say that buying 4k TVs…

  27. xfstef says:

    The whole industry is filled with lying bastards.

    Just tell the console users that their hardware is shit and that’s why they have to limit the games to 30FPS and be done with it !

  28. Lacero says:

    I thought this whole thing was because they’re limiting the PS4 version to run at the same speed as the XBox? Taking it as an actual 30fpsV60fps opinion seems very charitable.

    The first comment form them was something like we’ve limited the ps4 version to the same as the XBox to avoid debates. Which, er, didn’t avoid a debate.

    There are still rumours of microsoft not allowing games to release on xbox if they have worse graphics than ps4. And I think one company called their bluff a while back?

    but, eh, console wars aren’t really a thing for this website. I hope we get a good looking version.

  29. Bladderfish says:

    This is actually a form of mobile truth. When companies invent a reason why *this* product is the best, when in fact there are numerous reasons why it fails compared to competitors products.

    Ubisoft should be disregarded from game purchases for this. They are liars.

  30. KalSkirata says:

    I love you guys! I learned so much just reading the comments and watching the links. I almost don’t feel bad that I didn’t work on my bachelor thesis.

  31. ssh83 says:

    This reminds me of the no female debacle with ac:unity. Ubisoft is now ubi-lie?

    • Sam says:

      It’s really weird PR, because in both cases the truth isn’t actually that terrible.

      They don’t have playable female characters for the co-op because they want everyone to be playing as the main (male) character at all times. It would certainly be nice if the player could customise their character, but it’s generally accepted that many games have a fixed protagonist. Instead they blurble silliness about how difficult it is to animate women when they’d just released an AC game with a female protagonist.

      And now they’re speaking nonsense about cinematic feel, when the truth is they decided they’d rather have everything look a bit prettier than try to double the framerate. Which is a perfectly fair decision to make and I’m sure would be understood by players. I guess they’re not allowed to say anything to suggest the new consoles are not literally omnipotent.

      • HadToLogin says:

        AFAIK they don’t use female because they use same set of model, clothing and anims for coop-players, they simply change face-structure and/or texture. And changing it all to work with female model costs $.

  32. Lion Heart says:

    as long as its above 30fps i dont care. 30fps is the minimum. 1080+ and better visuals beats 60fps anyday

  33. Person of Interest says:

    I don’t have a problem with a few Ubisoft employees admitting that they prefer 30fps to 60fps. Plenty of people I know felt uncomfortable watching HFR Hobbit (Not me: the more frames the better! Preferably frames of better movies, though.) So it’s reasonable to expect some folks who work on console games for a living to feel uncomfortable playing 60 FPS content. Who am I to tell other people what they should prefer?

    Anyway, Alice already clearly explained the likely reason why the game will be 30 FPS. Unfortunately she left out the interviewees’ qualifying statements. TechRadar probably did selective editing of its own, so the original interview may have been largely forthright. A generous interpretation of, “It actually feels better for people when it’s at that 30fps. It also lets us push the limits of everything to the maximum.” would be that 30 FPS, highly detailed scenes feel better than 60 FPS, sparsely rendered scenes.

    Still, she does both us and Ubisoft a service by emphasizing the “30 FPS is okay” quotations. I’m glad to know I should avoid the game since I much prefer smooth motion, and Ubisoft sees (from the volume of comments) that they should work to unleash the higher graphical capabilities of PC’s.

  34. zaphod42 says:

    30 FPS actually DOES feel more cinematic…. FOR LIVE ACTION MOVIES! You can tell with films like The Hobbit which run at higher frame rates that something is lost, it feels more real and as such less magical, less purposeful. The characters seem like they’re on a set, wearing costumes, instead of actually being in some fantasy world. You see more motion and detail, but that also reveals that things are fake somewhat.

    Thing is… THAT DOES NOT APPLY TO 3D. Be it CG movies or videogames, 3D renders are not real life, and as such are inherently fake. We already know that. But they’re styled, and as such they don’t fall into the uncanny valley that live-action at 60fps or higher does. In videogames instead you have a problem with trying to react quickly to the motion and action of the scene, so you really do want as high a frame rate as possible.

    Publishers saying that 30fps for games is more cinematic ARE BULLSHITTING to cover up the fact that the game is a shoddy console port, and console often just has to accept 30fps for technical reasons. But the PC absolutely does not need to accept 30fps ever. They’re just covering their asses and they’re completely wrong.

  35. PopeRatzo says:

    Oh, and according to Ubisoft, they’re going with 8-bit sound because so many gamers love games with 8-bit sound.

    Anyway, this is just more proof that the companies that make games basically loathe their customers. They think we’re stupid and that we don’t deserve nice things.

    • Zaxwerks says:

      Absolutely – apparently they have also said they are going to pull all previous copies of AC from the shelves and re-release them only in 8bit 16 colour 320×240 resolution as they constantly read on forums that console owners think graphics are not important and it’s just about the gameplay when discussions like this surface.

    • Geebs says:

      UBIsoft obviously think that 60fps will allow the PC scum to pirate their games twice as fast

    • Shuck says:

      “Anyway, this is just more proof that the companies that make games basically loathe their customers. They think we’re stupid and that we don’t deserve nice things.”
      Or, you know, they develop console-centric games for the platform where they get 90%+ of their sales and would rather not completely re-engineer their product for the minority sales platform because it’s not economically practical (and having done so out of necessity then convince themselves that they did it for meaningful artistic reasons). But no, no doubt you’re right and developers are part of a loathing-filled conspiracy against the PC master race.

  36. SpacemanSpliff says:

    This article reminds me of when I installed Wing Commander One on a PC with a 486 processor. Uncontrollably fast spaceships is what you got.

    • TheBigBookOfTerror says:

      Oh yes, I was quite lucky that we had a turbo button on our pc, certainly helped with that problem. Wish I had one for when I upgraded my card last year. Bioshock Infinite went from 24fps to something incredibly fast and I suddenly found it unplayable. No other game I suffered that issue though, but that one it was really noticable and I didn’t like it.

  37. w0bbl3r says:

    People who say the difference between 30FPS and 60 FPS is so huge are the same graphics whores who will tell you that the HD texture pack for shadow of mordor needs 6GB VRAM because it looks so much better.
    There isn’t much difference that you can see, even though the difference is there. And the HD texture pack for mordor (like many of these kind of packs) makes little to no difference in graphics, but somehow still rapes your framerate.
    Anyone who says a game running 30FPS is unplayable can’t have played many games recently then. Almost every game release would be unplayable. But these people still manage to bitch about the latest CoD or rave about how great the latest far cry or assassins creed will be.
    30FPS is FINE. As long as it never drops BELOW that. Once you start to get below it, then you will start to notice stuttering. 30 and above, you won’t notice whilst playing. Maybe if you record and watch it back and make a point to look as hard as possible you might see very very rarely the odd flicker, but whilst playing a game, you would never ever see that.
    Like id said when they made doom 3; once you start to get much over 30FPS you start to see the same image stacked again and again, so you aren’t SEEING any difference, even though there are extra frames there. So all it is doing is hogging resources needlessly.
    Sure, it’s nice when the option is there to allow the framerate to go as high as possible. Because the higher it is over 30, the further it has to fall to get BELOW 30. But as long as 30 is the absolute minimum, there is no issue, unless you want there to be. Much like a placedo effect, it’s all in the mind. If you know it’s below 60, and you believe it makes a huge difference, your mind will be convinced it’s a problem, and will even be convinced it see’s the problem. It doesn’t

  38. Scumbag says:

    Try to competitively play any Quake at 30fps.

    • Geebs says:

      Thank you! I was getting increasingly despondent that this discussion hadn’t once touched on Quake jumping physics yet. And these people call themselves PC gamers….

  39. JonClaw says:

    Games are not movies.

  40. Initialised says:

    So if you have a 120Hz screen and your game is running at a 30fps frame cap your screen is going to display the same image four times before moving on to the next frame and this is supposed to be a good thing and people wont notice?

  41. MadTinkerer says:

    The real reason why Ubisoft are advocating 30 FPS is because all of the big publishers really, really want everyone to start streaming all of their games and are going to try to force everyone to make the switch. Which sounds insane until you realize that the reason why is because they hate piracy and filesharing that much. In short, forcing their games to run at 30 FPS is part of a perfectly valid method for realizing a suicidally insane point of view.

    • suibhne says:

      Great point. Another factor is that the big publishers are under lots of pressure (internally and externally) to give consumers a reason to buy into the new console generation, and static images sell that decision much more than framerate.

    • ffordesoon says:

      That makes a disturbing amount of sense.

  42. BryanTrysers says:

    I’d be interested (because I’m like that) to see how players’ perceptions are changed between 30fps and 60fps beyond the quality of the visuals. Increasing the frame rate of a visual stimulus can affect a viewer’s perception of velocity and duration. This has been suggested as a reason for some people feeling that The Hobbit @ 48fps was moving too quickly (at least initially) despite the physical timings being the same as for 24fps, and having the same soundtrack. Off the top of my head I don’t know of any research into this regarding gaming where there is more interaction and agency.

  43. Faxanadu says:

    It’s obvious I want my games to have 60 FPS instead of 30.

    The real question is, do I want my games to have 120 FPS instead of 60?

    That’s the step that requires consideration. Is 120 FPS worth the gear it takes and the power it consumes?

    (gear: 120hz screen, power: double tax on gpu etc)

    • dysomniak says:

      To my eye 90 fps is a bit better than 60, but 120 is more dubious. I still like to get my frame rate pegged at 120 if I can so there’s plenty of room, but I’ve been playing Shadow of Mordor which is capped at 100hz for some stupid reason and it looks as smooth as anything.

  44. Dale Winton says:

    Couple of points. I’d want the frame rate of a game to match my monitors refresh rate. If it can’t match it you’ll get tearing( this is what g-sync and free sync have been invented to eliminate)
    Frame rate drops. If you have a game that runs 60fps with drops to the 40s you’ll not notice it too much. However if you have a game running at 30fps and it drops well you’re going to notice that
    So basically its a load of nonsense what the ubisoft boy said

    • Aninhumer says:

      That depends whether you enable vsync.
      When it’s enabled, low framerates will manifest as dropped frames which will cause slight judder (since some frames will last twice as long as others) but no tearing.
      When it’s disabled, you’ll get partial frames which cause the tearing.
      Personally I find the former much less distracting, so I always turn it on.

      • dysomniak says:

        Even without vsync I don’t notice any tearing below my monitors 120hz refresh rate. And frame limiting keeps it from going over so why ever turn vsync on?

        • Aninhumer says:

          If your computer can maintain the full framerate, then you won’t get any tearing, but that’s not going to be the case for most people.

  45. His Divine Shadow says:

    Movies have 24 fps simply because they found it’s the barest minimum for the brain to recognize it as motion, and during the early days they wanted to scrimp on the film as much as possible. The idea that it looks in any way better is pure preconditioning.

    • phelix says:

      Exactly, you took the words out of my mouth. Any slower than 24 fps makes the brain see a series of images rather than one moving image.

  46. GameCat says:

    Meanwhile WiiU is getting more 1080p 60FPS games than so called next-gen consoles, lol.

  47. NothingFunny says:

    OMG not this again

    The whole problem is the control lag, it LOOKS fine at 30 fps if you just WATCH, but it FEELS much worse when you PLAY
    Especially when you look around quickly its a world of difference between steady 50-60 and capped 30.

    The ‘cinematic’ argument is absurd. It would feel more musical with screen off.

  48. ResonanceCascade says:

    It’s not a zero sum game, people.

    Game development is a long chain of tradeoffs. The subjective “cinematic” look can be a valid variable in that equation — especially when you’re working within tight hardware constraints. The Last of Us on PS3 benefited from running at a lower framerate in order to get better animations and facial expressions.

    That said, The Last of Us is also a lot better in 60fps on the PS4. If there is enough breathing room for the higher framerate then it should be used. But it’s not necessarily essential for every single game.

    All of this is pretty irrelevant for a discussion about PC games though, because if you’re locking down the framerate on the PC, then ur doing it wrong. The whole point of the PC as an open platform is to allow all kinds of resolutions and refresh rates to fit the gamer’s particular hardware and preferences.

  49. buxcador says:

    Only one thing needs to be said:

    GAMES ARE NOT MOVIES.

    As more a game tries to be a movie, the worse it gets.

  50. Howl says:

    120fps or GTFO.

    It’s so noticeable that it’s worth investing in more GPU muscle, or lowering graphics settings to achieve it. Most of my BF4 clan have 120-144Hz displays. Use a lightboost display for a while and it’s something you just can’t downgrade from.