Framerates Matter

A while ago I was reading up on Starblade, one of the first commercial polygon-based games. What really struck me about the game was just how smooth it was compared to its spiritual successor, Starfox (granted the above links are YouTube videos that don’t accurately reflect framerates, but the differences are still quite noticeable).

It’s an extreme case, but one that nicely demonstrates the importance of rendering speeds.

design - video, refresh, rate, human, games, frames, framerate, fps, eye, console, 60fps, 30fps
Despite having animations that consisted of only 2-3 frames, many classic games ran at 60fps. This granularity helped to smooth out movement, including Mario’s beloved jump.

Of course no one ever complains about games being too smooth, but the debate of 30fps vs. 60fps continues to pop up. What’s more, the 60fps side keeps losing ground, often to the argument that humans can’t really detect more than 30 frames in a single second.

And that is completely untrue.

It’s an inherently flawed statement as humans are not digital machines. The human brain is always on, always receiving input. Light hits our eyes as a wave, and the information it carries is almost instantly transmitted to the Visual Cortex. The brain analyzes this data, focusing on changes brought on by motion and fluctuations in intensity. Displacement is interpolated via motion blur and identical input is discarded to avoid extraneous processing.

The “decoded” image is further analyzed by various parts of the brain, but the overall process — as complex as it is — is quite fast and versatile. Much faster than 30fps. Faster than 60fps, in fact.

So where does the myth of 30fps come from? Well, film and TV for the most part, but the framerates of those media are not analogous to those of videogames. Others have written extensively about the topic, so I won’t go too deeply into it. What I’d like to talk about, though, is why high framerates are important to games.

design - video, refresh, rate, human, games, frames, framerate, fps, eye, console, 60fps, 30fps
The Unreal Tournament series has been known for letting its players set very high FPS benchmarks.

As a preface, different titles obviously have different requirements, and some suffer more from a low FPS than others. Also, the reasons for Insomniac’s decision to move away from their 60fps standard were completely understandable, if a little painful to accept.

With that said, here’s why I think high framerates are important:

1. Granularity

The framerate of a game is usually directly tied to the processing of its logic. As a result, action games that run at 30fps cannot have the same granularity of movement as those that run at 60fps. This might not matter much for turn-based strategy titles, but it makes an awful lot of shmups technically impossible to do at lower framerates.

2. Input Lag

Games are inherently an interactive medium, and as such the response times for input need to be virtually instant. On the hardware side this is rarely an issue, but a stuttering framerate can reduce the response times and greatly detract from the overall experience (especially in “twitch” titles).

3. Consistency

30fps isn’t bad, but what most people fail to realize is that it’s often the “ceiling” measurement, i.e., the best case scenario. Unlike TV and film, games are dynamic, and the processing required to render any given scene can fluctuate quite significantly. As a result, 30fps games actually tend to run at a rate of 20-30fps. These sort of inconsistencies can be very difficult to avoid, but they’re much less noticeable with higher benchmarks.

design - video, refresh, rate, human, games, frames, framerate, fps, eye, console, 60fps, 30fps
Motion blur at its finest.

4. Motion Blur

Motion blur is the biggest reason TV and film get away with smaller framerates. The phenomenon of motion blur relies on the human brain’s ability to stitch together a series of blurred images into a single, smooth animation. Until fairly recently, games had absolutely no motion blurring, and even these days it doesn’t have quite the same effect. The reason for this is that post-process blurring is not always accurate, and in many cases purposely exaggerated to create a distinctive visual effect.

To properly accommodate for all these factors, a high framerate is a must. And when it’s there, it creates a certain synchronization between the player and the game; a smooth flow that more developers should strive to achieve.

36 Comments

  • Point 1 is very inaccurate. You’re making huge assumptions about videogame architecture, and while that is correct for many games, there are plenty where that is not correct at all. I would suggest that you stick to the perceptual qualities of framerate instead of making assumptions about game architectures.

  • I don’t understand how anyone can think that 30fps is “enough”. I can’t even understand how people think 24fps movie is fine, I’d rather have 720p60 than 1080p24, anyone who has seen p60 must admit that it’s much smoother.

    When I purchased my Nvidia 3D vision goggles I also tried turning it off and running it in 120fps, and that’s extremely nice, but let’s first try to convince everybody that 50/60 fps is the lowest acceptable, and then we can increase from there :P 24/30 is definitely not enough!

  • “Motion blur is the biggest reason TV and film get away with smaller framerates.”

    I have noticed that films that are designed for interlacing, are extremely ugly when it comes to set designs. Look at some of Lucas arts stuff on an upscaled dvd player with a tv that can do 120-240 hertz. The film counted on interlacing to blur the ridiculously ugly sets, Look at it upscaled to 1080P and then switch the upscale setting to 1080i and you will see the difference. Progressive scan running at 60hz+ the motion becomes so clear you notice the sh*ty cardboard set design.

  • Point 3 was on the right track, but wasn’t as complete as could have been.

    A game renders frames at speeds that fluctuate constantly. Even when using a program like fraps you will observe that frames per second is constantly changing. The main difference between games and TV or film is that those other medium run at a perfectly smooth framerate. In other words, the frames for TV and film are nearly perfectly and evenly spaced. However, in a game, you might have 30fps during a given second, but the gap between frames might vary wildly, and in particular you may experience some significant inter-frame gaps. One or two large gaps per second can easily create the appearance of choppiness or lag. Unfortunately, v-sync can’t really help here. With v-sync you’ll get frames at even intervals, provided that the next frame can be rendered in time. Those same moments that induce stutter without v-sync will create a skipped frame with v-sync and you’ll wait an additional cycle for the next frame to be rendered. Depending on the fluctuations, you may have quite a few frames that are rendered and discarded without ever being displayed.

    The greater frame rates are simply indicative of the ability of the video card to keep pace. With a combination of game engine, scene, and graphics hardware that can keep pace, the frequency or likelihood of large inter-frame gaps is diminished, and therefore the experience is much better.

    If a game ran at a constant and perfectly even 30fps with each inter-frame gap being equal, then it would be hard to argue for needing higher frame rates. However, games do not run at anything resembling constant and perfectly even fps. 60fps or more is usually good enough to minimize the problems.

  • @RichC –

    your logic doesn’t hold up. If the game engine can’t keep up at 30fps, creating choppiness, then how is doubling the max possible framerate going to reduce the choppiness? You’ll still have a game engine that can’t keep up.

  • Even junior game developers understand that input and game logic are in no way tied to framerate.

    I guess you’re not a developer.

  • No, that’s what junior developers think as they can’t differentiate between logical, in-code separations and actual hardware and processes separations.

    I guess you’re not a developer.

  • Actually, items #1 and #2 are pretty far off the mark.

    “The framerate of a game is usually directly tied to the processing of its logic.” That really shouldn’t be the case. Working on top-end race sim games back in the mid-90’s, the very first mission statement was to have separate input, logic, and graphics threads so as to decouple these issues.

  • I used to play Falcon 3, an F-16 simulator. Newer /faster computers were always valuable because that war simulator had so much going on out of site.

    That dogfight on the horizon had implications for the game as a whole. If you added more horsepower, the game would add AI to the distant battles, pumping chaff you couldn’t even see. That ground war not even seen would get more intense, sending A-10s to tank bust. Your wingman would fly better, the radar suppression attacks would pave the way for you. You could winds aloft and cloud action, even weather patterns. Enemies who ignored you before would learn how to paint you with radar. I don’t think you could ever max out the game’s logic and hunger.

    It was certainly possible to have so much off-screen stuff going on that your in-cockpit experience suffered. When you were in a furball, you could see frame rates drop to 7-10 fps. Pulling Gs and having no feedback was very bad indeed.

    But you could bias the power. Make the AI dumber, throw away unseen detail, etc, to make the frame rate better. I thought that anything consistently above 15 fps was acceptable. We’d take screencaps and crow when our framerates got above 20.

    In that game, which I think is still actively played, people willingly traded framerates for more realism. Seeing an accurately shaded lens flare on a cockpit bubble or the billowing smoke of an AAMRAM launching are only small parts of realism.

    I hope this gets at @breakbread’s question.

  • 1) Game logic does not have to run at the same rate as visuals (they share an upper bound though). The game loop can be optimized by only performing certain logic updates a certain number of times each second. AI calculations might occur every 3rd frame, physics every 2, input every 1. Logic and visual updates share the same upper bound, but not everything on the logical end has to occur every frame. Optimizing in such a way can increase your upper bound, and more importantly, reduce fps variation.

    2) This 30 vs 60 fps argument is silly. Something unmoving but updating at 120 fps is no different than something unmoving that updates at 1fps. What is important is the smoothness of the transitions between what the eye records per some very tiny unit of time for the specific content it is viewing. 5 fps may be 100% adequate to achieve the desired visual experience for one piece of content, while 47fps is necessary for another.

    There is no cut and dry x frame rate is better than y frame rate argument.

  • As someone who actually owns a Starblade arcade game, it is nice to see someone point out one of the aspects of the game that is probably taken most for granted. The polygon graphics, quadraphonic sound, flash strobes and rumbler seat, and big curved projection display usually get the attention.

    But the lack of framerate issues made all the difference between it being a solid game, and not a cheap one.

  • I think a lot of you missed the point.

    A higher frame rate means more accurate motion blur, so indeed it is cut and dry. X frame rate is better than Y frame rate, because X has better motion blur.

    Maybe frames per second and motion blur isn’t that big a deal with an image that was captured from a camera, but in a digitally rendered environment frames per second and motion blur is a really big deal. Ask Pixar.

    It’s like the difference between an Atari and a PS3.

  • If you want to see a game brutalize your system into low frame rates, try Crysis.. Then play it on a system that can play it with acceptable frame rates and tell me you don’t notice the difference.

    Then again getting 60fps consistently on Crysis does require quite a bit of hardware if you don’t want to turn down all of the quality settings.

  • umm Dustin…

    A higher frame rate means more accurate motion blur, so indeed it is cut and dry. X frame rate is better than Y frame rate, because X has better motion blur.

    Maybe frames per second and motion blur isn’t that big a deal with an image that was captured from a camera, but in a digitally rendered environment frames per second and motion blur is a really big deal. Ask Pixar.

    It’s like the difference between an Atari and a PS3.

  • The reason TV produce smooth motion is not because of the blurring effect but because they really work at 60 frames per seconds (NTSC) or 50 fps (PAL).
    But the effect is achieved only with interlaced TV (CRT).

    The motion on films is not smooth at all, shaky at best. That’s why cameramen make sure not to move their camera too much and use a lot of tricks to avoid “motion sickness”. In one hand they do use full frames (aka progressive frames) but only at 24 fps.

    That’s not exactly what you call “Motion blur” to be “the biggest reason TV and film get away with smaller framerates”.

    However, yes, to have a good motion you need to have a good framerate.

    You can read a very good interview of James Cameron about it:
    http://www.variety.com/article/VR1117983864.html?categoryid=2868&cs=1

    It basically says: Perceived resolution = pixels x replacement rate.
    and the magic number is known: it’s 48 fps.

    The main problem is not only having the game/software logic to send the frames to the display fast enough but also having the display being able to display them!

    CRT were able to do it through interlaced frames, but mainstream flat panel technologies (plasma/LCD) basically can’t. OLED should do better on the near future.

    That said when there is no motion, 1 fps is just fine :-)

  • The real issue here isn’t framerates per-se….but motion-delta framerates. How ‘fast’ something is moving on the screen translates into so many pixels of distance between the same object on successively rendered frames. If the frame-rate of update is high enough (and the resolution sufficient)……or the motion is slow enough, then the number of pixels ‘jumped’ by an on-screen object that is moving will small enough that it becomes imperceptible as a ‘jump’……it looks smooth. But, when the object is moving faster than the screen can update for the same delta in screen-space, the jumps get bigger, and soon it becomes not only perceptible, but noticible and distracting.

    Motion blur only obscures this.

    The ‘framerate’ resolution of the eye is around 80-100 Hz for a stationary object. Beyond that, changes that are faster (like a flashing LED) no longer are perceptible as changing, and appear to be a continuous (if slightly dimmer than always-on).

    Moving objects are more complex, as it also involves the resolving power of the human eye (max around 1 arc-second). So, if your monitors resolution (and your distance from it) make 1 pixel to be around 1 arc-second wide/tall, then for NO noticible motion artifacts, your framerate needs to be fast enough that no object shifts more than 1 pixel in any direction per frame.

    Of course, you also have to concern yourself with LCD or CRT response times too, as most LCDs shift on/off much slower than the GtG times they like to advertise.

    So, if your LCD has a 5ms GtG response time, it’s On/Off response time is probably around 7-8ms. Which means a max displayable frame rate of 160fps or so. Anything higher than that is invisible (your monitor simply can’t display the changes that fast.)

    There are a LOT of factors that interact when it comes to perception limits. And different people have slightly better (or worse) limits perceptually as well. So, claiming that 30fps is better/worse than 60fps is somewhat nonsensical, without more information on the type of motion, resolution, distance, and speed on screen that are involved for a particular game……

  • It may just be a personal thing, Personally I can see the difference between 100fps and 80fps but apparently my brother cannot, he does notice the difference between 80fps to 50fps though.

    It’s especially apparent when the frame rate is not stable.

  • 60 fps even isn’t enough for fast games. Around 85 are required if you want to be good in a fast game like Quake 3.

  • Anything less than 50/60Hz unless done for artistic or traditional reasons (film and film look on video) is garbage, as far as I’m concerned. This not only concerns games, but internet-based video as well. I don’t care if it’s progressive or interlaced, but if it’s not delivering new chunks of termporal data fast enough, then it looks choppy and clunky. For example, sports or general live-action video should be shot and presented at 50/60fps. Film sometimes looks a little bit stuttery, but after watching countless films, my brain has adapted to the slower frame rate by now.

    Video games started out running at full frame rate (for the most part, at least in systems with a programmable vertical blank facility). More frames per second generally translate into more input points, since the vast majority of games poll for input once per frame. Some games aren’t even compatible with anything lower than 50/60fps – look at Guitar Hero and Rock Band. I have seen games from both series drop every other frame occasionally, and it makes it much harder to track the scrolling gems.

    This is my own opinion, of course, but it’s something I feel quite strongly against as one who always strives for the best quality video possible (objectively, of course – actual observable detail, spatial and temporal resolution, amount of unintended noise/grain and compression artifacts).

  • well written article; very succinct and to the point. However, personally, I don’t care. 30 fps is super fine, and while I can tell the difference between 30 and 60 (and lower than that – I played my share of N64 back in the day), I’ve never understood why 60fps is such a big deal.

    Maybe it’s because I don’t play a lot of fighters, and don’t care for many FPS’s (the most mainstream, creatively bankrupt, and ironically popular, genre there is), but 30fps – and 30fps LOCKED, which is often the case these days – is more than enough for smooth presentation and responsive input. Techies love to bring up input, but I’ve never encountered an error in a game due to me hitting a button and it not registering fast enough. I suppose if you bought a computer to play something like UT3, then you care about that kinda stuff, but how much is an internet connection to blame in that example, and not the rendering code of the game?

    Framerates ARE important, but we’re at a pretty ideal place right now. How many games, honestly, in 2009 ran at around 20 fps? Seriously? 30 fps is fine; 60 is great but it shouldn’t be some kind of goal in expense of some other content, or polish. Just like Insomniac, who recently mandated that they were going to stop pursuing 60 fps in their games. It’s silly to think that it’s that much of a big deal over a rock solid 30.

    That’s why the article was good; it wasn’t about that, it was just about why framerates were important. And it’s a very successful article right there.

  • The term “Framerate” in and of itself is too general. Modeling a control actuator system at anything less than 1KHz is foolish. Running a flight model at less than 400Hz makes it only an approximation (of an approximation, to really make your head hurt). Checking for human response at greater than 20-25Hz is a waste. Human reaction time is not that good. Both the flight control and simulation industries learned long ago that the most efficient computing model seperates functions into multiple framerates based on what they require. So to be specific, having parts of a game (high-end games today being hard to distinguish from simulations) run at very high framerates is a good thing.

    The objective for digital systems interacting with the analog world is to determine how fast digital computation must run (framerate) to “appear” to be continuous. The article correctly points out that humans are analog devices. So, how fast does a digital computer have to perform human oriented tasks, such as respond to a key stroke or display a new scene, to cause the human to believe that it is continuous. Since the discussion is focused on vision, lets look at that. Other posters point out that angular acuity is limited to about 1 arc second. If an object moves less than 1 arc second, the average human won’t perceive it as having moved. The motion picture and TV industries discovered that if they presented a sequence of images a 24-30Hz most humans would perceive it as a continuous sequence with no apparent steps. This is due, in part, to saccadic masking and in part to neurologic processing. Admittedly this is a lower limit to save on media, in the case of film, and bandwidth in the case of TV. Even so, most people don’t complain even when presented with detail rich scenes (check out Avatar). The Nyquist rate for the vast majority of humans is likely less than 60Hz so that number might make better sense than 30Hz.

    Higher framerates and, more importantly, multithreading, are very valuable in gaming and simulation. But, using resources to present images humans will tend to discard is a waste. Better to use resources to improve the scene content to help the human viewer suspend disbelief.

    Theory is nice but empirical measurement paired with human reporting would be an interesting set of studies.

  • Sorry, but your first point there is on the borderline of being completely outright wrong.

    ——
    1). Granularity

    The framerate of a game is usually directly tied to the processing of its logic. As a result, action games that run at 30fps cannot have the same granularity of movement as those that run at 60fps. This might not matter much for turn-based strategy titles, but it makes an awful lot of shmups technically impossible to do at lower framerates.
    ——

    You saved yourself here by not saying that all games ties there logic directly to the fps. Most games separates rendering and logic.
    The logic (game updates, physics and such), is commonly run frame independent in such a way that even if the rendering slow down to a crawl it will still perform okay.
    (If you don’t do this your physics are bound to blow up once in a while, and thats look real bad).

    For a simple introduction look at:
    http://gafferongames.com/game-physics/fix-your-timestep/
    (note: this is a introduction and is in no way a ultimate solution for all timestep problems, im also not in any way affiliated with gafferongames.com :) )

  • In response to mawilson’s insightful comment, that research has been done:

    The effects of frame rate and resolution on users playing first person shooter games
    Claypool, Mark | Claypool, Kajal | Damaa, Feissal
    Proceedings of SPIE. Vol. SPIE-6071, pp. 1-11. 2006

    The author’s conclusion:

    “Contrary to previous results for streaming video, frame rate has a marked impact on both player performance and game enjoyment while resolution has little impact on performance and some impact on enjoyment. “

  • Framerates matter, and so does the integrity of comments threads… which is pretty much trashed by including tweetbacks and linkbacks and general link spam :/ Why do bloggers and their readers tolerate this?

  • The biological process by which humans see is understood (not by gamers), and depending on the individual peaks well below 100 fps. So while there might be some justification to 60 fps, people who buy hardware to do 200 fps are kidding themselves that it is “obviously better” than 60.

    I got a 120 fps HDTV because it is better than 60, but virtually all content is at 59.97 and doesn’t really benefit other than what post processing happens in the set.

  • Games like Unreal Tournament can achieve a high framerate because they are old games that do not challenge a processor or GPU. Newly designed games have to tradeoff framerate and graphics intensity/gameplay, the higher the framerate therefore the less is going on on the screen.

Leave a Reply

Your email address will not be published. Required fields are marked *