Do cinematic video games need judder?

Just a thought. I’ve been learning a lot about television technology lately, and one of the tricky things about it is the difference between video and film. It’s generally well known that movies and film are at 24 fps, supposedly because that’s the frame rate at which we can’t distinguish it from real motion. (That’s bullshit by the way, and has nearly nothing to do with why film is at 24 fps.) Video, on the other hand, is run at 25/50 fps (PAL interlaced or progressive) or 30/60 fps (NTSC interlaced or progressive). This means that video has a very distinctly different look from film, and film never ends up looking quite right on normal televisions. The introduction of 120hz LCD TVs on the market is partly intended to combat this problem, and show film sources at their true frame rate.

However, as far as I can tell (and I haven’t looked very hard), no one in video games has thought to apply this in reverse. Film people have, and movies like Avatar have film grain and film judder (non-smooth motion) applied in post production. We’ve been using film grain post filters in video games for a little while now; Mass Effect and WET come to mind. I’m pretty sure they’re still running at 30/60 fps though, which isn’t right if you want to look like a movie. It’s half-hearted.

Now even though the consoles can configure their hardware to output 1080p/24 (1080 vertical progressive scan lines at 24 fps), I don’t know if it’s available (or permissible!) to run at that setting. So the thing to do if we want video games to look like movies is to render internally at 24 fps, and perform 2:3 pulldown telecine live. The altered cadence will provide the proper film feel. (Ironically, most higher end televisions now have dedicated hardware to counteract this effect and restore the stream that we started with.) That will let us approximate the desired 24 fps, while still running at 60 fps.

Those of you who are paying attention will have realized that this involves repeating frames at an uneven rate that is not interpolated to the current physics state. Won’t it introduce jerkiness when things are moving around? Why yes, yes it will. That’s the whole point.

2 thoughts on “Do cinematic video games need judder?

  1. We (Blizzard Cinematics) have always rendered at 24 fps, ostensibly for that reason. That said, it’ll be interesting to see where this all goes as digital cinemas catch on, and film makers start shooting at higher fps. James Cameron has been threatening to shoot 48fps for a while (the tech has been there for a while now). I’m guessing in the end, it’ll be a “vinyl vs. cd” thing…

    As a sidenote – try watching a film on one of those 120hz tv’s that actually does frame interpolation (I think most do now?). It makes the film look like a cheap PBS video production.

    1. Yeah, I’ve been reading AVS about all the 120 hz sets (who are practically up in arms one way or the other), and I have one on the way. I’ve found them to be a little odd looking too, but it seems like the store loops nowadays really suck for actually evaluating the TVs. I’ll find out first hand — luckily the interpolation can be tweaked and disabled.

Leave a comment