Tom Cruise is right: motion-smoothing sucks


Kinda using this as an opportunity to write this down, so sorry for the length. Here’s a little technical history.

Film is 24 Fps probably because at the time 24 frames was more mechanically reliable than running it any faster for most cameras.

European PAL and American NTSC Standard Def TV were 24 and 30 Frames per second (Fps) because TVs were synched to the power companies’ 50 and 60Hz AC respectively. They were also interlaced, having to do with technical limitations of the era.

Animation is 15 Fps because it’s half the work.

There were perceptual/psychological factors taken into consideration in the specs of that time. For example TV did not make 16 million colors because they didn’t have the bandwidth. Blue, for example, was only 11% of the NTSC signal because the eye/brain does not see detail as well in blue as it does in green. The eye/brain also sees 30Fps as smoother than 24Fps.

Computer monitors then developed over time to have higher frames rates because it reduced “flicker” and eye strain. (lower frame rate = more flicker). Also graphics cards did not have to adhere to TV standards.

All the HD formats from early 1060/60Fps P (progressive) to 1080/60Fps P, using the extra bandwidth available in more modern technology, went to higher frame rates, a wider color gamut (more colors) as well as more lines of resolution/pixels both horizontal and vertical.

So the trend in technology has been ever more resolution, higher frame rates and more color. This trend continues with 4K and 8K which ups all those things again.

“Interpolation”, which is what we’re talking about with this “effect” has actually been around for a while in various forms. Initially it was used to convert every movie (24Fps) that you’ve seen on SD TV to 30Fps. Nowadays it’s a clever algorithm that looks at adjacent frames and creates what it thinks would be a frame that should exist between them (or maybe many frames - the 60-120-240, etc. settings on your fancy TV). The idea is to make low frame rate original material come closer to the way we perceive motion — the higher the frame rate, the more closely the image appears to be continuous -the way the eye/brain sees -i.e. the eye does not see in frames. Given that gargantuan number crunching task, these algorithms do a remarkable job, albeit sometimes with “artifacts”.

Those are the technical reasons why there is this reach for ever higher frame rates. However there are psychological/perceptual effects accompanying these changes. For me, 30Fps looks “live” and 24Fps looks like it’s in the past. Maybe because 30Fps is closer to the way the eye sees but also all the live TV you’ve ever seen (including Soap Operas) is at least 30Fps. I could understand why seeing a movie that your brain expects at 24Fps run at 30 or 60 Fps would seem unnatural. Also I think the 24 Fps that film makers prefer differentiated their “serious” work from “TV” which until relatively recently, was mostly devoid of high quality storytelling. Likewise Japanese Public TV giant NHK likes super ultra mega high def with bazillions of colors because beauty is a core value of Asian culture. Their demos are usually flowers and natural beauty shots and NHK has been a leading research and development team in higher and higher def.

Preference can also depend on content. The live Superbowl looks fabulous at high frame rates. Citizen Kane (and apparently Hobbits) not so much. In fact The Evil Comcast and others don’t compress the Superb Owl (compression is another story) like they do reruns of the Brady Bunch. For a brief few hours you actually see what real HD can look like over cable (FYI, over the air broadcast is virtually uncompressed all the time -get an antenna and see!). Sports has long led the way for TV technology advancement and innovation because people want to feel like they are at the game. In fact many games are shot in 4K and downconverted to HD and some are already shot in 8K.

My conclusion is that one way is not necessarily “better” than another for creative purposes. 24P will probably long remain the preferred way to produce and watch movies for the simple fact that the lower frame rate imparts a quality that helps the storytelling in ways we may not fully understand.


That is not technically accurate. NTSC is not interpolated into a progressive picture, typically two frames are interlaced creating the fifth frame to convert 24 to 30 fps through the telecine process. The overall video stream can also be interlaced, but typically most DVDs follow a 3:2 format, 3 progressive and 2 interlaced.

About the only thing I hate more than “smooth” settings and vertical video is shittily done IVTC, especially on Youtube.


The one thing missing from this discussion is shutter speed. I know in the world of iPhone video people have no idea of how real cameras work, but frame rate doesn’t exist in a bubble.

Just because you are recording at 60fps doesn’t mean it has to look dramatically different. You can take film at 24fps and crank the apature wider while reducing the shutter speed to eliminate motion blur - a movie like Crank for example. Conversely you can go the other way modifying shutter speed, apature, and frame rate to create a softer effect. A point is certainly reached where the shutter speed can not be increased due to frame rate, but I don’t think 60fps makes that unrealistic.

Old games exhibit similar things when played on systems a decade in the future. A game like Descent, in all of its 320x240 pixilated glory, looks fluid - it is almost like you are seeing the computational errors in the video output. Sort of a mathematical noise that causes the video to shimmer.

I think historic, or a lived experience, of that’s the way it has been only plays into thinking it looks good partially. At the end of the day most people are going to agree that non-smoothed 24 or 30p is going to look more realistic. But sometimes it is not about realism, I agree that sports look great at 60p using a quicker shutter speed. But creating data where it doesn’t exist (in TV smoothing), especially on the fly in real time leads to mediocre outcomes.

As a side note, I use a lot of Avisynth filtering and effects. You can be well below playable framerates on the best hardware around once you start throwing temporal and spatial filtering on a video stream. The forums at cover a lot of technicals if anyone is interested.


Nice description. I was already in the weeds, so I avoided going deeper into 3-2 pulldown. (Nitpicking, sorry -I didn’t actually say interpolation produced a progressive scan).


This. There’s nothing inherently wrong with higher frame rates - it’s just that we’re so accustomed to seeing movies at 24fps that anything else looks “wrong”. Higher frame rates suit things like action films and sports really well.

Most every modern TV runs at 60 or 120Hz (fixed) so some transformation has to happen. 120Hz is better for film because it can directly map 24Hz into 120Hz without having to add extra frames at weird intervals causing jerkiness. This interpolation is not motion smoothing as it doesn’t alter the presentation (in fact it ensures better handling of the source material).

Motion smoothing, like most artificial processing in TVs these days just sucks. Some people seem to like it, though. They would rather things look slick, oversaturated, and blown out than what was intended.


This is why most TVs have a “game mode” setting - it turns off pretty much all unnecessary image processing to minimize input lag. When you have all that crap turned on it can add ~100-200ms of extra delay (on top of the delays already inherent to LCD technology) which is definitely perceptible to humans and will wreck gameplay for anything that requires quick response.


This topic was automatically closed after 5 days. New replies are no longer allowed.