To me, these examples look like 24 fps smooths out the motion and 60 fps captures or accentuates the actor’s less steady movements.
The next Bourne movie will be shot on GoPros dragged behind the actors on a string.
To me, these examples look like 24 fps smooths out the motion and 60 fps captures or accentuates the actor’s less steady movements.
The next Bourne movie will be shot on GoPros dragged behind the actors on a string.
I thought the Identity had more continuity.
Everything is wrong about the info in headline and video.
Headline is using “Uncanny Valley” incorrectly. U.V. is about CGI humans or humanoid robots appearing just NOT right enough and cause unease in the viewer.
This vid, the whole thing causes unease - no robots, or cgi humans involved.
But, yeah, as others have said, the vid is more an example of Video Smoothing where the effect tends to make films look like the shot-on-video quality of 1980s daytime Soap Operas. Because I don’t think they can physically take a film that was shot at 24fps and actually morph it, via home gizmos, into 60fps legitimately. So we’re not seeing what the vid clams, nor what the headline says.
I’ve seen this effect before (it was particularly jarring when I watched Ghostbusters, a movie I’m intimately familiar with, on a HD TV for the first time), but looking at this video, I see literally no difference whatsoever.
That’s pretty much what I saw. Most of the scenes with little motion had little change, but the scenes with lots of motion had the sense of motion increased. The boat scene was particularly jarring. The 24fps version was almost calm while the 60fps version quickly induced motion sickness. Possibly the interpolation was over compensating for the global motion.
Others have made comments that native high frame rate video doesn’t have this kind of effect and I would disagree. My phone can capture and playback video at a few frame rates, and I’ve recorded a scene with different frame rates before and the playback between the two is quite different feeling. So, I’ll refute the assertion that high frame rate won’t have the ‘soap opera’ quality, but will just look sharper due to less motion blur.
I think the CGI scene showed this best. It was the only scene in which I couldn’t tell the difference between the 24fps and the 60fps versions. I would guess that’s because it wasn’t rendered with any simulated shutter ‘motion blur’. That’s also probably why the CGI effects–like Scorpion’s grabber–stood out more in the 60fps version–they didn’t really change like the rest of the frame did.
Another thing I noticed is that the scenes (other than the boat) where the sense of motion increased more were poorly lit scenes which probably meant the original low frame rate video had a longer shutter time–more blur. The outdoor scene with Scorpion didn’t seem to show much difference to me. So, a useful comparison might be a low frame rate video with the same shutter time as a higher frame rate video of the same scene. That sort of highlights a difference between natural video and CGI. With natural video, you cannot have an exposure time greater than your frame time (generally it’s limited to a fraction of it as you need to move the film in between frames and wait for it to settle before exposure can begin). So, there’s no way to have a high frame rate video with the same shutter time as a low frame rate video as the latter used as long of a shutter time as it could.
CGI doesn’t suffer from that limitation. Its shutter is virtual and can easily extend longer than a frame time. I watch this effect on my security cameras at night. Sure, I’m seeing 15fps video, but each displayed frame is actually calculated from the last N frames of images captured from the sensor. That’s done so that static parts of the scene will have less noise and higher detail. Motion in dimmer locations will look strange–people walking by look like something from a fever dream.
Summary: it’s hard to say what these clips are comparing as we don’t know much about the original source material nor how the higher frame rate images were generated. A proper comparison would need to control for a large number of variables that a simple A/B test will not be capable of.
this is so great; and the VHS-distortions on top are top notch. which software did you use, if i may ask?
In this case the VHS is actual vhs and digitized at 720p… because I’m a pedantic dork. It was cut at 4:3 720p and played into a deck using a raspberry pi.
Partly because it wasn’t “real” 60fps, it was the original 24fps source material interpolated to 60fps. Since 24 doesn’t divide into 60, it has to add or discard frames so it will look janky. This is why all decent modern TVs do 120 or 240fps natively - standard 24, 30, and 60 fps formats can all be cleanly interpolated.
I call that true commitment. what did you use for the excellent motion interpolation?
At the time I used butterflow but if I were to do it today I might check into newer tools since I think this stuff may have improved since.
Am I the only one who doesn’t get 60fps? I don’t see a difference in this video or any others.
I don’t think it’s a physiological thing. HD blew my mind when I first saw it on a TV in Harrod’s (before such technology was affordable to us normal people). I love 3D movies (and also sometimes feel like I’m the only one who does). But 60fps? Looks exactly the same to me as 24fps.
I don’t know what to tell you, the difference is enormous and makes the 60FPS version pretty much unwatchable to me. Even the Hobbit, with no frame interpolation, had the same effect on me. It just looks cheap (apart from the Hobbit being a terrible film regardless of its frame rate). But then again, I often can’t tell much of a difference between different quality audio setups, so I guess sensory perception ist just that different between people.
Thanks!
Was just about to go on a technical rant about FPS and conversion…
One thing I would add to the comments is that 24 fps was a momentous step in film technology to match the human eyes refresh rate/ persistence of vision, the earlier films were called “The Flicks” because the lower frame rates e.g. 8 or 12 fps would make film motion ‘flicker’ to the eye.
The 120Hz and above fps of modern progressive monitors becomes obvious when you lower the rate and see your cursor start to flicker i.e. without motion blur.
I saw someone do this with some Ray Harryhausen stuff. Some of the smoothing made things look wrong because they smoothed the movement for lumbering or jerky creations. I do not expect a skeleton to move smoothly.
I think it was
While I’m indifferent with movie or TV sources, I highly prefer it with video games.
There are 2 things here: interpolation and what we in the trade call: temporal-spatial disparity.
Intelligent interpolation synthesizes data from what is there. From what I have seen, the effects of this are mixed: sometimes it looks better, sometimes it looks weird. This is not surprising as the algorithms are effectively creating data based on a ‘best guess’.
Incidentally, this also addresses the bigger industry problem: temporal-spatial disparity. In short: the pixel resolution of screens/displays has massively increased but the frame rate has not. In fast-paced action movies, this can mean that a character is only on the screen for a couple of frames at 24fps. This is a strain to the eyes and brain. Peter Jackson and Park Road Post tried to address this in the Hobbit movies by running at twice the frame rate: 48fps but he also found that this ‘broke’ the cinematic suspension of disbelief as the images looked so much more televisual and also highlighted any special effects that were being used.
At the bigger industry trade shows in Vegas (NAB) and Amsterdam (IBS) you can see ‘8K’ high fps monitors that look so realistic that it is pretty much like looking through a window but, other than Japan, there isn’t much interest in this worldwide yet.
To state a truism: you can’t get something for nothing.
This topic was automatically closed after 5 days. New replies are no longer allowed.