There is a hope for consumer-level stuff as well. The emergence of the 4k UHDTV format and its increasing penetration to consumer-grade hardware is bringing chips (ADCs, data path chipsets…) capable of dealing of pretty high bitrates, which can be quite reusable for high-fps lower-resolution image acquisition.
For example, 4k (3840x2160) at 60 fps at 3 8bit channels has 1.4 gigabytes per second (11.1 gigabits per second without overheads). That equals 19440 fps of monochrome 320x240, or 199,000 fps of 100x75. All on consumer-grade chips. Just (“just”, there will be an ant-hill of bugs to solve) swap the sensors. (Thought… would the 4k sensors work at that speed if only a small part of the frame is used in the readout?)
Edit: Then there is the possibility of overclocking the chips. For short bursts of activity, compared with sustained runs, the heat issues may not be prohibitive even if the added heatsinks are only rudimentary. Cooling the apparatus down in advance (Peltiers on the chips) can also help, as then some additional energy can be sunk into the thermal mass of the chip and its carrier (at short periods there is not enough time for the heat to get all the way to the heatsink so better cool the whole chip down in advance so we have more joules of waste heat avaiable before the chip hits its maximum allowed temperature). There are of course more issues in overclocking so the heat production may not actually be the biggest concern.
And there is the lighting. High-FPS needs LOTS of light. LOTS AND LOTS of light. That itself can overheat the target being shot. (What about, for short times, using e.g. a flashlamp with sufficiently long glow? One pulse spread over the time of the shot, and the energy delivered is more limited than with a lamp that has to be heated up.)
I’ve worked as a Phantom operator for about 6 years now. You aren’t kidding about the amount of light you need. In addition to the amount there is also the consideration that anything shot over 120 fps actually sees 2 full cycles of AC current which manifests as an exposure filcker. You see it a lot when they use them in sports. One studio I work at regularly had a 400A AC/DC inverter built to run the lights DC to counteract it. Sometimes we still need more light! My favorite setup is when they trot out a xenon spotlight to simulate sunlight indoors at 500 fps.
The flicker sucks, it can be seen even on conventional lightbulbs when a 30-fps camera is used in place with 50Hz power, as lower-frequency product of the two frequencies; very noticeable. I encountered that and if I will have to deal with it again, I will build a small adapter with a diode bridge and a capacitor for feeding the lamp.
How did you handle the amount of heat from the lights?
(Could the flicker be attenuated in postprocessing? Make a brightness-adjustment curve with the same frequency?)
What about using a laser? E.g. a multiwavelength “white light” argon-krypton one, as used on some laser shows?
I mainly work in “tabletop” commercial studios and the lights are all on dimmers or in the case of DC on a switch. When we’re ready to do the shot the lights are brought up. I’m watching a waveform monitor and when I see they are at the set level I call “speed” and the action happens and the lights are brought back down. Back in the days of shooting actual high speed film we did have a custom 30,000W fixture that if left on for more than 30 secs would catch on fire!
One other note we rarely shoot past 750fps. 1 sec of 1000fps has a real time playback of over 30 secs so for each shot we do the lights are rarely fully up for more than 10 secs per take.
I know there are certainly post options for removing the flicker but most directors don’t feel comfortable giving a client a bunch of footage they will need to “fix” so we do a lot to try to minimize or eliminate it. According to colorists I’ve talked to though the post options are actually pretty effective, and on broadcast FOX seems to be getting better at eliminating it though some sort of real time solution though I have no guess as to how.
It is possible to run the camera in a PIV or pulse mode for working with strobes and or lasers as you mention, but I believe this is generally used more in science and material analysis where image and light quality is less of a concern. Not to say you couldn’t achieve great results that way, just that most studios have a bunch of lights already and would rather deal with the “devil they know”.
here is a slomo fire video we shot on the Phantom 4k. no lights used. in setting sun and darkness. the sole source of light was the fires we used… the other slow motion video on the play list were on IDT Y5 Hdiablo
…while I still can reply… BLOODY COOL! Slo-mo fire is one of the best kinds.
This topic was automatically closed after 5 days. New replies are no longer allowed.